Nikhil

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 7

S N S Lakshman

Expertise: Informatica Power Center, Pentaho Data Integration, SQL

Experience: 4 Years 8 Months

E-mail: [email protected]

Mobile No: +91-7204496813

Cognizant Technology Solutions

Hyderabad

Career Objective:

To associate with an organization that progress dynamically, provides me an opportunity


to update my knowledge and enhance my skills in new technologies, and be part of a team that
excels in work towards the growth of the organization which gives me personal and professional
satisfaction thereof.

Professional Summary:

 Having hands on experience in developing mappings in Pentaho data integration using SQL
Server.
 Having hands on experience in developing mappings and workflows in Informatica Power
Center using Oracle 10g.
 Creating and Altering the database tables as per client requirement.
 Hands on experience on deploying Pentaho ktr files.
 Experience in version control using GIT.
 Experience in raising Change Requests for the deployment process.
 Experience in code review and managing Pull Requests raised in Bitbucket and Github.
 Good client relation skills and the drive to complete tasks effectively and efficiently where
customer services and technical skills are demanded.
 Good team member, positive attitude and self-motivated, quick Learner, willing to adapt to
new challenges and new technologies

Technical Skills & Tools:

ETL Tools : Informatica Power Center 9.x/10.x , Pentaho Data Integration 9.1

Database : SQL server, Oracle 10g

Scheduling Tools : Autosys

IDE Tools : IntelliJ


Version Management Tools : GIT

Build Tools : Maven

Domain Knowledge: Banking, Retail, Media

Education:

S. No. Education Institution Percentage

1 B. Tech (2013-18) G V P COLLEGE OF ENGG 58.98


(A), JNTU Kakinada

2 Intermediate(2011-13) Sri Chaitanya Junior College 79.6

3 S S C (2010-11) Bhashyam Public School 68.5

CERTIFICATIONS :

Certified in SQL Basic, SQL medium in Hacker Rank

PROFESSIONAL EXPERIENCE:

COGNIZANT Oct 2021 - March 2023

Associate – Projects

1. Project: Mortgage Project


Domain : Banking

Role : ETL Developer

Description:

Mortgage project, SQL Server data that is created to extracting data on daily basis and sent to the
downstream. Some jobs are using Pentaho data integration and Some jobs are using Informatica
Power Center ETL. Data trigger through autosys and sent through Mortgage-Data-Feed Engine, data
saved in S3 bucket and move the data from File Mover to respective downstream.

Responsibilities:

 Have to analyse Informatica Power Center mappings and migrate it to Pentaho


Transformations.
 Create ETL mappings using Pentaho data integration and sent the data to downstream.
 Created control files to trigger the autosys jobs for a specific time .
 Created Filemover routes for Pentaho Jobs.
 Effectively used Table input, Select Values, Calculator, Call DB Procedure, Merge join, Table
Output.
 Deployment of ktr files through Jules CI/CD pipeline.
 Performed unit testing at various levels of the ETL and actively involved in code reviews and
fixed the errors in the code.
 Monitoring and Production Support for pentaho Jobs on daily basis and fixing issues and
Incident Handling etc.
 Changing the Autosys command from informatica to Pentaho. It runs through Mortage-data-
Feed Engine.
 Created Snow tickets for PROD Deployment.
 Involved in Informatica Production Support .

Environment Tools:

Pentaho Data Integration 9.1, Informatica PowerCenter 10.2, SQL Server Management Studio,
Autosys scheduling tool., Swagger, Filemover.

PROFESSIONAL EXPERIENCE:

IBM Sep 2020 – Oct 2021

Software Engineer

2. Project : ASDA

Domain : Retail

Role : ETL Developer

Description :

ASDA, a british supermarket retailer giant, became a subsidiary of Walmart after a takeover in July
1999 and currently stands at 3rd rank by market share in the UK. ASDA operates online grocery
delivery and pick up services at more than 300+ stores delivering over a billion items across
thousands of trucks trips each year. A majority of the cost of fulfilment annually is attributed to
the ins-store order picking process as most of the e-commerce grocery orders are picked and
shipped from ASDA stores. Improving in-store order fulfilment velocity would ensure that more
items are processed per hour/picker thereby improving the capacity of online orders fulfilled in a
day (~directly impacting revenue), increasing on-time deliveries and driving operational costs
down.
Responsibilities:

 Interaction with the Customer business user to understand business requirements.


 Develop/Modify Informatica Mappings, Mapplets, Workflows, Worklets.
 Extracted data from different sources like Oracle, Flat files.
 Created connected and unconnected Lookup transformations to look up the data from
the source and target tables

 Worked on Decommission Application activities. involved in designing slowly


changing dimension type 1 and 2.

 Extensively used ETL to load data from Oracle and Flat files to Data Warehouse.
 Performed Unit testing and Integration testing of Informatica mappings.
 Implemented various Performance Tuning techniques.

Environment:

Informatica Powercenter 10.x, SQL Developer, JIRA, Unix, Shell scripts,Agile.

PROFESSIONAL EXPERIENCE:

IBM Apr 2019 – Sep 2020

3. Project : Solaris Comcast

Domain : Media

Role : ETL Developer

Description:

Rogers Inc. sells Video on Demand services to its clients. The VOD services statistics are then
collected and stored within various source systems. The purpose of the Project is to integrate all
statistics into one holistic view so that the business may utilize the outputs to perform reporting
and analyse the data. This Project will also include reconciliation process with varios validation
rules within the ETL process so that corrupt data can be pushed out to unreconcilled tables. These
data will further be reconciled.Solaris is being delivered to consumers via Maestro/COMCAST
platform. VST requires data for settlement purposes. Transaction TVOD data (Events, Credits ot
Reversals) is the library of the videos where subscribers need to pay for the rental videos. This
record will be created using Maestro as this rental is eligible for rating and billing. Exadata
receives Asset information from Hadoop. It is agreed between Exadata and Hadoop that later
team is going to send separate row for each of the asset types (Movie, Poster, Preview, Title).

Exadata is responsible to consolidate all column values for each Package Asset IDs. The Asset data
would go through the reconciliation process before updating the Legacy Tables.
Responsibilities:
 Responsible for gathering suit of business requirements, Prepare Source to Target
Mapping specifications and Transformation rules.
 Created Source to Target Mapping Specification Document
 Involved in system study, analyze the requirements by meeting the client and designing
the system
 Developed mappings/Reusable Objects/Transformation/mapplets by using mapping
designer, transformation developer and mapplet designer
 Extracted data from different sources like Oracle, Flat files .
 Designed and developed complex aggregate, join, look up transformation rules (business
rules) to generate consolidated (fact/summary) data identified by dimensions using
Informatica ETL tool
 Used the Update Strategy Transformation to update the Target Dimension tables
 Created connected and unconnected Lookup transformations to look up the data from
the source and target tables
 Involved in Performance tuning for sources, targets, mappings, sessions and server
 Developed batch file to automate the task of executing the different workflows and
sessions associated with the mappings on the development server
 People soft application engine was used to load the data marts
 Created test cases and completed unit, integration and system tests for Data warehouse
 Joined Tables Originating from Oracle.
 Wrote test cases, test conditions for Various Derivatives and Subject areas.
 Actively Participated in Team meetings and discussions to propose the solutions to the
problems.

Environment:
Informatica PowerCenter 9.6, SQL DEVELOPER, UNIX, JIRA, Shell Scripting

PROFESSIONAL EXPERIENCE:

IBM June 2018 – Mar 2019


4. Project : TDP
Domain : Banking
Team Size : 9
Role : ETL Developer

Description :
This Application is mainly targeted in ICF 7.4 to TDP Integration i.e, to check whether the
new source systems implemented ICF 7.4 will take care and serve all the business logics
properly and also responsible to check whether the data populated in EDW target were of
right data type right from ICF new source systems (Views).It also includes some of the
downstream like Tax, Liquidity and Cash On Hand Reports and sigma feed will no longer
needed.

Responsibilities:
 Gathering suit of business requirements, Prepare Source to Target Mapping
specifications and Transformation rules.
 Created Source to Target Mapping Specification Document
 Involved in system study, analyze the requirements by meeting the client and
designing the system
 Developed mappings/Reusable Objects/Transformation/mapplets by using
mapping designer, transformation developer and mapplet designer
 Extracted data from different sources like Oracle, Flat files .
 Designed and developed complex aggregate, join, look up transformation rules
(business rules) to generate consolidated (fact/summary) data identified by
dimensions using Informatica ETL tool
 Used the Update Strategy Transformation to update the Target Dimension tables
 Created connected and unconnected Lookup transformations to look up the data
from the source and target tables
 Involved in Performance tuning for sources, targets, mappings, sessions and
server
 Used PL/SQL and UNIX Shell Scripts for scheduling the sessions in Informatica
 Wrote SQL, PL/SQL for implementing business rules and transformations.
 Developed batch file to automate the task of executing the different workflows
and sessions associated with the mappings on the development server
 People soft application engine was used to load the data marts
 Created test cases and completed unit, integration and system tests for Data
warehouse
 Joined Tables Originating from Oracle.
 Wrote test cases, test conditions for Various Derivatives and Subject areas.
 Actively Participated in Team meetings and discussions to propose the solutions
to the problems.
 Prepare QA Test Plan, Test cases and QA Signoff documents.
 Prepare test cases and Test Plan in HP ALM.
 QA Team is to validate the DB data in ICF7.4
 Analyzing the business requirements according to DMS logic.
 Analyzing the user’s tasks and developing a model of the tasks and the flow of
work between the tasks.
 QA team is to validate the End -to-End testing from source table columns to
target table columns which includes business requirement, hard coded values and
straight copy.
 QA team is validating the workflow dependencies.
 QA Team is to validate the Trigger file functionality.
 QA Team is to validate the Command task and Event Wait task.
 Verify that in order to pull cash pool accounts, the logic being used is
BANKACCOUNTS.BANKACCTTYPEID='INT'

Tools Used : Informatica 9.5.1, PL / SQL, oracle 11g, HPALM,UNIX,SHELL SCRIPTING.

You might also like