Talend Resume

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 8

SUMMARY:

 7+ years IT experience in Design, Development, Maintenance,


Enhancements and Production Support of which includes Data
warehousing and Legacy applications using ETL tools like Talend
/Informatica.
 IT Experience in various industries like Financial, CRM - Address
Standardizations (Establishing Global Customer Processes Address
Cleansing and Standardized customer data), Telecom (CRM, Billing,
provisioning, Order management, Inventory System), US
- Healthcare (RxClaim pharmacy PBM (Integration OR Implementation
OR SDLC, Claims OR Benefits OR Eligibility, Medicare &Medicaid like
Rebates, Iris.
 Experience in adhering software methodologies like waterfall, Agile.
 Experienced in using TBD and Talend Data Fabric tools ( Talend DI,
Talend MDM, Talend DQ, Talend Data Preparation, ESB, TAC )
 Experienced in using All Talend DI, MDM, DQ, DP, ESB components.
 Experienced in match/merge in MDM to run match rules to check the
effectiveness of MDM process on data.
 Experienced in Data Ingestion projects to inject data into Data
lake using multiple sources systems using Talend Bigdata
 Experienced in Talend Service Oriented Web Services using SOAP,
REST and XML/HTTP technologies using Talend ESB components .
 Experienced in scheduling Talend jobs using Talend Administration
Console (TAC) .
 Experienced in ETL Talend Data Fabric components and used features
of Context Variables, MySQL, Oracle, Hive Database components .
 Good Understanding of relational database management systems,
experience in integrating data from various data source like Oracle,
MSSQL Server, MySQL and Flat files too.
 Experience with Talend to develop processes for extracting, cleansing,
transforming , integrating, and loading data into data mart database.
 Capable of processing large sets of structured, semi-
structured and unstructured data and supporting systems application
architecture.
 Good experience on Talend architecture and Talend installation .
 Familiar with data architecture including data ingestion pipeline
design , Hadoop information architecture, data modelling and data
mining, machine learning and advanced data processing and Experience
optimizing ETL workflows.
 Good experience in Big Data and Hadoop Ecosystem components like,
Pig, Hive, Sqoop, Flume, MapReduce.
 Experience in Debugging, Error Handling and Performance Tuning of
sources, targets, Jobs etc.
 Profound experience with Informatica Power Center tool
9.6.1/9.1/8.6/8.1 Power Exchange.
 Good Knowledge and experience in using distributed Environments for
providing BI solutions to build analytics systems.
 Experience in writing UNIX scripts for Informatica jobs that are used
for data movement and transformation purpose.
 Good Knowledge and experience in performance tuning in the live
systems for ETL jobs that are built on Informatica as well as Talend.
 Experience in writing database objects like Stored
Procedures , Triggers for Oracle , MYSQL databases and good
knowledge in PL/ SQL , hands on experience in writing medium level
SQL queries.
 Experience in converting the Store Procedures logic into ETL
requirements.
 Hands on experience in scheduling tools like Talend
Administrator , Autosys, Informatica Scheduler, Control M for ETL jobs
 Hands on experience in working with Jira, Rally , HP QC - Quality
Center Systems, Perforce - Versioning applications, BMC Remedy ,
Ticketing &Change Request Management applications
 Experience in leading mid-size team and offshore-onsite team
coordinator.

TECHNICAL SKILLS:

ETL Tool: Talend Data Fabric 6.2, 6.1, Informatica Power Center
9.6.1/9.1/8.6/8.1 , Informatica IDQ

ETL Scheduling Tool: TAC, Informatica Scheduler, Autosys, Control M, UC4


Manger

Databases: Netezza, Oracle 10g & 11i, Teradata, MS SQL Server, DB2, Hive,
MySQL

DB Tool: Toad, SQL Developer, WinSQL, Squirrel Client & SQL Assistant

Other Tools: HP QC, BMC-Remedy, TTS

Operating Systems: Window Family, Linux and Solaris

PROFESSIONAL EXPERIENCE:

Confidential, Chicago, IL

Senior ETL Talend Developer

Responsibilities:

 Responsible to build the jobs by looking to the ETL


Specification documents.
 Responsible for profiling the source systems data using
Different DQ Techniques.
 Responsible for develop the DI jobs to load the data from PeopleSoft to
Oracle ERP.
 Responsible to define the framework for reusability system for all
interfaces for Extracting, load, zipping and archiving files.
 Experienced in Creating the Context load using the parameter
variables.
 Experienced in implementing the project end to end Solutions.
 Experienced using the Talend with Big data using Hive, Sqoop,
Hdfs components.
 Experienced in loading the data into Data Lake.
 Developed and executed Talend jobs one module using DI for Rebate
systems.
 Extensively used Talend Administrator Console ( TAC ) for running the
jobs on different servers by passing various context parameters.
 Have the knowledge on scheduling jobs on TAC .
 Used SVN for version control tool for Talend jobs.
 Developed jobs to perform Address Standardization where customer
addresses are standardized and loaded into HBase table which involves
Talend jobs.
 Developed joblets that are reused in different processes in the flow.

Confidential, Milwaukee, WI

Senior ETL Talend Developer (DQ/DI/MDM)

Responsibilities:

 Responsible to build the jobs by looking to the ETL Specification


documents.
 Responsible for profiling the source systems data using Different DQ
Techniques.
 Responsible for develop the DI jobs to implement the address
validations, cleans and standardization on Talend ETL with different
components like tRecordMatching, tFuzzyMatch, tMatchGroup and other
components DI, DP, DQ components used features such as context
variables, database components.
 Responsible for understanding & deriving the new requirements from
Business Analysts/Stakeholders.
 Responsible for Data Ingestion to inject data into Data Lake using
multiple sources systems using Talend Bigdata.
 Responsible for MDM for Customer Data Talend MDM Customers
Suppliers Products Assets Agencies Stores
Address Standardizations and Reference Data Employees MDM is about
creating and managing the golden records of your business
 Configured match rule set property by enabling search by rules in MDM
according to Business Rules.
 Responsible for develop the modelling using Talend MDM at the same
time responsible to develop the DI jobs to populate the data in REF/XREF
tables and to create the data stewardship tasks.
 Experience in using Talend MDM components like tMDMBulkLoad,
tMDMClose, tMDMCommit, TMDMConnection, tMDMDelete, tMDMInput,
tMDMOutput, tMDMReceive, tMDMRollback, TStewardshipTaskDelete,
tStewardshipTaskInput, tStewardshipTaskOutput.
 Responsible for inject the data from multiple source systems to Hive.
 Responsible for run the Talend jobs using TAC.
 Responsible for understanding &deriving the new requirements from
Business Analysts/Stakeholders.
 Responsible for develop the jobs using ESB components like
tESBConsumer, tESBProviderFault, tESBProviderRequest,
tESBProviderResponse, tRESTClient, tRESTRequest, tRESTResponse to
get the service calls for customers DUNS numbers.
 Experienced on Match and Merge scenarios for different domains.
 Responsible for preparing the logical/physical design of tables involved
for this new enhancement within EDW.
 Created Tables, Indexes, Partitioned Tables, Materialized Views, Stored
Procedures and Packages in oracle Database.
 Worked on designing the table layout in hive, scripts to write the data
into Hadoop
 Responsible for modelling the new requirements based on BDD method
for ELT applications.
 Developing the re-usable UNIX scripts for executing the above Teradata
utility jobs for data load and extraction purpose
 Interacting with Business owners directly and provide the tested
solutions for their user sign-off.
 Responsible for accepting ad hoc modifications/new enhancements from
Business users for existing & new applications& incorporated the
change without effecting the current functionalities.
 Responsible for improving/maximising the ELT jobs performance by
modifying the query/mappings and partitioning the sessions using
performance-tuning techniques.
 Lead the team both development team & support team.
 Responsible for monitoring the Daily/Weekly/Monthly Jobs that are
scheduled using UC4 manager.

Environment: Talend Data Fabric 6.2, 6.1, Oracle, Hive, MYSQL, Visual Studio,
SOAP and Restful web services, TAC

Confidential

Senior ETL Informatica Consultant

Responsibilities:

 Developed number of complex Informatica Mappings, Mapplets and


reusable Transformations to implement the business logic and to load
the data incrementally ( delta load/full load )
 Extracting the data from the flat files and heterogeneous Relational
databases into staging area and populated into data warehouse
using SCD logic to maintain the history.
 Created and Used Workflow Variables, Session
Parameters and Mapping Parameters , partitioning, Incremental
aggregation, scheduler, Indirect method loading, Constraint Based
loading, Incremental loading, Target Load Plan, SCD-Type1 and Type2,
Tasks.
 Worked on optimizing and tuning the Teradata views and SQL’s to
improve the performance of batch and response time of data for users.
 Created Informatica Mappings to load data using transformations like
Filter, Expression, Router, Sorter, Rank, Transaction Control, Source
Qualifier, Stored Procedure, SQL Transformation, Normalizer, Sequence
Generator, Aggregator, Union, Joiner, Update Strategy, Dynamic Lookup,
and Connected and unconnected lookups, Source Qualifier
 Implementing the error handling strategies whenever we required in the
mappings and applying the default values
 Experienced in the performance tuning part in SQL/PLSQL quires and
Informatica level
 Identified and resolved the bottlenecks in source, target,
transformations, mappings and sessions to improve performance.
 By looking into logs based on busy percentage, identify the issue
whether bottleneck is at reader/writer/transformation thread and fixing
the same based on that log to improve the performance.
 Implemented Slowly Changing Dimensions ( SCDs, Type1and Type 2 )
 Maintained Code standards in warehouse metadata, naming standards
and warehouse standards for future application development
 Created & maintained tables, views, synonyms and indexes from
Logical database design document.
 Extensively Worked on Extraction, Transformation and Load
(ETL) process using PL/SQL to populate the tables
in OLTP and OLAP Data Warehouse Environment
 Tested the mappings using the Test Load option and Unit Test cases
 Performed the Unit and Integration Testing, which validated that the
data is mapped correctly which provides a qualitative check of overall
data flow.
 Created and Documented ETL Test Plans, Test Cases, Expected
Results, Assumptions and Validations
 Created Teradata External loader connections such as MLoad, UPsert
and Update, Fastload while loading data into the target tables in
Teradata Database
 Provided technical Support in deployment of workflows,
worklets and sessions for weekly releases using repository manger
and maintained the same Runbook.
 Data corrections - problem finding using custom scripts written
in SQL/PLSQL and problem solving using SQL/PLSQL with procedures
and functions
 Regular maintenance of billing activities and daily issues resolving
using SQL/PLSQL
 Efficient design and construction of the system to extract data from
operational systems into the defined physical data model per the
business and technical requirement.
 Review the requirement documents with the Client Data Analyst team
and the Business teams as Business System analyst.
 Strong experience in production support & implementation part
and RCA part and fixing the issues as per SLA and Provided the L1 and
L2 level support for team as well client for normal and emergency
issues on production environment.
 We did some poc kind of thing for 1 yr to migrate the project using
Talend tool.
 Developed and executed Talend jobs one module using DI for Rebate
systems.
 Extensively used Talend Administrator Console (TAC) for running the
jobs on different servers by passing various context parameters.
 Have the knowledge on scheduling jobs on TAC.
 Used SVN for version control tool for Talend jobs.
 Developed jobs to perform Address Standardization where customer
addresses are standardized and loaded into HBase table which involves
Talend jobs.
 Developed joblets that are reused in different processes in the flow.

Environment: Informatica 9.6.1 PC, Informatica 9.6.1 IDQ Developer,


Teradata, SQL/PLSQL.

Confidential

ETL Informatica - Netezza Developer

Responsibilities:

 Responsible for remodelling the Existing business logic to new Netezza


models for EDW
 Understanding the exiting SQL Server Store procedures logic and
convert them into ETL Requirements.
 XML Generation Process - Identifying the required NZ source tables from
the re-modelled NZ tables
 Creating the hybrid mappings for XML generation process for different
frequencies
 Identifying the Rulesets to be applied on each Client Info along with
Members &Providers’ info.
 Validate the data received & generate the XML files for each client and
transferred to require to third parties/downstream systems.
 Modifying the generated XML files using XML
formatter/Validator/Beautifier as per business owner/third-party
requirements
 Preparing the UNIX scripts for SFTP of XML files to different vendors on
external Servers.
 Unit testing and System Testing of mappings scheduling the ETL jobs
using Control M scheduler
 Monitoring the daily/weekly DW ETL workflows

Environment: Informatica 9.1 PC, Oracle, Netezza, MySQL Server, Autosys,


Toad, WinSQL, XMLReader, Windows XP &UNIX.

Confidential

Senior ETL Informatica Developer

Responsibilities:

 Understanding the exiting PL/SQL Procedures & re-engineering the logic


into Informatica requirements
 Extracted source definitions from oracle and flat files
 Developed mappings and workflows as per the new target databases
 Converting oracle stored procedures into Type1&Type2 mapping as per
new business requirements.
 Loaded data from files to Netezza tables (stage area) using NZLOAD
utility & to HDFS files using UNIX scripts.
 Understanding the Oracle Golden-Gate Data Integration application
errors & support the
 Oracle DBA team for fixing the issues.
 Unit testing and System Testing
 Scheduling the ETL jobs using Informatica scheduler
 Monitoring the daily/weekly DW ETL workflows
 Fixing the issues occurring on daily ETL load

Environment: Informatica 9.0.1 PC, Oracle 11g, Netezza, WinSQL, Toad,


Windows XP and UNIX

Confidential

ETL Informatica Developer

Responsibilities:

 Preparation of ETL High level design document and publish and review
the same with various teams like Modelling, testing and the End Users.
 Created mappings using Informatica PowerCenter Designer 8.6, as per
ETL specification document & Implemented Reject report for finding the
reject records along with the reason for rejecting those in ETL
 Developed Informatica mappings for TYPE 1, 2, 3 & Hybrid Slowly
Changing Dimensions with two levels of staging to cleanse data.
 Creating the Multi group sources (Multi Group Application SQ) using
Informatica Power Exchange for mainframes sources.
 Unit testing and System Testing
 Scheduling the Informatica jobs using the Autosys tool for
weekly/monthly base
 Involved in system Integration testing support and responsible to drive
until SIT signoff
 Lead a team of three members for new enhancements and delivering the
tested code.
 Interacted with Business owners in status meeting and understand their
requirements for enhancing and fixing the current issues in the existing
system. Come up with recommendations on improvements to the current
applications.

Environment: Informatica 8.6.1 PC & PX, Oracle, Netezza, Autosys, Toad,


Squirrel, Windows XP, UNIX

Confidential

ETL Informatica Developer

Responsibilities:

 Transforming the Oracle PL/SQL procedure logic into ETL requirements


 Created mappings using Informatica PowerCenter Designer 8.6, as per
ETL specification document & Implemented Reject Record Report for
finding the reject records along with the reason for rejecting those
records in ETL process
 Developed Informatica mappings for TYPE 1 and TYPE 2 Slowly Changing
Dimensions with two levels of staging to cleanse data
 Unit testing and System Testing
 Scheduling the Informatica jobs using the Autosys tool for
weekly/monthly base
 Interacted with Business owners in status meeting and understand their
requirements for enhancing and fixing the current issues in the existing
system. Come up with recommendations on improvements to the current
applications.

Environment: Informatica 8.6.1, Oracle (PL/SQL), Netezza, Autosys, Toad,


Squirrel, Windows, XP, UNIX

Confidential

ETL Informatica Developer

Responsibilities:

 Creating mappings using Informatica power center designer 8.1, as per


ETL specification document & creating the jobs for the mappings.
 Monitoring the Daily/weekly/monthly/quarterly jobs by fixing the Issues
 Developed Informatica mappings for TYPE 1 and TYPE 2 Slowly Changing
Dimensions
 Unit testing and System Testing

Environment: Informatica 8.1, Oracle, Toad, Windows XP &UNIX

You might also like