Krishna - Informatica IICS Developer

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 11

Name: Hari Krishna

Job Title: Sr. Informatica IICS Developer


PROFESSIONAL SUMMARY

 Around 10 years of IT experience in the analysis, design, development, testing and maintenance of
ETL/ELT solutions for various industry domains like Banking (Insurance), Media & Entertainment,
Financial and Automobile etc.
 5 plus years of strong experience in the areas of Data Integration, Data Warehousing, application
integration and Business Intelligence.
 Extensively worked with Informatica products Informatica Intelligent Data Management Cloud or
Informatica Intelligent Cloud Services (IICS), Informatica Power Center (10.4), Data Quality, Power
Exchange and MDM
 5 plus years of experience using Informatica cloud IICS services like Cloud Application Integration, Cloud
Data Integration, Cloud Data Quality, Mass ingestion, API Manager, API Portal, Administrator, Monitor,
Application Integration Console, Data Synchronization, Data Replication, etc.
 Developed data bases objects like stored procedures functions, packages and triggers using SQL and
PL/SQL.
 Led the successful implementation of IDMC solutions, overseeing the integration of Intelligent Data
Management Cloud tools to streamline data workflows.
 Collaborated with cross-functional teams to ensure seamless deployment and integration of IDMC
technologies within the organization.
 Experience in using Snowflake as a cloud-based data warehousing platform to store, manage, and
analyze large volumes of data.
 Proficient in designing and implementing data models in Snowflake, including creating tables, views,
and schemas to optimize data organization.
 Knowledge of connecting IICS to popular cloud platforms like AWS, Azure, and Snowflake Cloud to
facilitate seamless data transfer.
 Proficient in leveraging Data bricks for analytics purposes.
 Experience with Data bricks clusters, notebooks, and job scheduling.
 Experience knowledge of Apache Spark for processing big data within Data bricks.
 Integrated Data bricks with data lakes and data warehouses like AWS S3, Azure Data Lake Storage,
Snowflake, or Red shift.
 Demonstrated proficiency in Snowflake, including data modeling, schema design, and the
implementation of data warehousing best practices.
 Configured and optimized Snowflake warehouse environments for optimal performance and scalability.
 Used various sources and different targets like API’s, Sql server, Flat Files and Sales force
 Expertise in Implementing Azure Service Offering, such as Azure cloud services, Azure storage, IIS, Azure
Active Directory (AD), Azure Resource Manager (ARM), Azure Storage, Azure, Blob Storage, Azure VMs,
SQL Database, Azure Functions, Azure Service Fabric, Azure Monitor, and Azure Service Bus.
 Good knowledge in Data warehousing concepts, Relational Database Management Systems and
Dimensional modeling (Star schema & Snowflake schema).
 Experience working with cloud-based database solutions including Azure Synapse, Azure Data Lake
store, AWS Red shift and Snowflake.
 Provided training and knowledge transfer sessions to client teams and end-users to ensure smooth
transition to the new Snowflake environment.
 Demonstrated ability to integrate data from various sources, including structured and semi-structured
datasets, using cloud-based technologies and ETL methodologies.
 Proficient in creating and maintaining data models within Snowflake, optimizing data structures for
performance and query efficiency.
 Worked with Microsoft SQL Server, Snowflake Cloud, AWS S3, RDS, Red shift, Dynamo DB, JSON, XML,
XSD, Excel files, Rest APIs etc.
 Worked on Connectors like SFTP,FTP which overridden the existing functionality which suits the
requirements using Mule soft.
 Familiarity with other data integration and ETL tools such as Talend, AWS Glue, or Azure Data Factory,
and experience in evaluating and selecting the best tool for a given project based on technical and
business requirements.
 Experience in designing, optimizing, and maintaining multi-platform database environments, including
DB2, Oracle, SQL Server, Netezza, and Cassandra, with a focus on data integrity and high availability.
 Developed and maintained custom integrations between Workday and other systems, such as Jira, Net
Suite, Open Air, and SAP, using IICS connectors and APIs.
 Expertise in designing and developing ETL pipelines for big data processing, integration, and analysis.
 Experience with database systems like SQL Server, Oracle, MySQL, and Mongo DB.
 Experience working with Informatica IICS tool effectively using it for Data Migration from multiple
source systems in Snowflake Data warehouse.
 Demonstrated success in building Restful API Integration, SOAP (WSDL) using IICS.
 Migrated on premises enterprise data warehouse to cloud-based snowflake Data Warehousing solution
and enhanced the data architecture to use snowflake as a single data platform for all analytical
purposes.
 Hands-on experience with Snowflake utilities, Snow SQL, Snow Pipe, Big Data model techniques using
Python.
 ETL pipelines in and out of data warehouse using combination of Python and Snowflakes Snow SQL
Writing SQL queries against Snowflake.
 Worked in Azure System Engineering, network operations and data engineering.
 Experience in Analysis, Design, Development and Implementation as a Data engineer
 Implemented Azure Data bricks cluster, notebooks, jobs and auto scaling.
 Experience in using GitLab, GitHub, and Bit bucket as version control platforms to manage repositories,
track issues, and collaborate on projects.
 Experience designing and building web environments on AWS includes working with ELB, RDS and S3.
 Good experience in implementing CDC (Change data Capture) or Delta extraction from Source Systems
with various approaches.
 Worked on join, filter, aggregator, router, sorter, sequence generator, hierarichyparser ,expression
transformations in informatica cloud IICS- Data Integration.
 Good exposure on Informatica cloud IICS- Application integration steps like Service connector,
Assignment, create, Services.
 Good knowledge on processing API’s type source data in Informatica cloud IICS- Application integration
 Prepared Technical Design documents, Run books after completion of development.
 Good knowledge on ticketing tool JIRA
 Knowledge on Python Scripting.
 Good working experience in Agile and Waterfall methodologies.

EDUCATION
 Master of Science in Computer Science from Northwestern Polytechnic University, California - 2016
 Bachelor of Technology in Computer Science and engineering, JNTUK - 2013
TECHNICAL SKILLS

ETL Tool Informatica Intelligent Cloud Services (IICS) Application Integration, data
integration services, Informatica Power Center (10.4), Informatica Data
Quality (IDQ), Informatica Power Exchange (PWX), SSIS (SQL Server
Integration Services), DataStage, Oracle Golden Gate CDC, SQL Server Data
Tools, Visual Studio.

Physical Modelling, Logical Modelling, Relational Modelling,


Data Modeling Dimensional Modelling (Star Schema, Snowflake, Fact, Dimensions),
Entities, Attributes, Cardinality, ER Diagrams, ERP

Reporting Tool OBIEE, Micro Strategy

Databases SQL, Oracle, DB2, PL/SQL, Teradata, Greenplum, Snowflake , AWS


Redshift

Development Process Models Waterfall, Agile, Spiral, Prototype Model

Scheduling Tools Autosys, ESP CA workstation

Languages C, C++, Java, PHP, Android, UNIX shell scripts, Python

Office Package MS-Office 2013/2010/2007/XP/2003/2000

Web Technologies Servlets, HTML, XML, CSS, Java Script

Scripting UNIX, MS DOS, Linux, Windows

PROFESSIONAL EXPERIENCE

Client: JAZZPHARMA, (Redwood City CA, USA) August 2022- Present


Role: Sr. Informatica Cloud Developer
Project Environment: (IICS Developer, Informatica Power center Cloud Application Integration, Cloud Data
Integration, Microsoft SQL Server, AWS, API)
Responsibilities:
 Developed ETL Batch and Real-time integration data pipelines using IICS CAI, CDI Services and
Informatica Power Center (10.4).
 Build Bidirectional integrations using IICS CAI for Real time data integrations/exchange between the
Sales force platform events
 Build IICS components like mappings, maplets, mapping tasks, task flows, Business services, Data
Replication, Data synchronization tasks, file listener, Business Services, Hierarchical schema etc.
 Created IICS CAI service connectors, custom connections, application connections, process objects and
processes.
 Utilized Informatica Cloud infrastructure as Platform as a Service (IPaaS) and Software as a Service
(SaaS) delivery models.
 Integrated various data sources and destinations using IICS, including cloud-based and on-premises
systems like Snowflake.
 Conducted thorough data analysis and mapping exercises to understand source system data structures
and define the target schema in Snowflake.
 Actively participated in system upgrades, enhancements, and migration projects, providing expertise in
IICS integration design and deployment for Workday and its connected systems.
 Extensively used Swagger files, Business services, Web service, Restv2 to integrate with third party
applications.
 Strong command of SQL for querying and manipulating data within Snowflake, including complex joins,
sub queries, and data transformations.
 Experience in optimizing Snowflake queries and workloads to ensure efficient and cost-effective data
processing.
 Designed and developed complex ETL solutions using Informatica Intelligent Cloud Services (IICS) for
various data sources such as relational databases, flat files, and web services.
 Created and maintained mappings, workflows, and tasks using IICS Designer tool to extract, transform,
and load data into target systems such as data warehouses and operational data stores.
 Implemented advanced ETL concepts such as data quality, data validation, error handling, and
performance tuning to ensure accurate and efficient data integration processes.
 Worked with business stakeholders and technical teams to gather requirements, define data mappings,
and provide data integration solutions that meet business needs and data governance standards.
 Participated in code reviews, unit testing, and integration testing to ensure high-quality code and
smooth deployment of ETL jobs in production environments.
 Created IICS Data replication tasks to replicate Sales force objects into SQL Server tables.
 Extensively implemented IICS mass ingestion services like application ingestion, database ingestion, File
ingestion and streaming ingestion tasks.
 Designing and implementing CI/CD pipelines with Bamboo to automate build, test, and deployment
processes for various applications.
 Designed integration with Full loads, Incremental loads, Capture Data Capture (CDC) log-based
techniques.
 Designed and developed end-to-end data pipelines using Data bricks and Informatica IICS.
 Ensured data quality and reliability in data pipelines, including error handling and monitoring.
 Capable of automating data engineering workflows and scheduling tasks to ensure timely and reliable
data processing.
 Developed Integrations against third-party applications mainly with REST ful API Integration using IICS.
 Designed and implemented complex data mappings and transformations within Informatica IICS to
meet business requirements, such as data cleansing, enrichment, and aggregation before loading into
Redshift.
 Optimized ETL workflows in Informatica IICS to maximize performance and efficiency, minimizing data
transfer and transformation times, and reducing Redshift query loads.
 Implemented incremental data loading strategies to efficiently update Red shift with changed data,
reducing processing times and resource usage.
 Handled event driven S3 files processing using IICS CAI and integrated with CDI mapping task and task
flows.
 Constructed IICS mappings, mapping tasks, process to extract data from various sources like Oracles
Server and loaded into Amazon Red shift data warehouse.
 Extracted data from various on-premise systems and pushed data to AWS Red shift.
 Implemented IICS process with event driven Salesforce Platform events, AWS S3 Monitoring
 Developed and enforced data quality and security policies, contributing to improved data integrity and
regulatory adherence.
 Ability to collaborate with cross-functional teams, including data scientists, data analysts, and business
stakeholders, to understand requirements and deliver data-driven solutions.
 Strong analytical and problem-solving skills, with attention to detail and accuracy.
 Excellent communication and documentation skills, with the ability to explain complex technical
concepts to non-technical audiences.
 Experienced in using Informatica Cloud for data integration and ETL processes, including designing and
implementing complex data workflows, mapping data between systems, and transforming data to meet
business requirements.
 Created Power Shell scripts for automating repetitive tasks like FTP, Archival, file validation steps etc.
 Integrated IICS CDI Jobs (mappings tasks, task flows) in IICS CAI and also vice versa.
 Responsible for unit and system testing.
 Performance tuning in both IICS and Query tuning.
 Provide assistance as needed to Operations team.

Client: NYC DOITT, (NYC New York, USA) Jan 2022- August 2022
Role: Sr. ETL Informatica Cloud Developer
Project Environment: (IICS Informatica, Informatica Power center Cloud Application Integration, Cloud Data
Integration , Microsoft SQL Server , Salesforce, Service Now, SAP systems , ERP, Python, GCP, AWS,CI/CD,
Java script, Redshift)
Project Description:
IBM is helping NYPD DOITT (Department of Information Technology & Telecommunications) Data share
project to convert and transform their legacy technical infrastructure that resided on MQ / I Way to latest
Informatica Cloud, Salesforce Cloud, Azure.
Responsibilities:
 Worked on the API’s source Data
 Used different sources like API’s, Json files, SQl server tables, snowflake, Workday, SAP in informatica
cloud IICS integrations.
 Extracted data from Snowflake to push the data into Azure warehouse instance to support reporting
requirements
 Worked on the ETL jobs to migrate data from on premise to cloud snowflake by generating JSON and
CSV files to support Catalog API integration
 Worked on Informatica cloud- Application integration steps like Assignment step, create step, jump
step, parallel and data decision step in ICRT.
 Created service connectors in Informatica cloud IICS- Application integration(ICRT) toget the data from
API.
 Worked with connectors in Informatica cloud ICS and ICRT.
 Created Informatica cloud real time Process to run the informatica cloud jobs as per the business
needs.
 Familiarity with integrating Snowflake into various data pipelines and ETL processes for seamless data
movement.
 Knowledge of Snowflake's security features and best practices for ensuring data privacy and
compliance.
 Implemented efficient data storage and retrieval strategies using IDMC, resulting in improved data
accessibility and reduced latency.
 Optimized storage costs by implementing intelligent data tiering and lifecycle management within the
IDMC environment.
 Collaborated with cloud service providers to optimize IDMC deployment for performance and cost-
effectiveness
 Utilized IDMC automation features to create scalable and repeatable processes for data ingestion,
transformation, and distribution.
 Conducted performance optimizations based on monitoring data, resulting in enhanced overall system
efficiency and responsiveness.
 Collaborated with business stakeholders to understand their data management requirements and
translated them into effective IDMC solutions.
 Experience in managing and administering Snowflake accounts, users, and resources to maintain
system health and performance.
 Implemented logic to handle thousands of records in Informatica cloud ICRT process
 Implemented archiving logic after processing the files
 Developed ETL Batch and Real-time integration data pipelines using IICS
 Worked on Data Synchronization, Data Replication and mapping configuration task in Informatica cloud
IICS- Data Integration
 Experience in designing, developing, and maintaining ETL processes using Python.
 Proficiency in Python programming and knowledge of data structures, algorithms, and object-oriented
programming concepts.
 Successfully integrated SAP systems with Informatica Intelligent Cloud Services (IICS) to streamline data
and process workflows, ensuring efficient data management and automation.
 Designed and implemented data transformation solutions within IICS to facilitate data quality,
enrichment, and compliance for SAP systems, improving overall data integrity.
 Demonstrated expertise in using Snow SQL to interact with Snowflake data warehousing platform.
 Executed complex SQL queries using Snow SQL to extract, transform, and analyze data from Snowflake
databases.
 Designed and customized CI/CD pipelines to accommodate specific project requirements, integrating
unit tests, code quality checks, and deployment steps.
 Implemented parallel and sequential stages in pipelines to optimize build and deployment times.
 Involved in Business meetings to understand business requirement and design flow based on
requirement.
 worked on IICS Application Integration components like Processes, Service Connectors, and Process
Object
 To integrate the NYC DOITT data we used IICS as a tool and there are multiple sources and multiple
users
 Experience working with IICS concepts relating to data integration, Monitor, Administrator,
deployments, schedules.
 Integrated Jira, the project management tool, with Workday to automate employee on boarding, off
boarding, and project allocation processes, reducing manual effort and improving data accuracy.
 Collaborated with other developers and architects to design and implement scalable and resilient ETL
architectures using IICS Cloud Integration Hub, Cloud Data Integration, and Cloud Application
Integration.
 Provided technical guidance and mentorship to junior ETL developers, and contributed to the
development of best practices and standards for ETL development using IICS.
 Stayed up-to-date with the latest industry trends and emerging technologies related to ETL, cloud
computing, data integration, and big data, and applied them to improve ETL processes and solutions.
 Helped in Writing scripts in Python for Extracting Data from JSON and XML files
 Involved in all the phases of Informatica cloud services and informatica cloud real-time service
integration activities with oracle and REST API and SOAP web services Wrote scripts in Python for
Extracting Data from JSON and XML files.
 Worked with client technical team and other professional consultants for resolving data discrepancy,
completing requirements gathering, and performing system testing activities.
 Automated/Scheduled the cloud jobs to run daily with email notifications for any failures.
 Worked closely and effectively in a team environment, this includes members from other functions,
vendors, and external partners.
 Designed and implemented a scalable and cost-effective data lake on Azure Data Lake Storage (ADLS),
enabling data scientists to access and analyze vast amounts of structured and unstructured data
efficiently.
 Worked with client technical team and other professional consultants for resolving data discrepancy,
completing requirements gathering , and performing system testing activities
 Proficient in designing Oracle database schemas and data models to meet business requirements.
 Attending weekly and daily status calls.

Client: Wells Fargo, (Charlotte NC, USA) March 2020- Dec 2021
Role: Sr. Informatica/ETL Developer
Project Description:
Wells Fargo & Company is an American multinational financial services company and goal of the RDR team
is to normalize instrument level detail pertaining to the balance sheet and assure the instrument detail ties
back corporate General Ledger. Systems where the accounting on the SOR is not correct or the SOR is
unable to generate the necessary accounting and these SOR’s will be migrated to process through RDR.

Project Environment: (Informatica Cloud, Salesforce, ServiceNow Microsoft SQL Server 2008, and Microsoft
Visual studio, Python, AWS, Redshift, CI/CD)

Responsibilities:
 Deployed Informatica PowerCenter mapping to Informatica Cloud mapping.
 Used Informatica PowerCenter for (ETL) extraction, transformation and loading data from
heterogeneous source systems into target database.
 Worked on Informatica PowerCenter tool Source Analyzer, Data warehousing designer, Mapping,
Designer, and Mapplet & Transformations.
 Implemented source qualifier, Aggregator, Lookup, Filter, Sequence generator, Router, Update strategy
in Mappings.
 Involved in Preparing Unit Test Cases.
 Used various add on-connectors in Informatica cloud like XML, XML Target, Excel.
 Developed integrations like flatfile to service now XML to JSON
 Run the Informatica cloudData Integration tasks in application integration using service step
 Run the data integration task in application integration.
 Involved in gathering and analysis business requirement and writing requirement specification
documents and identified data sources and targets.
 Communicated with business customers to discuss the issues and requirements.
 Extensive knowledge in designing and developing the complex Informatica mappings by making use of
Java transformation, Connected and Unconnected Lookup transformation, Update transformation,
router transformation, expression transformation, SQL Transformation, Source Qualifier
transformation.
 Worked with control-m tool to automate the batch process scheduling.
 Developed ETL programs using Informatica to implement the business requirements. Writing Python
scripts to parse XML documents as well as JSON based REST Web services and load the data in
database.
 Led and managed end-to-end migration projects from legacy data warehousing systems to Snowflake,
in collaboration with Informatica ETL tools.
 Oversaw project planning, scoping, resource allocation, and timeline management to ensure successful
migration within defined deadlines.
 Writing ORM’s for generating the complex SQL queries and building reusable code and libraries in
Python for future use.
 Implemented NetSuite integration with Workday to synchronize financial and accounting data,
including employee expenses, purchase orders, and general ledger entries, ensuring data consistency
and eliminating data silos.
 Collaborated with business stakeholders and data owners to define data quality rules and standards,
resulting in a significant reduction of data anomalies within the first six months of deployment.
 Working closely with software developers and debug software and system problems
 Profiling Python code for optimization and memory management and implementing multithreading
functionality.
 Experience in Dimensional Modeling in creating various data marts like Star and Snowflake Schema,
Identifying Facts and Dimensions (SCD Type I, SCD Type II), Physical and logical data modeling
 Designed workflows with many sessions with decision, assignment task, event wait, and event raise
tasks, used informatica scheduler to schedule jobs.
 Worked with the Quality Assurance team to build the test cases to perform unit, Integration, functional
and performance Testing.
 Implemented software in all environment’s API & ETL tools (Informatica PowerCenter 10.2 HF2).
 Mappings according to the Business requirements for the Incremental Fixes Developer, and Mapping
Designer.
 `Used relational SQL wherever possible to minimize the data transfer over the network.
 Extracted data from various heterogeneous sources like DB2, Sales force, Mainframes, Teradata, and
Flat Files using Informatica Power center and loaded data in target database.
 Developed and deployed ETL Job Workflow with reliable error/exception handling and rollback with in
the mulesoft framework.
 Worked on Mule soft Any point API platform on designing and implementing Mule API's
 Used Informatica Power Center to load data from different data sources like xml, flat files and Oracle,
Teradata, Salesforce.
 Worked on Pre-Session and Post-Session Unix Scripts for automation of ETL Jobs Using Control M
Schedulers and involved in migration deployment of ETL Codes
 Developed Codes using perl scripts, Unix shell scripts which helps in main functionalities of the
application.
 Maintained comprehensive documentation of database structures, PL/SQL code, and procedures.
 Produced user guides and technical documentation for database-related processes.
 Involved in integration and deployment CI/CD.

Client: Polycom, (San Jose California, USA) February 2018 to February 2020
Role: Sr. Informatica/ETL Developer
Project Description: Polycom is a multinational corporation that develops video, voice and content
collaboration and communication technology. I worked on a project called ZYME and it included designing
and implementing a data repository for actual, schedule and time code data from the five divisions
(Northeast, West and Central, APAC, EMEA) and processing the data to provide a unified view to
PeopleSoft
Project Environment: (Informatica Cloud, UNIX, Oracle, Tivoli, PowerBi, Tableau, Oracle 11g/10g, JSON,
Greenplum, IDQ, PostgreSQL).

Responsibilities:
 Involved in gathering and analysis business requirement and writing requirement specification
documents and identified data sources and targets.
 Communicated with business customers to discuss the issues and requirements.
 Extensive knowledge in designing and developing the complex Informatica mappings by making use of
Java transformation, Connected and Unconnected Lookup transformation, Update transformation,
router transformation, expression transformation, SQL Transformation, Source Qualifier
transformation.
 Used various add on-connectors in Informatica cloud like XML, XML Target, Excel.
 Developed integrations like flat file to service now XML to JSON
 Run the Informatica cloud Data Integration tasks in application integration using service step
 Run the data integration task in application integration.
 Good Exposure on connectors to connect the third part applications using service connector with the
help of API’s
 Worked extensively on different types of transformations like Joiner, Aggregator, Router, Expression
and Normalizer, connected / unconnected Lookups, Data Masking and Update strategy
transformations.
 Used Parameter files with parameters and variables to restore multiple DB connections to the sources
and targets.
 Involved in providing 24/7 production support to solve critical issues.
 Responsible for the Performance tuning at the Source Level, Target Level, Mapping Level and Session
Level.
 Used Session Parameters to increase the efficiency of the sessions in the Workflow Manager
 Worked with trouble shooting in Production support and was responsible for fixing failed tests in
Production Server.
 Developed ETL processes for data transformation and integration between Oracle databases and other
systems.
 Utilized technologies like Oracle Data Pump and SQL Loader for data movement.
 Shared the work with offshore team members.
 Created Scripts in PostgreSQL to retrieve the data and to load the data into Greenplum.
 Developed several IDQ complex mappings in Informatica a variety of Power Center, transformations,
Mapping Parameters, Mapping Variables, Mapplets& Parameter files in Mapping Designer using
Informatica Power Center.
 Experience in Active VOS workflow design, creation of Human task Extensive experience in MDM 10. x
with IDQ and Active VOS 9. 2.
 Imported the IDQ address standardized mappings into Informatica Designer as a Mapplets.
 Utilized of Informatica IDQ to complete initial data profiling and matching/removing duplicate data
 Extensively used Informatica Data Explorer (IDE)&Informatica Data Quality (IDQ) profiling capabilities to
profile various sources, generate score cards, create and validate rules and provided data for business
analysts for creating the rules.
 Worked with Informatica Data Quality (IDQ) toolkit, Analysis, data cleansing, data matching, data
conversion, exception handling, and reporting and monitoring capabilities of IDQ.
Client: BBVA Compass Bank, (Birmingham, AL, USA) January 2017 to January 2018
Role: ETL / Data Warehouse Developer
Project Environment: (Informatica Power Center 9.0.1, Erwin, Tidal, SQL Assistance, DB2, XML, Oracle
9i/10g/11g, MQ Series, OBIEE 10.1.3.2, IDQ, MDM Toad and UNIX Shell Scripts.).

Responsibilities:
 Worked with Business Analysts (BA) to analyze the Data Quality issue and find the root cause for the
problem with the proper solution to fix the issue...
 Understanding the business requirements based on functional specification to design
the ETL methodology in technology specifications.
 Involved in development of logical and physical data models that capture the existing state.
 Used Model Mart of Erwin for effective model management of sharing, dividing and reusing model
information and design for productivity improvement.
 Extracted the data from various heterogeneous sources like DB2, SQL Server and Flat files to the target
database.
 Document the process that resolves the issue which involves analysis, design, construction, and testing
for Data quality issues
 Maintained warehouse metadata, naming standards and warehouse standards for future application
development.
 Acted as a technology specialist to propose best practices during enterprise data integration (ETL, data
replication, data services) for project execution, software implementation and upgrades played a crucial
role defining the data replication services for facets SQL Server tables.
 Dealt with data issues in the staging flat files and after it was cleaned up it is sent to the targets.
 Prepared UNIX shell scripts and these shell scripts will be scheduled in AUTOSYS for automatic
execution at the specific things.
 Extracted data from flat files and oracle database, applied business logic to load them in the central
oracle database.
 Involved in doing the Data model changes and other changes in the Transformation logic in the existing
Mappings according to the Business requirements for the Incremental Fixes
 Worked extensively with Informatica tools like Source Analyzer, Warehouse Designer, Transformation
Developer, and Mapping Designer.
 Extensively used Informatica Data Explorer (IDE) & Informatica Data Quality (IDQ) profiling capabilities
to profile various sources, generate score cards, create and validate rules and provided data for
business analysts for creating the rules.
 Created Informatica workflows and IDQ mappings for - Batch and Real Time.
 Defined the Match and Merge Rules in MDM Hub by creating Match Strategy, Match columns and rules
for maintaining Data Quality
 Involved in data loading using PL-SQL and SQL Loader calling unix scripts to download and manipulate
files.
 Performed SQL and PL/SQL tuning and application tuning using tools like SQL Trace,Tkprof and
autotrace.
 Extensively worked with different transformations such as Expression, Aggregator, Sorter, Joiner,
Router, Filter, and Union in developing the mappings to migrate the data from source to target.
 Used connected and Unconnected Lookup transformations and Lookup Caches in looking the data from
relational and Flat Files. Used Update Strategy transformation extensively with DD_INSERT,
DD_UPDATE, DD_REJECT, and DD_DELETE.
Client: Smartchip, (Hyderabad, INDIA) May 2013 to September 2015
Role: ETL Developer/Analyst
Project Environment: (Informatica Power Center 8.1, Erwin, Oracle 9i, UNIX, Sybase, MS SQL Server
Windows 2000).

Responsibilities:
 Involved in the requirement definition and analysis support for Data warehouse efforts.
 Documented and translated user requirements into system solutions; developed implementation plan
and schedule.
 Designed fact and dimension tables for Star Schema to develop the Data warehouse.
 Involved in the development of data mart and populating the data marts using Informatica.
 Performance tuning of the ETL – by utilizing dynamic cache for lookup partitioning the sessions.
 Created dimensions and facts in physical data model using ERWIN tool.
 Used Informatica Designer to create complex mappings using different transformations to move data to
a Data Warehouse
 Developed mappings in Informatica to load the data from various sources into the Data Warehouse,
using different transformations like Source Qualifier, Look up, Aggregator, Stored Procedure, Update
Strategy, Joiner, Filter.
 Scheduling the sessions to extract, transform and load data into warehouse database on Business
requirements.
 Loaded the flat files data using Informatica to data warehouse
 Created Global Repository, Groups, Users assigned privileges Using Repository manager.
 Setting up Batches and sessions to schedule the loads at required frequency using Power Center Server
Manager.
 Handled common data warehousing problems like tracking dimension change using SCD type2
mapping.
 Used e-mail task for on success and on-failure notification.
 Used decision task for running different tasks in the same workflow.
 Assisted team member with their various Informatica needs.
 Developed and maintained technical documentation regarding the extract, transformation, and load
process.
 Responsible for the development of system test plans, test case creation, monitoring progress of
specific testing activities against plan, and successfully completing testing activities within the requisite
project timeframes.

You might also like