Cloud RanganathJasti
Cloud RanganathJasti
Cloud RanganathJasti
Email: [email protected]
Expert in Extraction, Transforming and Loading (ETL) data flows using SSIS; creating
workflows to extract data from SQL Server and Flat File sources and load into various
Business Entities.
Large amount of experience in gathering user requirements and translating them into
technical and system specifications. Managing clients, vendors, and partner relationships.
Extensive experience for business information systems, focusing on database
architecture, data modeling, data analysis, programming, and application integration.
Experience in creating Tables, Views, Stored Procedures, and Indexes, troubleshooting
database issues and Performance Tuning of Stored Procedures and Database.
Automated the data loading from Snowflake's internal stage to the Snowflake tables by
creating client applications in python which calls the Rest endpoints API with the name
of the snow pipes.
Created auto sys jobs to run the python applications responsible for loading the data
automatically on scheduled basis.
Good experience in migration of data solutions from other databases to Snowflake, GCP
Big query and AWS Redshift using various AWS services.
Experience in developing Data Engineering routines using Snowpark API with python.
Experience with snowflake Multi-Cluster Warehouses and In-Depth knowledge of
Snowflake database, schema, and table structures.
Expertise in Snowflake data modelling, ELT using Snowflake SQL, implementing
complex stored Procedures and standard DWH and ETL concepts using Snowpark API
with python.
Experience in building and architecting multiple Data pipelines, end to end ETL and ELT
process for Data ingestion and transformation using GCP, Storage, Big Query, Dataflow,
Composer and Data Proc.
Created tables, views, complex stored procedures, and functions in Big Query for the
consumption of client applications responsible for BI Reporting and Analytics.
Leveraged Python and Apache beam and execute it in cloud Dataflow to run Data
validation between raw source file and Big query tables.
Used S3 storage for summarized business data and leveraged Athena for SQL queries
and Analysis.
Developed data ingestion modules (both real time and batch data load) to data into
various layers in S3, Redshift using AWS Glue, AWS Lambda and AWS Step Functions.
Create S3 buckets for data storage in AWS cloud and manage bucket policies and
lifecycle as per organization’s guidelines.
Create PySpark Glue jobs to implement data transformation logics in AWS and stored
output in Redshift cluster.
Experience in creating Data Governance Policies, Data Dictionary, Reference Data,
Metadata, Data Lineage, and Data Quality Rules.
Educational Qualification:
1
Degree of Bachelor of Science (Computer Science) in 2003 from Andhra University.
One year Post Diploma in Software Engineering in 2004 from NTTF IT Centre, Chennai.
Software Profile:
2
Create PySpark Glue jobs to implement data transformation logics in AWS and stored
output in Redshift cluster.
Used S3 storage for summarized business data and leveraged Athena for SQL queries
and Analysis.
Validating the data from SQL server to snowflake to make sure it has Apple to Apple
match.
Environnent : AWS S3, Lambda, Glue, EMR, Athena, Redshift, Py Spark SQL Server 2016, SSIS,
Snowflake, Python, Snowpark API, Rest endpoints API, and Auto-sys scheduler,
Gcp, Big query, Cloud Dataflow, Dataproc, Cloud SQL, pub/sub, Composer.
GCCP- Global client communication platform is a client reporting application used to generate
various kinds of reports like Client books, Factsheets for Invesco Global institutional clients
located across APAC, EMEA and USA. This system allows the Reporting team, Marketing Team,
Compliance Team for Review\Approvals and finally publish the documents to the external
clients/Invesco website.
3
Validating the data from SQL server to snowflake to make sure it has Apple to Apple
match.
Environnent : AWS S3, Lambda, Glue, EMR, Athéna, Redshift, Py Spark SQL Server 2016, SSIS,
Snowflake, Python, Snowpark API, Rest endpoints API, and Auto-sys scheduler,
Gcp, Big query, Cloud Dataflow, Dataproc, Cloud SQL, pub/sub, Composer.
Environnent : SQL Server 2012, SSIS, SSAS, XML, TFS and MS.Net Web Services and Desk Net
Content Welder 8.2 and Power BI, MDX, DAX.
4
VUE Compensation Management is a powerful, flexible, and intuitive tool that makes it easy for
insurance organizations to organize and streamline complex commission and incentive
programs. The system will create the reports and information necessary to manage the business
and build strategic business knowledge. Agent Management, Agent License, Agent
Appointment, Agent Contract, Carrier, Product, Insured, Plan, Financial, Reports, Tracking and
Controlling, Admin and Document Management are the different components Present in VUE.
Environnent : SQL Server 2005/2008, SSIS, SSRS, Flat files, XML, CSV, VBScript.
Broker Unit Database is used to record the data related to the Brokers who get the business to the
company in the form of customers who are referred as Introductions. The main purpose and
scope of the Broker Unit Database System is to record the details of Brokers, record the details of
Professionals, record the details of Introductions and Generating different type of Reports as per
required by Business.
Environnent : ASP.Net 2.0, ADO.Net 2.0, SQL Server 2005, IIS 5.0, C#.Net, Java Script
Major Responsibilities:
5
Involved in ETL Process and developed medium to complex SSIS packages.
Migrated data from Flat files, Excel spreadsheets to SQL Server databases using
Integration Services (SSIS).
Extensive use of Stored Procedures. Writing new stored procedures, T-SQL, UDF,
and modifying existing ones and tune them such that they perform well.
Create package configurations, deploy Integration Services projects and schedule SSIS
packages via jobs.