0% found this document useful (0 votes)
42 views7 pages

Ashish Pingili DA

Download as docx, pdf, or txt
Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1/ 7

Ashish Pingili

Data Analyst
Email: pingiliashish@gmail.com, Phone: +1 (551) 775-1106

Dedicated and results-driven Data Analyst with 7 years of experience in analyzing and interpreting complex
datasets to drive business decisions. Proficient in data visualization, statistical analysis, and data mining
techniques, Advanced proficiency in data manipulation languages such as SQL, Python, R, Java, and data
visualization tools including Tableau & Power BI. I am an excellent communicator and collaborator. I am
committed to continuous professional development in the ever-evolving field of data analysis.

Professional Summary:

● Proficient in Python, R, SQL, and Java with hands-on experience in end-to-end data projects.

● Excellent knowledge in various Machine Learning techniques.

● Strong understanding of DBMS and cloud platforms such as Azure.

● Experienced in blockchain technologies ensuring data integrity and security.

● Expertise in data analysis, data processing, and data modeling.

● Conducted data collection, cleaning, and transformation processes using SQL, Python, and Pandas to

prepare structured datasets for analysis.

● Performed advanced data analysis and generated actionable insights through statistical techniques

such as regression analysis, clustering, and hypothesis testing.

● Developed and maintained complex SQL queries and stored procedures to extract and manipulate

data from large relational databases.

● Developed ad hoc reports using Power Pivot and Power Query for anticipating significant efforts and

evaluating per business needs.

● Expertise in writing complex DAX Queries and Expressions in Power BI and Power Pivot.

● Worked with Power View in Excel, to filter and sort data and Power Maps for reporting.

● Contributed to the creation of machine learning models for predicting user churn, which led to a 10%

reduction in customer attrition.

● Create interactive data visualizations and dashboards using Tableau, providing stakeholders with a

clear and intuitive representation of key performance indicators.

1
● Created Power Pivot Reports in Excel and uploaded them to SharePoint to deliver ad-hoc reports.

● Made use of Power View to create charts and other visualizations.

● Collaborated with cross-functional teams to define key metrics, track performance, and provide data-

driven recommendations for process improvements.

● Utilized machine learning models in Python (e.g., scikit-learn) to build predictive models for customer

behavior, resulting in a 15% increase in customer retention.

● Assisted in the design and implementation of a real-time dashboard using Power BI to monitor

website traffic and user engagement metrics.

● Conducted A/B testing on website features and marketing campaigns to optimize user experience and

increase conversion rates.

● Implemented data quality checks and monitoring processes to ensure the accuracy and reliability of

data, reducing errors by 20%.

● Streamlined data reporting by automating daily, weekly, and monthly reports using Python and

scheduling tools like Apache Airflow.

● Participated in data governance initiatives to maintain data integrity, compliance & security in

alignment with industry standards and regulations.

● Collaborated with the data engineering team to optimize ETL processes and improve data extraction

and loading times using Apache Spark.

Technical Skills:
Programming Languages Python, R, Java, SAS, SQL, and C.
Development Environments IntelliJ, Visual Studio, Eclipse, Jupyter notebook, and Android Studio.
Data Analysis Tools Python libraries (Pandas, NumPy, Scikit-learn, TensorFlow), Excel.
Machine Learning Linear Regression, Random Forest, SVM, Neural Networks and building
predictive models.
Databases & Tools SQL, MySQL, MongoDB, Heidi SQL, Hadoop Azure SQL, Oracle, MS Access,
Business Intelligence Development Studio (BIDS), SQL Profiler, SQL Query
Analyzer, AWS
Visualization & reporting Tools Tableau, Power BI, MS Reporting Server (SSRS), Excel Power View, XML Spy.
Data Modeling Tools Power Designer, MS Visio, ERWIN
Operating Systems Windows, Mac, UNIX, LINUX & ChromeOS
Data Analysis skills Decision-making, Risk assessments, Data mining, Blockchain (Ardor,
Lightweight Contracts)

2
Professional Experience:
Walgreens, Chicago
April 2023 – Current
Data Analyst
Description:

Developed and maintained a sophisticated sales forecasting model, incorporating historical sales data,
seasonality trends, and external factors, resulting in a 12% reduction in excess inventory and a 10% increase in
on-shelf availability of critical pharmaceutical products. Conducted in-depth analysis of inventory turnover
rates, identifying slow-moving and obsolete items, and recommending clearance strategies that led to a 7%
reduction in holding costs.

Responsibilities:

 Aggregated, Grouped, and Merged data from multiple databases and tables using SQL to build dataset
for Modeling.

 Built an Optimized Logistic Regression Email Response rate model to predict whether a customer will
buy an insurance policy or not.

 Designed Production code for the Logistic Regression Model to automate the Email Marketing
Campaign daily.

 Built production-level OLS regression models on different segments to predict sales price with an adj-
R2 over 60% beating the previous model.

 Automated binning and visualization of variables using SAS macros and Power BI to reduce variable
exploration time by more than 80%.

 Created linear splines to transform and make continuous variables linearly related with target
variables.

 Employed Python libraries like Boto3, Pandas, and NumPy for advanced data analytics, manipulation,
and improvements.

 Performed coefficient consistency test to check consistency of covariates across different segments.

 Standardized data formats to improve usability and data quality.

 Automated weblogs extraction via Airflow DAGs, simplifying data processing.

 Developed interactive dashboards in Power BI to monitor key performance indicators and project
progress.

3
 Managed project versions and progress through Git and JIRA.

 Integrated Power BI with SQL databases to create real-time, data-driven reports and visualizations.

Environment:

Python, SAS, R, SQL, PySpark, MySQL, OLS, PCA, SAS Enterprise Miner, JMP, Excel, Git, JIRA, Power BI.

Infosys
Nov 2020 – Jan 2023
Data Analyst

Description:
Providing decision making support across the Toyota North American Enterprise is one of the core objectives of
the NAC Program Pilot BoM and Part Costs Requirements project developed detailed business requirements for
creating and maintaining Bills of Material (eBoM). The overall goal of the project is to implement a technology
solution that will support business processes by improving the accuracy, speed, and frequency of actual
product cost development and improve stakeholder access to BoM and part cost data.

Responsibilities:
 Employed SQL Server Integration Services to implement ETL (Extraction, Transform, and Load) for data
migration and data staging.
 Data ingestion to one or more Azure services (Azure Data Lake, Azure Storage, Azure SQL, Azure DW)
and processing the data in Azure Databricks.
 Implemented Apache Airflow for authoring, scheduling, and monitoring Data Pipelines.
 Developed Mappings using Transformations like Expression, Filter, Joiner and Lookups for better data
messaging and to migrate clean and consistent data.
 Involved in Configuring and Upgrading the On Premises Data gateway between various Data sources
like SQL Server to Azure Analysis Services and Power BI service.
 Designed several DAGs (Directed Acyclic Graph) for automating ETL pipelines.
 Created a variety of T-SQL stored procedures, triggers, views, and table additions/changes for data
loading, transformation, and extraction.
 Contributed to the creation of the data warehouse's data model or models. All data strategies and out-
of-scope procedures were recommended.
 Converted the intricate business calculations and logic into code.
 Created shell and Python scripts to carry out procedures or tasks that Snaplogic does not yet support.
 Managed Snaplogic servers, pipelines, and jobs that were scheduled or triggered.
 All development activities, including design, development, test, and quality assurance support, were
carried out in accordance with the SDLC methodology.

4
 Presented a variety of reports successfully while working in agile sprints.
 Independently developed or helped develop performance tracking metrics.
 Developed new stored procedures, user-defined functions, and user-defined triggers for application
developers. developed SQL scripts for scheduling and adjusting. SQL queries were tuned and
optimized utilizing the execution plan and profiler.
 Architected and managed Redshift clusters, optimizing query performance through query tuning,
indexing, and data compression techniques.
 Implemented robust data security and access control measures in compliance with industry standards,
ensuring data confidentiality and integrity.
 Using SQL, SQL Azure, Azure Storage, Azure Data Factory, SSIS, and PowerShell mostly for data
migration.
 Developed ad hoc reports using Power Pivot and Power Query for anticipating significant efforts and
evaluating per business needs.
 ERWIN or other data modeling tools were used to create dimensional models. Analyze both organized
and unstructured data by doing data profiling.
 Created Power Pivot Reports in Excel and uploaded them to SharePoint to deliver ad-hoc reports as
needed by clients.
 Made use of Power View to create charts and other visualizations.
 Using Power View in Excel, you can filter and sort data. You can also create Power Maps for reporting.
 Supported multiple SQL Agent Jobs, Packages, and Reports for other teams/offshore.
 Worked on a daily automatic file upload procedure and numerous data loads.
 Designed and automated several manual data loading tasks.
 Used SQL Server as a DBA, developer, and analyst for the Batlas Power project.

5
Environment:

Azure services, Python, Java, PowerShell, MS Power BI, MS SQL Server, SSIS, SQL Profiler, Apache Airflow.

Kricon IT Solutions
Jan 2017 – Nov 2020
SQL / Power BI Developer

Description:
A leading global financial services company, offering financial advice in all aspects of investment banking,
private banking, and asset management. The Investment Bank provides a diversified global portfolio of financial
products and services to institutional, hedge fund, corporate and government clients around the world.
Investment Banking Department provides the full range of advisory, origination and capital raising services
which serves customers in community banking, consumer, commercial and small business lending, and wealth
management.

Responsibilities:
 Created visualizations to make data easily accessible across the organization.
 Prepared ad-hoc analysis run reports related to specific sales, promotions and events, and perform
analysis that will enhance business decision-making.
 Scheduled and monitored ETL processes using DTS Exec utilities and batch files. Designed and
implemented ETL job restart ability using checkpoints and optimal package design techniques in SSIS.
 Identify data quality issues and support Data governance by participating in necessary activities.
 Developed various T-SQL stored procedures, triggers, views and adding/changing tables for data load,
transformation and extraction.
 Wrote complex T-SQL Queries, Stored Procedure, Triggers, Views & Indexes using DML, DDL
commands and user defined functions to implement the business logic.
 Implemented Copy activity, Custom Azure Data Factory Pipeline Activities
 Involved in architecture Design using Power BI to connect to various data set (hive, csv, excel) through
Azure data lake to create ad-hoc report.
 Primarily involved in Data Migration using SQL, SQL Azure, Azure storage, and Azure Data Factory,
SSIS, PowerShell
 Utilized Power Query in Power BI to Pivot and Un-pivot the data model for data cleansing and data
massaging.
 Implemented several DAX functions for various fact calculations for efficient data visualization in
Power BI.
 Created SSIS Packages using Pivot Transformation, Fuzzy Lookup, Derived Columns, Condition Split,
Term extraction, Aggregate, Execute SQL Task, Data Flow Task, and Execute Package Task etc. to
generate underlying data for the reports and to export cleaned data from Excel Spreadsheets, Text file,
MS Access and CSV files to data warehouse.

6
 Involved in designing and developing Data Warehouses, Data marts and Business Intelligence using
multi-dimensional models such as Star Schemas and Snowflake Schema for developing CUBES using
MDX.
 Scheduled the packages to keep extracting the data from OLTP at specific time intervals.
 Worked on formatting SSRS Reports using Global variables and expressions.
 Created various visualizations like waterfall, funnel, matrix visualization, scatter plots, combo charts,
gauges, cards and KPI.
 Published the Power BI Desktop models to Power BI Service to create highly informative dashboards,
collaborate using workspaces, apps, and to get quick insights about datasets.
 Experience in publishing Power BI Desktop reports created in Report view to the Power BI service.
 Provided security by using row level security implementation.
 Created various Power BI Reports that included Charts, Filters, Scorecards, Drilldown and Drill-
Through, Cascade, Parameterized reports that involved conditional formatting.
 Mentored business power users to create reports/dashboards using Power BI.
 Used Power Query to acquire data and Power BI desktop for designing rich visuals.
 Developed custom calculated measures using DAX in Power BI to satisfy business requirements.
 Developed and implemented complex XML schemas for a large e-commerce website, resulting in
improved data organization and faster load times.
 Collaborated with a team of developers to troubleshoot and resolve issues with XML data exchange
between systems, ensuring seamless integration across platforms.

Environment: SSIS, SSAS, SSRS, SSMS, Visual Studio, SharePoint, Power BI, Azure, C#, TFS, JIRA, XML Spy.

You might also like