Ankita

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 5

Email: asinghcareers147@gmail.

com
Phone: +1 8722588518
LinkedIn: https://www.linkedin.com/in/ankita-s-88a47b234/

Software Engineer
---------------------------------------------------------------------------------------------------------------
PROFESSIONAL SUMMARY:

● 8 years of professional experience as a Software Engineer, pro client coder in multiple languages and experience in
Design, Development, Implementation of Python, client - server technologies-based applications, RESTful
services, AWS, JAVA and SQL
● Developed and reviewed SQL queries with use of Joins, Window functions in power BI to validated static and
dynamic data for data visualization
● Experienced in working with various stages of Software Development Life Cycle (SDLC), Software Testing Life Cycle
(STLC) and QA methodologies from project definition to post-deployment documentation.
● Experience with Design, code, debug operations, reporting, data analysis and Web Applications utilizing Python
● Experienced in implementing Object Oriented Python, Hash Tables (Dictionaries) and Multithreading, Django,
MYSQL, Exception Handling and Collections using Python
● Worked with MVW frameworks like Django, HTML, CSS, XML, Java Script, jQuery, Bootstrap.
● Strong experience of software development in Python (libraries used: libraries numpy, matplotlib, python-
twitter, Pandas data frame, networks, MySQLdb for database connectivity) and IDE’s - sublime text, Spyder
● Hands-on experience working with various Relational Database Management Systems (RDBMS) like MySQL,
Microsoft SQL Server, Oracle & non- relational databases (NoSQL) like MongoDB and Cassandra
● Experienced in developing Web Services with Python programming language - implementing JSON
based RESTful and XML based SOAP web services
● Pycharm proficient in performing Data analysis and Data Visualization using Python libraries.
● Experience in using Version Control Systems like GIT to keep the versions and configurations of the code organized
● Exposure to CI/CD tools - Jenkins for Continuous Integration, Ansible for continuous deployment.
● Experienced with containerizing applications using Docker
● Experience in maintaining and executing build scripts to automate development and production builds.
● Experience in Amazon Web Services (AWS) cloud platform like EC2, Virtual private clouds (VPCs), Storage models
(EBS, S3, instance storage), Elastic Load Balancers (ELBs)
● Designed the data models to be used in data intensive AWS Lambda applications which are aimed to do complex
analysis creating analytical reports for end-to-end traceability, lineage, definition of Key Business elements
from Aurora
● Implemented automated local user provisioning instances created in AWS cloud and google cloud.
● Excelled on creating AMI (AWS Machine Images) that utilizes ELB (Elastic Load Balancer) and Auto Scaling.
● Implemented a 'serverless' architecture using API Gateway, Lambda, and DynamoDB and deployed AWS
code from Amazon S3 buckets. Created a Lambda Deployment function and configured it to receive events from
your S3 bucket
● Experience with Unit testing/ Test driven Development (TDD), Load Testing
● Good Experience on testing tools like JIRA, HPC QLM for bug tracking. Experience in building frameworks and
automating complex workflows using Python for Test Automation
● Experience in implementing with server-side technologies with restful API and MVC design patterns with node
JS and Django framework
● Hands on experience in using NOSQL libraries like MongoDB, Cassandra
● Databases like Oracle, SQLite, PostgreSQL and MYSQL databases
● Experience in deploying applications in heterogeneous Application Servers TOMCAT, Web Logic and Oracle
Application. Server
● Good Knowledge in Amazon AWS concepts like EMR and EC2 web services which provides fast and efficient
processing of Big Data
● Proficient in Object oriented design experience, with extensive experience of Python-C/C++ binding using Boost
Python

TECHINCAL SKILL:
Programming Languages: Python 2.X, Python 3.X, Java, SQL , C, C++
Web technologies: HTML, CSS, XHTML, Java Script, jQuery, AJAX, XML, JSON
Webserver: Web logic, Apache Tomcat 5.5,6.0,8.0
Databases: MySQL, PLSQL, Oracle, Microsoft SQL, PostgreSQL, MongoDB
Python Web frameworks: Django, Pyramid, Flask, web2Py.
Development IDE: PyCharm, Pydev Eclipse, Net beans, MS Visio, Sublime Text, Notepad++
Operating Systems: Linux, Windows 10/8/7/Vista/XP, Mac
Web Services: SOAP, RESTful, pyspark
Version Control: Git-Hub
Build Tools: GNU, Apache Ant, Apache Maven, Buck, Bit-Bake, Boot, Grunt
Methodologies: Agile, Scrum, Waterfall

PROFESSIONAL EXPERIENCE
Company: NREL (National Renewable Energy Laboratory), Denver, CO June 2023 – Present
Description: NREL (National Renewable Energy Laboratory) is a research-based organization focused on renewable
energy and energy efficiency. I was part of the Stratus Cloud Team within the Advanced Computing Operations (ACO)
group, specializing in advanced technologies like IoT and machine learning (ML). My responsibilities included gathering
requirements, translating business details into technical design, and participating in all stages of the SDLC. I developed
front-end and back-end modules using Python and Django, implemented digital twins for hydrogen plants using AWS
TwinMaker, and led the development of CloudBot, a research assistant leveraging AWS technologies. I also managed
RESTful APIs, developed custom Power BI dashboards, and created Docker containers.
Role: Software Engineer
Responsibilities:

● Gathering requirements and translating the Business details into Technical design.
● Participated in all the stages of software development lifecycle (SDLC) like design, testing development and
implementation.
● Developed entire frontend and backend modules using Python on Django Web Framework by implementing MVC
architecture.
● Spearheaded development of digital twins for hydrogen plants utilizing AWS TwinMaker. Implemented real-time
analytics and dashboard functionalities using AWS IoT Core and edge computing technologies, leveraging Kafka for
seamless data stream management and Grafana for visualization. Achieved 40-70% enhancement in operational
insights and decision-making capabilities.
● Led development of CloudBot, an LLM-powered research assistant, integrating AWS SageMaker, Bedrock, and Langchain
ReAct. Designed to streamline researchers' access to cloud and native applications at NREL. And achieved a 40% boost in
insights extraction efficiency. Continuously optimizing model performance through iterative tuning.
● Designed and developed REST API service by leveraging AWS Lambda, API Gateway, and DynamoDB to streamline data
processing, fostering real-time communication between clients and servers, and delivering a 40% reduction in response time,
resulting in significantly enhanced efficiency and scalability for dynamic application requirements.
● Implemented responsive user interface and standards throughout the development and maintenance of the
website using the HTML, CSS, JavaScript, jQuery.
● Worked on MongoDB database concepts such as locking, transactions, indexes, Sharding, replication, schema
design.
● Developed custom Microsoft Power BI dashboards and visualizations to deliver actionable insights
● Experience in creating Docker containers leveraging existing Linux Containers and AMI's in addition to creating
Docker containers from scratch.
● Implemented Google OAuth flow
● Writing SQL queries to support product and other teams with ad-hoc reporting
● Used JIRA to build an environment for development.
● Developed Wrapper in Python for instantiating multi-threaded applications.
● Creating RESTful web services for Catalog and Pricing with Django MVT, MySQL, and MongoDB.
● Fixed bugs, providing production support, enhanced applications by improving code reuse and performance by
making effective use of various design patterns.
● Deployed and monitored scalable infrastructure on Amazon web services (AWS).
● Implemented monitoring and established best practices around using Elastic search and used AWS Lambda to run
code without managing servers.
● Experience in Amazon Web Services (AWS) cloud platform like EC2, Virtual private clouds (VPCs), Storage models
(EBS, S3, instance storage), Elastic Load Balancers (ELBs).
● Implemented a 'serverless' architecture using API Gateway, Lambda, and Dynamo DB and deployed AWS
Code from Amazon S3 buckets. Created a Lambda Deployment function and configured it to receive events from
your S3 bucket.
● Generated graphical reports using python package NumPy and Matplotlib.
● Implemented task object to interface with data feed framework and invoke database message service setup and
update functionality.
● Performed efficient delivery of code based on principles of Test-Driven Development (TDD) and continuous
integration to keep in line with Agile Software Methodology principles.

Environment: Python 3.7, Django 1.7, HTML5, CSS, JSON, JavaScript, AJAX, RESTful web service, MongoDB, MySQL,
jQuery SQLite, Docker, Windows 12 Server, AWS (EC2, S3), PyUnit, Jenkins, Selenium Automation Testing.

Company: NREL(National Renewable Energy Laboratory), Denver, CO Jun 2022 – May 2023
Role: Software Engineer
Description : NREL (National Renewable Energy Laboratory) is a research-based organization focused on renewable
energy and energy efficiency. As a full-time Software Engineer, I continued my work with the Stratus Cloud Team within
the Advanced Computing Operations (ACO) group. My role involved communicating regularly with business and IT
leadership, building and deploying jobs using Airflow, and developing ETL pipelines for data ingestion into S3. I developed
a Single Sign-On (SSO) cloud portal website using Django. I also wrote SQL queries, developed stored procedures,
coordinated with team members on test scripts, and optimized Spark jobs for faster data processing.
Responsibilities:
● Communicate regularly with business and I.T leadership.

● Built and Deployed jobs using Airflow.

● Responsible for data extraction and data ingestion from different data sources into S3 by creating ETL pipelines
using Spark and Hive.
● Developed a Single Sign-On (SSO) cloud portal website using the Django framework, incorporating microservices,
web services, and serverless patterns.
● Developed enterprise dashboards in Power BI to meet the need of stakeholders

● Wrote SQL queries using Joins, Window Functions, CTE to extract data from multiple tables

● Specialized in developing stored procedures/functions, packages and database triggers using Oracle SQL and
PL/SQL.
● Co-ordinated with the other team members to write and generate test scripts, test cases for numerous user stories.

● Analyzed the SQL scripts and designed them by using PySpark SQL for faster performance.

● Knowledgeable in AWS Services including S3 and RDS.

● Hands-on experience in creating Docker containers and Docker consoles for managing the application life cycle.

● Developed a Single Sign-On (SSO) cloud portal website using the Django framework, incorporating microservices,
web services, and serverless patterns.
● Involved in building database models, APIs, and Views utilizing Python, to build an interactive web-based solution.

● Experience in handling, configuration, and administration of databases like MySQL and NoSQL databases
like MongoDB and Cassandra.
● Create a Pyspark frame to bring data from DB2 to Amazon S3 and optimized the Spark jobs to run on Kubernetes
Cluster for faster data processing.
● Used GIT to collaboratively interact with the other team members.

● Involved in Agile methodologies, daily scrum meetings and sprint planning.

● Used Django configuration to manage URLs and application parameters.

● Knowledge in Data mining and Data warehousing using ETL Tools and Proficient in Building reports and dashboards
in Tableau (BI Tool)
● Designed and managed API system deployment using a fast HTTP server and Amazon AWS architecture.
Environment: Python, Django, HTML5, CSS, XML, AJAX, jQuery, Pyquery, PostgreSQL, Eclipse, Git
Client: Amdocs, India Jun 2015 – May 2021
Description: Amdocs is a leading provider of software and services to communications and media companies, enhancing
customer experiences through innovative solutions. I designed RESTful APIs to improve the efficiency of Amdocs Billing
Products for AT&T, reducing processing time by 20%. I led a seamless migration of 1 million data records from Oracle to
MySQL and optimized billing products, resulting in a 25% boost in system efficiency. My role also included developing and
debugging software tools, creating automated solutions, and providing production maintenance support.

Role: Software Engineer


Responsibilities:

● Designed RESTful APIs for microservices to meet specific business requirements for enhancing the efficiency and
responsiveness of Amdocs Billing Products for AT&T, resulting in a 20% reduction in processing time.
● Used UML Tools to develop Use Case diagrams, Class diagrams, Collaboration and Sequence Diagrams, State
Diagrams and Data Modeling.
● Led and validated the migration of 1 million data from Oracle to MySQL, collaborating with senior engineers to
ensure a seamless transition with a performance architecture approach.
● Enhanced existing automated solutions, such as the Inquiry Tool for automated Asset Department reporting and
added new features and fixed bugs
● Created database using MySQL, wrote several queries to extract/store data.

● Developed, tested and debugged software tools utilized by clients and internal customers

● Extracted and loaded data using Python scripts and PL/SQL packages

● Optimized Amdocs OSS and BSS billing products, leading rigorous network testing with XML files and SoapUI,
achieving a 25% boost in system efficiency. Demonstrated strong Troubleshooting skills and seamless integration of
critical functionalities.
● Worked on Installation and configure MongoDB Cluster nodes on different AWS EC2 instances.

● Supported Java application for Media portal management.

● Wrote python scripts to parse XML documents and load the data in the database.

● Creating unit test/regression test framework for working/new code.

● Debugging and testing of the applications & fine-tuning performance.

● Provided maintenance support in the production environment.

Environment: Python, Django, HTML5, CSS, XML, AJAX, Bootstrap, jQuery, Pyquery, PostgreSQL, Eclipse, Git, Linux, Shell
Scripting.

You might also like