Durga Prasad - Resume
Durga Prasad - Resume
Durga Prasad - Resume
[email protected]
____________________________________________________________________________________
OBJECTIVE:
To obtain a challenging and rewarding position in a dynamic Organization where my education and
experience will be in tandem with the company goals
PROFESSIONAL SUMMARY
13+ of IT experience in analysis, design, development, documentation, implementing and
testing of software systems in Java, J2EE and Internet technologies.
Developed new libraries with Micro Services architecture using Rest APIs, spring boot
Transformed legacy application into a suite of Aws, Azure cloud-hosted Microservices using
Spring Boot
Worked in agile framework, collaborating with business and research teams in story grooming,
reviewing story/acceptance criteria and performance metrics
Extensive experience in developing Microservices using Spring Boot, Netflix OSS (Eureka,
Hystrix) and followed domain driven design.
Having Experience in Design API specifications using BIAN , Open Banking Standards
Having Good Experience in Core Development using Java 8 features
Having Good Knowledge on Software development using SDLC using various methodologies
like Agile, Waterfall
Experience in using build/deploy tools such as Jenkins, Docker for Continuous Integration and
Deployment for Microservices.
Having Good exposure on Multi-threading and Concurrency
Experience in implementing the Java design patterns such as Singleton, Factory, Decorator,
Strategy pattern in the development of multi-Tier distributed Enterprise Applications.
Hands on experience with build and deployment tools including Gradle/Maven. logging and
Debugging using slf4j and log4j
Having Good experience in unit and integration testing using Junit, Mockito, Cucumber
Experience in writing queries in Sql and No Sql like Mongo Db, Couchbase
Having Good experience in Kafka Producer and Consumer, Kafka Streams, Connectors,
setup on Kafka Cluster and Good knowledge on Consumer Groups, Topics, Partitions etc.
Implemented DAOs, entities using Hibernate API and Hibernate QL.
Participated in the daily stand up meetings as part of AGILE process for reporting the day to
day developments of the work done
Having experience in Banking, Payment, Retail, Telecom, Travel domains
Skills:
Java 8, Spring Boot, Micro Services, Design Patterns, Design Principles, Kafka Messaging System, K-
Streams, Kafka Connectors, Multithreading, Collections, Data Structures and Algorithms, SQL, NO-
SQL, Rest API, CI/CD, Azure, Docker, Kubernetes, AWS, Azure
PROFESSIONAL EXPERIENCE:
Currently working as Java Technical Lead at SID Global Digital Solutions, Hyderabad (Oct
2022 – 2024 and Clients – BDO Bank, ICICI Bank)
I worked as Lead Engineer at Tesco India Pvt Ltd, Hyderabad (Jan 2022 – 2022 via
JouelsToWatts Business Private Limited)
I worked as Senior Lead at Gap It India Pvt Ltd, Hyderabad (Jan 2021- 2021, via Synergy
Global)
I worked as Lead at Wells Fargo India Pvt Ltd, Hyderabad (Feb 2019- 2021, via Synechron
Technologies)
I worked as Analyst 1 Apps Prog at BA Continuum India Pvt Ltd, Hyderabad (November
2014- 2018)
I Worked as Software Engineer at Verizon Data Service India Pvt. Ltd, Hyderabad (April
2012-November 2014)
I Worked as Technical Consultant at Sonata-Software Pvt. Ltd, Hyderabad (Feb 2011-April
2012)
PROJECT EXPERIENCE:
To build customer convenience by providing a reliable and efficient payment solution for corporate
customers
To ensure a Seamless and Stress-free transaction journey for the client, assuring the transaction to be
Successful, avoiding the need for frequent status checks for timeout and pending status, remembering
error codes and reducing complexities
Technology: Core Java 8, Spring Boot, Microservices, Kafka Messaging system (Kafka Streams,
Ktable, Kafka connectors, Non- Blocking Retry, Schedulers, Processor API’s, Kafka Joins) , Design
Patterns, Docker, Aws
Responsibilities:
Working as Kafka Architect, gathering requirements based on use case from Business Stake
holders and do the analysis whether Kafka required or not and setup the Kafka cluster
Having experience in Kafka Streams to redirect the multiple payment systems as NEFT,
RTGS, FT, SQL, Ktable for mapping data from topic to topic and do the joins for matching
and unmatching data
Create the design based on BRS and Kafka Template like Source and Destination system
details and volume of the data and frequency
Implement the development for producer and consumer, creation of topics and partitions and
deploy the code into Kafka clusters in Control Center
Having experience in managing exceptions and failure scenarios
Having experience in Kafka Connectors like JDBC Source and Sink Connectors
Having experience in creation of Kstreams and Ktable and process the data based on the
different types of payment modes like NEFT, IMPS, RTGS, UPI, FT
Having experience in using Non - blocking Retry mechanisms and schedulers
Having experience in implementation of debt status check API
CMS Api Banking - Corporate customers process the transaction via Api banking
Supports Internet and Mobile Banking
Technology: Core Java 8, Spring Boot, Microservices, Kafka Messaging system (Kafka Streams,
Ktable, Kafka connectors, Non- Blocking Retry, Schedulers, Processor API’s, Kafka Joins) , Design
Patterns, Docker, Aws , Design High – Level , Low – Level System, API Specifications, BIAN
Standards, Open Banking Standards
Responsibilities:
Make design and functional enhancements to consumer facing Android and iOS mobile
banking applications
Experienced in assisting customer with issues with online banking.
Filling cases to correct customer online issues.
Servicing customer with bill pay issues and lost payments.
Experience with third-party libraries and APIs- Run Fortify scans to verify application security
protocols met
Designing High-level and Low-level system designs
Creating Swagger Specifications
Applying BIAN (Banking Industrial Architecture Network) and Open Banking standards
Working with stakeholders to deliver the right solution
Maintained friendly and professional customer interactions.
Review Designs and Code Quality
Transport the goods from different depots to Tesco stores in optimized way in European region
Technology: Core Java 8, Spring Boot, Microservices, Kafka Messaging system, Design Patterns,
Docker, Azure, Postgres, My Sql
Responsibilities:
Reduced the time to fetch the stores for each depot by using index and optimising the queries. It
decreased the time from 50 seconds to 7 seconds. It improves the performance alot
For validating csv headers, we created. yml and added regex pattern for each of the header, then
map the headers with expected headers in csv which wil simply the validating headers of each
csv and avoid the lot of boiler plate code to validate csv header
Using Completable feature, we processed the most of the demands parallelly instead of
sequentially
Transformed monolithic app to microservices using Spring Boot
Centralized configuration & centralized logging by deploying into Azure cloud & Splunk,
respectively
Coordinated with the Technical Architects for detailed design and implemented Java-8 based
applications
Outlined documents & performed program coding & testing in compliance with the approved
life cycle methodologies
Fixed & troubleshot Java code issues and technical issues for designers & developers during the
project lifecycle
Resolved application-based issues through debugging, market research, and investigation
Validated the different types of demands across the tesco depots and stores
Implemented kafka producer and consumer configuration on kafka set-up
Implemented to reproduce the failure messages in kafka using offset id
Used various design patterns in the application: Startegy, Factory patterns etc.
Transport the goods from different depots to Tesco stores in optimized way in European region
Technology: Core Java 8, Spring Boot, Microservices, Kafka Messaging system, Design Patterns,
Docker, Azure, Postgres,
Responsibilities:
CMS is an Engine and received feeds from Multiple up streams and processed to downstream (FLM) in
3 phases like filtering, validating, transforming the data. We processed projections and actuals to
downstream.
Technology: Core Java 8, Spring Boot, Microservices, Design Patterns, Docker, AWS, Postgres,
Responsibilities:
S3D (Single Source Standing Data) is an application which handles all product and party data and we
received data from up streams and will process those data to down streams based on request of different
downstream systems through GLOSS (Global Settlement Systems). Currently this handles 30
downstream systems.
Technology: Core Java , Spring Boot, Microservices, Design Patterns, Docker, Postgres,
Responsibilities:
Interact with a client and peers’ developers to understand technology specifications from
business requirements for the modules within the project
Used multithreading for implement the Reports generation and Price batch applications for
Source system feeds
Coordination with QA for the application testing
Worked on defects for fixes and did PR review .
Involved in Rules engine implementation for processing Fixed Income Instruments and using
concurrency package, Collections as part of implementation
Debug and fixing production issues as priority in time
Indulged in core development, deployment and server side configurations, where in did
programming in core java 8, JVM tuning for utilizing the CPU and heap.
Involved in understanding the requirements of input data and map them to the design then
developing with Testing in every Sprint having AGILE Methodology.
Experience in using Façade and Decorator Design Patterns
Responsibilities:
Musky is the middle layer between IRIS. Plus and the inventories/ platforms and serves to connect
them, an easy way of understanding the functionality of both Musky and IRIS. Plus is to document the
communication between them.
Responsibilities:
JNTU | Hyderabad
Btech Computer Science