Experienced Data Analyst proficient in data migration, ETL processes, and data verification.
Skilled in extracting, transforming, and loading data from various sources using Python, APIs, and shell scripting.
Proficient in developing frameworks and business rules for enhanced data quality and integrity, resulting in streamlined decision-making and improved migration processes.
Adept at creating comprehensive dashboard reports for clear visualization and proactive data cleansing strategies.
Collaborative team player with strong communication skills, capable of understanding and documenting business processes and requirements.
To migrate B2B and B2C accounts, Products & Services from legacy systems to Salesforce CRM stack.
Extracted data from Siebel CRM, billing, inventory, and attributes utilizing Python, APIs, and shell scripting, facilitating efficient data integration into GCP Big Query for in-depth analysis.
Developed a strong framework and business rules using Big Query SQL and routines, resulting in a substantial enhancement of data quality, capability, and integrity metrics.
Streamlined decision-making by 80% through the creation of comprehensive dashboard summary reports, providing clear visualization of account eligibility for migration.
Enhanced data quality and integrity while increasing the number of eligible accounts for migration through proactive data cleansing strategies.
Provided actionable insights through comprehensive reports and dashboards, supporting strategic initiatives
ETL Developer
BT Group
01.2019 - 12.2020
Project : DGS migration
Successfully implemented ETL processes using Oracle procedures, functions, and packages, resulting in a 20% reduction in data processing time.
Designed and executed transformation logic based on product mapping, improving data accuracy by 15%.
Streamlined complex reporting and data archival systems, reducing storage costs by 25%.
Extracted data from diverse source systems including XML, JSON, Excel, Oracle DBLink, API, and cloud platforms, ensuring seamless integration and reducing data retrieval time by 30%.
Managed bulk B2B data to restructure product models, enhancing product adaptability and reducing time-to-market by 20%.
Oracle PLSQL Developer
Synchrony financial
06.2016 - 12.2018
Projects : Checkfree & Massflip
Designed and developed tables, views, materialized views, stored procedures, packages, triggers and functions
Implemented advanced bulk processing techniques such as FORALL, collections, PL/SQL tables and BULK COLLECT to enhance performance
Implemented records, tables, and collections (nested tables and arrays) to optimize query performance by minimizing context switching
Proficient in utilizing IMPLICIT CURSOR, EXPLICIT CURSOR, and REF CURSOR for various database operations
Wrote UNIX shell scripts to perform daily file processing tasks such as renaming, extracting, unzipping, and removing unwanted characters before loading the data into base tables created a shell script using multi-threading to extract bulk data from database through parallel processes, significantly reducing extraction time
Database Developer
HCL Technologies Limited
11.2013 - 06.2016
Implemented high-performance queries using PL/SQL, SQL queries, and T-SQL codes with collections and bulk collect, resulting in a 20% reduction in query execution time and improved overall database performance.
Successfully resolved daily support tickets by conducting data analysis and implementing bug fixes, ensuring timely delivery of solutions and maintaining a client satisfaction rate of 95%.
Maintained different versions of code using SVN (version control), leading to improved code management and collaboration efficiency within the development team.
Enhanced query performance by 25% through optimization techniques such as Explain Plan, Tkprof, and trace utility, resulting in faster data retrieval and processing times.
Developed complex batch processes in PL/SQL to handle large volumes of data, increasing data processing efficiency by 30% and reducing manual intervention.
Implemented database triggers to enforce integrity constraints and security measures, ensuring data consistency
Created multiple shell script to automate batch report distribution to wider audience, reducing manual effort by 50% and improving report delivery speed
Education
Bachelor of Engineering in Electronics & Communication -
Bannari Amman institute of Technology
04.2012
Skills
Data Analysis
Data Migration
Data Modeling
Data cleaning
SQL
PL/SQL
GCP BigQuery & Cloud Storage
T-SQL
MySQL
Python
PostgreSQL
Microsoft SQL Server
Certification
Google Associate Cloud Engineer
Microsoft Azure Data Engineer
Microsoft Azure Data Fundamentals
Timeline
Data Analyst
BT Group
12.2020 - Current
ETL Developer
BT Group
01.2019 - 12.2020
Oracle PLSQL Developer
Synchrony financial
06.2016 - 12.2018
Database Developer
HCL Technologies Limited
11.2013 - 06.2016
Bachelor of Engineering in Electronics & Communication -