Summary
Overview
Work History
Education
Skills
Certification
Languages
Personal Information
Timeline
Generic
Rahul Badoni

Rahul Badoni

Barnehurst, London

Summary

Results-driven Senior Data Engineer with 15 years of experience, including over 4 years specializing in Azure-based data solutions and 11 years focused on Data Warehousing, ETL, and Oracle PL/SQL development. Expertise includes designing and implementing robust data models, lakehouse architectures, and comprehensive end-to-end ETL/ELT pipelines utilizing Azure Data Factory (ADF), Databricks (PySpark, Delta Lake), ADLS Gen2, and SQL. Proven track record in data migration, REST API integrations, and Power BI (DAX) data modeling, complemented by strong skills in Azure DevOps and Git-based version control to ensure data quality, compliance, and security. Committed to developing scalable, reliable, and business-driven data solutions that align seamlessly with organizational data strategies.

Overview

15
15
years of professional experience
1
1
Certification

Work History

Senior Data Engineer

LTIMindtree
11.2019 - 07.2025
  • Led the design and development of scalable Azure-based analytics and lakehouse platforms, significantly improving data processing performance and reliability.
  • Designed and implemented data models and ETL/ELT pipelines using Azure Data Factory (ADF), Azure Databricks (PySpark, Delta Lake), and Azure Data Lake Storage Gen2 within a Medallion architecture (Bronze, Silver, Gold) framework.
  • Developed dynamic, parameterized ADF pipelines with Linked Services, Datasets, Integration Runtimes, and Copy Activities for reusable, efficient, and maintainable data workflows.
  • Extracted and ingested data from REST APIs and on-premise Oracle systems using Self-hosted Integration Runtime (SHIR) to integrate disparate data sources into the Azure ecosystem.
  • Standardized cloud-based data warehousing and transformation processes, reducing development effort and accelerating project delivery timelines.
  • Optimized PySpark transformations in Databricks by leveraging Delta Lake, partitioning, caching, and cluster tuning, improving job performance by over 30%.
  • Implemented data validation, reconciliation, and automated quality checks using ADF, Databricks, and SQL, ensuring accuracy and reliability of production datasets.
  • Integrated with Azure Key Vault for secure credential, secret, and token management, aligning with organizational data security and compliance standards.
  • Collaborated with data architects, analysts, and data scientists to translate analytical and reporting requirements into scalable, governed data models.
  • Demonstrated strong understanding of IT environments and data processing workflows by designing, integrating, and optimizing data pipelines across cloud and on-premise systems, ensuring seamless data flow and reliability.
  • Implemented CI/CD pipelines using Azure DevOps and ARM templates to automate deployment of ADF, Databricks, and data infrastructure assets across environments.
  • Ensured adherence to data governance, GDPR, and organizational security policies, maintaining confidentiality and compliance across data environments.
  • Mentored and guided a team of Azure Data Engineers, facilitating Agile ceremonies, sprint planning, and technical reviews to improve team productivity and quality delivery.
  • Recommended and deployed new technologies and best practices for data engineering, continuously improving performance, scalability, and maintainability.

Data Engineer

LTI - Larsen & Toubro Infotech
12.2015 - 10.2019
  • Requirement analysis including interaction with Business users.
  • Provide profiling solution for strategic ledger accounting platform using Ab Initio/DQS.
  • Work as a Technical Lead in Data Warehousing/Enterprise Business Intelligence project.
  • Making complex, technical and design solution in Ab Initio.
  • Extensively worked on all the phases of SDLC from Requirement Gathering, Analysis, Design, Development, Testing and Support of Data Warehouse applications and deployment using Ab Initio and UNIX scripting.
  • Developed standards for coding, testing, debugging and implementation.
  • Provide expertise in technical analysis and solving technical issues during project delivery.
  • Having an experience of reading data from different sources like flat files, delimited files, XML files and database tables. Develop generic graphs and plans in Ab Initio.
  • Provided Level 3 support, resolving critical issues and ensuring system availability.
  • Work with devops team to release code into different program lanes.
  • Defect analysis and fix.
  • Ensure that application development and enhancement is in line with Citi's Coding Standards.
  • Analyze existing Autosys jobs to understand end to end process developed in Sybase IQ/ASE and prepare a plan to develop similar process in Oracle PL/SQL.
  • Develop procedures in Oracle Database using Oracle SQL, PL/SQL.
  • Design Packages, Procedures and Functions using Oracle Database.
  • Schedule jobs using Autosys scheduler.
  • Work actively on Production release activities and made sure planned runs will complete successfully.
  • Mentor to team impact analysis, development and implementation.
  • Defect analysis and fix.
  • Providing technical support to the Team.

Software Engineer

Syntel
07.2010 - 12.2015
  • Develop clear technical design, user requirement document, understand the technical & functional specifications.
  • Requirement gathering and understanding, effort estimation on tasks assigned, technical design, and task planning.
  • Designing and Developing Complex Graphs, Plans in Ab Initio.
  • Working on the database side for storing data pulled by Ab Initio.
  • Regularly communicating with the customer on the status of the issue resolution.
  • Advising customers on best practices for development or integration processes.
  • Handling L3 and Warranty Support post successful Production Go Live.
  • Be flexible enough with the working hours during the Data Warehouse, Migration and Integration Go Lives.
  • Optimize architecture of application for better performance and making design reliable and robust.
  • To guide team at different stages and when team members faces any Technical or Business problems.
  • Component Design, developing, Unit Testing, troubleshooting and debugging of the application.
  • Studying system flow, data usage and work processes, investigating problem areas, following software development lifecycle.
  • Work as a ETL developer (Data warehouse Specialist) for enhancing the current application using Ab-initio ETL tool.
  • Use UNIX skills for creating scripts for the project and incorporating them into Ab Initio plans and graphs.
  • Maintain Version Control of all the code across the Development, Test and Prod environments.
  • Giving Knowledge transfers to the new joiners of the project and efficiently distributing the requirements amongst them.
  • Supervise all the Release Processes and make sure it goes smoothly during the cutover hours.
  • Make sure delivery is Quality driven and within the Time Lines, handed over to the test team.
  • Participate in code reviews, test case reviews and ensure code developed meets the requirements.

Education

Master of Computer Applications (MCA) - Information Technology

Graphic Era University
Dehradun
05.2008

Skills

  • Azure Data Factory expertise
  • Proficient in Azure Databricks
  • Azure monitoring expertise
  • Azure Integration Runtime
  • Azure Blob Storage
  • Skilled in managing Azure Data Lake Storage environments
  • Azure SQL Database
  • Linked Services
  • Datasets
  • Copy Activities
  • Data manipulation using SQL
  • Experienced in Python programming
  • Data processing with PySpark
  • Jenkin
  • REST API Integration
  • Data structure design
  • ETL
  • Data Warehousing
  • Performance Tuning
  • Data Integration
  • Lakehouse Architecture
  • Data Analysis
  • Data Validation
  • Data Quality
  • Data Management
  • Data Governance
  • Business Intelligence
  • Version Control (Git)
  • CI/CD Pipelines (Azure DevOps)
  • JIRA
  • Agile
  • Stored procedure
  • Big Data
  • Problem-solving
  • Decision-making
  • Teamwork
  • Collaboration
  • Effective Communication skills
  • Build relationship and networking
  • Continuous Learning

Certification

AWS Certified Cloud Practitioner, Amazon Web Services Training and Certification, 12/01/22, www.credly.com/badges/9f1...

Languages

English
Full Professional
Hindi
Native or Bilingual

Personal Information

Visa Status: Skilled Dependent Visa holder with full work rights - Immediately available for opportunities

Timeline

Senior Data Engineer

LTIMindtree
11.2019 - 07.2025

Data Engineer

LTI - Larsen & Toubro Infotech
12.2015 - 10.2019

Software Engineer

Syntel
07.2010 - 12.2015

Master of Computer Applications (MCA) - Information Technology

Graphic Era University
Rahul Badoni