Data Engineer with 5 years of experience in designing and
developing data pipelines and data intensive applications.
Highly enthusiastic with exceptional algorithm development and
problem solving skills.
Over 18 years of solid IT development experience
Betfred aims to establish a foundational data platform, moving from an SQL Server-based data warehouse to an AWS-based system with real-time and batch capabilities.
For Historical data, the aim is to facilitate seamless data migration and ensure timely data availability for initial data products and reports.
For Transactional data, the goal is to eliminate staging databases and create direct connections to transactional systems, thereby consolidating the data structure and creating an effective data platform.
A unified data model will consist of centralized entities to support multi-country and multi-franchise (channel) operations. The objective is to standardize data ingestion, transformation, and reporting, thereby providing consistency, reducing overhead, and eliminating redundancy.
Establish a Centralized Data Lake as the Unified Storage Layer in Amazon S3 .DMS tasks are used for migrating data. Debezium and MSK are used for transactional data.Iceberg is used as the table format for data in S3.Iceberg enables schema evolution and partitioned table management, supporting long-term storage and efficient access to historical data
Data Standardization, Transformation and Harmonization using Glue pipelines. Data from the raw layer is transformed through AWS Glue ETL jobs
Central Data Warehousing in Amazon Redshift .Load transformed data into Redshift, where each core data product (e.g., Customer, Betting Activity, Promotions) will be represented as unified tables.
Data Governance and Access Control.Apply governance policies using AWS Lake Formation and IAM roles
Build the Common Data Model (CDM) for Data Products to standardizes entity definitions (e.g., Customer, Bet, Promotion) and their attributes.
Implement the Data Product Layer for Self-Service Analytics in QuickSight
VBIT (Vodafone Business IT) program consists of Migrating Vodafone Business workloads from multiple sources to GCP Datahub which would be gradually used for analytics, BI usecases and other vodafone business usecases.
Design and develop batch pipelines using Dynamo & Data Fusion to consume CPQ API, apply transformation and write to BQ
Design and Develop new Dynamo template and plugins.
Develop Composer DAG to process Dynamo pipelines
CIAS (Cognitive Intelligence and Automation) project is a Europe wise automation platform where more than 80 % of the 1st line support is automated. The GCP project is created to replace expensive Cloudera analytics with GCP data ingestion and analytics.
Involved in Design and Development of Cloud functions to process NiFi checksum and data file
Designed and developed DAG for post processing
Configured and deployed BIF for NiFi to GCP data ingestion
VBIT (Vodafone Business IT) program consists of Migrating Vodafone Business workloads from multiple sources to GCP Datahub platform which would be gradually used for analytics , BI usecases and other vodafone business usecases.
Work Force Management (WFM) program consists of migration of the agent information pertaining to voice, chat and email data coming from several systems in the Local Market into one single cloud-based system.
Neuron EDS (NEDS) Solution refers to a consolidated and centralized platform to onboard external data sources. External Data Sources ( EDS) refers to data coming from non-vodafone sources.
DT (Digital Twin) are simulations that represent the current state of the asset or process. Data is regularly collected in order to update this virtual replica. DT provides understanding of the current and future predicted performance, given the application of different levers, tactics and scenarios
BC Ferries is one of the largest ferry systems in the world.
The project involves implementing a new e-commerce platform based on Hybris. Involved in implementation of Ferry and Vacation booking, integration with FFDEI Microservices, manage booking, cancellation and refund process.
Stack - Java, Spring 4.x, Hybris 6.x, REST APIs, Kafka, Microservices,Tomcat , Maven, MySQL, Solr, Git
easyJet, is a British low-cost carrier airline headquartered at London.
This Project called FCP(Future Commerce Platform) is the re-platforming of current easyJet commercial platform.
Involved in Design and implementation of Spring based REST APIs, to be consumed by various channels including Customer facing website, Airport Helpdesk, Customer Services and B2B clients.
Stack- Java 8, Spring 4.x, Hybris 6.x, REST APIs, Tomcat, Maven, MySQL, Solr, ActiveMQ, Bamboo, Git
EE is the largest mobile network operator in the UK.
This project involves new ecommerce implementation of EE website.
GCP - Data Fusion, BigQuery, Dataflow (Apache Beam), Pub/Sub, Composer, Cloud Functions, Google Cloud Storage, Cloud Dataproc, Cloud SQL, Data Studio, Dynamo, CDAP, Vault
Sun Certified Java Professional (SCJP)