Summary
Overview
Work History
Education
Skills
Certification
Timeline
Generic

Kishore Padman

Data Engineer
Carshalton, London

Summary

Data Engineer with 5 years of experience in designing and
developing data pipelines and data intensive applications.
Highly enthusiastic with exceptional algorithm development and
problem solving skills.
Over 18 years of solid IT development experience

Overview

21
21
years of professional experience
5
5
Certificates

Work History

AWS Data Engineer

Betfred
London, United Kingdom
10.2024 - Current

Betfred aims to establish a foundational data platform, moving from an SQL Server-based data warehouse to an AWS-based system with real-time and batch capabilities.
For Historical data, the aim is to facilitate seamless data migration and ensure timely data availability for initial data products and reports.
For Transactional data, the goal is to eliminate staging databases and create direct connections to transactional systems, thereby consolidating the data structure and creating an effective data platform.
A unified data model will consist of centralized entities to support multi-country and multi-franchise (channel) operations. The objective is to standardize data ingestion, transformation, and reporting, thereby providing consistency, reducing overhead, and eliminating redundancy.
Establish a Centralized Data Lake as the Unified Storage Layer in Amazon S3 .DMS tasks are used for migrating data. Debezium and MSK are used for transactional data.Iceberg is used as the table format for data in S3.Iceberg enables schema evolution and partitioned table management, supporting long-term storage and efficient access to historical data
Data Standardization, Transformation and Harmonization using Glue pipelines. Data from the raw layer is transformed through AWS Glue ETL jobs
Central Data Warehousing in Amazon Redshift .Load transformed data into Redshift, where each core data product (e.g., Customer, Betting Activity, Promotions) will be represented as unified tables.
Data Governance and Access Control.Apply governance policies using AWS Lake Formation and IAM roles
Build the Common Data Model (CDM) for Data Products to standardizes entity definitions (e.g., Customer, Bet, Promotion) and their attributes.
Implement the Data Product Layer for Self-Service Analytics in QuickSight

GCP Data Engineer

Vodafone UK
Newbury, United Kingdom
03.2024 - 08.2024

VBIT (Vodafone Business IT) program consists of Migrating Vodafone Business workloads from multiple sources to GCP Datahub which would be gradually used for analytics, BI usecases and other vodafone business usecases.
Design and develop batch pipelines using Dynamo & Data Fusion to consume CPQ API, apply transformation and write to BQ
Design and Develop new Dynamo template and plugins.
Develop Composer DAG to process Dynamo pipelines

GCP Data Engineer

Vodafone UK
London, United Kingdom
10.2023 - 03.2024

CIAS (Cognitive Intelligence and Automation) project is a Europe wise automation platform where more than 80 % of the 1st line support is automated. The GCP project is created to replace expensive Cloudera analytics with GCP data ingestion and analytics.
Involved in Design and Development of Cloud functions to process NiFi checksum and data file
Designed and developed DAG for post processing
Configured and deployed BIF for NiFi to GCP data ingestion

GCP Data Engineer

Vodafone UK
08.2022 - 10.2023

VBIT (Vodafone Business IT) program consists of Migrating Vodafone Business workloads from multiple sources to GCP Datahub platform which would be gradually used for analytics , BI usecases and other vodafone business usecases.

  • Design and develop batch pipelines using Dynamo to process files from SFTP and Oracle, apply transformation and write to BQ
  • Design and Develop new Dynamo template for Oracle use case
  • Develop Data fusion plugins for checksum validation etc.
  • Design and Develop EIM processing and transformation
  • Develop Composer DAG to process Dynamo pipelines
  • Support for Intelligent Pricing application deployment in GCP

GCP Data Engineer

Vodafone UK
01.2021 - 08.2022

Work Force Management (WFM) program consists of migration of the agent information pertaining to voice, chat and email data coming from several systems in the Local Market into one single cloud-based system.


  • Design and develop Data Fusion batch pipelines to process files from various sources and write to Verint SFTP Server. Pipeline involves complex transformations, calculations, encryption and decryption.
  • Develop Data Fusion plugins for GPG Encryption,Decryption,Vault connectivity,SFTP Integration, Amazon S3 Integration etc
  • Design and develop Data Fusion streaming pipelines to stream real time data from Pub/Sub, apply transformation and write to Verint SFTP Server
  • Design and develop Dataflow job to stream realtime data from API to Google Cloud Pub/Sub
  • Build Jenkins pipeline for CI/CD
  • Create Collibra configuration files for data dictionary management.
  • Provide KT and training for Support team

GCP Data Engineer

Vodafone UK
11.2021 - 08.2022

Neuron EDS (NEDS) Solution refers to a consolidated and centralized platform to onboard external data sources. External Data Sources ( EDS) refers to data coming from non-vodafone sources.

  • Design and develop Data Fusion batch pipelines to process files from SFTP source, apply transformation and write to BQ and GCS
  • Design and develop Data Fusion batch pipelines to process real time data from API, apply transformation and write to BQ and GCP
  • Pipeline involves complex transformations and pseudonymisation using HPE Voltage plugin .
  • Design and Develop DAG in Airflow Composer
  • Build Jenkins pipeline for CI/CD

GCP Data Engineer

Vodafone UK
08.2020 - 12.2020

DT (Digital Twin) are simulations that represent the current state of the asset or process. Data is regularly collected in order to update this virtual replica. DT provides understanding of the current and future predicted performance, given the application of different levers, tactics and scenarios

  • Create BQ schema and Cloud SQL schema based on the requirements.
  • Create configuration of NiFi services to move files from local market to cloud storage location
  • Design and develop Dataflow job using BIF to load the data from cloud storage to BigQuery and complete transformation.
    Data flow job involves validation of files,Transform Data and write to BQ.
  • Configure Cloud composer to trigger the data flow based on schedule
  • Provide support for pipeline deployment
  • Create Collibra configuration files for data dictionary management.
  • Provide KT and training for Support team

Senior Developer

BC Ferries
10.2018 - 08.2020

BC Ferries is one of the largest ferry systems in the world.

The project involves implementing a new e-commerce platform based on Hybris. Involved in implementation of Ferry and Vacation booking, integration with FFDEI Microservices, manage booking, cancellation and refund process.



Stack - Java, Spring 4.x, Hybris 6.x, REST APIs, Kafka, Microservices,Tomcat , Maven, MySQL, Solr, Git

Senior Developer

Portaltech Reply, London
07.2014 - 10.2018

easyJet, is a British low-cost carrier airline headquartered at London.

This Project called FCP(Future Commerce Platform) is the re-platforming of current easyJet commercial platform.


Involved in Design and implementation of Spring based REST APIs, to be consumed by various channels including Customer facing website, Airport Helpdesk, Customer Services and B2B clients.


Stack- Java 8, Spring 4.x, Hybris 6.x, REST APIs, Tomcat, Maven, MySQL, Solr, ActiveMQ, Bamboo, Git

Senior Developer

BAE Systems, London
06.2013 - 04.2014

EE is the largest mobile network operator in the UK.

This project involves new ecommerce implementation of EE website.

Senior Developer

Greenlight Digital, London
03.2013 - 06.2013

Senior Developer

Conexus, Basingstoke
07.2012 - 03.2013

Senior Developer

Portaltech, London
06.2010 - 05.2012

Java Developer

Target Limited, USA (Deputation from TCS)
08.2006 - 03.2010

Java Developer

Cognizant Technology Solutions, India
07.2004 - 07.2006

Education

undefined

Cochin University of Science and Technology (CUSAT) , India

Skills

    GCP - Data Fusion, BigQuery, Dataflow (Apache Beam), Pub/Sub, Composer, Cloud Functions, Google Cloud Storage, Cloud Dataproc, Cloud SQL, Data Studio, Dynamo, CDAP, Vault

undefined

Certification

Sun Certified Java Professional (SCJP)

Timeline

AWS Data Engineer

Betfred
10.2024 - Current

GCP Data Engineer

Vodafone UK
03.2024 - 08.2024

GCP Data Engineer

Vodafone UK
10.2023 - 03.2024

GCP Data Engineer

Vodafone UK
08.2022 - 10.2023

GCP Data Engineer

Vodafone UK
11.2021 - 08.2022

GCP Data Engineer

Vodafone UK
01.2021 - 08.2022

GCP Data Engineer

Vodafone UK
08.2020 - 12.2020

Senior Developer

BC Ferries
10.2018 - 08.2020

Senior Developer

Portaltech Reply, London
07.2014 - 10.2018

Senior Developer

BAE Systems, London
06.2013 - 04.2014

Senior Developer

Greenlight Digital, London
03.2013 - 06.2013

Senior Developer

Conexus, Basingstoke
07.2012 - 03.2013

Senior Developer

Portaltech, London
06.2010 - 05.2012

Java Developer

Target Limited, USA (Deputation from TCS)
08.2006 - 03.2010

Java Developer

Cognizant Technology Solutions, India
07.2004 - 07.2006

undefined

Cochin University of Science and Technology (CUSAT) , India
Kishore PadmanData Engineer