Summary
Overview
Work history
Education
Skills
Intrests
Timeline
Generic

Vinay Varsani

Wembley

Summary

Data Engineering consultant with over 5 years experience delivering robust data pipelines and digital products catering to multiple use cases using a wide range of technologies. Highly experienced in leveraging Azure services to optimise data solutions and drive business value

Overview

6
6
years of professional experience

Work history

Senior Data Consultant

Mesh-AI
London
02.2023 - Current

Client: Liberty Specialty Markets

Led delivery of a Pricing Data Mart on Redshift, targeting loss ratio reduction and estimated savings of 10 million dollars.

  • Lead design and architecture and delivery of ingestion pipelines on AWS, utilising Step Functions, Lambdas and multiple API calls to hydrate the data.
  • Designed and architected ingestion pipelines on AWS using Step Functions, Lambdas, and multiple API calls.
  • Deployed all infrastructure and pipeline code via CDK with Typescript.
  • Collaborated with CDO and stakeholders to define requirements and drive successful delivery.
  • Data Modelling in Redshift with dbt.


Client: National Grid

Built digital products for regulatory reporting and outage planning

  • Lead as Python software engineer building backend Fast APIs.
  • Emphasis on software engineering best practices including scalability and through testing.
  • Built a Python library to automate a Capex reporting.
  • Worked with the business to understand requirements and shape product delivery through iterative development.


Client: Total Energies

Lead technical delivery of a new Data Mesh Platform

  • Built a greenfield Azure data platform using Terraform and implemented CI/CD with GitHub actions.
  • Developed a carbon emissions data product using Azure Functions and Azure Data Factory, significantly improving reporting accuracy.
  • Key emphases on data governance with Azure Purview.
  • Conducted extensive client engagement to understand data challenges and deliver tailored solutions.
  • Up-skilling client team and onboarding onto the new platform.

Consultant

Infinity Works Part of Accenture
London
10.2021 - 02.2023

Client: Imperial Health Care Trust

Delivered a data migration onto Azure/Snowflake.

  • Key emphasis placed on security during all stages of development.
  • Migrated transformations on SSIS/MsSQL stored procedures into dbt.
  • Built ADF pipelines for ELT jobs into snowflake.
  • Orchestrated dbt through Docker image which ran on Azure container through ADF.
  • Greenfield cloud platform built through dbt.
  • Led multiple requirements gathering workshops and scope feasibility assessments to agree on product deliverables.
  • Up-skilled client engineers through formal knowledge transfers and peer programming exercises.
  • Developed CI/CD pipelines through Azure Devops.

Client: AICPA

Delivered a Data Platform to enable reporting and analytics.

  • Data Warehouse built with Snowflake.
  • Data transformations and data models delivered using dbt.
  • IaC approach with platform infrastructure managed and deployed using Terraform.
  • Integrated Fivetran/Function Apps to perform Extract/Load into Snowflake (deployed using Terraform) from multiple source systems.
  • Up-skilled client support team with the aforementioned technologies.
  • Developed CI/CD pipelines through Azure Devops.

Data Engineer

AXA UK
London
04.2020 - 10.2021
  • Built and maintained data pipelines using Pyspark for Machine learning models and PowerBI dashboards.
  • Worked with Data Scientists on feature engineering, model build and model deployment.
  • Deployed models using Flask API. Lead contact between analytics and IT.
  • Technical lead on Data Engineering Framework (Kedro).
  • Lead POC using MLFlow and Azure ML to improve end to end model build.
  • Built in-house automation and pipeline monitoring framework.
  • Built .NET C# ingestion pipeline to process emails from blob storage for model build. Pipeline included anonymisation solution using Pyspark UDF.
  • Built Devop's release pipelines for CI/CD

Data Engineer

Tata Consultancy Service | Lloyds Banking Group
London
08.2019 - 04.2020
  • Primarily worked on Ingestion into the Data Lake (Hadoop) and Data Warehouse (Hive) gaining strong programming skills using Scala, Java and shell scripting.
  • Completed stories using Scala, Java and Unix continually developing and learning new skills and best practices.
  • Performed unit testing and Non-functional testing on high volume data.
  • Utilised Spark to perform transformations using both Java and Scala.
  • Primarily used Hive and HBase to store, validate and transform data.

Business Analyst/Product Owner

Tata Consultancy Service | Lloyds Banking Group
London
03.2019 - 08.2019
  • Responsible for an Agile scrum.
  • Communicating between stakeholders to prioritise stories and then refining with the scrum team.
  • Lead various scrum ceremonies including stand-ups, backlog refinement and sprint review and retrospective.
  • Responsible for resolving any blockers.

Education

BSc - Physics

University College London
London
06.2018

Skills

  • Data Engineering: Python, SQL, Azure, Databricks, PySpark, Snowflake, Terraform, DBT
  • DevOps: Git, CI/CD, Azure DevOps, Docker, Github Actions
  • Tools & Frameworks: Fast API, Kedro, MLFlow

Intrests

  • Ranked LTA Tennis player, successfully winning 3 grade 4 tournaments.
  • Climbed Mount Kilimanjaro for Charity.
  • Recently completed the Yorkshire 3 Peaks challenge.
  • Keen Footballer.

Timeline

Senior Data Consultant

Mesh-AI
02.2023 - Current

Consultant

Infinity Works Part of Accenture
10.2021 - 02.2023

Data Engineer

AXA UK
04.2020 - 10.2021

Data Engineer

Tata Consultancy Service | Lloyds Banking Group
08.2019 - 04.2020

Business Analyst/Product Owner

Tata Consultancy Service | Lloyds Banking Group
03.2019 - 08.2019

BSc - Physics

University College London
Vinay Varsani