Summary
Overview
Work history
Education
Skills
Websites
Certification
Accomplishments
Affiliations
Languages
Timeline
Generic
Vinaykumar Hiremath

Vinaykumar Hiremath

Barking,United Kingdom

Summary

Accomplished Azure Data abd AI Architect with extensive expertise in programming languages such as Python, PySpark, and SQL, and a strong foundation in data engineering practices including ETL, ELT, and data Architecture. Proficient in leveraging advanced technologies like Azure Databricks and AWS to drive efficient data solutions. Demonstrates domain expertise across MLEU, RCGTH, BFSI and PS sectors. Skilled in implementing generative AI models such as GPT4.0, Anthropic Claude, Genie and Azure AI Foundry to enhance business intelligence capabilities. Committed to utilising agile methodologies to deliver innovative solutions that align with organisational goals.

Overview

9
9
years of professional experience
4
4
years of post-secondary education
1
1
Certification

Work history

Databricks Solution Architect

Public Sector
London, UK
09.2025 - 11.2025
  • Facilitated implementation of Medallion Architecture and Unity Catalogued workspaces with Platform Architecture team.
  • Led discussions with stakeholders to gather insights on services and existing information sources.
  • Developed and proposed solution architecture for Azure Databricks, guiding the implementation process.
  • Implemented Databricks Medallion Architecture, enhancing data processing capabilities.
  • Executed Data Sharing and Security for Network Architecture for secure data accessibility.
  • Analyzed existing information sources and conducted data cleanup to improve quality.
  • Recommended establishment of data squad team to implement Dataflow Pipelines in Azure architecture.
  • Collaborated with business analysts and service designers to compile findings and propose delivery directory maintenance options.

Databricks Data Architect

Utilities
London, UK
01.2025 - 11.2025
  • Led discussions with stakeholders and project teams to identify services, activities, and information sources.
  • Proposed solution architecture for Azure Databricks and collaborated on implementation.
  • Developed Databricks DLT metadata framework to enhance data management.
  • Analysed existing information sources and executed cleanup for improved efficiency.
  • Collaborated with business analysts, user researchers, and service designers to compile findings.
  • Presented options for maintaining delivery directory based on comprehensive analysis.
  • Transformed legacy systems to cloud platform, ensuring minimal business logic impact.

Databricks Data Architect

Liberty Mutual Insurance
London, UK
01.2020 - 01.2025
  • Built generic ETL pipelines for data ingestion and loading on Azure Databricks.
  • Facilitated communication with business users, analysts, architects, and product owners for seamless project execution.
  • Developed Azure Data Factory pipelines and Databricks jobs, leveraging notebooks and delta tables.
  • Orchestrated deployment processes through Azure DevOps and managed code repositories in GitLab.
  • Automated data quality checks to maintain accuracy and integrity across the data lifecycle.
  • Mentored new team members in Scaled Agile Framework practices.
  • Collaborated effectively within Global Risk Solutions Project at Liberty Mutual Insurance.

Databricks Data Architect

Matalan Retail
Liverpool, UK
02.2018 - 12.2019
  • Job Profile: Databricks Data Architect
  • Client: Matalan Retail, Liverpool, UK
  • Technology: Microsoft Azure Databricks (Unity Catalogue & PySpark), Azure Data Lake Storage, ADF, Snowflake, SQL Server, SFTP, DBT.
  • Description: Worked on Data Modernization project to Migration of Data from On Premises SAS Database to Azure Cloud.
  • Lead discussions with business stakeholders & enterprise architect related to understand current system Data, Dependencies, process, business rules, transformation rules, etc. to come up with plan and solution for migrating to Azure Cloud platform following medallion architecture using Azure data lake, Azure Databricks and Snowflake for processing and consuming the data.
  • Work with Cloud Platform in provisioning the environment, enable connectivity between service and enable access for the Project.
  • Working with other vendor team on data and technical requirement.
  • Owned the Data migration discover, estimation, end to end solution design and coordination.
  • Help team on resolving technical challenges in development, resolving the dependency, raising the risk and finding mitigate to make sure development completes on time
  • Release management and working business SME on UAT and post prod defect fixing and warranty closure.
  • Collaborated closely with cross-functional teams to define technical requirements for new initiatives.

Lead Data Engineer

ANZ Bank
Sydney
03.2017 - 02.2018
  • Job Profile: Lead Data Engineer
  • Client: ANZ Bank, Sydney, Australia
  • Environment: Azure, MSBI, SQL Server
  • Description: Singapore regulatory reporting project will address the initial tranche of fifteen returns required for the monetary authority of Singapore. This has helped the reporting team to generate the reports and send to monetary authority of Singapore.
  • Design and implement end-to-end data solutions on Azure, including data ingestion, transformation, and storage processes, resulting in improved data accessibility and analysis capabilities.
  • Develop data integration pipelines using Azure Data Factory to extract, transform, and load data from diverse sources, ensuring seamless data flow and integrity.
  • Implement data transformation logic using SQL, Stored Procedure and SSIS enabling efficient data processing, data enrichment, and data quality checks.

Data Engineer

Microsoft Corporation
08.2016 - 02.2017
  • Developed tabular model reports for US and 11 regions in GPF for financial planning.
  • Analysed LLG impact on views over base tables, delivering a comprehensive checklist to business stakeholders.
  • Modified load processes based on new LLG mapping table for finance geography dimension.
  • Reviewed consumer database views to assess LLG impact and prepared detailed spreadsheets for onsite coordinator.
  • Developed robust ETL pipelines to facilitate quicker data retrieval.
  • Mentored junior engineers on best practices in big-data handling, nurturing their professional growth along the way.
  • Migrated legacy systems to cloud platforms, resulting in increased scalability and reliability.

Education

Bachelor of Engineering - Electronics and Communications Engineering

Visvesvaraya Technological University
Belagavi Karnataka
06.2004 - 03.2008

Skills

  • Programming Languages: Python, PySpark, SQL, SO-QL, Databricks SQL, Spark SQL, Apache Spark, R
  • Data Engineering: ETL, ELT, Data Ingestion Framework, Data Migration, Data Orchestration, Data Validation, DLT, Delta Lake, Databricks Workflows, Lakeflow, Declarative Pipelines
  • Technology Expertise: Azure Databricks, Unity Catalog, Databricks Lakehouse, Azure Synapse Analytics, Azure Data Factory, DBT Labs, R studio, Posit, Azure DevOps, Github
  • Domain Expertise: MLEU, RCGTH, BFSI, PS
  • Generative AI: GPT40, RAG Model, AWS Nova Pro, Anthropic Claude Sonnet, Genie, AgentBricks, AgentCore
  • Methodology: Agile, Kanban

Certification

  • Azure Certifications: AZ-900(Azure Fundamentals), DP-203(Azure Data Engineer Associate), DP-700(Fabric Data Engineer Associate)
  • Databricks Certifications: Databricks Lakehouse Fundamentals, Data Engineer Associate, Data Engineer Professional, Solution Architect Essentials, Solution Architect Champion
  • AWS Certifications: AWS Cloud Practitioner, AWS Data Engineer Associate
  • Gen AI Certifications : AI-900 (Azure AI-Fundamentals), Databricks Gen AI Fundamentals, Databricks Gen AI Engineer Associate

Accomplishments

  • Successfully designed and delivered data platform in Azure for a large manufacturing and retail customer.
  • Winner of the AWS GPL Hackathon held in London, demonstrating innovative problem-solving and cloud architecture skills in a competitive environment.
  • Actively participated in Cognizant's Vibe Coding Week, contributing to the Guinness World Record for the largest online generative AI-assisted hackathon. Developed an innovative prototype as part of a global initiative to democratize AI innovation and accelerate enterprise-wide transformation.
  • Actively engaged with the Databricks community through initiatives such as staffing, training, and team motivation, while also participating in global events including meetups, technical sessions, summits, and the Databricks World Tour to foster collaboration and drive innovation

Affiliations

  • Tennis Swimming Cricket

Languages

English
Fluent

Timeline

Databricks Solution Architect

Public Sector
09.2025 - 11.2025

Databricks Data Architect

Utilities
01.2025 - 11.2025

Databricks Data Architect

Liberty Mutual Insurance
01.2020 - 01.2025

Databricks Data Architect

Matalan Retail
02.2018 - 12.2019

Lead Data Engineer

ANZ Bank
03.2017 - 02.2018

Data Engineer

Microsoft Corporation
08.2016 - 02.2017

Bachelor of Engineering - Electronics and Communications Engineering

Visvesvaraya Technological University
06.2004 - 03.2008
Vinaykumar Hiremath