Summary
Overview
Work History
Skills
Certification
Accomplishments
Clearance
Baselinechecks
Personal Information
Education
Timeline

Bhanu Chunduri

Berkshire

Summary

Azure Cloud certified Devops & data ops engineer, working on Cloud building and operationalizing large scale enterprise solutions, Data Lakes and applications

Overview

8
8
years of professional experience
3
3
Certifications

Work History

Devops Cloud Data Engineer Azure

End Client UK Public Sector (DWP, MoJ, MoD)
12.2020 - Current
  • Design and implement data storage and data processing using Azure data factory ADF pipelines
  • Implementing and optimising data pipelines to connect operational systems and data for analytics and BI systems within Azure
  • Strong, in-depth hands-on experience with Microsoft Azure in building, deploying and automation, along with networking and security components in Azure cloud environments
  • Conducted complex data mapping and built scalable Azure data pipelines to ensure data integrity and efficient processing
  • Engaged with stakeholders to gather requirements, communicate progress, and ensure alignment with business objectives
  • Built Azure Synapse Analytics platform to build data warehousing, to facilitates multiple tenants to ingest and transform data
  • Technical Proficiency In-depth knowledge of SQL, Python scripting, and Azure Platform Systems, including SQLMI
  • Skilled in troubleshooting data conversion issues and optimizing performance during database transitions
  • Delivered end to end data data clusters on Azure Cloud platform using ADO ( Azure DevOps)
  • Expertly managed and deployed VMware Cloud Foundation (VCF) for scalable and secure cloud infrastructure
  • Implemented VCF to streamline operations, enhance data center efficiency, and support multi-cloud environments
  • Proficient knowledge on multiple Azure services Azure Storage account, Azure SQL, AD
  • Delivered Azure Infrastructure as a code using templates, built pipelines in azure Platform
  • Built AKS (Azure Kubernetes Service), Azure pipelines (CI/CD) deployment and deploying using TF
  • Environment Management Services supporting atalassain tools and CI CD procedures
  • Deployed various Azure networking services WAF(firewall) policies, Vnet Peering and Virtual networks (vnets)
  • Design and deployment of OpenText NetIQ products identity manager, designer, & Netiq analyser
  • Built and maintained scalable data pipelines using Azure Data Factory, Databricks, and Synapse to ingest, transform, and store data
  • Designed and implemented data modeling best practices to optimize data storage and retrieval
  • Migrated Hadoop Cloudera clusters from Physical to Cloud platform
  • Terraform to manage infrastructure as code and implement CI/CD pipelines using Jenkins Implemented security measures such as PKI, IAM, and PAM to ensure secure access to data
  • Designed and implemented infrastructure as code using Terraform to manage and maintain cloud infrastructure on Azure
  • Built and maintained CI/CD pipelines using Jenkins for automated software deployment
  • Worked closely with development teams to ensure successful software delivery also senior stakeholders to explain complex data concepts in business-friendly terms
  • Expertise with Kubernetes, including installing on VMs, and strong skills in Bash, Python, or Ruby for scripting and automation
  • Extensive experience with Terraform and Ansible for infrastructure management and configuration
  • Familiarity with Postgres and Apache Kafka enhances the ability to manage and process data efficiently.

Cloud & Devops Platform Engineer

RBS / NatWest
07.2017 - 12.2020
  • Expert level knowledge on Azure Data Factory & DataBricks, Azure DataLake Store
  • Azure Data Factory, ACF Hive, Spark, Kafka, HBase Excellent and recent experience developing data pipelines in Python and PySpark, YAML
  • Deployed Azure data platforms Databricks and data lake, data factory & HDInsight
  • Strong knowledge on Kubernetes, Docker, Puppet, Ansible, Terraform, CI/CD and github
  • Hand on experience with performance tuning and troubleshooting cloud environment
  • Architected cloud-based data platforms using a wide range of the Azure Services including Azure Databricks, Azure SQL Database, Azure Data Lake Storage, Azure Cosmos DB, Azure Data Factory and Azure Blob storage
  • Architected, designed and setup a full streamsets kafka cluster for a project, which involves data ingestion, transfer including ETL
  • Part of Environment management services team who overseen
  • Designing, building, and maintaining software testing environments
  • Hands on experience implementing Big Data solutions using Azure Data Platform and knowledge of Powershell, using ARM templated, Iaac, Paas tools/services
  • Good knowledge and experience migrating legacy applications into the Cloud using Microsoft Azure
  • Worked with Azure Devops Automated and deployed Big data clusters using rest API
  • Exposure to BI, ETL tools and AI linked to Azure
  • Good knowledge of data warehousing and data modelling tools.

Devops Data Engineer

Lloyds Banking Group
01.2017 - 06.2017
  • End to end support to existing Production Hortonworks HDP 2.3 - 2.5 Infra in Data Lake team
  • Upgrading HDP clusters from 2.3 to 2.5 & Ambari from 2.2 to 2.4
  • Design, Build, Implementation, end to end support of (4 x 20) node Production and DR clusters
  • Good understanding of all eco systems HDFS, Hbase, Hive, Kafka, Spark, Solr, Ranger
  • Extensive knowledge of Hadoop eco systems and integration of LDAP/Kerberos, AD and SSL
  • Configuring alerting and monitoring several Hadoop eco-systems and processes, integrate with Tivoli
  • Assisting Hadoop developers with ingestion, extraction and enrichment of data lake estate.

Hadoop Big Data & Azure Cloud Consultant

RBS (Royal Bank Of Scotland)
03.2016 - 12.2016
  • Installation, configuration and end to end Support and Administration of Cloudera CDH (5.12) Clusters in Production
  • Designed and deployed core Hadoop eco systems like HDFS, YARN, Impala, Hive, HDFS, Map Reduce, Apache Pig, Sqoop, and Oozie
  • Part of Hadoop production support team, implementing bug fixes and major production upgrades
  • Designed deployed high availability solutions for avoided SPOF for key big data components
  • Mentored Global teams for knowledge transfer of BAU tasks including morning checks assisted with on-boarding of users onto the Data lake platform.

Skills

5 years’ experience working on multi cloud environments, Azure cloud infrastructure services mainly focused on Azure along with strong Devops background

Certification

Microsoft Certified Azure Data Engineer Associate (DP-200, DP201)

Accomplishments

Delivered various Projects involving servers and data center migrations to cloud on time. Excellent Customer facing skills along with expertise in Technical Project Support & Management. Provided high level of service delivery leading to increased customer satisfaction

Clearance

Active, SC, 03/2026

Baselinechecks

Completed successfully with Disclosure Scotland & CRB Checks

Personal Information

  • Citizenship: British Citizen
  • Driving License: Hold Full and clean UK driving License

Education

High School Diploma -

Bachelors in Computer Science

Timeline

Devops Cloud Data Engineer Azure - End Client UK Public Sector (DWP, MoJ, MoD)
12.2020 - Current
Cloud & Devops Platform Engineer - RBS / NatWest
07.2017 - 12.2020
Devops Data Engineer - Lloyds Banking Group
01.2017 - 06.2017
Hadoop Big Data & Azure Cloud Consultant - RBS (Royal Bank Of Scotland)
03.2016 - 12.2016
Bachelors in Computer Science - High School Diploma,
Bhanu Chunduri