Summary
Overview
Work history
Skills
Certification
Timeline
Generic
Priyanka B

Priyanka B

Watford,Hertfordshire

Summary

Around 5years of experience as a Data Analyst with great experience of designing, analyzing, adapting, implementing, testing, and maintaining database solutions through Business Intelligence solutions.

  • Excellent experience of entire Software Development Life Cycle (SDLC) methodologies like Agile, Scrum, UML, Waterfall and Project Management Methodologies.
  • As a Data Analyst responsible for Data Modeling, Enterprise Data Management, Data Presentation, Visualization, Optimization, Risk assessment, predictive analysis, advance Data Analysis, Business Research and Quantitative Analysis.
  • Experience on publishing Power BI Desktopreports on dashboards in Power BI serverand in designing, developing Power BI graphical and visualization solutions based on business.
  • Experience in creating Power BI Dashboards (Power View, Power Query, Power Pivot, and Power Maps).
  • Experience in OLTP/OLAP environment and Data Warehouse. Extensive experience in SQL Server Analysis Services OLAP Cubes, and Data Mining.
  • Experience in designing, developing and deploying Business Intelligence solutions using SSIS, SSRS, SSAS, Power BI.
  • Gained experience on Data Warehousingmethodologies and concepts, including star schemas, snowflakes schemas, ETL processes, dimensional modelling and reporting tools.
  • Extensive experience in developing Stored Procedures, Views, Cursors, Triggers and Complex Queries in MS SQL Server.
  • Extensive experience in using SQL Server Integration Services (SSIS) to build Data Integration and Workflow Solutions, Extract, Transform and Load (ETL) solutions for Data warehousing applications.
  • Experience in creating multiple type of reports using tools like Power BI and SSRS.
  • Strong SQL development experience in writing complex and nested T-SQL queries. Installed and Configured Power BI Gateways to keep the dashboards and reports.
  • Building and publishing POWER BI reports utilizing complex calculated fields, table calculations, filters, parameters.
  • Experience in creating Power pivots, power query, Power views and SSRS reports using tabular model (DAX), Cubes (MDX) and SQL queries.
  • Implement one-time data migration of multistate level data from SQL Server to Snowflake by using Python and Snow SQL.
  • Extensive experience in working closely withstakeholders, vendors, and business users.
  • Hands on Experience in Migrating SQL database to Azure data Lake, Azure data lake Analytics, Azure SQL Database and Azure SQL Data warehouse and Controlling and granting database access and Migrating On premise databases to Azure Data lake store using Azure Data factory.
  • Gained experience in version control using Git, enabling efficient collaboration, and tracking code changes throughout the project lifecycle.
  • Gained knowledge of how to automate the build, test, and deployment processes for software projects needing continuous integration and delivery (CI/CD) using Jenkins and Azure DevOps.
  • Extensive experience in text analytics, creating diverse statistical data mining, and data visualizations utilizing R, Python, and Tableau solutions to numerous business problems.
  • Knowledge on utilizing essential MapReduce, YARN, Hive, HBase, Flume, Sqoop, PySpark, Spark SQL, and Kafka components of the Hadoop ecosystem to create and implement enterprise-based applications.
  • Experience to provide process automation, workflow, and dashboard display framework, Tableau Desktop was used in the design and development of visual components.
  • Extensive experience with various data processing platforms and languages including Oracle PL/SQL, SQL Server T-SQL, MySQL and PostgreSQL.
  • Experience in SDLC and their methodologies including, Azure DevOps, Waterfall and Agile.
  • Experience in coding Unix/Windows scripts for file transfers.

Overview

5
5
years of professional experience
1
1
Certification

Work history

Data Analyst

EY
06.2022 - 02.2025

Client: Deutsche Bank

Description: The bank main activities are loan origination and deposit generation. In this bank, my duties included creating, implementing, and maintaining all databases, including complex queries, triggers, and stored procedures. I also helped with the administration of several bank databases in both development and production environments. It offers the entire spectrum of personal and business banking services and products.


Responsibilities:

  •  Created measures, calculated columns, relationships and performed time series analysis using DAX in Power BI.
  •  Created different drill down reports using Power BI Desktop. Created measures, calculated columns in Power BI Desktop to show good data analysis techniques
  •  Designed and developed SSIS (ETL) packages to validate, extract, transform and load data from OLTP system to the Data warehouse and Report-Data mart.
  •  Responsible for creating and changing the visualizations in Power BI reports and Dashboards on client requests.
  •  Used Power Query to implement ETL Methods on the PowerBI desktop, combine diverse data sources, including SQL servers, spreadsheets, and external data obtained through web services, into a single, centralized data warehouse.
  •  Developed complex stored procedures and views to generate various drill-through reports, parameterized reports and linked reports using SSRS.
  •  Created and stored T-SQL queries to validate data between different sources and Power BI.
  •  Involved in source code management with the help of GitHubusing push and pull operations of GIT an Involved in building Data Marts and multi-dimensional models like Star Schema and Snowflake Schema.
  • d created a local GIT repository so that the source code can be managed locally.
  •  Creating stored procedure and scheduled them in Azure Data factory and Creating Data factory in Azure Data factory.
  •  Designed, Build the Dimensions, cubes with star schema using SQL Server Analysis Services (SSAS).
  •  Designed and documented logical and physical database designs for Enterprise Application (OLTP), Data Warehouses (OLAP), NoSQL databases.
  • Created Spark applications using PySpark and Spark SQL for data extraction, transformation, and aggregation from various file formats, uncovering insights into customer usage patterns.
  • Created reports utilizing SSRS, Excel services, Power BI and deployed them on SharePoint Server as per business requirements.
  • Collaborated with key internal and external stakeholders to gather and analyze needs and requirements to design and implement robust analytic solutions to support the identified needs.
  • Good Understanding of Azure DevOpspractices for continuous integration/delivery and creating CI/CD release pipelines for release management to push the changes to higher environments.
  • Gained Knowledge Data Ingestion to one or more Azure Services - (Azure Data Lake, Azure Storage, Azure SQL, Azure DW) and processing the data in Azure Databricks.
  • Build and publish Tableau reports utilizing complex calculated fields, table calculations, filters, parameters.
  • Used for testing and data processing, Scala and Spark-SQL were used to develop Spark.
  • Worked on track record of modifying Jiraprojects with a wide range of schemes, complex processes, flows, authorization schemes, and notification schemes.
  • Following Agile (Scrum) Methodology for developing application development.


Environment: Power BI, NO SQL, SSAS, SSIS, SSRS, OLTP, OLAP, Pyspark, Spark, Azure SQL, Azure DW, Azure Data Lake, Azure Data bricks, Scala, T SQL, Git, Github, Azure Data factory, Power Query, T SQL, ETL, Power BI Desktop, Excel, DAX, Star Schema, Snowflake Schema, Jira, Tableau, CI/CD, Azure DevOps.

Data Analyst

Value Labs
01.2020 - 06.2022

Client: Capital one

Description: The project goal is to provide a scalable and trustworthy solution that can track transactional data in real time from diverse sources and spot any problems. The projects goal is to fix the mistakes as quickly as possible to avoid any financial losses and to preserve the data integrity so that it can be trusted for further study.


Responsibilities:

  • Designed, developed, modified and enhanced the database structures and database objects. Data warehouse and data mart designs efficiently support BI and end user requirements
  • Developed interactive data visualizations and developed automated processes for updating content used in standardized dashboards.
  • Experience in writing expressions in SSRS and Expert in fine-tuning the reports created many Drill through and Drill Down reports using SSRS.
  • Created Power BI Reports using the Tabular SSAS models as source data in Power BI desktop and publish reports to service.
  • Implemented version control using Git for ETL processes, data mapping, analytics scripts, dashboard development, and report templates, ensuring collaboration and tracking changes.
  • Worked with various SSIS Tasks like Execute SQL Task, bulk insert task, data flow task, ftp task, send mail task.
  • Used SQL, Python, and PySpark, worked Data bricks notebooks were generated, and automated notebooks were made using tasks.
  • Leveraging advanced Power BI DAX calculations and cutting-edge conditional formatting for comprehensive Data Analysis.
  • Designed Extraction Transformation and Loading (ETL) process using SSIS to extract data from flat files, excel files and SQL server database.
  • Daily tasks included processing client-supplied external files and, on occasion, transferring data across servers using Older DTS Packages in a SQL Server 2000/2005/2008 /2016 environment.
  • Wrote T- SQL scripts to validate and correct inconsistent data in the staging database before loading data into the database.
  • Performed extensive data modelling to differentiate between the OLTP and Data Warehouse data models.
  • Utilized monitoring and deployment tools to efficiently manage Power BI Service CI/CD and data flow operations.
  • Worked on Web jobs, AZURE data factory, Logic Apps for application and project development in Azure.
  • Developed automated reports and scheduled updates using Tableau publishing and scheduling features, providing stakeholders with up-to-date insights.
  • Utilized Jira for project task management, requirement tracking, and collaboration among team members, ensuring timely delivery and effective communication.
  • Actively applied Agile Methodology practices for development of the Microsoft reporting environment.


Environment: Power BI, Git, ETL, SSIS, SSRS, SQL, OLTP, Pyspark, Python, CI/CD, Azure Data Factory, Logic Apps, Azure, Tableau, Jira, Agile, DAX, Power Desktop, Jira, Agile, T SQL.

Skills


  • BI Tools: Power BI Desktop, Query, Pivot, Power Map, View
  • Data Modelling Tool ETL, Star, Snowflake
  • Reporting Tools: SSRS , Power BI, Tableau
  • Database Technologies: MySQL, SQL Server 2005/2008/2008 R2/2012, 2014,2016
  • Cloud: Azure, DevOps, Jenkins, CI/CD
  • Hadoop: MapReduce, YARN, Hive, HBase, Flume, Sqoop, PySpark, Spark SQL, KAfka
  • ETL Tools: SSIS, Power Query
  • Visualization Tool: Tableau Desktop, Python
  • Development Tools: T SQL, NO SQL
  • Bug Tracking Tool: Jira
  • Operating Systems Windows, Unix, Linux
  • Methodologies Agile, Scrum, Waterfall

Certification

Microsoft Certified: Power BI Data Analyst Associate

Timeline

Data Analyst

EY
06.2022 - 02.2025

Data Analyst

Value Labs
01.2020 - 06.2022
Priyanka B