Client: Deutsche Bank
Description: The bank main activities are loan origination and deposit generation. In this bank, my duties included creating, implementing, and maintaining all databases, including complex queries, triggers, and stored procedures. I also helped with the administration of several bank databases in both development and production environments. It offers the entire spectrum of personal and business banking services and products.
Responsibilities:
- Created measures, calculated columns, relationships and performed time series analysis using DAX in Power BI.
- Created different drill down reports using Power BI Desktop. Created measures, calculated columns in Power BI Desktop to show good data analysis techniques
- Designed and developed SSIS (ETL) packages to validate, extract, transform and load data from OLTP system to the Data warehouse and Report-Data mart.
- Responsible for creating and changing the visualizations in Power BI reports and Dashboards on client requests.
- Used Power Query to implement ETL Methods on the PowerBI desktop, combine diverse data sources, including SQL servers, spreadsheets, and external data obtained through web services, into a single, centralized data warehouse.
- Developed complex stored procedures and views to generate various drill-through reports, parameterized reports and linked reports using SSRS.
- Created and stored T-SQL queries to validate data between different sources and Power BI.
- Involved in source code management with the help of GitHubusing push and pull operations of GIT an Involved in building Data Marts and multi-dimensional models like Star Schema and Snowflake Schema.
- d created a local GIT repository so that the source code can be managed locally.
- Creating stored procedure and scheduled them in Azure Data factory and Creating Data factory in Azure Data factory.
- Designed, Build the Dimensions, cubes with star schema using SQL Server Analysis Services (SSAS).
- Designed and documented logical and physical database designs for Enterprise Application (OLTP), Data Warehouses (OLAP), NoSQL databases.
- Created Spark applications using PySpark and Spark SQL for data extraction, transformation, and aggregation from various file formats, uncovering insights into customer usage patterns.
- Created reports utilizing SSRS, Excel services, Power BI and deployed them on SharePoint Server as per business requirements.
- Collaborated with key internal and external stakeholders to gather and analyze needs and requirements to design and implement robust analytic solutions to support the identified needs.
- Good Understanding of Azure DevOpspractices for continuous integration/delivery and creating CI/CD release pipelines for release management to push the changes to higher environments.
- Gained Knowledge Data Ingestion to one or more Azure Services - (Azure Data Lake, Azure Storage, Azure SQL, Azure DW) and processing the data in Azure Databricks.
- Build and publish Tableau reports utilizing complex calculated fields, table calculations, filters, parameters.
- Used for testing and data processing, Scala and Spark-SQL were used to develop Spark.
- Worked on track record of modifying Jiraprojects with a wide range of schemes, complex processes, flows, authorization schemes, and notification schemes.
- Following Agile (Scrum) Methodology for developing application development.
Environment: Power BI, NO SQL, SSAS, SSIS, SSRS, OLTP, OLAP, Pyspark, Spark, Azure SQL, Azure DW, Azure Data Lake, Azure Data bricks, Scala, T SQL, Git, Github, Azure Data factory, Power Query, T SQL, ETL, Power BI Desktop, Excel, DAX, Star Schema, Snowflake Schema, Jira, Tableau, CI/CD, Azure DevOps.