· Experience in building robust Architecture to organize Data flow from heterogeneous/homogeneous system using SQL Server Integration Services (SSIS) and Data Transform Services (DTS).
· Experience in optimizing the database by creating various Clustered, Non-clustered indexes.
· Experience in creating Jobs, Alerts, SQL Mail Agent, and Scheduled DTS and SSIS Packages.
· Good knowledge in Developing and Extending OLAP Cubes, Dimensions and data source view, MOLAP, HOLAP, ROLAP, Partitioning, Aggregates, Calculated Members, Perspective and MDX queries
As an Operations Manager for a Capital Reporting Regulatory Data Platform, I oversee the development, execution, and operational efficiency of a highly critical system responsible for generating capital risk numbers. The platform runs complex financial calculators that process billions of trade data points to ensure accurate and timely regulatory submissions to financial regulators in the UK and Europe.
Given the platform's mission-critical nature, my role is to ensure it is efficiently developed, managed, and optimized to maintain system reliability, scalability, and compliance. Any failure or inefficiency could lead to significant monetary penalties and reputational damage to the firm. To mitigate risks, I focus on operational excellence, automation, incident management, and proactive monitoring, ensuring seamless and error-free capital reporting in a highly regulated financial environment.
Roles & Responsibilities
Bank is in digital transformational journey from legacy Microsoft platform to modern big data ecosystem platform. Existing application and data are migrated to new platform and new uses cases are delivered to ensure best customer experience is provided to customers
Roles & Responsibilities
· Architect and design modern data platform using Apache tech stack – Spark (Batch), Flink (Streaming), Airflow (Orchestration), Openshift (Container hosting), Hive (Analytics), Druid (Fast Analytics), Atlas (Data Lineage), Ranger (Governance).
· Capability demonstration about different tools using real time POCs to business stakeholders explaining the business benefits of the new platform.
· Evaluate new business use cases and deciding right tech stack for robust solution and delivery of business objectives.
· Designed Data Lake (Raw, Refined, Trusted, Consumptions) and deployed dbt (data build tool) building on demand data models for end users
· Compared data observability tools like Amazon Deequ and Great Expectations and designed platform logging & monitoring services using ELK and Prometheus.
· Created side car containers using fluentd for logging all platform related events to ELK.
· Cassandra and HBase are used as purpose build data store for persistent storage along with Postgres SQL. Experience creating PVs (Persistent Volumes) and PVCs (Persistent Volume Claims) for mounting volumes to pods.
Global payment is a payment gateway providing payment services to small vendors. Project involved in end to end order processing mechanism of small vendors globally and data are loaded from different sources and integration services are used for loading the data maintaining data integration, cubes and reporting for analysis and tracking process.
Roles & Responsibilities
· Credit card payment interface project which helps small merchants to process the payments with VISA, Master Card, AMEX, Discover cards to enhance their business.
· MDM service for all onboarding merchants’ data using IBM Initiate tool for minimizing errors in equipment shipment and gateway setup.
· Merchants are on boarded to the system in two ways thru UI and batch processing.
· SSIS packages were created for on boarding the merchants and process their payments
· Complex Stored procedures created for UI related task or activity for completing the onboarding process.
· Logical data model design and review for optimal performance of the system
· SSAS cubes created for measuring stats of the merchants and providing reports on monthly basis like frequency of the transactions and trend analysis.
· SSRS reports created for delivering scheduled reports.
Humana is healthcare company and was growing inorganically by acquisition. In order maintain the data efficiently and integrating the data of the acquired companies we built a single shared data repository data integration, MDM, data masking and data isolation catering data need for enterprise wide.
Roles & Responsibilities
· Enterprise wide Data warehousing initiate to collate all the data in a central repository and providing real time data availability for interfacing apps
· Responsible for creating common interface gateway for integrating acquired organization data for efficient operation of business
· Architecture and planning with multiple teams understanding upstream and downstream dependencies and minimal impact to issues with business
· Data quality reporting with COTs and custom build tools to ensure no data loss and data corruption critical for effective business delivery.
· Created Informatica workflows for mainframe source systems and SSIS packages for TXT,CSV and RDMS data sources
· Created complex stored procedures for data manipulation for real time systems.
· MDM of existing groups and members’ data and acquired company data. This minimized the privacy issue and adheres to compliance of the member data.
· SSAS cubes for reporting executive summary level dashboard using SSRS and OBIEE and MDX queries
· Project execution and people management consisting of 30 member team.
Textron as a parent organization wanted to understand spend pattern across multiple business units involving direct and in direct cost implicated on the business. This was my first data warehouse project and involved SSIS for integration services and SSAS for analytical purposes.
Roles & Responsibilities
· Client wanted to calculate Direct and Indirect spend associated with the organization with was manufacturing company making small aircrafts, helicopters and golf carts etc
· Primary responsibility was to design a system which will acquire all the data across different platforms in the organization. Process the data to required format.
· Design EDW with multiple stages to transform the data using SSIS packages
· Custom data quality tool automation for error reporting and measuring data standards
· SSAS cube creation for dashboard and end reporting tool was business objects for drilling down the data to granular level and MDX queries.
· Pre-data processing and Post-data processing validations automation for enforcing data quality
· SQL Server 2005 performance tuning of the stored procedures
· Project planning and execution.
Certified Cloud Technologies (Azure, AWS)
Message Queues(RabbitMQ, JMS Queues, Kafka, Azure service bus, Event Hub)
Data Visualization(Power BI, SSRS, Tableau, Apache Superset)
API Management(Mulesoft, Azure API platform)
Orchestration Tools (Apache Airflow, Apache NiFi, Talend, Azure ADF, Control M)
Logging & Monitoring(ELK (Elastic Search, Logstash, Kibana), Prometheus, Azure Logging & Monitoring, AWS Cloudwatch, fluentd )
DWH Cloud/Appliances(Teradata, Netezza, Microsoft PDW, Snowflake, Azure Synapse, AWS Redshift, Microsoft MSBI 2016/2019, Informatica)
Batch/Streaming Tools(Apache Flink, Spark Streaming, Apache Storm, Kafka streams, data build tool (dbt), Apache Spark (PySpark))
Data Observality Tools(Great Expectations, Amazon Deequ, Databand)
Data Governance & Security(Colibra, Apache Atlas, Apache Ranger, MSBI Data Quality Service, Informatica DQ Service)
Data Base(MS SQL, MySQL, Oracle, HBase, Hive Metastore, Trino, Apache Druid, Postgres SQL)
Serverless(AWS Lambda, Azure – Functions, Logic Apps)
Programming Language(Python, Java, C#, Javascript, R language)
Project Planning(Jira, MS Project, Excel)
Data Stores(Cassandra, Redshift, S3, Azure Blob storage, Noobaa, Samba file server)
DevOps Tools(Gitlab, Maven, Jenkin, Git Runner, Kubernetes, Open Shift, Docker , Terra form, Azure ARM templates, Nexus Repository, AWS Cloud formation)
SRE Tools (Grafana, Dynatrace, Appdynamics, Service Now, Apache Echarts)