Summary
Overview
Work History
Education
Skills
Timeline
Generic

Ramya Somashekar

Bournemouth

Summary

Dynamic Data Domain Architect at JPMorgan Chase with a proven track record in technical leadership and performance optimization. Expert in cloud computing and data management, Developed innovative solutions that enhanced operational efficiency. Skilled in Python programming, Successfully automated data visualisations, driving impactful insights and strategic decision-making across the enterprise. Demonstrates strong analytical, communication, and teamwork skills, with proven ability to quickly adapt to new environments. Eager to contribute to team success and further develop professional skills. Brings positive attitude and commitment to continuous learning and growth.

Overview

16
16
years of professional experience

Work History

Data Domain Architect

JPMorgan Chase
01.2021 - Current
  • Translated business and technical requirements into architectural blueprint to achieve business objectives.
  • Developed and maintained database solution in AWS to support business operations.
  • Developed and maintained data ingestion pipelines using Alteryx Designer and Python scripts to extract, transform, and load data from various structured and unstructured source systems.
  • Automated repetitive data integration tasks by designing robust Alteryx workflows to support business intelligence, reporting, and analytics use cases.
  • Utilized Python scripting to handle custom transformations, API integrations, and advanced data manipulations not natively supported by Alteryx.
  • Performed data validation, cleansing, and enrichment to ensure accuracy, consistency, and quality before loading into downstream systems.
  • Integrated data from multiple systems such as SQL Server, Oracle, Excel, REST APIs, CSVs, and cloud-based sources into centralized data repositories.
  • Scheduled and monitored Alteryx workflows using Alteryx Server, ensuring reliable and timely data refresh cycles.
  • Collaborated with business analysts and data stakeholders to gather requirements and translate them into scalable ingestion solutions.
  • Documented pipeline logic, data lineage, and dependencies to support ongoing maintenance and knowledge sharing.
  • Analyzed large amounts of data to identify trends and find patterns, signals and hidden stories within data.
  • Built robust and interactive data visualizations in Tableau and Qlik Sense, enabling users to track and analyze 15 key business metrics across multiple domains.
  • Created intuitive dashboards and reports that provided actionable insights, supporting data-driven decision-making for stakeholders and business units.
  • Developed reusable data models and calculated fields to streamline performance and maintain consistency across visualizations.
  • Implemented drill-down capabilities, filters, and KPIs to provide both summary-level and detailed insights based on user roles.
  • Identified, reviewed and evaluated data management metrics to recommend ways to strengthen data across enterprise.
  • Performed root-cause analysis on data-related system problems to recommend or execute corrective action.
  • Provided technical leadership and delivered innovative products and services to address customer specific requirements.

Big Data Analyst - Testing

Cognizant Technologies
01.2016 - 12.2020
  • Collaborated closely with product owners to gather and analyze business requirements, ensuring alignment with technical solutions.
  • Conducted detailed data analysis to identify enrichment needs, reporting data structures, and API consumption requirements.
  • Participated in Agile ceremonies and contributed to PI planning for end-to-end QA and testing activities.
  • Designed and developed a reusable PySpark testing framework to automate validation of big data pipelines.
  • Built Apache NiFi processors to automate batch data ingestion workflows, enabling seamless integration across ingestion pipelines.
  • Utilized LSF GUI for job scheduling and monitoring across environments.
  • Designed and implemented database schemas for audit services to monitor pipeline executions.
  • Created data simulation scripts to produce real-time Kafka messages, supporting robust testing of streaming data applications.
  • Wrote Hive queries to validate golden records and ensure data quality in the Active Copy.
  • Developed scripts to automate reference and lookup data setup by invoking enterprise data service APIs.
  • Constructed HQL and MongoDB queries based on business logic to validate categorized datasets.
  • Conducted product demos with stakeholders to review deliverables and define acceptance criteria (Definition of Done).
  • Performed acceptance and performance testing on APIs and automated performance testing procedures.
  • Executed performance testing on big data systems to validate scalability and throughput.
  • Collaborated with BizOps teams for production deployments, monitoring, and issue resolution.
  • Automated DDL validation in the cloud using Python scripts.
  • Validated configuration database setups within SQL Azure, ensuring integrity and accuracy.
  • Verified ADF pipeline executions to confirm seamless data flow from source systems to Azure Data Lake Storage (ADLS).
  • Designed and analyzed Tableau dashboards, created SQL queries, and generated detailed and summary-level reports.
  • Configured groups, hierarchies, and datasets to enhance dashboard functionality and insights.
  • Conducted parallel testing between on-premise and Azure cloud environments to ensure feature parity and continuity during migration.

Test Engineer

Attra Info Tech Solutions
03.2015 - 12.2015
  • Contributed to the development of key components within the test automation framework, following best practices and coding standards.
  • Implemented the Page Object Model (POM) design pattern to enhance maintainability and reusability of test scripts.
  • Actively participated in the execution of automation test scripts across various test cycles.
  • Maintained and updated existing automation scripts to support regression testing for assigned modules, ensuring continued test coverage.
  • Logged and tracked bugs using SPIRA test management tool, collaborating with development teams to ensure timely resolution.

Senior Script Engineer

Webcetera Software Solutions (Ezlynx)
11.2009 - 03.2015
  • Executed and maintained automation scripts to support regression testing across assigned modules, ensuring ongoing functionality and stability.
  • Designed and implemented data validation automation using Unix shell scripting for back-end verification.
  • Developed and committed user-defined functions (UDFs) to a shared function library, enhancing script reusability and efficiency.
  • Automated interactions with Microsoft Excel, including importing, exporting, and updating data using the Excel Application Object.
  • Performed screen scraping of web applications to extract data and validate UI elements for automated test scenarios.
  • Logged, tracked, and managed defects using JIRA, collaborating with development teams for issue resolution and re-testing.
  • Contributed to the maintenance and continuous improvement of the automation framework to support evolving test needs.

Education

Bachelor of Science - Computer And Information Sciences

BNM Institue of Technology
India
03-2008

Skills

  • AWS Cloud computing
  • Alteryx
  • Terraform
  • Data integration
  • Predictive analysis
  • Big data technologies
  • Python programming
  • ETL Tools - NiFi, Informatica
  • Dashboard creation - Tableau/Qliksense/Quick Sight
  • Predictive modeling
  • Analyze trends
  • SQL reporting
  • Agile methodologies
  • Team oversight
  • Continuous integration

Timeline

Data Domain Architect

JPMorgan Chase
01.2021 - Current

Big Data Analyst - Testing

Cognizant Technologies
01.2016 - 12.2020

Test Engineer

Attra Info Tech Solutions
03.2015 - 12.2015

Senior Script Engineer

Webcetera Software Solutions (Ezlynx)
11.2009 - 03.2015

Bachelor of Science - Computer And Information Sciences

BNM Institue of Technology
Ramya Somashekar