Senior Data Migration and ETL Developer

Kpmg India Services Llp

Bengaluru/Bangalore

Not disclosed

Work from Office

Full Time

Min. 5 years

Job Details

Job Description

Consultant

Role Overview:

We are seeking a highly skilled Senior Developer specializing in Data Migration and ETL to lead complex data integration initiatives across enterprise platforms. The ideal candidate will have extensive experience in SAP S/4HANA, Azure Databricks, and Azure Data Factory (ADF), ensuring seamless migration, transformation, and orchestration of large-scale datasets in a secure and efficient manner.

 

Key Responsibilities:

  • Design & Develop ETL Pipelines:
    Build and optimize end-to-end ETL workflows for migrating data from SAP S/4HANA and other source systems into target platforms using ADF and Databricks.
  • Data Migration Strategy:
    Implement robust migration frameworks, including staging, transformation, and reconciliation processes, ensuring data integrity and compliance with business rules.
  • Integration with SAP Ecosystem:
    Utilize SAP Data Provisioning Agent and connectors for extracting data from ECC/S4HANA, BW, and Datasphere, integrating with Databricks Lakehouse for advanced analytics.
  • Performance Optimization:
    Tune Spark jobs, cluster configurations, and ADF pipelines for high throughput and cost efficiency.
  • Governance & Security:
    Apply best practices for data governance, metadata management, and secure data handling across environments (Dev, QA, Prod).
  • Collaboration:
    Work closely with architects, data engineers, and business stakeholders to align migration activities with enterprise data strategy.
 

Required Skills & Experience:

  • Technical Expertise:
    • SAP S/4HANA data extraction and integration patterns with expertise on using Data Migration cockpit, Data services.
    • Azure Data Factory (ADF) orchestration and pipeline development.
    • Azure Databricks (PySpark, Delta Lake, Unity Catalog).
    • Strong SQL and Python skills for data transformation.
  • Data Architecture Knowledge:
    • Experience with ADLS Gen2, hierarchical namespaces, and Delta Lake format.
    • Familiarity with Common Data Model (CDM) and enterprise data platforms.
  • ETL & Migration Frameworks:
    • Hands-on experience with staging, mapping, reconciliation dashboards, and metadata-driven transformations.
  • Cloud & Big Data Ecosystem:
    • Understanding of Azure Integration Services and hyperscaler tools (ADF, AWS Glue, etc.).
  • Soft Skills:
    • Strong problem-solving, analytical thinking, and stakeholder communication.
    • Ability to lead migration projects and mentor junior developers.
 

Preferred Qualifications:

  • 5+ years of experience in data engineering and migration projects.
  • SAP certification or proven experience in SAP data integration.
  • Experience in Agile delivery and CI/CD practices.
 

Role Overview:

We are seeking a highly skilled Senior Developer specializing in Data Migration and ETL to lead complex data integration initiatives across enterprise platforms. The ideal candidate will have extensive experience in SAP S/4HANA, Azure Databricks, and Azure Data Factory (ADF), ensuring seamless migration, transformation, and orchestration of large-scale datasets in a secure and efficient manner.

 

Key Responsibilities:

  • Design & Develop ETL Pipelines:
    Build and optimize end-to-end ETL workflows for migrating data from SAP S/4HANA and other source systems into target platforms using ADF and Databricks.
  • Data Migration Strategy:
    Implement robust migration frameworks, including staging, transformation, and reconciliation processes, ensuring data integrity and compliance with business rules.
  • Integration with SAP Ecosystem:
    Utilize SAP Data Provisioning Agent and connectors for extracting data from ECC/S4HANA, BW, and Datasphere, integrating with Databricks Lakehouse for advanced analytics.
  • Performance Optimization:
    Tune Spark jobs, cluster configurations, and ADF pipelines for high throughput and cost efficiency.
  • Governance & Security:
    Apply best practices for data governance, metadata management, and secure data handling across environments (Dev, QA, Prod).
  • Collaboration:
    Work closely with architects, data engineers, and business stakeholders to align migration activities with enterprise data strategy.
 

Required Skills & Experience:

  • Technical Expertise:
    • SAP S/4HANA data extraction and integration patterns with expertise on using Data Migration cockpit, Data services.
    • Azure Data Factory (ADF) orchestration and pipeline development.
    • Azure Databricks (PySpark, Delta Lake, Unity Catalog).
    • Strong SQL and Python skills for data transformation.
  • Data Architecture Knowledge:
    • Experience with ADLS Gen2, hierarchical namespaces, and Delta Lake format.
    • Familiarity with Common Data Model (CDM) and enterprise data platforms.
  • ETL & Migration Frameworks:
    • Hands-on experience with staging, mapping, reconciliation dashboards, and metadata-driven transformations.
  • Cloud & Big Data Ecosystem:
    • Understanding of Azure Integration Services and hyperscaler tools (ADF, AWS Glue, etc.).
  • Soft Skills:
    • Strong problem-solving, analytical thinking, and stakeholder communication.
    • Ability to lead migration projects and mentor junior developers.
 

Preferred Qualifications:

  • 5+ years of experience in data engineering and migration projects.
  • SAP certification or proven experience in SAP data integration.
  • Experience in Agile delivery and CI/CD practices.
 

Role Overview:

We are seeking a highly skilled Senior Developer specializing in Data Migration and ETL to lead complex data integration initiatives across enterprise platforms. The ideal candidate will have extensive experience in SAP S/4HANA, Azure Databricks, and Azure Data Factory (ADF), ensuring seamless migration, transformation, and orchestration of large-scale datasets in a secure and efficient manner.

 

Key Responsibilities:

  • Design & Develop ETL Pipelines:
    Build and optimize end-to-end ETL workflows for migrating data from SAP S/4HANA and other source systems into target platforms using ADF and Databricks.
  • Data Migration Strategy:
    Implement robust migration frameworks, including staging, transformation, and reconciliation processes, ensuring data integrity and compliance with business rules.
  • Integration with SAP Ecosystem:
    Utilize SAP Data Provisioning Agent and connectors for extracting data from ECC/S4HANA, BW, and Datasphere, integrating with Databricks Lakehouse for advanced analytics.
  • Performance Optimization:
    Tune Spark jobs, cluster configurations, and ADF pipelines for high throughput and cost efficiency.
  • Governance & Security:
    Apply best practices for data governance, metadata management, and secure data handling across environments (Dev, QA, Prod).
  • Collaboration:
    Work closely with architects, data engineers, and business stakeholders to align migration activities with enterprise data strategy.
 

Required Skills & Experience:

  • Technical Expertise:
    • SAP S/4HANA data extraction and integration patterns with expertise on using Data Migration cockpit, Data services.
    • Azure Data Factory (ADF) orchestration and pipeline development.
    • Azure Databricks (PySpark, Delta Lake, Unity Catalog).
    • Strong SQL and Python skills for data transformation.
  • Data Architecture Knowledge:
    • Experience with ADLS Gen2, hierarchical namespaces, and Delta Lake format.
    • Familiarity with Common Data Model (CDM) and enterprise data platforms.
  • ETL & Migration Frameworks:
    • Hands-on experience with staging, mapping, reconciliation dashboards, and metadata-driven transformations.
  • Cloud & Big Data Ecosystem:
    • Understanding of Azure Integration Services and hyperscaler tools (ADF, AWS Glue, etc.).
  • Soft Skills:
    • Strong problem-solving, analytical thinking, and stakeholder communication.
    • Ability to lead migration projects and mentor junior developers.
 

Preferred Qualifications:

  • 5+ years of experience in data engineering and migration projects.
  • SAP certification or proven experience in SAP data integration.
  • Experience in Agile delivery and CI/CD practices.

Experience Level

Senior Level

Job role

Work location

Bangalore, Karnataka, India

Department

Data Science & Analytics

Role / Category

DBA / Data warehousing

Employment type

Full Time

Shift

Day Shift

Job requirements

Experience

Min. 5 years

About company

Name

Kpmg India Services Llp

Job posted by Kpmg India Services Llp

Apply on company website