Kpmg India Services Llp

Senior Data Migration and ETL Developer

Kpmg India Services Llp
Bengaluru/Bangalore
Not disclosed
Work from OfficeWork from Office
Full TimeFull Time
Min. 5 yearsMin. 5 years

Job Description

Consultant

Role Overview:

We are seeking a highly skilled Senior Developer specializing in Data Migration and ETL to lead complex data integration initiatives across enterprise platforms. The ideal candidate will have extensive experience in SAP S/4HANA, Azure Databricks, and Azure Data Factory (ADF), ensuring seamless migration, transformation, and orchestration of large-scale datasets in a secure and efficient manner.

 

Key Responsibilities:

  • Design & Develop ETL Pipelines:
    Build and optimize end-to-end ETL workflows for migrating data from SAP S/4HANA and other source systems into target platforms using ADF and Databricks.
  • Data Migration Strategy:
    Implement robust migration frameworks, including staging, transformation, and reconciliation processes, ensuring data integrity and compliance with business rules.
  • Integration with SAP Ecosystem:
    Utilize SAP Data Provisioning Agent and connectors for extracting data from ECC/S4HANA, BW, and Datasphere, integrating with Databricks Lakehouse for advanced analytics.
  • Performance Optimization:
    Tune Spark jobs, cluster configurations, and ADF pipelines for high throughput and cost efficiency.
  • Governance & Security:
    Apply best practices for data governance, metadata management, and secure data handling across environments (Dev, QA, Prod).
  • Collaboration:
    Work closely with architects, data engineers, and business stakeholders to align migration activities with enterprise data strategy.
 

Required Skills & Experience:

  • Technical Expertise:
    • SAP S/4HANA data extraction and integration patterns with expertise on using Data Migration cockpit, Data services.
    • Azure Data Factory (ADF) orchestration and pipeline development.
    • Azure Databricks (PySpark, Delta Lake, Unity Catalog).
    • Strong SQL and Python skills for data transformation.
  • Data Architecture Knowledge:
    • Experience with ADLS Gen2, hierarchical namespaces, and Delta Lake format.
    • Familiarity with Common Data Model (CDM) and enterprise data platforms.
  • ETL & Migration Frameworks:
    • Hands-on experience with staging, mapping, reconciliation dashboards, and metadata-driven transformations.
  • Cloud & Big Data Ecosystem:
    • Understanding of Azure Integration Services and hyperscaler tools (ADF, AWS Glue, etc.).
  • Soft Skills:
    • Strong problem-solving, analytical thinking, and stakeholder communication.
    • Ability to lead migration projects and mentor junior developers.
 

Preferred Qualifications:

  • 5+ years of experience in data engineering and migration projects.
  • SAP certification or proven experience in SAP data integration.
  • Experience in Agile delivery and CI/CD practices.
 

Role Overview:

We are seeking a highly skilled Senior Developer specializing in Data Migration and ETL to lead complex data integration initiatives across enterprise platforms. The ideal candidate will have extensive experience in SAP S/4HANA, Azure Databricks, and Azure Data Factory (ADF), ensuring seamless migration, transformation, and orchestration of large-scale datasets in a secure and efficient manner.

 

Key Responsibilities:

  • Design & Develop ETL Pipelines:
    Build and optimize end-to-end ETL workflows for migrating data from SAP S/4HANA and other source systems into target platforms using ADF and Databricks.
  • Data Migration Strategy:
    Implement robust migration frameworks, including staging, transformation, and reconciliation processes, ensuring data integrity and compliance with business rules.
  • Integration with SAP Ecosystem:
    Utilize SAP Data Provisioning Agent and connectors for extracting data from ECC/S4HANA, BW, and Datasphere, integrating with Databricks Lakehouse for advanced analytics.
  • Performance Optimization:
    Tune Spark jobs, cluster configurations, and ADF pipelines for high throughput and cost efficiency.
  • Governance & Security:
    Apply best practices for data governance, metadata management, and secure data handling across environments (Dev, QA, Prod).
  • Collaboration:
    Work closely with architects, data engineers, and business stakeholders to align migration activities with enterprise data strategy.
 

Required Skills & Experience:

  • Technical Expertise:
    • SAP S/4HANA data extraction and integration patterns with expertise on using Data Migration cockpit, Data services.
    • Azure Data Factory (ADF) orchestration and pipeline development.
    • Azure Databricks (PySpark, Delta Lake, Unity Catalog).
    • Strong SQL and Python skills for data transformation.
  • Data Architecture Knowledge:
    • Experience with ADLS Gen2, hierarchical namespaces, and Delta Lake format.
    • Familiarity with Common Data Model (CDM) and enterprise data platforms.
  • ETL & Migration Frameworks:
    • Hands-on experience with staging, mapping, reconciliation dashboards, and metadata-driven transformations.
  • Cloud & Big Data Ecosystem:
    • Understanding of Azure Integration Services and hyperscaler tools (ADF, AWS Glue, etc.).
  • Soft Skills:
    • Strong problem-solving, analytical thinking, and stakeholder communication.
    • Ability to lead migration projects and mentor junior developers.
 

Preferred Qualifications:

  • 5+ years of experience in data engineering and migration projects.
  • SAP certification or proven experience in SAP data integration.
  • Experience in Agile delivery and CI/CD practices.
 

Role Overview:

We are seeking a highly skilled Senior Developer specializing in Data Migration and ETL to lead complex data integration initiatives across enterprise platforms. The ideal candidate will have extensive experience in SAP S/4HANA, Azure Databricks, and Azure Data Factory (ADF), ensuring seamless migration, transformation, and orchestration of large-scale datasets in a secure and efficient manner.

 

Key Responsibilities:

  • Design & Develop ETL Pipelines:
    Build and optimize end-to-end ETL workflows for migrating data from SAP S/4HANA and other source systems into target platforms using ADF and Databricks.
  • Data Migration Strategy:
    Implement robust migration frameworks, including staging, transformation, and reconciliation processes, ensuring data integrity and compliance with business rules.
  • Integration with SAP Ecosystem:
    Utilize SAP Data Provisioning Agent and connectors for extracting data from ECC/S4HANA, BW, and Datasphere, integrating with Databricks Lakehouse for advanced analytics.
  • Performance Optimization:
    Tune Spark jobs, cluster configurations, and ADF pipelines for high throughput and cost efficiency.
  • Governance & Security:
    Apply best practices for data governance, metadata management, and secure data handling across environments (Dev, QA, Prod).
  • Collaboration:
    Work closely with architects, data engineers, and business stakeholders to align migration activities with enterprise data strategy.
 

Required Skills & Experience:

  • Technical Expertise:
    • SAP S/4HANA data extraction and integration patterns with expertise on using Data Migration cockpit, Data services.
    • Azure Data Factory (ADF) orchestration and pipeline development.
    • Azure Databricks (PySpark, Delta Lake, Unity Catalog).
    • Strong SQL and Python skills for data transformation.
  • Data Architecture Knowledge:
    • Experience with ADLS Gen2, hierarchical namespaces, and Delta Lake format.
    • Familiarity with Common Data Model (CDM) and enterprise data platforms.
  • ETL & Migration Frameworks:
    • Hands-on experience with staging, mapping, reconciliation dashboards, and metadata-driven transformations.
  • Cloud & Big Data Ecosystem:
    • Understanding of Azure Integration Services and hyperscaler tools (ADF, AWS Glue, etc.).
  • Soft Skills:
    • Strong problem-solving, analytical thinking, and stakeholder communication.
    • Ability to lead migration projects and mentor junior developers.
 

Preferred Qualifications:

  • 5+ years of experience in data engineering and migration projects.
  • SAP certification or proven experience in SAP data integration.
  • Experience in Agile delivery and CI/CD practices.

Experience Level

Senior Level

Job role

Work location
Work locationBangalore, Karnataka, India
Department
DepartmentData Science & Analytics
Role / Category
Role / CategoryDBA / Data warehousing
Employment type
Employment typeFull Time
Shift
ShiftDay Shift

Job requirements

Experience
ExperienceMin. 5 years

About company

Name
NameKpmg India Services Llp
Job posted by Kpmg India Services Llp

Similar jobs you can apply for

Logistics/ Warehouse operations
Packaid Ecopack Private Limited

Quality Assistant

Packaid Ecopack Private Limited
Margondanahalli, Bengaluru/Bangalore
₹16,000 - ₹22,000*
Work from Office
Full Time
Any experience
Basic English

Quality Control Engineer

Pragathi IT Solutions
Peenya, Bengaluru/Bangalore
₹18,000 - ₹22,000
Work from Office
Full Time
Night Shift
Freshers only
Basic English

Software Developer

Frame Culture Private Limited
Rajaji Nagar, Bengaluru/Bangalore
₹30,000 - ₹60,000
Work from Office
Full Time
Min. 3 years
Basic English

Audio Evaluator — Speech Quality Assessment (Hindi/Tamil/Telugu/Urdu/Odia)

Arctic Engines
Work From Home
₹25,000 - ₹45,000
Full Time
Min. 6 months
Basic English
Edizi Tools Private Limited

Quality Assurance Executive

Edizi Tools Private Limited
Electronics City, Bengaluru/Bangalore
₹19,000 - ₹24,000*
Work from Office
Full Time
Freshers only
Good (Intermediate / Advanced) English

Automation Test Engineer

MNC company
Kumaraswamy Layout, Bengaluru/Bangalore
₹1,20,000 - ₹1,49,999
Work from Office
Full Time
Min. 5 years
Good (Intermediate / Advanced) English