Kpmg India Services Llp

Assistant Manager - ERP Data Integration and Azure Data Engineering

Kpmg India Services Llp
Bengaluru/Bangalore
Not disclosed
Work from OfficeWork from Office
Full TimeFull Time
Min. 6 yearsMin. 6 years

Job Description

Assistant Manager-GDC

 Roles & responsibilities 
Role Overview: Lead enterprise ERP data integration projects focused on extracting financial data across diverse ERP platforms within an audit-centric environment. Define and execute technical strategy leveraging Databricks and Azure to deliver secure, scalable, and compliant solutions. The Senior Data Engineer will own strategic data initiatives, make architecture decisions, and guide the engineering team in extracting, validating, analyzing, and visualizing data from on-premises and cloud sources. This role requires the ability to convert stakeholder and business requirements into well-structured design and technical documentation covering data lineage, pipeline architecture, and analytics frameworks.
The role will be responsible for:
Development
Drive implementation and optimization of audit technology platforms including KPMG Clara, Validis, and Informatica.
Define and execute onboarding strategies for new clients, including secure agent setup, ERP role provisioning, and data mapping validation. Collaborate with client IT teams, audit teams, and infrastructure/security stakeholders to ensure seamless deployment.
Build robust ETL workflows using Databricks notebooks, PySpark, and Azure Data Factory.
Implement Delta Lake architectures for ERP data storage and processing.
Develop connectors for ERP systems (SAP, Oracle, Dynamics) using APIs and secure protocols.

Automate data ingestion and transformation processes leveraging Azure orchestration tools.
Advanced Databricks notebook development using PySpark and SQL.
Designs, codes, verifies, tests, documents, amends, and refactors complex programs/scripts. Applies agreed standards and tools, to achieve a well-engineered result. 
Actively participate in the design, development and implementation of fixes and enhancements to existing modules or new modules.
Build talent in the team on emerging technologies and train professionals to acquire working knowledge and certifications.
Experience on any Visualization tool for building dashboards and knowledge on Low-code (Power Apps etc.,) platform is an added advantages.
Knowledge on data preparation, transformation, and deliver high-quality data from diverse components (Lakehouse, data warehouses, semantic models, notebooks, etc.) using Microsoft Fabric is an added advantage.
Having a certification in any ERP technical or functional area is a plus.
Identify opportunities for improvement, propose enhancements to existing systems, and explore new techniques by leveraging AI/ Gen AI. 

 Roles & responsibilities 
Role Overview: Lead enterprise ERP data integration projects focused on extracting financial data across diverse ERP platforms within an audit-centric environment. Define and execute technical strategy leveraging Databricks and Azure to deliver secure, scalable, and compliant solutions. The Senior Data Engineer will own strategic data initiatives, make architecture decisions, and guide the engineering team in extracting, validating, analyzing, and visualizing data from on-premises and cloud sources. This role requires the ability to convert stakeholder and business requirements into well-structured design and technical documentation covering data lineage, pipeline architecture, and analytics frameworks.
The role will be responsible for:
Development
Drive implementation and optimization of audit technology platforms including KPMG Clara, Validis, and Informatica.
Define and execute onboarding strategies for new clients, including secure agent setup, ERP role provisioning, and data mapping validation. Collaborate with client IT teams, audit teams, and infrastructure/security stakeholders to ensure seamless deployment.
Build robust ETL workflows using Databricks notebooks, PySpark, and Azure Data Factory.
Implement Delta Lake architectures for ERP data storage and processing.
Develop connectors for ERP systems (SAP, Oracle, Dynamics) using APIs and secure protocols.

Automate data ingestion and transformation processes leveraging Azure orchestration tools.
Advanced Databricks notebook development using PySpark and SQL.
Designs, codes, verifies, tests, documents, amends, and refactors complex programs/scripts. Applies agreed standards and tools, to achieve a well-engineered result. 
Actively participate in the design, development and implementation of fixes and enhancements to existing modules or new modules.
Build talent in the team on emerging technologies and train professionals to acquire working knowledge and certifications.
Experience on any Visualization tool for building dashboards and knowledge on Low-code (Power Apps etc.,) platform is an added advantages.
Knowledge on data preparation, transformation, and deliver high-quality data from diverse components (Lakehouse, data warehouses, semantic models, notebooks, etc.) using Microsoft Fabric is an added advantage.
Having a certification in any ERP technical or functional area is a plus.
Identify opportunities for improvement, propose enhancements to existing systems, and explore new techniques by leveraging AI/ Gen AI. 
Mandatory  technical & functional skills
Databricks architecture, Unity Catalog, and Azure RBAC
Python, SQL and Spark
Azure Data Factory
Azure Data Lake Storage
7+ years of total IT Experience in ETL, Databricks and Microsoft Azure
Strong expertise in ERP systems, including their implementation and design principles. Familiarity with the Financial module is essential.
Proficient in Database architecture and data modeling
Strong expertise in ERP systems, including their implementation and design principles. Familiarity with the Financial module is essential.
Proficient in Database architecture and data modeling
Experience in building ETL/ELT processes and Data Ingestion/Data Migration.
Should be able to monitor, troubleshoot and optimize the performance of Databricks notebooks, Azure Data Factory and Synapse workloads.
Must have Hands-on experience on Azure Cloud Services: (Azure Databricks, Azure Data Lake, Synapse Analytics, Logic Apps & Function Apps, Azure Functions, Azure SQL DB, Azure Data Factory)
Must have strong knowledge on Python and PySpark, should have ability to write PySpark scripts for developing data workflows and debugging.
Must have experience in concepts like Partitioning, optimization, and performance tuning.
Any experience on Visualization tools or Power Apps is an added advantage
Good to have knowledge on KQL (Kusto Query Language), Azure REST APIs
Experience in writing SQL queries and/or database management skills in Oracle/SQL server. Data warehousing working knowledge is preferred 

 This role is for you if you have the below
Educational qualifications
BSc/BCA/MSc/MCA/B.Tech. 
Work experience
6 – 8  years of Professional Experience

 

 

Experience Level

Mid Level

Job role

Work location
Work locationBangalore, Karnataka, India
Department
DepartmentSoftware Engineering
Role / Category
Role / CategoryDBA / Data warehousing
Employment type
Employment typeFull Time
Shift
ShiftDay Shift

Job requirements

Experience
ExperienceMin. 6 years

About company

Name
NameKpmg India Services Llp
Job posted by Kpmg India Services Llp

Similar jobs you can apply for

Logistics/ Warehouse operations
Packaid Ecopack Private Limited

Quality Assistant

Packaid Ecopack Private Limited
Margondanahalli, Bengaluru/Bangalore
₹16,000 - ₹22,000*
Work from Office
Full Time
Any experience
Basic English

Quality Control Engineer

Pragathi IT Solutions
Peenya, Bengaluru/Bangalore
₹18,000 - ₹22,000
Work from Office
Full Time
Night Shift
Freshers only
Basic English

Software Developer

Frame Culture Private Limited
Rajaji Nagar, Bengaluru/Bangalore
₹30,000 - ₹60,000
Work from Office
Full Time
Min. 3 years
Basic English

Audio Evaluator — Speech Quality Assessment (Hindi/Tamil/Telugu/Urdu/Odia)

Arctic Engines
Work From Home
₹25,000 - ₹45,000
Full Time
Min. 6 months
Basic English
Edizi Tools Private Limited

Quality Assurance Executive

Edizi Tools Private Limited
Electronics City, Bengaluru/Bangalore
₹19,000 - ₹24,000*
Work from Office
Full Time
Freshers only
Good (Intermediate / Advanced) English

Automation Test Engineer

MNC company
Kumaraswamy Layout, Bengaluru/Bangalore
₹1,20,000 - ₹1,49,999
Work from Office
Full Time
Min. 5 years
Good (Intermediate / Advanced) English