Kpmg India Services Llp

Assistant Manager - ERP Data Integration and Azure Data Engineering

Kpmg India Services Llp
Bengaluru/Bangalore
Not disclosed
Work from OfficeWork from Office
Full TimeFull Time
Min. 6 yearsMin. 6 years

Job Description

Assistant Manager-GDC

 Roles & responsibilities 
Role Overview: Lead enterprise ERP data integration projects focused on extracting financial data across diverse ERP platforms within an audit-centric environment. Define and execute technical strategy leveraging Databricks and Azure to deliver secure, scalable, and compliant solutions. The Senior Data Engineer will own strategic data initiatives, make architecture decisions, and guide the engineering team in extracting, validating, analyzing, and visualizing data from on-premises and cloud sources. This role requires the ability to convert stakeholder and business requirements into well-structured design and technical documentation covering data lineage, pipeline architecture, and analytics frameworks.
The role will be responsible for:
Development
Drive implementation and optimization of audit technology platforms including KPMG Clara, Validis, and Informatica.
Define and execute onboarding strategies for new clients, including secure agent setup, ERP role provisioning, and data mapping validation. Collaborate with client IT teams, audit teams, and infrastructure/security stakeholders to ensure seamless deployment.
Build robust ETL workflows using Databricks notebooks, PySpark, and Azure Data Factory.
Implement Delta Lake architectures for ERP data storage and processing.
Develop connectors for ERP systems (SAP, Oracle, Dynamics) using APIs and secure protocols.

Automate data ingestion and transformation processes leveraging Azure orchestration tools.
Advanced Databricks notebook development using PySpark and SQL.
Designs, codes, verifies, tests, documents, amends, and refactors complex programs/scripts. Applies agreed standards and tools, to achieve a well-engineered result. 
Actively participate in the design, development and implementation of fixes and enhancements to existing modules or new modules.
Build talent in the team on emerging technologies and train professionals to acquire working knowledge and certifications.
Experience on any Visualization tool for building dashboards and knowledge on Low-code (Power Apps etc.,) platform is an added advantages.
Knowledge on data preparation, transformation, and deliver high-quality data from diverse components (Lakehouse, data warehouses, semantic models, notebooks, etc.) using Microsoft Fabric is an added advantage.
Having a certification in any ERP technical or functional area is a plus.
Identify opportunities for improvement, propose enhancements to existing systems, and explore new techniques by leveraging AI/ Gen AI. 

 Roles & responsibilities 
Role Overview: Lead enterprise ERP data integration projects focused on extracting financial data across diverse ERP platforms within an audit-centric environment. Define and execute technical strategy leveraging Databricks and Azure to deliver secure, scalable, and compliant solutions. The Senior Data Engineer will own strategic data initiatives, make architecture decisions, and guide the engineering team in extracting, validating, analyzing, and visualizing data from on-premises and cloud sources. This role requires the ability to convert stakeholder and business requirements into well-structured design and technical documentation covering data lineage, pipeline architecture, and analytics frameworks.
The role will be responsible for:
Development
Drive implementation and optimization of audit technology platforms including KPMG Clara, Validis, and Informatica.
Define and execute onboarding strategies for new clients, including secure agent setup, ERP role provisioning, and data mapping validation. Collaborate with client IT teams, audit teams, and infrastructure/security stakeholders to ensure seamless deployment.
Build robust ETL workflows using Databricks notebooks, PySpark, and Azure Data Factory.
Implement Delta Lake architectures for ERP data storage and processing.
Develop connectors for ERP systems (SAP, Oracle, Dynamics) using APIs and secure protocols.

Automate data ingestion and transformation processes leveraging Azure orchestration tools.
Advanced Databricks notebook development using PySpark and SQL.
Designs, codes, verifies, tests, documents, amends, and refactors complex programs/scripts. Applies agreed standards and tools, to achieve a well-engineered result. 
Actively participate in the design, development and implementation of fixes and enhancements to existing modules or new modules.
Build talent in the team on emerging technologies and train professionals to acquire working knowledge and certifications.
Experience on any Visualization tool for building dashboards and knowledge on Low-code (Power Apps etc.,) platform is an added advantages.
Knowledge on data preparation, transformation, and deliver high-quality data from diverse components (Lakehouse, data warehouses, semantic models, notebooks, etc.) using Microsoft Fabric is an added advantage.
Having a certification in any ERP technical or functional area is a plus.
Identify opportunities for improvement, propose enhancements to existing systems, and explore new techniques by leveraging AI/ Gen AI. 
Mandatory  technical & functional skills
Databricks architecture, Unity Catalog, and Azure RBAC
Python, SQL and Spark
Azure Data Factory
Azure Data Lake Storage
7+ years of total IT Experience in ETL, Databricks and Microsoft Azure
Strong expertise in ERP systems, including their implementation and design principles. Familiarity with the Financial module is essential.
Proficient in Database architecture and data modeling
Strong expertise in ERP systems, including their implementation and design principles. Familiarity with the Financial module is essential.
Proficient in Database architecture and data modeling
Experience in building ETL/ELT processes and Data Ingestion/Data Migration.
Should be able to monitor, troubleshoot and optimize the performance of Databricks notebooks, Azure Data Factory and Synapse workloads.
Must have Hands-on experience on Azure Cloud Services: (Azure Databricks, Azure Data Lake, Synapse Analytics, Logic Apps & Function Apps, Azure Functions, Azure SQL DB, Azure Data Factory)
Must have strong knowledge on Python and PySpark, should have ability to write PySpark scripts for developing data workflows and debugging.
Must have experience in concepts like Partitioning, optimization, and performance tuning.
Any experience on Visualization tools or Power Apps is an added advantage
Good to have knowledge on KQL (Kusto Query Language), Azure REST APIs
Experience in writing SQL queries and/or database management skills in Oracle/SQL server. Data warehousing working knowledge is preferred 

 This role is for you if you have the below
Educational qualifications
BSc/BCA/MSc/MCA/B.Tech. 
Work experience
6 – 8  years of Professional Experience

 

 

Experience Level

Mid Level

Job role

Work location
Work locationBangalore, Karnataka, India
Department
DepartmentSoftware Engineering
Role / Category
Role / CategoryDBA / Data warehousing
Employment type
Employment typeFull Time
Shift
ShiftDay Shift

Job requirements

Experience
ExperienceMin. 6 years

About company

Name
NameKpmg India Services Llp
Job posted by Kpmg India Services Llp

Similar jobs you can apply for

Software / Web Developer
BNV Software

Software / Web Developer Intern

BNV Software
Domlur, Bengaluru/Bangalore
₹12,000 - ₹18,000
Work from Office
Full Time
Freshers only
Good (Intermediate / Advanced) English
Minchu Productions

App Developer

Minchu Productions
Jaya Nagar, Bengaluru/Bangalore
₹25,000 - ₹25,000
Work from Office
Full Time
Any experience
Good (Intermediate / Advanced) English
Jai Finance India Limited

Quality Assurance Officer

Jai Finance India Limited
BTM Layout, Bengaluru/Bangalore
₹25,000 - ₹30,000
Work from Office
Full Time
Min. 1 year
Good (Intermediate / Advanced) English
Ace Carbo Nitriders

Quality Engineer

Ace Carbo Nitriders
Peenya, Bengaluru/Bangalore
₹18,000 - ₹30,000
Work from Office
Full Time
Any experience
Basic English
360 Bytes Tech Venture Private Limited

Package Consultant – SAP HANA SCM PM

360 Bytes Tech Venture Private Limited
Bengaluru/Bangalore
₹1,00,000 - ₹1,15,000
Work from Office
Full Time
Min. 10 years
Good (Intermediate / Advanced) English
Digitory Solutions

DevOps Engineer

Digitory Solutions
Basavanagudi, Bengaluru/Bangalore
₹20,000 - ₹50,000
Work from Office
Full Time
Min. 1 year
Good (Intermediate / Advanced) English

You can expect a minimum salary of 0 INR. The salary offered will depend on your skills, experience and performance in the interview.

The candidate should have completed the required education and people who have 6 to 8 years are eligible to apply for this job. You can apply for more jobs in Bengaluru/Bangalore to get hired quickly.

The candidate should have sound communication skills and sound communication skills for this job.

Both Male and Female candidates can apply for this job.

No, it's not a work from home job and can't be done online. You can explore and apply for other work from home jobs in Bengaluru/Bangalore at apna.

No work-related deposit needs to be made during your employment with the company.

Go to the apna app and apply for this job. Click on the apply button and call HR directly to schedule your interview.

The last date to apply for this job is . For more details, download apna app and find Full Time jobs in Bengaluru/Bangalore . Through apna, you can find jobs in 64 cities across India. Join NOW!