Senior Azure Data Engineer
Kpmg India Services Llp
Apply on company website
Senior Azure Data Engineer
Kpmg India Services Llp
Bengaluru/Bangalore
Not disclosed
Job Details
Job Description
Senior Engineer - SA1
Roles & responsibilities
Role Overview:
The Senior Associate 1 – Azure Data Engineer will be part of the GDC Technology Solutions (GTS) team, contributing to the Audit Data & Analytics domain. This role involves developing proficiency in KPMG proprietary Data & Analytics tools and applying audit methodology to deliver impactful solutions.
The ideal candidate should possess strong expertise in data engineering, transformation, and integration using Azure technologies, playing a critical role in enabling audit engagements through data-driven insights.
As part of this team, the candidate will be responsible for extracting and processing datasets from client ERP systems (such as SAP, Oracle, Microsoft Dynamics and others) and other sources. The goal is to provide actionable insights through data transformation, ETL processes, and dashboarding solutions for audit and internal teams. Additionally, the role includes developing innovative solutions using a diverse set of tools and technologies.
Key Responsibilities:
•Data Transformation & Integration •Perform data transformations using Azure Databricks, Azure Data Factory, and Python. •Implement and customize data mapping changes within Databricks using PySpark. •Integrate Databricks with ERP or third-party systems via APIs and develop Python/PySpark notebooks to apply business transformation logic aligned with the common data model. • •Data Handling & Workflow Development •Design and manage data processing pipelines and workflows using Azure Databricks and ADF (Azure Data Factory) to support analytics and other data-driven applications. •Apply advanced concepts like partitioning, optimization, and performance tuning for efficient data processing. • •Performance Optimization & Debugging •Debug, optimize, and tune performance for large-scale data processing workflows. •Resolve issues with minimal guidance and propose effective solutions. • •Audit Engagement Support •Collaborate with audit engagement teams and client IT teams to extract and transform data. •Interpret results and deliver meaningful audit insights through reports. •Prepare supporting for Audit documentation and review with attention to detail. • •Leadership & Collaboration •Maintain accurate project status for self and assigned team members. •Coach junior team members on best practices and ensure knowledge transfer for low-complexity engagements. • •Domain Expertise (Added advantage) •Experience with General Ledger/Sub-Ledger analysis and development of risk assessment or substantive routines for Audit/Internal Audit (Big 4 experience preferred). • •Additional Skills •Working knowledge of KQL (Kusto Query Language) and Azure REST APIs is a plus. •Enthusiasm to explore and apply Azure AI services in business processes.Roles & responsibilities
Role Overview:
The Senior Associate 1 – Azure Data Engineer will be part of the GDC Technology Solutions (GTS) team, contributing to the Audit Data & Analytics domain. This role involves developing proficiency in KPMG proprietary Data & Analytics tools and applying audit methodology to deliver impactful solutions.
The ideal candidate should possess strong expertise in data engineering, transformation, and integration using Azure technologies, playing a critical role in enabling audit engagements through data-driven insights.
As part of this team, the candidate will be responsible for extracting and processing datasets from client ERP systems (such as SAP, Oracle, Microsoft Dynamics and others) and other sources. The goal is to provide actionable insights through data transformation, ETL processes, and dashboarding solutions for audit and internal teams. Additionally, the role includes developing innovative solutions using a diverse set of tools and technologies.
Key Responsibilities:
•Data Transformation & Integration •Perform data transformations using Azure Databricks, Azure Data Factory, and Python. •Implement and customize data mapping changes within Databricks using PySpark. •Integrate Databricks with ERP or third-party systems via APIs and develop Python/PySpark notebooks to apply business transformation logic aligned with the common data model. • •Data Handling & Workflow Development •Design and manage data processing pipelines and workflows using Azure Databricks and ADF (Azure Data Factory) to support analytics and other data-driven applications. •Apply advanced concepts like partitioning, optimization, and performance tuning for efficient data processing. • •Performance Optimization & Debugging •Debug, optimize, and tune performance for large-scale data processing workflows. •Resolve issues with minimal guidance and propose effective solutions. • •Audit Engagement Support •Collaborate with audit engagement teams and client IT teams to extract and transform data. •Interpret results and deliver meaningful audit insights through reports. •Prepare supporting for Audit documentation and review with attention to detail. • •Leadership & Collaboration •Maintain accurate project status for self and assigned team members. •Coach junior team members on best practices and ensure knowledge transfer for low-complexity engagements. • •Domain Expertise (Added advantage) •Experience with General Ledger/Sub-Ledger analysis and development of risk assessment or substantive routines for Audit/Internal Audit (Big 4 experience preferred). • •Additional Skills •Working knowledge of KQL (Kusto Query Language) and Azure REST APIs is a plus. •Enthusiasm to explore and apply Azure AI services in business processes.Education Requirements
•B. Tech/B.E/MCA (Computer Science / Information Technology)Technical Skills
•Minimum 4–6 years in data engineering with strong exposure to Azure Databricks, Azure Data Factory, PySpark, and Python. •Proficiency in Azure, DataBricks Notebooks, SQL, Python or Pyspark notebooks development. •Strong knowledge of ETL tools and processes. •Hands-on experience with Azure Databricks, Azure Data Factory (ADF), Azure Data Lake Storage (ADLS) •Comprehensive knowledge of Azure cloud services. •Experience with Databricks notebooks for building transformations and creating tables •Experience in Alteryx and PowerBI for data analysis and visualization purpose. •Microsoft Fabric and Azure AI services is an added advantage •Understanding of audit processes, financial data structures, and risk assessment routines.Enabling Skills
•Excellent analytical, problem solving and troubleshooting abilities •Critical thinking: able to look at numbers, trends and data and come to new conclusions based on findings •Attention to detail and good team player •Quick learning ability and adaptability •Willingness and capability to deliver within tight timelines •Effective communication skills •Flexible to work timings and willingness to work in different projects/technologies •Collaborate with business stakeholders to understand data requirements and deliver solutionsExperience Level
Senior LevelJob role
Work location
Bangalore, Karnataka, India
Department
Data Science & Analytics
Role / Category
Data Science & Machine Learning
Employment type
Full Time
Shift
Day Shift
Job requirements
Experience
Min. 4 years
About company
Name
Kpmg India Services Llp
Job posted by Kpmg India Services Llp
Apply on company website