Kpmg India Services Llp

Senior Data Engineering Developer

Kpmg India Services Llp
Bengaluru/Bangalore
Not disclosed
Work from OfficeWork from Office
Full TimeFull Time
Min. 6 yearsMin. 6 years

Job Description

Assistant Manager (SA2) - Developer (Jr Developer DataBricks)

 Roles & responsibilities 
Role Overview: 
The Senior Associate 2 – Data Engineering Developer is a senior individual contributor responsible for designing, building, and optimizing scalable data pipelines, data models, and integration solutions. The role requires strong hands‑on expertise in modern data engineering tools and cloud platforms (Azure/AWS/GCP), along with the ability to work independently, collaborate with cross‑functional teams, and drive high‑quality data delivery.


Key Responsibilities:

Perform and oversee data transformation and integration projects using Azure Databricks, Azure Data Factory, and Python/PySpark.
Design and manage scalable data processing pipelines and workflows to support analytics and audit requirements.
Apply advanced concepts such as partitioning, optimization, and performance tuning for efficient data processing.
Integrate Databricks with ERP or third-party systems via APIs, developing robust business transformation logic.
Design, develop, and maintain robust ETL/ELT pipelines using modern data engineering tools. 
Implement scalable data ingestion from structured and unstructured sources. 
Ensure pipelines are optimized for performance, reliability, and reusability. 
Develop automated data quality checks, validation frameworks, and monitoring solutions.
Develop end‑to‑end data workflows using cloud services (Azure Data Factory, Synapse, Databricks
Debug, optimize, and resolve issues in large-scale data workflows, proposing effective solutions with minimal guidance.
Foster a collaborative and inclusive team environment, promoting knowledge sharing and best practices.
Maintain accurate project status for self and team, ensuring timely delivery of milestones.
Coach junior team members and facilitate knowledge transfer for engagements of varying complexity.
Resolve issues with minimal guidance and propose effective solutions.

Data Handling & Workflow Development
Demonstrate advanced knowledge of Azure cloud services, including Azure Databricks, Data Factory, Data Lake Storage, and related technologies.
Utilize tools such as Alteryx and PowerBI for data analysis and visualization.
Apply understanding of audit processes, financial data structures, and risk assessment routines.
Explore and apply Azure AI services to enhance business processes.

Performance Optimization & Debugging
Debug, optimize, and tune performance for large-scale data processing workflows.

        

 

 Roles & responsibilities 
Role Overview: 
The Senior Associate 2 – Data Engineering Developer is a senior individual contributor responsible for designing, building, and optimizing scalable data pipelines, data models, and integration solutions. The role requires strong hands‑on expertise in modern data engineering tools and cloud platforms (Azure/AWS/GCP), along with the ability to work independently, collaborate with cross‑functional teams, and drive high‑quality data delivery.


Key Responsibilities:

Perform and oversee data transformation and integration projects using Azure Databricks, Azure Data Factory, and Python/PySpark.
Design and manage scalable data processing pipelines and workflows to support analytics and audit requirements.
Apply advanced concepts such as partitioning, optimization, and performance tuning for efficient data processing.
Integrate Databricks with ERP or third-party systems via APIs, developing robust business transformation logic.
Design, develop, and maintain robust ETL/ELT pipelines using modern data engineering tools. 
Implement scalable data ingestion from structured and unstructured sources. 
Ensure pipelines are optimized for performance, reliability, and reusability. 
Develop automated data quality checks, validation frameworks, and monitoring solutions.
Develop end‑to‑end data workflows using cloud services (Azure Data Factory, Synapse, Databricks
Debug, optimize, and resolve issues in large-scale data workflows, proposing effective solutions with minimal guidance.
Foster a collaborative and inclusive team environment, promoting knowledge sharing and best practices.
Maintain accurate project status for self and team, ensuring timely delivery of milestones.
Coach junior team members and facilitate knowledge transfer for engagements of varying complexity.
Resolve issues with minimal guidance and propose effective solutions.  

Data Handling & Workflow Development
Demonstrate advanced knowledge of Azure cloud services, including Azure Databricks, Data Factory, Data Lake Storage, and related technologies.
Utilize tools such as Alteryx and PowerBI for data analysis and visualization.
Apply understanding of audit processes, financial data structures, and risk assessment routines.
Explore and apply Azure AI services to enhance business processes.

Performance Optimization & Debugging
Debug, optimize, and tune performance for large-scale data processing workflows.

Mandatory  technical & functional skills
Proficiency in Azure, DataBricks Notebooks, SQL,  Python or Pyspark notebooks development.
Knowledge on any ERP and about the dataflows in ERP
Strong knowledge of ETL tools and processes.
Hands-on experience with Azure Databricks, Azure Data Factory (ADF), Azure Data Lake Storage (ADLS)
Comprehensive knowledge of Azure cloud services.
Experience with Databricks notebooks for building transformations and creating tables.

 This role is for you if you have  the below
Educational qualifications 
B. Tech/B.E/MCA (Computer Science / Information Technology)

Work experience
Minimum 6–8 years of experience in data engineering, with deep expertise in Azure technologies.

 

 

Experience Level

Entry Level

Job role

Work location
Work locationBangalore, Karnataka, India
Department
DepartmentData Science & Analytics
Role / Category
Role / CategoryData Science & Machine Learning
Employment type
Employment typeFull Time
Shift
ShiftDay Shift

Job requirements

Experience
ExperienceMin. 6 years

About company

Name
NameKpmg India Services Llp
Job posted by Kpmg India Services Llp

Similar jobs you can apply for

Software / Web Developer

SQL Developer

Infronex Systems
Bengaluru/Bangalore
₹25,000 - ₹65,000
Work from Office
Full Time
Any experience
Basic English
Big Basket

LOSS PREVENTION ASSOCIATE - HARLUR AND RAYASANDRA

Big Basket
Electronics City, Bengaluru/Bangalore
₹20,000 - ₹30,000
Work from Office
Full Time
Min. 6 months
Basic English

Quality Assurance Executive

Edizi Tools Private Limited
Electronics City, Bengaluru/Bangalore
₹19,000 - ₹24,000*
Work from Office
Full Time
Freshers only
Good (Intermediate / Advanced) English

Automation Test Engineer

MNC company
Kumaraswamy Layout, Bengaluru/Bangalore
₹1,20,000 - ₹1,49,999
Work from Office
Full Time
Min. 5 years
Good (Intermediate / Advanced) English

Web Developer

Infronex Systems
Bengaluru/Bangalore
₹2,700 - ₹70,000
Work from Office
Full Time
Any experience
Basic English
Future Revolution

Wordpress Developer

Future Revolution
Vijaya Nagar, Bengaluru/Bangalore
₹20,000 - ₹30,000
Work from Office
Full Time
Min. 1 year
Good (Intermediate / Advanced) English