Senior Data Engineering Developer
Kpmg India Services LlpJob Description
Assistant Manager (SA2) - Developer (Jr Developer DataBricks)
Roles & responsibilities
Role Overview:
The Senior Associate 2 – Data Engineering Developer is a senior individual contributor responsible for designing, building, and optimizing scalable data pipelines, data models, and integration solutions. The role requires strong hands‑on expertise in modern data engineering tools and cloud platforms (Azure/AWS/GCP), along with the ability to work independently, collaborate with cross‑functional teams, and drive high‑quality data delivery.
Key Responsibilities:
Perform and oversee data transformation and integration projects using Azure Databricks, Azure Data Factory, and Python/PySpark.
Design and manage scalable data processing pipelines and workflows to support analytics and audit requirements.
Apply advanced concepts such as partitioning, optimization, and performance tuning for efficient data processing.
Integrate Databricks with ERP or third-party systems via APIs, developing robust business transformation logic.
Design, develop, and maintain robust ETL/ELT pipelines using modern data engineering tools.
Implement scalable data ingestion from structured and unstructured sources.
Ensure pipelines are optimized for performance, reliability, and reusability.
Develop automated data quality checks, validation frameworks, and monitoring solutions.
Develop end‑to‑end data workflows using cloud services (Azure Data Factory, Synapse, Databricks
Debug, optimize, and resolve issues in large-scale data workflows, proposing effective solutions with minimal guidance.
Foster a collaborative and inclusive team environment, promoting knowledge sharing and best practices.
Maintain accurate project status for self and team, ensuring timely delivery of milestones.
Coach junior team members and facilitate knowledge transfer for engagements of varying complexity.
Resolve issues with minimal guidance and propose effective solutions.
Data Handling & Workflow Development
Demonstrate advanced knowledge of Azure cloud services, including Azure Databricks, Data Factory, Data Lake Storage, and related technologies.
Utilize tools such as Alteryx and PowerBI for data analysis and visualization.
Apply understanding of audit processes, financial data structures, and risk assessment routines.
Explore and apply Azure AI services to enhance business processes.
Performance Optimization & Debugging
Debug, optimize, and tune performance for large-scale data processing workflows.
Roles & responsibilities
Role Overview:
The Senior Associate 2 – Data Engineering Developer is a senior individual contributor responsible for designing, building, and optimizing scalable data pipelines, data models, and integration solutions. The role requires strong hands‑on expertise in modern data engineering tools and cloud platforms (Azure/AWS/GCP), along with the ability to work independently, collaborate with cross‑functional teams, and drive high‑quality data delivery.
Key Responsibilities:
Perform and oversee data transformation and integration projects using Azure Databricks, Azure Data Factory, and Python/PySpark.
Design and manage scalable data processing pipelines and workflows to support analytics and audit requirements.
Apply advanced concepts such as partitioning, optimization, and performance tuning for efficient data processing.
Integrate Databricks with ERP or third-party systems via APIs, developing robust business transformation logic.
Design, develop, and maintain robust ETL/ELT pipelines using modern data engineering tools.
Implement scalable data ingestion from structured and unstructured sources.
Ensure pipelines are optimized for performance, reliability, and reusability.
Develop automated data quality checks, validation frameworks, and monitoring solutions.
Develop end‑to‑end data workflows using cloud services (Azure Data Factory, Synapse, Databricks
Debug, optimize, and resolve issues in large-scale data workflows, proposing effective solutions with minimal guidance.
Foster a collaborative and inclusive team environment, promoting knowledge sharing and best practices.
Maintain accurate project status for self and team, ensuring timely delivery of milestones.
Coach junior team members and facilitate knowledge transfer for engagements of varying complexity.
Resolve issues with minimal guidance and propose effective solutions.
Data Handling & Workflow Development
Demonstrate advanced knowledge of Azure cloud services, including Azure Databricks, Data Factory, Data Lake Storage, and related technologies.
Utilize tools such as Alteryx and PowerBI for data analysis and visualization.
Apply understanding of audit processes, financial data structures, and risk assessment routines.
Explore and apply Azure AI services to enhance business processes.
Performance Optimization & Debugging
Debug, optimize, and tune performance for large-scale data processing workflows.
Mandatory technical & functional skills
Proficiency in Azure, DataBricks Notebooks, SQL, Python or Pyspark notebooks development.
Knowledge on any ERP and about the dataflows in ERP
Strong knowledge of ETL tools and processes.
Hands-on experience with Azure Databricks, Azure Data Factory (ADF), Azure Data Lake Storage (ADLS)
Comprehensive knowledge of Azure cloud services.
Experience with Databricks notebooks for building transformations and creating tables.
This role is for you if you have the below
Educational qualifications
B. Tech/B.E/MCA (Computer Science / Information Technology)
Work experience
Minimum 6–8 years of experience in data engineering, with deep expertise in Azure technologies.
Experience Level
Entry LevelJob role
Job requirements
About company
Similar jobs you can apply for
Software / Web Developer
Software / Web Developer Intern
BNV Software
Quality Assurance Officer
Jai Finance India Limited
Quality Engineer
Ace Carbo Nitriders
Analyst
Rohini Enterprises
Package Consultant – SAP HANA SCM PM
360 Bytes Tech Venture Private Limited
DevOps Engineer
Digitory SolutionsYou can expect a minimum salary of 0 INR. The salary offered will depend on your skills, experience and performance in the interview.
The candidate should have completed the required education and people who have 6 to 8 years are eligible to apply for this job. You can apply for more jobs in Bengaluru/Bangalore to get hired quickly.
The candidate should have sound communication skills and sound communication skills for this job.
Both Male and Female candidates can apply for this job.
No, it's not a work from home job and can't be done online. You can explore and apply for other work from home jobs in Bengaluru/Bangalore at apna.
No work-related deposit needs to be made during your employment with the company.
Go to the apna app and apply for this job. Click on the apply button and call HR directly to schedule your interview.
The last date to apply for this job is . For more details, download apna app and find Full Time jobs in Bengaluru/Bangalore . Through apna, you can find jobs in 64 cities across India. Join NOW!