JP Morgan Services India Pvt Ltd

Lead Data Engineer

JP Morgan Services India Pvt Ltd
Pune
Not disclosed
Work from OfficeWork from Office
Full TimeFull Time
Min. 5 yearsMin. 5 years

Job Description

Lead Data Engineer

Embrace this pivotal role as an essential member of a high performing team dedicated to reaching new heights in data engineering. Your contributions will be instrumental in shaping the future of one of the world’s largest and most influential companies.


As a Lead Data Engineer at JPMorgan Chase within the Consumer & Community Banking in Connected Commerce Travel Technology Team, you are an integral part of an agile team that works to enhance, build, and deliver large-scale data collection, storage, access, and analytics in a secure, stable, and scalable way. Leverage your deep technical expertise and problem solving capabilities to lead and drive significant business impact and tackle a diverse array of challenges that span multiple data pipelines, data architectures, and other data consumers.

Job responsibilities

  • Lead the design, development, and maintenance of robust, scalable cloud-based data processing pipelines and infrastructure, ensuring adherence to engineering standards, governance frameworks, and industry best practices.
  • Architect and refine data models for large-scale datasets, optimizing for efficient storage, high-performance retrieval, and advanced analytics while upholding data integrity and quality.
  • Partner with cross-functional teams to translate complex business requirements into effective, scalable data engineering solutions that drive organizational value.
  • Champion a culture of innovation and continuous improvement, proactively identifying and implementing enhancements to data infrastructure, processing workflows, and analytics capabilities.
  • Define and execute data strategy, including the development of enterprise data models and the management of end-to-end data infrastructure—from design and construction to installation and ongoing maintenance of large-scale processing systems.
  • Drive data quality initiatives, ensure seamless data accessibility for analysts and data scientists, and maintain strict compliance with data governance and regulatory requirements.
  • Align data engineering practices with business objectives, ensuring solutions are both technically sound and strategically relevant.
  • Author, review, and approve technical requirements and architectural designs, and lead process re-engineering efforts to deliver cost-effective, high-impact business solution

 

Required qualifications, capabilities, and skills

  • Expert in at least two programming languages (Python, and Java)
  • Expert in at least one distributed data processing framework (Spark)
  • Expert in at least one cloud data Lakehouse platforms (AWS Data lake services or Databricks, if not Hadoop),
  • Expert in at least one scheduling/orchestration tools ( Airflow, alternatively AWS Step Functions or similar) 
  • Expert with relational and NoSQL databases.
  • Expert in data structures, data serialization formats (JSON, AVRO, Protobuf, or similar), and big-data storage formats (Parquet, Iceberg, or similar)
  • Proficiency in microservices architecture, serverless computing and distributed cluster computing tools such as Docker, Kubernetes etc.
  • Proficiency in one or more data modelling techniques (Dimensional, Data Vault, Kimball, Inmon, etc.)
  • Experience with test-driven development (TDD) or behavior-driven development (BDD) practices, as well as working with continuous integration and continuous deployment (CI/CD) tools.
  • Experience organizing and leading design workshops, coding sessions, and hackathons to promote a culture of excellence and innovation in data engineering.
  • Expertise in architecting reusable, future-ready design patterns that address diverse use cases across the organization.

 

Preferred qualifications, capabilities, and skills
  • Hands-on experience with Infrastructure as Code (IaC) tools, preferably Terraform; experience with AWS CloudFormation is also valued.
  • Proficiency in cloud-based data pipeline technologies such as Fivetran, dbt, Prophecy.io, or similar platforms.
  • Experience with front-end development frameworks, ideally React; familiarity with Angular is also beneficial.
  • Strong working knowledge of the Snowflake data platform.
  • Experience in budgeting and resource allocation for data engineering projects.
  • Proven ability to manage vendor relationships effectively.

Experience Level

Senior Level

Job role

Work location
Work locationPune, Maharashtra, India
Department
DepartmentData Science & Analytics
Role / Category
Role / CategoryData Science & Machine Learning
Employment type
Employment typeFull Time
Shift
ShiftDay Shift

Job requirements

Experience
ExperienceMin. 5 years

About company

Name
NameJP Morgan Services India Pvt Ltd
Job posted by JP Morgan Services India Pvt Ltd

Similar jobs you can apply for

Manufacturing / Production
Avirat Enterprise

Quality Control Engineer

Avirat Enterprise
Narhe, Pune
₹15,000 - ₹30,000*
Work from Office
Full Time
Min. 3 years
Basic English

SAP ABAP

THE NaukriWala
Pune
₹50,000 - ₹1,20,000
Work from Office
Full Time
Min. 2 years
Good (Intermediate / Advanced) English
S R Engineering

Quality Control Engineer

S R Engineering
Ambegaon Budruk, Pune
₹12,000 - ₹25,000
Work from Office
Full Time
Night Shift
Min. 1 year
No English Required
Future Agents

AI Automation & Training Specialist

Future Agents
Hinjewadi, Pune
₹50,000 - ₹1,00,000*
Work from Office
Full Time
Min. 1 year
Good (Intermediate / Advanced) English
Iotas Solutions Private Limited

Embedded Software Developer

Iotas Solutions Private Limited
Hinjewadi, Pune
₹15,000 - ₹35,000
Work from Office
Full Time
Min. 2 years
Basic English
Speed Express

Software Developer

Speed Express
Somwar Peth, Pune
₹20,000 - ₹25,000
Work from Office
Full Time
Min. 1 year
Good (Intermediate / Advanced) English