AWS Data Engineering Manager
Ernst & Young LLP ( EY India )
Apply on company website
AWS Data Engineering Manager
Ernst & Young LLP ( EY India )
Pune
Not disclosed
Job Details
Job Description
EY - GDS Consulting - AI And DATA - AWS Data Engineer - Manager
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all.
EY GDS – Data & Analytics - D and A – AWS Data Engineering Manager
As part of the EY-GDS Data & Analytics (D&A) organization, you will lead teams in solving complex business challenges through large-scale data engineering and cloud transformation initiatives. Our clients span Banking, Insurance, Manufacturing, Healthcare, Retail, Supply Chain, Finance, and other global industries. You will help organizations modernize their data ecosystems, build scalable analytics platforms, and unlock actionable insights.
The Opportunity
We are seeking an experienced AWS Data Engineering Manager with deep expertise in PySpark, SQL, ETL engineering, AWS data services, and modern data lakehouse architectures. This is a leadership role within a rapidly growing practice, offering the chance to guide high-impact, enterprise-grade data programs.
Key Responsibilities
Technical Leadership
- Architect, design, and oversee scalable ETL/ELT pipelines using PySpark, SQL, Python, and AWS data services.
- Lead the implementation of data lakehouse solutions using AWS S3, Glue, Iceberg, and other cloud-native components.
- Drive migration of on-premises data workloads to AWS, ensuring performance, reliability, scalability, and cost optimization.
- Define and standardize metadata-driven ingestion frameworks and medallion (Bronze/Silver/Gold) architecture patterns.
- Provide direction on Spark job optimization, distributed processing, and performance tuning.
Delivery & Governance
- Lead teams in building and operationalizing data pipelines using orchestration tools such as Astronomer (Airflow), AWS Step Functions, and managed workflows.
- Ensure adherence to data quality frameworks, best practices, and coding standards.
- Review architecture, design, and code artifacts; troubleshoot complex technical issues.
- Collaborate with cross-functional teams including BI, data science, and product teams for seamless data delivery.
People & Stakeholder Management
- Mentor and guide data engineers and senior engineers in technical delivery.
- Work directly with business and technical stakeholders, translating requirements into scalable solutions.
- Facilitate Agile/Scrum delivery across multi-functional teams.
- Optional (Good to Have)
- Experience with Databricks (Delta Lake, PySpark notebooks, Unity Catalog).
- Familiarity with modern governance frameworks and MLOps/DevOps integrations.
Skills and Attributes for Success
- 9+ years of overall IT experience, with 5+ years in AWS-based data engineering and at least 2+ years in a leadership/managerial capacity.
- Advanced hands-on expertise in:
- PySpark, SQL, Python
- AWS S3, Glue, Lambda, Step Functions, CloudWatch
- ETL/ELT design and data lake/lakehouse architectures
- Apache Iceberg or similar table formats
- Airflow/Astronomer or equivalent orchestration tools
- Strong understanding of structured/semi-structured data formats (Parquet, JSON, CSV, XML).
- In-depth knowledge of DW concepts, dimensional modeling, and performance optimization.
- Practical experience with CI/CD frameworks (GitHub, Azure DevOps, Jenkins).
- Proven analytical, problem-solving, and troubleshooting capabilities.
- Excellent communication, leadership, and stakeholder management skills.
To Qualify, You Must Have
- Bachelor’s or Master’s degree in Computer Science, IT, or related field.
- 9+ years of industry experience with significant hands-on exposure to cloud data engineering.
- Experience designing and managing production-grade AWS data platforms.
- Demonstrated success leading Agile/Scrum delivery teams.
- Ability to own deliverables end-to-end with a proactive, self-driven approach.
Ideally, You’ll Also Have
- Prior client-facing experience and ability to influence senior stakeholders.
- Experience delivering in multi-environment, large-scale enterprise data landscapes.
- Exposure to Databricks, Delta Lake, or governance frameworks such as Unity Catalog.
What We Look For
- We seek technically strong, innovative, and adaptable leaders who enjoy mentoring teams, solving complex data challenges, and driving continuous improvement in a fast-paced environment.
What Working at EY Offers
- Opportunities to work on diverse, industry-leading, and high-impact data programs.
- Access to continuous learning, coaching, and tailored career development.
- A collaborative, inclusive, and global work environment.
- Flexibility to manage work in a way that suits you best.
- A culture that supports innovation, knowledge-sharing, and growth.
EY | Building a better working world
EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets.
Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate.
Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Job role
Work location
Pune, MH, IN, 411014
Department
IT & Information Security
Role / Category
IT Security
Employment type
Full Time
Shift
Day Shift
Job requirements
Experience
Min. 9 years
About company
Name
Ernst & Young LLP ( EY India )
Job posted by Ernst & Young LLP ( EY India )
Apply on company website