Data Engineering Manager - Azure & GCP Platforms
PriceWaterhouseCoopers Pvt Ltd ( PWC )Job Description
IN-Manager_ Azure DE + GCP _Data and Analytics_Advisory_Bangalore
Line of Service
AdvisoryIndustry/Sector
Not ApplicableSpecialism
Data, Analytics & AIManagement Level
ManagerJob Description & Summary
At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth.In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions.*Why PWCAt PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us.At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. "
Job Description & Summary:
A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge.
Responsibilities:
Design, develop, and maintain scalable data pipelines using Python, PySpark, and Spark (Scala/PySpark) across Azure and GCP platforms.
Build and manage data workflows using Databricks Workflows and Apache Airflow / Cloud Composer DAGs.
Develop and optimize Delta Lake tables on Databricks, ensuring data reliability, performance, and governance.
Implement and manage Databricks Unity Catalog for data access control and metadata management.
Work with BigQuery for large-scale data warehousing and analytics.
Develop event-driven and batch data processing solutions using Pub/Sub, Cloud Dataflow, and Cloud Functions.
Implement ML pipelines on Databricks, including experimentation tracking using MLflow.
Collaborate with data scientists, analytics teams, and business stakeholders to deliver end-to-end data solutions.
Ensure best practices for data quality, security, cost optimization, and performance tuning.
Support CI/CD pipelines and production deployments for data engineering workloads.
Mandatory skill sets:
Azure Databricks
Python, PySpark
Advanced Spark (PySpark / Scala)
Databricks Delta Tables
Databricks Workflows
Unity Catalog
Apache Airflow / Cloud Composer DAGs
BigQuery
GCP Native Services:
BigQuery
Cloud Composer
Cloud Functions
Pub/Sub
Preferred skill sets:
Machine Learning on Databricks
MLflow for experiment tracking and model lifecycle management
Google Cloud Dataflow
Cloud Data Fusion
Experience with event-driven architecture
Exposure to multi-cloud (Azure + GCP) data platforms
Knowledge of data governance, security, and compliance best practices
Years of experience required:
6 to 10+ Years
Education qualification:
BE, B.Tech, ME, M,Tech, MBA, MCA (60% above)
Education (if blank, degree and/or field of study not specified)
Degrees/Field of Study required: Bachelor of Engineering, Master DegreeDegrees/Field of Study preferred:Certifications (if blank, certifications not specified)
Required Skills
GCP DataflowOptional Skills
Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Coaching and Feedback, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling {+ 32 more}Desired Languages (If blank, desired languages not specified)
Travel Requirements
Not SpecifiedAvailable for Work Visa Sponsorship?
NoGovernment Clearance Required?
NoJob Posting End Date
May 14, 2026Experience Level
Mid LevelJob role
Job requirements
About company
Similar jobs you can apply for
Software / Web DeveloperMobile App Developer
Alphameet Innovate Private LimitedMarathi Native Speaker – AI Speech Recording Project (Remote)
Arctic Engines
Field Executive
Closed Circuit AI Private LimitedUrdu Native Speaker – AI Speech Recording Project (Remote)
Arctic EnginesOdia Native Speaker – AI Speech Recording Project (Remote)
Arctic Engines