Deutsche Bank

Google Cloud Platform Data Engineer - Assistant Vice President

Deutsche Bank
Pune
Not disclosed
Work from OfficeWork from Office
Full TimeFull Time
Min. 10 yearsMin. 10 years

Job Description

GCP Data and AI Engineer, AVP

Job Description:

Job Title: GCP Data and AI Engineer, AVP

Location: Pune, India

Role Description:

As an AVP, Senior GCP Data and AI Engineer, you will be a lead technologist and individual contributor at the forefront of our data and artificial intelligence initiatives. This role requires an expert builder with a deep, practical understanding of system architecture and modern design patterns. You will be responsible for the hands-on development of our most complex data pipelines and generative AI solutions, from design through to production deployment, ensuring the solutions you build are scalable, resilient, and forward-thinking

What we’ll offer you

As part of our flexible scheme, here are just some of the benefits that you’ll enjoy

  • Best in class leave policy.

  • Gender neutral parental leaves

  • 100% reimbursement under childcare assistance benefit (gender neutral)

  • Sponsorship for Industry relevant certifications and education

  • Employee Assistance Program for you and your family members

  • Comprehensive Hospitalization Insurance for you and your dependents

  • Accident and Term life Insurance

  • Complementary Health screening for 35 yrs. and above

Your key responsibilities:

As a Senior Engineer, your hands-on responsibilities will include:

AI & Application Development:

  • Design, build, and operationalize sophisticated AI applications, including production-grade RAG pipelines on GCP.

  • Leverage the Vertex AI platform to train, fine-tune, and deploy machine learning and generative AI models.

  • Design and implement complex agentic workflows to automate and optimize business processes.

  • Develop, consume, and host mission-critical REST APIs using Python, deployed as containerized applications on Cloud Run.

Data Engineering & Pipelines:

  • Design, develop, and maintain scalable batch and streaming data pipelines using Python, SQL, Cloud Composer, and Pub/Sub.

  • Develop and optimize complex SQL queries in BigQuery for large-scale data analysis, extraction, and transformation.

  • Automate data quality and ETL testing procedures using Python and SQL.

Infrastructure & Operations:

  • Develop and deploy all cloud infrastructure as code using Terraform.

  • Implement and manage CI/CD pipelines to ensure smooth and reliable deployment of data and AI applications.

  • Serve as a key escalation point for complex L3 production issues, providing expert troubleshooting and resolution.

Your skills and experience

Mandatory Engineering Skills:

  • 10+ years of IT experience as a hands-on engineer with a proven track record of building and deploying large-scale data systems.

  • Expert-Level Languages: Deep proficiency in Python and advanced SQL, including complex query optimization and data modeling.

  • Cloud Platform: Extensive hands-on experience building solutions on GCP. Experience with Azure or AWS is also valuable.

  • Core GCP Services: Mastery of BigQueryCloud Composer (or Apache Airflow), and Cloud Run in production environments.

  • Infrastructure & Automation: Proficient in defining infrastructure as code using Terraform and designing robust CI/CD pipelines.

Advanced Technical & Design Expertise: We expect candidates to have deep, practical experience in the following areas, with a portfolio of projects demonstrating their expertise.

  • System Design: Strong understanding of modern data patterns, distributed systems, and architectural best practices.

  • Vertex AI Platform: Extensive, hands-on experience building solutions using the Vertex AI platform, including model fine-tuning, custom training, and scalable endpoint deployment.

  • Generative AI Systems: Proven experience building and deploying production-grade RAG (Retrieval-Augmented Generation) systems. Deep understanding of LLMs (like Gemini), vector databases, and embedding models.

  • Agentic Architecture: Strong practical knowledge of Agentic Patterns, multi-agent systems, and complex workflows. Awareness of emerging frameworks (Google ADK) and concepts like a2a (agent-to-agent) protocol and agent cards is highly desirable.

MLOps & Application Hosting: Demonstrable experience in operationalizing AI models and hosting applications using containers (Docker).

How we’ll support you

  • Training and development to help you excel in your career

  • Coaching and support from experts in your team

  • A culture of continuous learning to aid progression

  • A range of flexible benefits that you can tailor to suit your needs

About us and our teams

Please visit our company website for further information:

https://www.db.com/company/company.html

We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively.

Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group.

We welcome applications from all people and promote a positive, fair and inclusive work environment.

Experience Level

Executive Level

Job role

Work location
Work locationPune - Margarpatta, India
Department
DepartmentData Science & Analytics
Role / Category
Role / CategoryDBA / Data warehousing
Employment type
Employment typeFull Time
Shift
ShiftDay Shift

Job requirements

Experience
ExperienceMin. 10 years

About company

Name
NameDeutsche Bank
Job posted by Deutsche Bank

Similar jobs you can apply for

Software / Web Developer
Wyse Biometrics Systems Private Limited

Software Tester

Wyse Biometrics Systems Private Limited
Parvati Paytha, Pune
₹15,000 - ₹16,500
Work from Office
Full Time
Any experience
Basic English
Eco Tech Engineers

Quality Engineer

Eco Tech Engineers
Pune
₹25,000 - ₹30,000
Work from Office
Full Time
Any experience
Basic English
Biovision Process Engineering Pvt. Ltd.

QA / QC Executive

Biovision Process Engineering Pvt. Ltd.
Pune
₹10,000 - ₹15,000
Work from Office
Full Time
Any experience
Basic English

Salesforce Developer

THE NaukriWala
Pune
₹60,000 - ₹1,49,000
Work from Office
Full Time
Min. 1 year
Basic English

Quality Engineer

Nigasavi Solutions LLP
Kothrud, Pune
₹20,000 - ₹30,000
Work from Office
Full Time
Min. 5 years
Basic English

Java Developer

THE NaukriWala
Pune
₹40,000 - ₹1,49,000
Work from Office
Full Time
Min. 2 years
Basic English

You can expect a minimum salary of 0 INR. The salary offered will depend on your skills, experience and performance in the interview.

The candidate should have completed the required education and people who have 10 to 31 years are eligible to apply for this job. You can apply for more jobs in Pune to get hired quickly.

The candidate should have sound communication skills and sound communication skills for this job.

Both Male and Female candidates can apply for this job.

No, it's not a work from home job and can't be done online. You can explore and apply for other work from home jobs in Pune at apna.

No work-related deposit needs to be made during your employment with the company.

Go to the apna app and apply for this job. Click on the apply button and call HR directly to schedule your interview.

The last date to apply for this job is . For more details, download apna app and find Full Time jobs in Pune . Through apna, you can find jobs in 64 cities across India. Join NOW!