Kpmg India Services Llp

Assistant Manager - GCP and Databricks Data Engineering

Kpmg India Services Llp
Bengaluru/Bangalore
Not disclosed
Work from OfficeWork from Office
Full TimeFull Time
Min. 7 yearsMin. 7 years

Job Description

GCP and Databricks Assistant Manager Hiring

  • Bachelors or higher degree in Computer Science or a related discipline; or equivalent (minimum 5 years work experience). 
  • At least 7+ years of consulting or client service delivery experience on Databricks and GCP
  • Proficient Experience on designing, building and operationalizing large-scale enterprise data solutions using at least four GCP services among Data Flow, Data Proc, Pub Sub, BigQuery, Cloud Functions, Composer, GCS
  • Proficient hands-on programming experience in Spark/Scala (python/java)
  • Proficient in building production level ETL/ELT data pipelines from data ingestion to consumption
  • Data Engineering knowledge (such as Data Lake, Data warehouse - Redshift/Hive/Snowflake, Integration, Migration) 
  • Excellent communicator (written and verbal formal and informal)
  • Experience using software version control tools (Git/Bitbucket/code commit).

  • Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution.

  • Ability to multi-task under pressure and work independently with minimal supervision.

  • Must be a team player and enjoy working in a cooperative and collaborative team environment.

  • Proficient in databricks concepts like volumes, notebooks, clusters, workflows etc

An GCP and Data Engineer is responsible for designing, building, and maintaining the data infrastructure for an organization using GCP and Databricks cloud services. This includes creating data pipelines, integrating data from various sources, and implementing data security and privacy measures. The GCP + Databricks Data Engineer will also be responsible for monitoring and troubleshooting data flows and optimizing data storage and processing for performance and cost efficiency. 

Preferred Skills

  • Proficient Experience on designing, building and operationalizing large-scale enterprise data solutions using at least four GCP services among Data Flow, Data Proc, Pub Sub, BigQuery, Cloud Functions, Composer, GCS
  • Proficient hands-on programming experience in Spark/Scala (python/java)
  • Proficient in building production level ETL/ELT data pipelines from data ingestion to consumption
  • Data Engineering knowledge (such as Data Lake, Data warehouse - Redshift/Hive/Snowflake, Integration, Migration) 
  • Excellent communicator (written and verbal formal and informal)
  • Experience using software version control tools (Git/Bitbucket/code commit).

  • Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution.

  • Ability to multi-task under pressure and work independently with minimal supervision.

  • Must be a team player and enjoy working in a cooperative and collaborative team environment.

  • Proficient in databricks concepts like volumes, notebooks, clusters, workflows etc

  • Proficient Experience on designing, building and operationalizing large-scale enterprise data solutions using at least four GCP services among Data Flow, Data Proc, Pub Sub, BigQuery, Cloud Functions, Composer, GCS
  • Proficient hands-on programming experience in Spark/Scala (python/java)
  • Proficient in building production level ETL/ELT data pipelines from data ingestion to consumption
  • Data Engineering knowledge (such as Data Lake, Data warehouse - Redshift/Hive/Snowflake, Integration, Migration) 
  • Excellent communicator (written and verbal formal and informal)
  • Experience using software version control tools (Git/Bitbucket/code commit).

  • Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution.

  • Ability to multi-task under pressure and work independently with minimal supervision.

  • Must be a team player and enjoy working in a cooperative and collaborative team environment.

  • Proficient in databricks concepts like volumes, notebooks, clusters, workflows etc

Experience Level

Mid Level

Job role

Work location
Work locationBangalore, Karnataka, India
Department
DepartmentData Science & Analytics
Role / Category
Role / CategoryDBA / Data warehousing
Employment type
Employment typeFull Time
Shift
ShiftDay Shift

Job requirements

Experience
ExperienceMin. 7 years

About company

Name
NameKpmg India Services Llp
Job posted by Kpmg India Services Llp

Similar jobs you can apply for

Software / Web Developer
BNV Software

Software / Web Developer Intern

BNV Software
Domlur, Bengaluru/Bangalore
₹12,000 - ₹18,000
Work from Office
Full Time
Freshers only
Good (Intermediate / Advanced) English

Mobile App Developer

Alphameet Innovate Private Limited
Rayasandra, Bengaluru/Bangalore
₹25,000 - ₹30,000
Work from Office
Full Time
Min. 2 years
Good (Intermediate / Advanced) English

Marathi Native Speaker – AI Speech Recording Project (Remote)

Arctic Engines
Work From Home
₹25,000 - ₹45,000
Part Time
Full Time
Min. 6 months
Basic English
Closed Circuit AI Private Limited

Field Executive

Closed Circuit AI Private Limited
Bengaluru/Bangalore
₹25,000 - ₹40,000*
Field Job
Full Time
Any experience
Basic English

Odia Native Speaker – AI Speech Recording Project (Remote)

Arctic Engines
Work From Home
₹25,000 - ₹45,000
Full Time
Min. 6 months
Basic English
Airdit Software Services Private Limited

Sales Executive (Fresher)-MBA Sales And Marketing Fresher

Airdit Software Services Private Limited
Bengaluru/Bangalore
₹50,000 - ₹75,000
Work from Office
Full Time
Night Shift
Any experience
Good (Intermediate / Advanced) English