Goldman Sachs Services Pvt Ltd

Data Engineer - Lakehouse and AI Data Platform

Goldman Sachs Services Pvt Ltd
Hyderabad
Not disclosed
Work from OfficeWork from Office
Full TimeFull Time
Min. 1 yearMin. 1 year

Job Description

Data Engineering - Data, Lakehouse and AI Data Platform Engineer-Hyderabad-Analyst

Role Summary

As a Data Engineer in the Lakehouse and AI Data Platform team, you will design, build, test and support data pipelines and curated datasets on the firm’s modern data platform. You will work across ingestion, transformation, modelling, optimisation and data quality, helping to deliver data products that are reliable, scalable and fit for purpose.  Where there are gaps in platform functionality, you may also contribute to shared tooling or framework components that improve how the platform is used and operated.

The role is suited to engineers who are comfortable writing code, working with SQL and distributed data processing, and solving practical delivery problems in a team environment. More experienced candidates may also contribute to technical design, platform standards and the shaping of delivery approaches across a wider set of use cases.

Key Responsibilities

Pipeline Engineering

Build, enhance and support batch and streaming data pipelines on the Lakehouse and AI data platform.
Refactor or modernise existing data flows where needed to improve reliability, performance and maintainability.
Where needed, build reusable tooling to improve delivery, consistency and operational support.
Ensure data pipelines are production-ready, well tested and operationally supportable.
Data Modelling and Curation

Develop raw, refined and curated datasets that support analytics, reporting and AI use cases.
Apply sound data modelling principles to represent business entities, relationships and historical change accurately.
Work with consumers to shape data products that are usable, well documented and aligned to business needs.
Data Quality and Reconciliation

Implement controls to validate completeness, accuracy and consistency of data across pipelines and datasets.
Use reconciliation approaches to build confidence in production outputs and investigate breaks where they arise.
Contribute to clear standards for testing, monitoring and issue resolution.
Contribute to practical improvements in testing, monitoring or reconciliation tooling where these strengthen platform reliability and day-to-day delivery.
Skills and Experience

Required

1+ years of experience
Bachelor’s or master’s degree in a relevant discipline, or equivalent practical experience, with evidence of strong quantitative skills or data engineering expertise.
Strong hands-on programming experience in Python or Java.
Good working knowledge of SQL, including troubleshooting, optimization and data analysis.
Ability to learn new tools, internal platforms and delivery workflows quickly.
Familiarity with software engineering fundamentals, including version control, testing, release discipline and CI/CD practices.

Data Engineering Capability

Understanding of temporal data modelling, including the handling of historical state and change over time.
Knowledge of schema design, schema evolution and data compatibility considerations.
Understanding of partitioning, clustering and other techniques used to improve data performance at scale.
Ability to make sensible design choices across normalized and denormalized models, and between natural and surrogate keys.
Practical approach to data quality, reconciliation and root-cause analysis.
Experience building or supporting production data pipelines in a collaborative engineering environment.
Experience working with distributed data processing frameworks such as Apache Spark.
Working knowledge of common data formats such as JSON, Avro and Parquet.
Stronger ownership of technical design across multiple datasets or pipeline domains.
Experience guiding implementation standards, code quality and engineering practices within a team.
Ability to lead delivery for a workstream, manage dependencies and support less experienced engineers.

Technology Environment

The role will involve working with a modern and evolving data stack. Candidates are not expected to have deep expertise in every tool from day one but should bring relevant experience and the ability to work across comparable technologies.

Examples of technologies in scope include:

Data processing and logic: ANSI SQL, Apache Spark, Kafka
Data formats: JSON, Avro, Parquet
Platforms and storage: Snowflake, Apache Iceberg, Databricks, Hadoop ecosystem technologies, Sybase IQ
Engineering and deployment: CI/CD tooling, containerized or Kubernetes-based deployment approaches where relevant
You will also work with internal data management and platform tooling, so a practical and adaptable engineering mindset is important.

 

What We Are Looking For

We are looking for engineers who can deliver well-structured, reliable solutions in production and who take ownership of the quality of what they build. The role suits candidates who are technically strong, pragmatic and comfortable working in a fast-paced environment where data platforms support important business outcomes.

Stronger candidates will typically demonstrate:

sound judgement in technical trade-offs
attention to detail in data correctness and testing
a clear and structured approach to problem solving
willingness to work closely with stakeholders and partner teams
an interest in developing long-term expertise within the firm

Job role

Work location
Work locationHyderabad, Telangana, India
Department
DepartmentData Science & Analytics
Role / Category
Role / CategoryDBA / Data warehousing
Employment type
Employment typeFull Time
Shift
ShiftDay Shift

Job requirements

Experience
ExperienceMin. 1 year

About company

Name
NameGoldman Sachs Services Pvt Ltd
Job posted by Goldman Sachs Services Pvt Ltd

Similar jobs you can apply for

DevOps
JP Morgan Services India Pvt Ltd

DevOps Engineer

JP Morgan Services India Pvt Ltd
Hyderabad
Work from Office
Full Time
Min. 2 years
JP Morgan Services India Pvt Ltd

Lead Software Engineer

JP Morgan Services India Pvt Ltd
Hyderabad
Work from Office
Full Time
Min. 5 years
Kpmg India Services Llp

Application Support Engineer

Kpmg India Services Llp
Hyderabad
Work from Office
Full Time
Min. 5 years
Google India Pvt Ltd

Software Engineer Intern

Google India Pvt Ltd
Hyderabad
Work from Office
Full Time
Any experience
Clean Harbors

.NET Software Engineer

Clean Harbors
Hyderabad
Work from Office
Full Time
Min. 5 years
DP World Express Logistics Private Limited

DevOps Engineer

DP World Express Logistics Private Limited
Hyderabad
Work from Office
Full Time
Min. 1 year

You can expect a minimum salary of 0 INR. The salary offered will depend on your skills, experience and performance in the interview.

The candidate should have completed the required education and people who have 1 to 31 years are eligible to apply for this job. You can apply for more jobs in Hyderabad to get hired quickly.

The candidate should have sound communication skills and sound communication skills for this job.

Both Male and Female candidates can apply for this job.

No, it's not a work from home job and can't be done online. You can explore and apply for other work from home jobs in Hyderabad at apna.

No work-related deposit needs to be made during your employment with the company.

Go to the apna app and apply for this job. Click on the apply button and call HR directly to schedule your interview.

The last date to apply for this job is . For more details, download apna app and find Full Time jobs in Hyderabad . Through apna, you can find jobs in 64 cities across India. Join NOW!