Google Cloud Platform Data Engineer
Sutherland Global Services Private LimitedJob Description
GCP Data Engineer
Company Description
Sutherland is seeking a reliable and technical person to join us as Full Stack Developer who will play a key role in driving our continued product growth and innovation. If you are looking to build a fulfilling career and are confident you have the skills and experience to help us succeed, we want to work with you!!
Job Description
Key Responsibilities:
- Design and implement real-time data ingestion pipelines using Pub/Sub and Kafka Streams for healthcare data formats (HL7, FHIR)
- Build robust Bronze layer as the single source of truth storing raw, untransformed data in Cloud Storage
- Develop streaming ingestion patterns using Dataflow for real-time data capture with minimal transformation
- Implement batch loading processes using Dataproc for large-volume data from diverse sources (logs, databases, APIs)
- Apply schema inference and basic data type adjustments while preserving raw data lineage
- Design partitioning strategies in Cloud Storage for efficient historical data archival and retrieval
- Establish data landing zone controls including audit logging, versioning, and immutability patterns
- Create automated workflows using Cloud Composer for orchestrating ingestion pipelines
- Implement data catalog and metadata management for raw data assets
Qualifications
Required Skills:
- 5+ years experience with GCP services (Cloud Storage, Pub/Sub, Dataflow, Dataproc, Cloud Composer)
- Strong expertise in Apache Kafka, Kafka Streams, and event-driven architectures
- Proficiency in Python and/or Java for data pipeline development using Apache Beam SDK
- Experience with healthcare data standards (HL7, FHIR) and handling semi-structured data
- Hands-on experience with streaming frameworks (Apache Beam, Dataflow) for near-real-time ingestion
- Knowledge of file formats and compression (JSON, Avro, Parquet) for raw data storage
- Understanding of CDC patterns, incremental loading, and data versioning strategies
- Experience with Cloud Storage lifecycle management and cost optimization
Preferred Qualifications:
- GCP Professional Data Engineer certification
- Experience with Confluent Platform or Google Cloud managed Kafka (if applicable)
- Familiarity with healthcare compliance requirements (HIPAA) and data residency
- Background in log aggregation platforms (Fluentd, Logstash) and observability
- Knowledge of data lake security patterns and IAM controls
Additional Information
All your information will be kept confidential according to EEO guidelines.
Experience Level
Mid LevelJob role
Job requirements
About company
Similar jobs you can apply for
Software Development
Senior Application Engineer
Google India Pvt Ltd
Data Engineer
Goldman Sachs Services Pvt LtdOracle Database Programmer
Kpmg India Services Llp
Data Engineer
Goldman Sachs Services Pvt Ltd
Software Engineer
Goldman Sachs Services Pvt LtdApplication Developer
Accenture India Private LimitedYou can expect a minimum salary of 0 INR. The salary offered will depend on your skills, experience and performance in the interview.
The candidate should have completed the required education and people who have 5 to 31 years are eligible to apply for this job. You can apply for more jobs in Hyderabad to get hired quickly.
The candidate should have sound communication skills and sound communication skills for this job.
Both Male and Female candidates can apply for this job.
No, it's not a work from home job and can't be done online. You can explore and apply for other work from home jobs in Hyderabad at apna.
No work-related deposit needs to be made during your employment with the company.
Go to the apna app and apply for this job. Click on the apply button and call HR directly to schedule your interview.
The last date to apply for this job is . For more details, download apna app and find Full Time jobs in Hyderabad . Through apna, you can find jobs in 64 cities across India. Join NOW!