Google Cloud Platform Data Engineer
Sutherland Global Services Private LimitedHyderabad
Not disclosed
Job Description
GCP Data Engineer
Company Description
Sutherland is seeking a reliable and technical person to join us as Full Stack Developer who will play a key role in driving our continued product growth and innovation. If you are looking to build a fulfilling career and are confident you have the skills and experience to help us succeed, we want to work with you!!
Job Description
Key Responsibilities:
- Design and implement real-time data ingestion pipelines using Pub/Sub and Kafka Streams for healthcare data formats (HL7, FHIR)
- Build robust Bronze layer as the single source of truth storing raw, untransformed data in Cloud Storage
- Develop streaming ingestion patterns using Dataflow for real-time data capture with minimal transformation
- Implement batch loading processes using Dataproc for large-volume data from diverse sources (logs, databases, APIs)
- Apply schema inference and basic data type adjustments while preserving raw data lineage
- Design partitioning strategies in Cloud Storage for efficient historical data archival and retrieval
- Establish data landing zone controls including audit logging, versioning, and immutability patterns
- Create automated workflows using Cloud Composer for orchestrating ingestion pipelines
- Implement data catalog and metadata management for raw data assets
Qualifications
Required Skills:
- 5+ years experience with GCP services (Cloud Storage, Pub/Sub, Dataflow, Dataproc, Cloud Composer)
- Strong expertise in Apache Kafka, Kafka Streams, and event-driven architectures
- Proficiency in Python and/or Java for data pipeline development using Apache Beam SDK
- Experience with healthcare data standards (HL7, FHIR) and handling semi-structured data
- Hands-on experience with streaming frameworks (Apache Beam, Dataflow) for near-real-time ingestion
- Knowledge of file formats and compression (JSON, Avro, Parquet) for raw data storage
- Understanding of CDC patterns, incremental loading, and data versioning strategies
- Experience with Cloud Storage lifecycle management and cost optimization
Preferred Qualifications:
- GCP Professional Data Engineer certification
- Experience with Confluent Platform or Google Cloud managed Kafka (if applicable)
- Familiarity with healthcare compliance requirements (HIPAA) and data residency
- Background in log aggregation platforms (Fluentd, Logstash) and observability
- Knowledge of data lake security patterns and IAM controls
Additional Information
All your information will be kept confidential according to EEO guidelines.
Experience Level
Mid LevelJob role
Work locationHyderabad, TS, India
DepartmentData Science & Analytics
Role / CategoryDBA / Data warehousing
Employment typeFull Time
ShiftDay Shift
Job requirements
ExperienceMin. 5 years
About company
NameSutherland Global Services Private Limited
Job posted by Sutherland Global Services Private Limited
Similar jobs you can apply for
Manufacturing / ProductionQuality Control Engineer
Fujiyama TechnologiesBhagya Nagar Colony, Hyderabad
₹20,000 - ₹42,000*
Junior Full Stack Developer
Tapasya College of Commerce and ManagementMadhapur, Hyderabad
₹20,000 - ₹25,000

QA / QC Analyst
Vikrics (OPC) Private LimitedHimayatnagar, Hyderabad
₹15,000 - ₹60,000

Full Stack Web Developer
Nkxus Tecnovate Private LimitedBegumpet, Hyderabad
₹15,000 - ₹40,000*

Digital Marketing Executive
RMG FlexipackJubilee Hills, Hyderabad
₹20,000 - ₹30,000

Software Engineer
Mindwave InfomaticsBanjara Hills, Hyderabad
₹30,000 - ₹60,000