Data Platform Engineer

Accenture India Private Limited

Bengaluru/Bangalore

Not disclosed

Work from Office

Full Time

Min. 3 years

Job Details

Job Description

Custom Software Engineer

Project Role : Custom Software Engineer
Project Role Description : Develop custom software solutions to design, code, and enhance components across systems or applications. Use modern frameworks and agile practices to deliver scalable, high-performing solutions tailored to specific business needs.
Must have skills : Python (Programming Language)
Good to have skills : PostgreSQL
Minimum 3 year(s) of experience is required
Educational Qualification : 10 years full time education

Role Overview:

The Data Platform Engineer owns the design, build, and evolution of the firm s cloud-native data platform. This role focuses on ingestion, transformation, orchestration, metadata, and data quality across AWS and Snowflake.
Core Responsibilities
Build and maintain data ingestion pipelines from internal and external sources into AWS S3 (Raw & Standard zones)
Develop and operate Airflow DAGs on EKS for ingestion, transformation, notifications, and data quality
Implement and maintain dbt models for Snowflake transformations
Manage Snowflake integrations, including external stages (S3) and environment separation (Prod / Non-Prod)
Own data quality frameworks, checks, and orchestration metadata (stored in RDS)
Integrate with AWS Glue / Data Catalog (SaaS) for metadata, lineage, and discovery with Atlan
Ensure data reliability, scalability, and cost efficiency across the platform
Partner with analytics and downstream consumers (BI tools, reporting, Cognos/Tableau)

Required Skills:

Strong experience with cloud data platforms on AWS
Hands-on experience with Airflow, dbt, and Snowflake
Proficiency in SQL and Python
Experience working with S3-based data lakes (raw/curated patterns)
Familiarity with metadata management, data catalogs, and data governance concepts
Containers
Skills Must Haves:
AWS Services (IAM, S3, SNS, SQS, API Gateway, Lambda, DynamoDB, EKS)
Airflow,
SQL and Python
CI/CD systems (GitHub, Azure DevOps, Octopus)
Linux and scripting skills (Bash, Python, Powershell)

Nice To Have(s):
dbt, and Snowflake
Pyspark and knowledge of Glue Catalog, IAM cross-account access, and secure data sharing
Experience implementing data quality frameworks (custom or open-source)
Exposure to hybrid / on-prem integrations (e.g., file drops, legacy ETL)
Experience with hybrid architectures (on-prem + cloud)

Job role

Work location

Bengaluru

Department

IT & Information Security

Role / Category

IT Infrastructure Services

Employment type

Full Time

Shift

Day Shift

Job requirements

Experience

Min. 3 years

About company

Name

Accenture India Private Limited

Job posted by Accenture India Private Limited

Apply on company website