Senior ETL Developer

Johnson Controls Ltd

Pune

Not disclosed

Work from Office

Full Time

Min. 5 years

Job Details

Job Description

Senior ETL Developer

Senior ETL Developer

Location: Pune

Employment Type: Full-time

About the Role:
We are seeking an experienced Senior ETL Developer to join our data engineering team. In this role, you will lead the design, development, and optimization of robust data pipelines that power our analytics, reporting, and business intelligence initiatives. You will play a key role in building scalable, reliable ETL/ELT processes in a modern cloud environment, with a strong emphasis on Azure-native tools and integration with leading data movement platforms.

Key Responsibilities:

  • Design, develop, and maintain complex data pipelines for extracting, transforming, and loading large volumes of data from diverse sources into Snowflake.
  • Lead end-to-end ETL/ELT solution architecture using Azure Data Factory (ADF) as the primary orchestration tool, including pipelines, data flows, linked services, triggers, and monitoring.
  • Implement and manage data ingestion processes using Fivetran for automated connector-based integrations, with the ability to customize and troubleshoot as needed.
  • Write efficient, reusable SQL scripts for data transformation, validation, quality checks, and performance tuning.
  • Develop custom Python scripts and modules to handle complex transformations, error handling, API integrations, or scenarios not covered by no-code/low-code tools.
  • Collaborate with data architects, analysts, BI developers, and stakeholders to gather requirements and translate business needs into technical data solutions.
  • Ensure data integrity, security, and compliance throughout the pipeline lifecycle.
  • Monitor, troubleshoot, and optimize pipeline performance, implementing alerting and logging mechanisms.
  • Mentor junior team members and contribute to best practices in data engineering.

Required Qualifications:

  • 5–10 years of hands-on experience in ETL development, with a proven track record of designing and implementing enterprise-grade data pipelines.
  • Strong expertise in Azure Data Factory (required) — including authoring pipelines, data flows (mapping & wrangling), integration runtimes, and CI/CD deployment practices.
  • Practical experience working with Fivetran for automated data ingestion and replication (familiarity required; deep hands-on experience preferred).
  • Advanced proficiency in SQL (query optimization, complex joins, window functions, stored procedures) — experience with Azure SQL Database, Snowflake, or similar preferred.
  • Solid Python programming skills for data manipulation (pandas, requests, etc.), custom transformations, and scripting automation.
  • Experience with cloud storage solutions (Azure Data Lake, Blob Storage) and modern data architectures.
  • Strong understanding of data modeling, schema design, and data quality principles.

Preferred Qualifications:

  • Experience with Snowflake cloud data platform.
  • Familiarity with CI/CD tools (Azure DevOps, Git) for data pipeline deployments.
  • Exposure to additional ETL/ELT tools (e.g., SSIS, Airflow, dbt) or reverse ETL platforms.
  • Knowledge of data governance, security (RBAC, encryption), and compliance standards.

Experience Level

Senior Level

Job role

Work location

Pune Cerebrum, India

Department

Data Science & Analytics

Role / Category

Data Science & Machine Learning

Employment type

Full Time

Shift

Day Shift

Job requirements

Experience

Min. 5 years

About company

Name

Johnson Controls Ltd

Job posted by Johnson Controls Ltd

Apply on company website