Data Engineer

Hirdhav Technologies Private Limited

Goregaon West, Mumbai/Bombay

₹60,000 - ₹80,000 monthly

Fixed

60000 - ₹80000

Earning Potential

80,000

Work from Office

Full Time

Min. 3 years

Good (Intermediate / Advanced) English

Job Details

Interview Details

Job highlights

Urgently hiring

55 applicants

Job Description

Key Responsibilities


• ETL and Data Pipeline Development

o Design and implement end-to-end, cloud-native data pipelines on Azure for ingesting, transforming, and delivering client and transaction data to external applications and APIs.

o Modernise data integrations by replacing legacy CSV-based feeds with automated, scalable, and resilient data products.

o Develop metadata-driven ingestion frameworks using Azure Data Factory and Databricks.

o Ensure data delivery meets required formats, structures, quality standards, and delivery schedules for external integrations.

o Design, build, and maintain enterprise data assets, including the Enterprise Data Warehouse (EDW) using Data Vault 2.0, data marts, and business-oriented data products.

o Build and support Azure Analysis Services models, including both multidimensional and tabular cubes.


• Data Analysis and Wrangling

o Collaborate closely with end users, Business Data Analysts, and Product Owners to understand business objectives and data requirements.

o Perform ad hoc data exploration, profiling, and wrangling using Azure Synapse Analytics, Databricks, and SQL.

o Translate business requirements and data mappings into technical ETL logic and robust data models.


• Non-Functional Delivery and Controls

o Define and implement non-functional requirements covering latency, throughput, monitoring, alerting, resilience, and recovery.

o Build automated reconciliation, data completeness checks, and exception handling mechanisms.

o Ensure all solutions comply with standards for data quality, security, auditability, and regulatory requirements.


• Integration and Testing

o Develop and execute integration tests to validate API interactions, ensuring payloads are correctly received, processed, and persisted.

o Work with API specifications, including authentication mechanisms, pagination, and error handling, to build robust and reliable integrations.


• Deployment, DevOps and Operations

o Deliver solutions through Azure DevOps pipelines, adhering to best practices for version control, code reviews, CI/CD, and release management.

o Build and maintain infrastructure-as-code (IaC) deployment pipelines.

o Support deployments to lower and production environments, including operational support during go-live.

o Share support and operational responsibilities within the engineering team.


• Collaboration and Architecture

o Collaborate with data architects, platform engineers, and compliance teams to ensure alignment with the overall data strategy, regulatory requirements, and architectural standards.

o Contribute to the design and documentation of data architecture supporting screening, transaction monitoring, and broader analytics use cases.


Essential Technical Skills

• Strong background in data engineering, ETL development, and data integration using Azure technologies such as Data Factory, Databricks, Synapse Analytics, and ADLS Gen2.

• Excellent SQL / T-SQL skills, including stored procedures, functions, and performance optimisation.

• Extensive hands-on experience with SQL Server and SSIS.

• Proven experience integrating with REST APIs, including authentication, pagination, error handling, and performance optimisation.

• Demonstrated ability to build resilient, high-performance ETL and orchestration processes

• Strong understanding of data modelling methodologies, including Data Vault 2.0 and dimensional (Kimball) modelling.

• Experience designing, building, and maintaining Azure Analysis Services models (tabular and multidimensional).

• Proficient in delivering non-functional excellence, including monitoring, alerting, reconciliation, resilience, and performance tuning.

• Experience with source control (preferably Git) and CI/CD tools such as Azure DevOps.

• Solid understanding of deployment pipelines and infrastructure-as-code (IaC) concepts.


Core Competencies

• Ability to translate business and data requirements into scalable, robust engineering solutions.

• Strong data analysis and problem-solving skills, with keen attention to detail.

• Excellent collaboration skills across engineering, architecture, compliance, and business teams.

• Comfortable working in Agile and DevOps-driven delivery environment


Desirable / Advantageous

• Experience with financial crime or client screening platforms such as Dow Jones, Actimize, or similar solutions.

• Familiarity with event-driven architectures, data products, or microservices-based systems.

• Knowledge of banking client data models and KYC, AML, and sanctions control frameworks.

• Understanding of financial markets.

• Some programming experience in C# and .NET Core.

Job role

Work location

Goregaon West, Mumbai, Maharashtra, India

Department

Data Science & Analytics

Role / Category

Data Science & Machine Learning

Employment type

Full Time

Shift

Day Shift

Job requirements

Experience

Min. 3 years

Education

Graduate

English level

Good (Intermediate / Advanced) English

Gender

Any gender

About company

Name

Hirdhav Technologies Private Limited

Address

Goregaon West, Mumbai, Maharashtra, India

Job posted by Hirdhav Technologies Private Limited

FAQs about this job

Show all

Apply for job