Synechron Technologies

Azure Data Engineer - Databricks Specialist

Synechron Technologies
Pune
Not disclosed
Work from OfficeWork from Office
Full TimeFull Time
Min. 2 yearsMin. 2 years

Job Description

Azure Data Engineer – Databricks, Lakehouse Architecture & Cloud Data Security

Job Summary

Synechron is seeking a highly skilled Azure Data Engineer with extensive expertise in Databricks and Lakehouse architecture to lead the development of scalable, high-performance data pipelines supporting enterprise analytics solutions. In this role, you will design, implement, and optimize large-scale data workflows across cloud platforms, ensuring compliance, security, and operational efficiency. Your contributions will enable data-driven decision-making and support our organization’s digital transformation objectives with innovative data management practices.


Software Requirements

Required:

  • Proven experience with Azure Cloud Services including Azure Data Factory, Azure Data Lake, Azure Databricks (latest runtime versions preferred) (5+ years)

  • Deep expertise in Databricks components: Delta Lake, Unity Catalog, Lakehouse architecture, Delta Live Pipelines, and Table Triggers

  • Strong hands-on knowledge of Spark and PySpark for large-scale data processing and transformation

  • Experience developing and managing Databricks Asset Bundles for deployment automation

  • Experience with SQL (Azure SQL, Data Warehouse, or relational databases) for data modeling and validation

  • Familiarity with GitLab or similar version control tools for collaboration and artifact management


Preferred:

  • Experience with streaming frameworks like Spark Streaming or Structured Streaming

  • Knowledge of advanced Databricks runtime optimizations and configuration tuning

  • Exposure to end-to-end data governance and security compliance frameworks (e.g., GDPR, HIPAA)


Overall Responsibilities

  • Lead design and implementation of data pipelines supporting enterprise analytics, reporting, and AI integrations across the Azure cloud ecosystem

  • Build, test, and deploy Lakehouse solutions using Databricks, Delta Lake, and related components

  • Optimize big data workflows for performance, cost-efficiency, and scalability in cloud environments

  • Develop automated deployment strategies using Databricks Asset Bundles and CD pipelines supporting continuous integration and delivery

  • Manage data security, access controls, and compliance using tools like Unity Catalog and Azure security features

  • Collaborate with analytics, data science, and security teams to incorporate AI/ML models into data workflows

  • Conduct performance tuning, troubleshoot issues, and continuously improve data quality and pipeline reliability

  • Maintain detailed documentation of architecture, data flow, security policies, and system configurations

  • Lead efforts to migrate legacy systems to cloud-native architectures supporting scalability and operational resilience


Technical Skills (By Category)

Programming and Data Processing (Essential):

  • PySpark, Spark SQL, Delta Lake (latest runtime preferred)

  • Python for data scripting and automation

  • SQL: Azure SQL, Data Warehouse, or equivalent


Frameworks & Libraries:

  • Databricks Delta Lake, Unity Catalog, Delta Live Pipelines

  • Spark structured streaming (preferred)

  • Data governance and metadata management tools


Cloud Technologies:

  • Azure Data Factory, Azure Data Lake, Azure Databricks, Azure Synapse (preferred)

  • Cloud security best practices and access management


Data Management & Governance:

  • Data lineage, data quality tools, security policies compliant with regulations

DevOps & Automation:

  • CI/CD pipelines supporting Azure DevOps, GitLab, or Jenkins

  • Infrastructure as Code: Terraform, Azure Resource Manager (ARM) templates


Experience Requirements

  • 5+ years of experience designing, developing, and supporting large-scale data pipelines within cloud environments, preferably Azure

  • Proven expertise in Databricks and Lakehouse architecture in enterprise settings

  • Demonstrated experience integrating AI/ML workflows within data pipelines (preferred)

  • Sound knowledge of data governance, security, and compliance practices supporting enterprise standards

  • Experience supporting migration from legacy data platforms to cloud-native architectures


Day-to-Day Activities

  • Architect and develop scalable data workflows supporting enterprise analytics, reporting, and AI initiatives

  • Build, tune, and optimize Delta Lake and Lakehouse data solutions for performance and reliability

  • Automate data pipeline deployments and infrastructure using Asset Bundles and CI/CD tools

  • Manage data security, privacy, and compliance leveraging Unity Catalog and cloud security features

  • Troubleshoot and resolve production issues, analyze system bottlenecks, and optimize data workflows

  • Collaborate with data scientists, BI teams, and enterprise architects to refine infrastructure and data models

  • Document architecture, data lineage, and security controls for audit and compliance readiness

  • Support data migration, platform upgrades, and cloud infrastructure provisioning


Qualifications

  • Bachelor’s or Master’s degree in Data Engineering, Computer Science, or related fields

  • 5+ years supporting large-scale data pipelines, cloud data architectures, and analytics platforms

  • Certifications such as Azure Data Engineer Associate or equivalent are preferred

  • Hands-on experience supporting regulated and secure data environments in enterprise organizations


Professional Competencies

  • Strong analytical and troubleshooting skills for complex data workflows and pipelines

  • Excellent stakeholder communication and collaboration skills

  • Leadership qualities for mentoring team members and guiding best practices

  • Adaptability to evolving data technologies, security standards, and industry regulations

  • Results-oriented focus on operational excellence, security, and scalability in data environments

S​YNECHRON’S DIVERSITY & INCLUSION STATEMENT

Diversity & Inclusion are fundamental to our culture, and Synechron is proud to be an equal opportunity workplace and is an affirmative action employer. Our Diversity, Equity, and Inclusion (DEI) initiative ‘Same Difference’ is committed to fostering an inclusive culture – promoting equality, diversity and an environment that is respectful to all. We strongly believe that a diverse workforce helps build stronger, successful businesses as a global company. We encourage applicants from across diverse backgrounds, race, ethnicities, religion, age, marital status, gender, sexual orientations, or disabilities to apply. We empower our global workforce by offering flexible workplace arrangements, mentoring, internal mobility, learning and development programs, and more.


All employment decisions at Synechron are based on business needs, job requirements and individual qualifications, without regard to the applicant’s gender, gender identity, sexual orientation, race, ethnicity, disabled or veteran status, or any other characteristic protected by law.

Candidate Application Notice

Experience Level

Senior Level

Job role

Work location
Work locationPune - Hinjewadi (Ascendas), India
Department
DepartmentData Science & Analytics
Role / Category
Role / CategoryDBA / Data warehousing
Employment type
Employment typeFull Time
Shift
ShiftDay Shift

Job requirements

Experience
ExperienceMin. 2 years

About company

Name
NameSynechron Technologies
Job posted by Synechron Technologies

Similar jobs you can apply for

Manufacturing / Production
Avirat Enterprise

Quality Control Engineer

Avirat Enterprise
Narhe, Pune
₹15,000 - ₹30,000*
Work from Office
Full Time
Min. 3 years
Basic English
Future Agents

AI Automation & Training Specialist

Future Agents
Hinjewadi, Pune
₹50,000 - ₹1,00,000*
Work from Office
Full Time
Min. 1 year
Good (Intermediate / Advanced) English
Iotas Solutions Private Limited

Embedded Software Developer

Iotas Solutions Private Limited
Hinjewadi, Pune
₹15,000 - ₹35,000
Work from Office
Full Time
Min. 2 years
Basic English
Eco Tech Engineers

Quality Engineer

Eco Tech Engineers
Pune
₹20,000 - ₹30,000
Work from Office
Full Time
Any experience
No English Required
Speed Express

Software Developer

Speed Express
Somwar Peth, Pune
₹20,000 - ₹25,000
Work from Office
Full Time
Min. 1 year
Good (Intermediate / Advanced) English

React Native Developer

Hr Consortium
Pune
₹0 - ₹1,20,000
Work from Office
Full Time
Min. 5 years
Good (Intermediate / Advanced) English
Azure Data Engineer - Databricks Specialist in Synechron Technologies | apna.co