Synechron Technologies

Associate Specialist - Python Data Engineer

Synechron Technologies
Pune
Not disclosed
Work from OfficeWork from Office
Full TimeFull Time
Min. 10 yearsMin. 10 years

Job Description

Data Engineer (Python, PySpark, Azure Databricks) | Cloud Data Pipelines, Migration & Governance

Job Summary
Synechron is seeking a skilled Data Engineer with expertise in cloud platforms, large-scale data pipelines, and data security to support our enterprise analytics and data migration initiatives. This role involves designing, developing, and managing scalable data solutions leveraging Python, PySpark, and Azure Databricks. The successful candidate will enable data-driven decision-making, ensure compliance with industry standards, and contribute to our digital transformation roadmap across regulated industries.

Software Requirements

Required Software Proficiency:

  • Python (version 3.7 or higher) — extensive hands-on experience in developing automation scripts and data processing solutions

  • PySpark — demonstrated ability in processing large data volumes on Spark clusters

  • Azure Databricks — expertise in developing and managing data pipelines within the Databricks environment

  • SQL — strong query writing, data validation, and performance tuning capabilities in databases like SQL Server, PostgreSQL, or Oracle

  • Azure Data Factory — experience in orchestrating and automating data workflows via Azure services

  • ADLS Gen2 — familiarity with large-scale data storage solutions supporting analytics projects

  • Azure Synapse — understanding of data warehousing and integration in Azure ecosystem

  • Azure Key Vault — knowledge of securing sensitive data and managing secrets

Preferred Software Skills:

  • Databricks Workflows — experience in creating automated, scheduled data workflows in Databricks

  • REST APIs — skills in integrating external data sources and services via API calls

  • Unity Catalog — familiarity with data cataloging, data governance, and metadata management in Databricks

  • Docker — experience in containerizing data pipelines for deployment and scalability

  • Power BI / Tableau — basic experience in creating dashboards and visualizing data insights

Overall Responsibilities

  • Design, develop, and optimize scalable ETL/ELT data pipelines supporting enterprise analytics, migration, and compliance goals

  • Build scalable and resilient data architectures within cloud environments using Azure Data Lake, Databricks, and Synapse Analytics

  • Collaborate closely with data scientists, business analysts, and application teams to translate data requirements into technical solutions

  • Conduct data validation, reconciliation, and security assessments to uphold data governance and compliance standards like GDPR and HIPAA

  • Lead data migration efforts, supporting infrastructure upgrades and cloud adoption initiatives

  • Implement and maintain data security policies, encryption standards, and access controls across platforms

  • Monitor pipeline performance, troubleshoot issues, and optimize resource utilization for cost-effective scalability

  • Automate workflows and infrastructure provisioning using Terraform, Azure DevOps, or similar tools

  • Document data architecture, data lineage, and operational procedures for ongoing support and compliance audits

Technical Skills (By Category)

  • Languages & Data Tools (Essential):

    • Python (3.7+) — scripting and automation for data processing

    • PySpark — large-scale data processing in Spark environments

    • SQL — query development, performance optimization, and validation

  • Databases & Data Management:

    • Relational: SQL Server, PostgreSQL, Oracle

    • Data lakes: Azure Data Lake Storage Gen2

    • Data warehousing: Azure Synapse Analytics

  • Cloud & Infrastructure:

    • Azure Data Factory, Databricks, Synapse, ADLS Gen2 — primary cloud data platform stack

    • Optional: Azure DevOps, Terraform for automation and provisioning

  • Frameworks & Libraries:

    • PySpark, Delta Lake, Dataflow integration in Databricks

  • Security & Governance:

    • Data encryption, role-based access control, GDPR/HIPAA compliance practices

Experience Requirements

  • 6+ years in supporting large-scale, cloud-based data pipelines and architectures

  • Proven ability to design and optimize ETL/ELT workflows supporting business, compliance, and migration needs

  • Demonstrated expertise in data security, privacy policies, and governance in regulated industries

  • Experience with cloud migration projects, integrating multiple data sources, and automating processes

  • Strong troubleshooting skills and ability to improve performance and reliability of data systems

Day-to-Day Activities

  • Develop, test, and optimize data pipelines supporting analytics, reporting, and migration

  • Engage with business and technical teams to translate requirements into scalable data architecture

  • Monitor system performance, troubleshoot failures, and implement performance improvements

  • Conduct data validation, security assessments, and compliance checks aligned with industry regulations

  • Automate workflows and infrastructure setup using Terraform and Azure DevOps pipelines

  • Document data processes, architecture, security policies, and procedures

  • Support ongoing data migration and cloud infrastructure upgrades

  • Collaborate regularly with data scientists, analysts, and platform teams for continuous improvement

Qualifications

  • Bachelor’s or Master’s degree in Data Engineering, Computer Science, or related fields

  • 6+ years supporting enterprise data environments with cloud-native tools and big data frameworks

  • Relevant certifications such as Azure Data Engineer or GCP Professional Data Engineer are a plus

  • Experience supporting secure, compliant, and high-performance data platforms in regulated environments

Professional Competencies

  • Strong analytical and troubleshooting skills for complex data workflows

  • Leadership qualities for guiding junior team members and supporting best practices

  • Effective communication skills for stakeholder engagement and cross-team collaboration

  • Adaptability to evolving cloud technologies and industry standards

  • Innovative mindset with a focus on automation, data quality, and operational security

  • Time management and organizational skills to prioritize tasks effectively in dynamic environments

S​YNECHRON’S DIVERSITY & INCLUSION STATEMENT

Diversity & Inclusion are fundamental to our culture, and Synechron is proud to be an equal opportunity workplace and is an affirmative action employer. Our Diversity, Equity, and Inclusion (DEI) initiative ‘Same Difference’ is committed to fostering an inclusive culture – promoting equality, diversity and an environment that is respectful to all. We strongly believe that a diverse workforce helps build stronger, successful businesses as a global company. We encourage applicants from across diverse backgrounds, race, ethnicities, religion, age, marital status, gender, sexual orientations, or disabilities to apply. We empower our global workforce by offering flexible workplace arrangements, mentoring, internal mobility, learning and development programs, and more.


All employment decisions at Synechron are based on business needs, job requirements and individual qualifications, without regard to the applicant’s gender, gender identity, sexual orientation, race, ethnicity, disabled or veteran status, or any other characteristic protected by law.

Candidate Application Notice

Experience Level

Senior Level

Job role

Work location
Work locationPune - Hinjewadi (Ascendas), India
Department
DepartmentData Science & Analytics
Role / Category
Role / CategoryDBA / Data warehousing
Employment type
Employment typeFull Time
Shift
ShiftDay Shift

Job requirements

Experience
ExperienceMin. 10 years

About company

Name
NameSynechron Technologies
Job posted by Synechron Technologies

Similar jobs you can apply for

Software / Web Developer
Wyse Biometrics Systems Private Limited

Software Tester

Wyse Biometrics Systems Private Limited
Parvati Paytha, Pune
₹15,000 - ₹16,500
Work from Office
Full Time
Any experience
Basic English
Eco Tech Engineers

Quality Engineer

Eco Tech Engineers
Pune
₹25,000 - ₹30,000
Work from Office
Full Time
Any experience
Basic English
Biovision Process Engineering Pvt. Ltd.

QA / QC Executive

Biovision Process Engineering Pvt. Ltd.
Pune
₹10,000 - ₹15,000
Work from Office
Full Time
Any experience
Basic English

Salesforce Developer

THE NaukriWala
Pune
₹60,000 - ₹1,49,000
Work from Office
Full Time
Min. 1 year
Basic English

Quality Engineer

Nigasavi Solutions LLP
Kothrud, Pune
₹20,000 - ₹30,000
Work from Office
Full Time
Min. 5 years
Basic English

Java Developer

THE NaukriWala
Pune
₹40,000 - ₹1,49,000
Work from Office
Full Time
Min. 2 years
Basic English

You can expect a minimum salary of 0 INR. The salary offered will depend on your skills, experience and performance in the interview.

The candidate should have completed the required education and people who have 10 to 31 years are eligible to apply for this job. You can apply for more jobs in Pune to get hired quickly.

The candidate should have sound communication skills and sound communication skills for this job.

Both Male and Female candidates can apply for this job.

No, it's not a work from home job and can't be done online. You can explore and apply for other work from home jobs in Pune at apna.

No work-related deposit needs to be made during your employment with the company.

Go to the apna app and apply for this job. Click on the apply button and call HR directly to schedule your interview.

The last date to apply for this job is . For more details, download apna app and find Full Time jobs in Pune . Through apna, you can find jobs in 64 cities across India. Join NOW!