Data Engineer with Expertise in Ab-initio, Python, Spark & Cloud Data Platforms

Synechron Technologies

Pune

Not disclosed

Work from Office

Full Time

Min. 7 years

Job Details

Job Description

Data Engineer | Expertise in Ab-initio, Python, Spark & Cloud Data Platforms

Job Summary
Synechron is seeking a highly experienced Ab-initio & Python Developer to oversee and develop scalable data pipeline solutions. This role is critical to advancing our organization’s data engineering capabilities by leading impact-driven projects that leverage Ab-initio, Python, and modern data processing tools. The incumbent will play a key role in designing, migrating, and optimizing large-scale data workflows to ensure efficient data management aligned with business objectives.

Software Requirements

Required Software Proficiency:

  • Ab-initio (including graphs, plans, PDL, DML) — extensive hands-on experience with migration to Python-based pipelines is essential

  • Python (version 3.8 or higher) — deep expertise and practical implementation experience

  • Pandas and PySpark — proficiency in large-scale data processing and transformation

  • SQL and data storage formats: Parquet, Delta — experience in optimization and management

  • Version control: Git — solid understanding of workflows, environments, and artifact management

Preferred Software Skills:

  • CI/CD tools: GitHub Actions, Jenkins, or equivalent — experience with automation, testing, and deployment pipelines

  • Containerization: Docker — extensive experience in containerizing applications and deploying to cloud data platforms

  • Cloud platforms: Azure, AWS, GCP — practical experience deploying data solutions in cloud environments

  • AI/ML tools (e.g., GitHub Copilot, code assistants) — application in daily development and code reviews

Overall Responsibilities

  • Lead the design, development, testing, and deployment of scalable data pipelines utilizing Ab-initio and Python

  • Drive migration efforts from legacy Ab-initio workflows to Python-based data processing services

  • Collaborate with stakeholders to understand data needs, translating them into efficient, well-documented solutions

  • Ensure data quality, security, and performance across platforms and processes

  • Foster best practices in code reuse, testing, and documentation; promote design thinking from discovery to production

  • Mentor team members on data engineering techniques, tools, and emerging best practices

  • Stay current with industry trends in data engineering, cloud computing, and automation to inform technical strategies

Technical Skills (By Category)

  • Programming Languages:

    • Essential: Python (deep expertise), Ab-initio (hands-on experience)

    • Preferred: Additional languages such as Java or Scala for broader data ecosystem integration

  • Databases/Data Management:

    • Extensive experience with data storage formats such as Parquet and Delta Lake

    • Proficiency in SQL tuning and data warehouse management

  • Cloud Technologies:

    • Experience deploying data pipelines on Azure, AWS, or GCP

    • Knowledge of cloud-native data services and infrastructure as code practices

  • Frameworks and Libraries:

    • Expertise with Pandas, PySpark, Spark SQL — proven ability to process large datasets efficiently

  • Development Tools and Methodologies:

    • Strong familiarity with CI/CD pipelines, Git workflows, and automated testing (pytest or equivalent)

    • Agile methodologies for project management and team collaboration

  • Security Protocols:

    • Understanding of data security, compliance standards, and encryption within cloud environments

Experience Requirements

  • Minimum of 7+ years of professional experience in data engineering or related roles with impactful projects

  • Proven hands-on expertise with Ab-initio and Python in large-scale data environments

  • Experience in migrating workflows from legacy to modern pipelines is strongly preferred

  • Demonstrated success in leading impact-driven, cross-functional technical initiatives

  • Prior experience working within Agile teams and managing stakeholder expectations

Day-to-Day Activities

  • Develop, optimize, and deploy data workflows leveraging Ab-initio and Python tools

  • Strategize migration of legacy workflows into efficient Python-based processes

  • Collaborate with data scientists, analysts, and other stakeholders to meet data requirements

  • Conduct code and design reviews to ensure adherence to high standards of quality and security

  • Resolve production and technical issues through troubleshooting and root cause analysis

  • Document solutions thoroughly and present findings clearly to technical and non-technical audiences

  • Stay updated on industry developments to continuously enhance engineering practices

Qualifications

  • Bachelor’s degree in Computer Science, Data Engineering, Engineering, or related field; Master’s degree preferred

  • Relevant certifications in Python, cloud platforms, or data tools (e.g., AWS Certified Data Analytics, GCP Professional Data Engineer) are advantageous

  • Ongoing commitment to professional learning, especially in new data processing technologies and automation tools

Professional Competencies

  • Strong analytical skills with the ability to troubleshoot complex data issues effectively

  • Leadership and mentorship abilities to guide technical teams and foster knowledge sharing

  • Excellent communication skills to articulate technical concepts and collaborate with diverse stakeholders

  • Adaptability to evolving technologies and project demands with a continuous learning approach

  • Strategic thinking from discovery to deployment, emphasizing sustainable, scalable solutions

  • Effective time management to prioritize tasks and meet project deadlines

S​YNECHRON’S DIVERSITY & INCLUSION STATEMENT

Diversity & Inclusion are fundamental to our culture, and Synechron is proud to be an equal opportunity workplace and is an affirmative action employer. Our Diversity, Equity, and Inclusion (DEI) initiative ‘Same Difference’ is committed to fostering an inclusive culture – promoting equality, diversity and an environment that is respectful to all. We strongly believe that a diverse workforce helps build stronger, successful businesses as a global company. We encourage applicants from across diverse backgrounds, race, ethnicities, religion, age, marital status, gender, sexual orientations, or disabilities to apply. We empower our global workforce by offering flexible workplace arrangements, mentoring, internal mobility, learning and development programs, and more.


All employment decisions at Synechron are based on business needs, job requirements and individual qualifications, without regard to the applicant’s gender, gender identity, sexual orientation, race, ethnicity, disabled or veteran status, or any other characteristic protected by law.

Candidate Application Notice

Experience Level

Senior Level

Job role

Work location

Pune - Hinjewadi (Ascendas), India

Department

Data Science & Analytics

Role / Category

DBA / Data warehousing

Employment type

Full Time

Shift

Day Shift

Job requirements

Experience

Min. 7 years

About company

Name

Synechron Technologies

Job posted by Synechron Technologies

Apply on company website