Senior Data Engineer
EXL SERVICES
Senior Data Engineer
EXL SERVICES
Gurgaon/Gurugram
Not disclosed
Job Details
Job Description
Assistant Vice President
Senior Data Engineer (T-SQL, Snowflake, DBT, Prefect, CI/CD, DevOps)
Position Overview:
As a Senior Data Engineer at EXL, you will leverage your expertise in T-SQL, Snowflake, DBT, and Prefect to design and implement data pipelines and workflows that power critical business insights. You will play a key role in the replication of stored procedures from SQL Server to Snowflake and will also be responsible for integrating CI/CD practices using Jenkins and GitLab. Strong communication skills and the ability to collaborate with cross-functional teams will be key to success in this role. If you have a passion for DevOps and data engineering, this is the perfect opportunity for you.
Key Responsibilities:
- Data Pipeline Development & Optimization:
Design, develop, and maintain robust, scalable, and efficient data pipelines to move, transform, and load data into Snowflake from various data sources. Ensure data pipelines are optimized for performance and cost. - Stored Procedure Replication:
Leverage your expertise in T-SQL to replicate complex stored procedures from SQL Server to Snowflake, ensuring that the replicated logic works seamlessly and performs at scale in the cloud environment. - DBT (Data Build Tool) Management:
Use DBT to create and manage data transformations within Snowflake. Build reusable, efficient, and reliable data models and transformations that are aligned with business needs. - Workflow Automation with Prefect:
Implement and manage automated workflows using Prefect Scheduler to orchestrate data pipelines and ETL/ELT processes. Ensure reliability, monitoring, and alerting for critical data workflows. - CI/CD and DevOps Integration:
Design, develop, and maintain CI/CD pipelines for data projects using tools such as Jenkins, GitLab, or similar. Integrate these pipelines into the development lifecycle to automate testing, deployment, and version control for data workflows. - Collaboration & Communication:
Work closely with cross-functional teams, including data scientists, analysts, and business stakeholders, to understand data requirements and deliver data solutions that meet organizational goals. Communicate complex technical concepts to non-technical stakeholders. - Code Versioning & Management:
Use GitLab or similar version control systems to manage the development and deployment of data engineering solutions. Foster best practices in collaborative code management, branching, and version control. - Performance Tuning:
Continuously monitor, troubleshoot, and optimize data pipelines for performance, ensuring they scale efficiently with large data volumes and meet SLAs. - Data Quality & Documentation:
Ensure data integrity and quality across the pipeline. Maintain clear, detailed documentation for code, architecture, and processes to enable knowledge sharing within the team.
Qualifications:
- Experience:
- 8+ years of hands-on experience as a Data Engineer, with deep expertise in T-SQL, Snowflake, DBT, Prefect, and CI/CD practices.
- Proven experience replicating SQL Server stored procedures to Snowflake, ensuring optimized performance and functional accuracy.
- Strong experience with CI/CD pipelines using Jenkins, GitLab, or similar tools.
- Solid understanding of DevOps principles and the ability to integrate CI/CD into data engineering workflows.
- Technical Skills:
- Advanced proficiency in T-SQL for writing complex queries, stored procedures, and performance optimization.
- Experience with Snowflake, including data modeling, schema design, and performance tuning.
- Experience with DBT for creating and managing data transformations.
- Familiarity with Prefect for workflow orchestration and data pipeline scheduling.
- Strong background in CI/CD and DevOps practices using Jenkins, GitLab, or other similar platforms.
- Python experience is preferred but not mandatory, especially for data processing or orchestration tasks.
- Database & ETL Skills:
- Extensive experience with SQL Server and Snowflake, with the ability to replicate, optimize, and integrate stored procedures and data pipelines.
- Proficient in building, deploying, and maintaining ETL/ELT pipelines for large-scale data processing.
- Collaboration & Communication:
- Strong communication and interpersonal skills, with the ability to work effectively with cross-functional teams and convey complex technical information clearly.
- Proven ability to manage multiple projects simultaneously, with a focus on quality, timeliness, and business value.
- Education:
- A bachelor’s degree in Computer Science, Information Technology, Engineering, or a related field is required.
- Advanced certifications or training in data engineering, DevOps, or cloud technologies is a plus.
- Soft Skills:
- Strong problem-solving abilities and an analytical mindset.
- Ability to work in a fast-paced, collaborative, and evolving environment.
- A self-starter with a strong attention to detail and the ability to work independently and in teams.
Job role
Work location
Gurgaon
Department
Data Science & Analytics
Role / Category
Data Science & Machine Learning
Employment type
Full Time
Shift
Day Shift
Job requirements
Experience
Min. 8 years
About company
Name
EXL SERVICES
Job posted by EXL SERVICES
This job has expired