Senior Technical Consultant - Cloud Data Platform (Snowflake)

Blue Yonder Pvt Ltd

Coimbatore

Not disclosed

Work from Office

Full Time

Min. 6 years

Job Details

Job Description

Sr. Technical Consultant - Cloud - Snowflake

Scope:

  • The Data Platform team partners with business and engineering groups to deliver scalable Snowflake solutions.

  • The L4 role focuses heavily on execution—translating architectural patterns into highly optimized, secure, and reliable data pipelines and models for specific business domains.

  • The position involves automating ingestion, optimizing complex queries, and ensuring platform stability while guiding junior data engineers.

Technical Environment :

  • Snowflake Platform: Multi-Cluster Warehouses, Snowpipe, Tasks, Streams, Zero-Copy Cloning, Time Travel, Apache Iceberg tables.

  • Data Engineering & Scripting: Advanced SQL, Python (Snowpark), Java/Scala (UDFs/UDTFs), dbt (Data Build Tool).

  • Integrations & Orchestration: Apache Airflow, Fivetran, Kafka, Spark, Trino, external catalogs (AWS Glue, Polaris).

  • Governance & Security: Hierarchical RBAC, Dynamic Data Masking, Row Access Policies, Object Tagging, Secure Data Sharing.

  • Platform Enhancements: Snowpark Container Services, Snowflake Cortex (AI/ML), Search Optimization Service, Materialized Views.

  • DataOps/Agile: CI/CD pipelines, Git, GitHub Actions/GitLab, Terraform (Infrastructure-as-Code), Agile delivery.

What you’ll do:

  • Design & Architect: Design robust dimensional data models and domain-specific data pipelines. Implement standard FinOps and security patterns defined by senior architects.

  • Develop & Deliver: Configure continuous data ingestion (Snowpipe/Streams), write modular transformations using dbt and Python (Snowpark), and build task orchestrations.

  • Guide & Govern: Review code and data models for team members, ensuring adherence to CI/CD standards and SQL best practices.

  • Operate & Optimize: Identify and rewrite bottleneck queries, optimize micro-partition clustering, and handle L3 technical escalations for pipeline failures.

What we are looking for:

  • Bachelor’s degree in Computer Science, Data Engineering, or a related technical field.

  • 6–8 years of IT/Data experience, with 3–5 years specifically in deep Snowflake development and pipeline architecture.

  • Strong expertise in Snowflake core architecture, caching layers, and warehouse sizing.

  • Deep proficiency in Advanced SQL, Python, and data modeling (Dimensional/Kimball).

  • Hands-on experience with dbt, Airflow, and CI/CD pipelines for database deployments.

Our Values


If you want to know the heart of a company, take a look at their values. Ours unite us. They are what drive our success – and the success of our customers. Does your heart beat like ours? Find out here: Core Values

All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status.

Experience Level

Senior Level

Job role

Work location

IND - Coimbatore (708), India

Department

Data Science & Analytics

Role / Category

Business Intelligence & Analytics

Employment type

Full Time

Shift

Day Shift

Job requirements

Experience

Min. 6 years

About company

Name

Blue Yonder Pvt Ltd

Job posted by Blue Yonder Pvt Ltd

Apply on company website