Snowflake Manager
Kpmg India Services Llp
Apply on company website
Snowflake Manager
Kpmg India Services Llp
Bengaluru/Bangalore
Not disclosed
Job Details
Job Description
Manager - Snowflake
Description
About KPMG in India
KPMG entities in India are professional services firm(s). These Indian member firms are affiliated with KPMG International Limited. KPMG was established in India in August 1993. Our professionals leverage the global network of firms, and are conversant with local laws, regulations, markets and competition. KPMG has offices across India in Ahmedabad, Bengaluru, Chandigarh, Chennai, Gurugram, Jaipur, Hyderabad, Jaipur, Kochi, Kolkata, Mumbai, Noida, Pune, Vadodara and Vijayawada.
KPMG entities in India offer services to national and international clients in India across sectors. We strive to provide rapid, performance-based, industry-focused and technology-enabled services, which reflect a shared knowledge of global and local industries and our experience of the Indian business environment.
Lighthouse Overview
The Lighthouse practice focuses on delivering data, analytics, cloud and engineering solutions to its clients.
Responsibilities
Attached the Job description for your reference -
- 7+ years of experience in Data warehouse and Data Lake Implementations as a Data Engineer/ETL Developer
- Deep understanding of Kafka architecture, concepts (topics, partitions, clustering,brokers), and related tools (Kafka Connect, Kafka Streams, Schema Registry).
- Strong experience with relevant AWS services for data streaming and infrastructure management (e.g., MSK, EC2, S3, Lambda, IAM, VPC).
- Expertise in one or more programming languages commonly used with Kafka, such as Java, Scala, or Python.
- Strong knowledge and experience in Data Lake and Data Warehouse architecture and concepts in cloud
- Expert in Data Ingestion (ETL and ELT), Dimension Modeling, Data Quality and Data Validation
- Strong working experience in Snowflake using warehouses, stored procedures, streams, snow pipes, tasks, stages, storage integration, ingestion frameworks and tools etc.
- Ability to develop ELT/ETL pipelines to move data to and from Snowflake data store using combination of Python, Advanced SQL and Snowflake SnowSQL.
- Strong hands-on experience with requirements gathering analysis, coding, testing, implementation, maintenance, and review.
- Extensive development experience using Advanced SQL, Python and other scripting languages (Perl, shell etc.)
- Experience in working with core data engineering services in AWS, MS Azure, or other cloud providers.
- Good hands-on experience on workflow management tools such as Airflow, Control M etc.
- Solid experience with application Support & resolving production issues.
- Great customer support skills and adaptable to changing business needs.
- Good project management, experience in Agile methodologies (Kanban and SCRUM), communication, and interpersonal skills.
- Lead project efforts with the technology solutions to perform proof of concept (POC) analysis.
- Experience with containerization technologies (e.g., Docker, Kubernetes)
- Experience with DevOps, Git, CI/CD
- Experience directly managing a team in an onsite/offshore development environment.
- Experience in Service Now Data is preferred.
- Strongly preferred to have experience: 3 to 5 years of experience working with Kafka platforms in below area.
- Design, implement, and maintain Kafka producers, consumers, and stream processing applications using languages like Java, Scala, or Python.
- Deploy, manage, and optimize Kafka clusters and related applications on AWS services such as Amazon Managed Streaming for Apache Kafka (MSK), EC2, S3, Lambda, and CloudWatch.
- Develop and manage end-to-end data pipelines involving Kafka Connect, Kafka Streams, and other data integration tools
- Ensure the performance, scalability, and reliability of Kafka-based systems, including cluster tuning, monitoring, and troubleshooting.
- Implement security best practices for Kafka on AWS, including authentication, authorization (ACLs), and data encryption.
- Manage Kafka Schema Registry for data serialization and evolution
- Develop real-time stream processing applications using Apache Spark Streaming, Kafka Streams, or AWS Lambda
- Implement complex event processing (CEP) patterns for real-time analytics
Qualifications
Bachelor’s degree in Computer Science, Information Systems, or related field.
Job role
Work location
Bangalore, Bangalore - EGL Pebble Beach -KPMG, EGL Pebble Beach -KPMG, Bangalore, Karnataka, India
Department
IT & Information Security
Role / Category
IT Security
Employment type
Full Time
Shift
Day Shift
Job requirements
Experience
Min. 5 years
About company
Name
Kpmg India Services Llp
Job posted by Kpmg India Services Llp
Apply on company website