Java Kafka Developer

Kpmg India Services Llp

Bengaluru/Bangalore

Not disclosed

Work from Office

Full Time

Min. 3 years

Job Details

Job Description

DEQ :: CON - Java Kafka Developer

Description

About KPMG in India

KPMG entities in India are professional services firm(s). These Indian member firms are affiliated with KPMG International Limited. KPMG was established in India in August 1993. Our professionals leverage the global network of firms, and are conversant with local laws, regulations, markets and competition. KPMG has offices across India in Ahmedabad, Bengaluru, Chandigarh, Chennai, Gurugram, Jaipur, Hyderabad, Jaipur, Kochi, Kolkata, Mumbai, Noida, Pune, Vadodara and Vijayawada. 

KPMG entities in India offer services to national and international clients in India across sectors. We strive to provide rapid, performance-based, industry-focused and technology-enabled services, which reflect a shared knowledge of global and local industries and our experience of the Indian business environment.

Responsibilities

Kafka Engineer is responsible for designing, implementing, and managing Kafka-based streaming data pipelines and messaging solutions. This role involves configuring, deploying, and monitoring Kafka clusters to ensure the high availability and scalability of data streaming services. The Kafka Engineer collaborates with cross-functional teams to integrate Kafka into various applications and ensures optimal performance and reliability of the data infrastructure.

Kafka Engineers play a critical role in driving data-driven decision-making and enabling real-time analytics, contributing directly to the company’s agility, operational efficiency, and ability to respond quickly to market changes. Their work supports key business initiatives by ensuring that data flows seamlessly across the organization, empowering teams with timely insights and enhancing the customer experience.

Typical duties and responsibilities

  1. Design, implement, and manage Kafka-based data pipelines and messaging solutions to support critical business operations and enable real-time data processing.
  2. Configure, deploy, and maintain Kafka clusters, ensuring high availability and scalability to maximize uptime and support business growth.
  3. Monitor Kafka performance and troubleshoot issues to minimize downtime and ensure uninterrupted data flow, enhancing decision-making and operational efficiency.
  4. Collaborate with development teams to integrate Kafka into applications and services.
  5. Develop and maintain Kafka connectors such as JDBC, MongoDB, and S3 connectors, along with topics and schemas, to streamline data ingestion from databases, NoSQL data stores, and cloud storage, enabling faster data insights.
  6. Implement security measures to protect Kafka clusters and data streams, safeguarding sensitive information and maintaining regulatory compliance.
  7. Optimize Kafka configurations for performance, reliability, and scalability.
  8. Automate Kafka cluster operations using infrastructure-as-code tools like Terraform or Ansible to increase operational efficiency and reduce manual overhead.
  9. Provide technical support and guidance on Kafka best practices to development and operations teams, enhancing their ability to deliver reliable, high-performance applications.
  10. Maintain documentation of Kafka environments, configurations, and processes to ensure knowledge transfer, compliance, and smooth team collaboration.
  11. Stay updated with the latest Kafka features, updates, and industry best practices to continuously improve data infrastructure and stay ahead of industry trends.

Required skills

  • 3-5 years of experience working with Apache Kafka in a production environment.
  • Strong knowledge of Kafka architecture, including brokers, topics, partitions, and replicas.
  • Experience with Kafka security, including SSL, SASL, and ACLs.
  • Proficiency in configuring, deploying, and managing Kafka clusters in cloud and on-premises environments.
  • Experience with Kafka stream processing using tools like Kafka Streams, KSQL, or Apache Flink.
  • Solid understanding of distributed systems, data streaming, and messaging patterns.
  • Proficiency in Java, Scala, or Python for Kafka-related development tasks.
  • Familiarity with DevOps practices, including CI/CD pipelines, monitoring, and logging.
  • Experience with tools like Zookeeper, Schema Registry, and Kafka Connect.
  • Strong problem-solving skills and the ability to troubleshoot complex issues in a distributed environment.
  • Excellent communication and collaboration skills to work effectively with cross-functional teams and stakeholders.

Qualifications

Equal employment opportunity information 


KPMG India has a policy of providing equal opportunity for all applicants and employees regardless of their color, caste, religion, age, sex/gender, national origin, citizenship, sexual orientation, gender identity or expression, disability or other legally protected status. KPMG India values diversity and we request you to submit the details below to support us in our endeavor for diversity. Providing the below information is voluntary and refusal to submit such information will not be prejudicial to you.

Qualifications 

BE/BTech/MCA/MBA or an equivalent education

Job role

Work location

Bangalore, Bangalore - EGL Pebble Beach -KPMG, EGL Pebble Beach -KPMG, Bangalore, Karnataka, India

Department

Software Engineering

Role / Category

Software Backend Development

Employment type

Full Time

Shift

Day Shift

Job requirements

Experience

Min. 3 years

About company

Name

Kpmg India Services Llp

Job posted by Kpmg India Services Llp

Apply on company website