Azure Data Engineer

Kpmg India Services Llp

Hyderabad

Not disclosed

Work from Office

Full Time

Min. 2 years

Job Details

Job Description

Azure Data Engineer - Associate Consultant - Hyderabad

Specific Job Title: Data Engineer

Cost Center: Managed Services

Area of interest: Data Integration

 

Data Engineer: Azure

Level: Associate Consultant

Location: Hyderabad

Key Technologies:

  • 2–4 years of experience in data engineering or a related field.
  • Proficiency in Azure Data Factory (ADF) for building pipelines and workflows.
  • Hands-on experience with Azure Data Lake Storage (ADLS) for managing data lakes.
  • Strong working knowledge of Databricks and PySpark.
  • Proficiency in Power BI development, including report and dashboard creation
  • Azure Service Fabric Knowledge is a plus.
  • Basic understanding of Python for scripting and automation.
  • Experience in designing and maintaining ETL/ELT workflows.
  • Familiarity with CI/CD pipelines and version control systems like Git, Azure DevOps
  • Good problem-solving and communication skills.
  •  

Key Responsibilities:

  • Develop and maintain scalable data pipelines using Azure Data Factory (ADF) and Databricks.
  • Perform data extraction, transformation, and loading (ETL/ELT) from various sources into Azure Data Lake Storage (ADLS).
  • Implement data processing workflows using Databricks and PySpark for structured and unstructured data.
  • Designing and developing Power BI reports and dashboards to meet the business stakeholders’ needs
  • Troubleshooting and resolving issues in Power BI reports
  • Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions.
  • Ensure data accuracy, consistency, and security across all stages of the data lifecycle.
  • Write clean and efficient Python scripts for data manipulation and workflow automation.
  • Monitor and optimize pipeline performance and troubleshoot issues as they arise.
  • Stay updated with Azure and data engineering best practices to recommend and implement improvements.

Certifications:

PL-300: Power BI Data Analyst Associate
DP-700: Fabric Data Engineer Associate (Optional)

 

Roles:

  • Client Support: Provide expert support and troubleshooting for customers’ applications, ETL’s and data integration resolving issues in a timely manner to ensure minimal disruption to client operations.
  • System Monitoring: Monitor client environments for performance, stability, and security, implementing proactive measures to prevent potential issues.
  • Configuration and Optimization: Assist clients with the configuration and optimization of their customers’ applications, integration systems to align with business requirements and improve efficiency.
  • Documentation and Reporting: Maintain accurate documentation of client environments, issues resolved, and changes made. Provide regular reports to clients on system performance and areas for improvement.
  • Training and Knowledge Transfer: Deliver training sessions and knowledge transfer to client teams, empowering them to effectively use and manage their customers’ applications and integration systems.
  • Azure Data Factory (ADF), Data Lake (ADLS), Data bricks, PySpark, Azure Fabric, Python, CI/CD, Power BI, PL-300, DP-700

 

 

 

 

Keywords:


Qualification

  • Any bachelor’s or master’s Degree



Work Location

  • Hyderabad

Specific Job Title: Data Engineer

Cost Center: Managed Services

Area of interest: Data Integration

 

Data Engineer: Azure

Level: Associate Consultant

Location: Hyderabad

Key Technologies:

  • 2–4 years of experience in data engineering or a related field.
  • Proficiency in Azure Data Factory (ADF) for building pipelines and workflows.
  • Hands-on experience with Azure Data Lake Storage (ADLS) for managing data lakes.
  • Strong working knowledge of Databricks and PySpark.
  • Proficiency in Power BI development, including report and dashboard creation
  • Azure Service Fabric Knowledge is a plus.
  • Basic understanding of Python for scripting and automation.
  • Experience in designing and maintaining ETL/ELT workflows.
  • Familiarity with CI/CD pipelines and version control systems like Git, Azure DevOps
  • Good problem-solving and communication skills.
  •  

Key Responsibilities:

  • Develop and maintain scalable data pipelines using Azure Data Factory (ADF) and Databricks.
  • Perform data extraction, transformation, and loading (ETL/ELT) from various sources into Azure Data Lake Storage (ADLS).
  • Implement data processing workflows using Databricks and PySpark for structured and unstructured data.
  • Designing and developing Power BI reports and dashboards to meet the business stakeholders’ needs
  • Troubleshooting and resolving issues in Power BI reports
  • Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions.
  • Ensure data accuracy, consistency, and security across all stages of the data lifecycle.
  • Write clean and efficient Python scripts for data manipulation and workflow automation.
  • Monitor and optimize pipeline performance and troubleshoot issues as they arise.
  • Stay updated with Azure and data engineering best practices to recommend and implement improvements.

Certifications:

PL-300: Power BI Data Analyst Associate
DP-700: Fabric Data Engineer Associate (Optional)

 

Roles:

  • Client Support: Provide expert support and troubleshooting for customers’ applications, ETL’s and data integration resolving issues in a timely manner to ensure minimal disruption to client operations.
  • System Monitoring: Monitor client environments for performance, stability, and security, implementing proactive measures to prevent potential issues.
  • Configuration and Optimization: Assist clients with the configuration and optimization of their customers’ applications, integration systems to align with business requirements and improve efficiency.
  • Documentation and Reporting: Maintain accurate documentation of client environments, issues resolved, and changes made. Provide regular reports to clients on system performance and areas for improvement.
  • Training and Knowledge Transfer: Deliver training sessions and knowledge transfer to client teams, empowering them to effectively use and manage their customers’ applications and integration systems.
  • Azure Data Factory (ADF), Data Lake (ADLS), Data bricks, PySpark, Azure Fabric, Python, CI/CD, Power BI, PL-300, DP-700

 

 

 

 

Keywords:


Qualification

  • Any bachelor’s or master’s Degree



Work Location

  • Hyderabad

Specific Job Title: Data Engineer

Cost Center: Managed Services

Area of interest: Data Integration

 

Data Engineer: Azure

Level: Associate Consultant

Location: Hyderabad

Key Technologies:

  • 2–4 years of experience in data engineering or a related field.
  • Proficiency in Azure Data Factory (ADF) for building pipelines and workflows.
  • Hands-on experience with Azure Data Lake Storage (ADLS) for managing data lakes.
  • Strong working knowledge of Databricks and PySpark.
  • Proficiency in Power BI development, including report and dashboard creation
  • Azure Service Fabric Knowledge is a plus.
  • Basic understanding of Python for scripting and automation.
  • Experience in designing and maintaining ETL/ELT workflows.
  • Familiarity with CI/CD pipelines and version control systems like Git, Azure DevOps
  • Good problem-solving and communication skills.
  •  

Key Responsibilities:

  • Develop and maintain scalable data pipelines using Azure Data Factory (ADF) and Databricks.
  • Perform data extraction, transformation, and loading (ETL/ELT) from various sources into Azure Data Lake Storage (ADLS).
  • Implement data processing workflows using Databricks and PySpark for structured and unstructured data.
  • Designing and developing Power BI reports and dashboards to meet the business stakeholders’ needs
  • Troubleshooting and resolving issues in Power BI reports
  • Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions.
  • Ensure data accuracy, consistency, and security across all stages of the data lifecycle.
  • Write clean and efficient Python scripts for data manipulation and workflow automation.
  • Monitor and optimize pipeline performance and troubleshoot issues as they arise.
  • Stay updated with Azure and data engineering best practices to recommend and implement improvements.

Certifications:

PL-300: Power BI Data Analyst Associate
DP-700: Fabric Data Engineer Associate (Optional)

 

Roles:

  • Client Support: Provide expert support and troubleshooting for customers’ applications, ETL’s and data integration resolving issues in a timely manner to ensure minimal disruption to client operations.
  • System Monitoring: Monitor client environments for performance, stability, and security, implementing proactive measures to prevent potential issues.
  • Configuration and Optimization: Assist clients with the configuration and optimization of their customers’ applications, integration systems to align with business requirements and improve efficiency.
  • Documentation and Reporting: Maintain accurate documentation of client environments, issues resolved, and changes made. Provide regular reports to clients on system performance and areas for improvement.
  • Training and Knowledge Transfer: Deliver training sessions and knowledge transfer to client teams, empowering them to effectively use and manage their customers’ applications and integration systems.
  • Azure Data Factory (ADF), Data Lake (ADLS), Data bricks, PySpark, Azure Fabric, Python, CI/CD, Power BI, PL-300, DP-700

 

 

 

 

Keywords:


Qualification

  • Any bachelor’s or master’s Degree



Work Location

  • Hyderabad

Experience Level

Mid Level

Job role

Work location

Hyderabad, Telangana, India

Department

Data Science & Analytics

Role / Category

DBA / Data warehousing

Employment type

Full Time

Shift

Day Shift

Job requirements

Experience

Min. 2 years

About company

Name

Kpmg India Services Llp

Job posted by Kpmg India Services Llp

Apply on company website