Big Data Engineer
Nagarro Noida
Job Description
Job Title: Data Engineer
Job Summary:
We are looking for a Senior Data Engineer with expertise in building scalable data pipelines, designing data frameworks, and working with modern data technologies. The ideal candidate should have strong experience with Scala, Apache Spark, and streaming technologies such as Kafka, Kinesis, or Flink.
Key Responsibilities:
- Design, develop, and maintain data pipelines to support business and analytics needs.
- Build data frameworks for unit testing, data lineage tracking, and automation.
- Optimize and enhance existing data processing workflows for scalability and performance.
- Work with Apache Spark to process large-scale datasets efficiently.
- Implement real-time data streaming solutions using Kafka, Kinesis, or Flink.
- Collaborate with cross-functional teams, including Data Scientists, Analysts, and Software Engineers, to drive data-driven decision-making.
Required Qualifications:
- 5+ years of experience in building data pipelines.
- 5+ years of experience in developing data frameworks for unit testing, data lineage tracking, and automation.
- Proficiency in Scala is required.
- Strong working knowledge of Apache Spark.
- Experience with streaming technologies such as Kafka, Kinesis, or Flink.
- Familiarity with cloud platforms (AWS, Azure, or GCP) is a plus.
Preferred Qualifications:
- Experience with big data processing frameworks such as Hadoop or Databricks.
- Strong understanding of distributed computing principles.
- Knowledge of data modeling, ETL development, and performance tuning.
Compunnel Technology India Private LimitedNoida
Job Description
Description
We are seeking an experienced Azure Data Engineer to join our team in Noida. The ideal candidate must have 7-12 years of experience in the job market context of India and be an immediate joiner. The candidate must have...
A Client of FreshersworldNoida
Data Engineer JOB DESCRIPTIONHandling MIS operations for internal & external stakeholdersCommunicating with internal teams to get requirement detailsGenerating analysis reports by fetching data from RDBMS, Big data, or other NoSQL sourcesExtensive...
PwC IndiaNoida
an Agile delivery / DevOps methodology to deliver proof of concept and production implementation in iterative sprints.
• Integrate Databricks with other technologies (Ingestion tools, Visualization tools).
• Proven experience working as a data engineer...