AWS Data Engineer
Job Description
Skills: AWS Data Engineer, ETL workflows, ETL pipelines., PySpark scripts, Airflow DAGS, AWS EMR, Spark SQL code,
Job Description:AWS Data Engineer
Job Sector:IT, Software, Internet, Analytics
Interviewer / Hiring Manager Comments
Immediate Joiner Preferred Linkedin Profile must
Job Type:Permanent
Working:Hybrid (Office + Home)
Country:India
Location:Pune, Chennai, Trichy, Greater Noida, Bengaluru
Experience:5 - 15 Years
Candidate Salary Range / End RateRs. 15,00,000 - 22,00,000
AWS Data Engineer with min of 5 to 7 years of experience.
Collaborate with business analysts to understand and gather requirements for existing or new ETL pipelines.
Connect with stakeholders daily to discuss project progress and updates.
Work within an Agile process to deliver projects in a timely and efficient manner.
Design and develop Airflow DAGs to schedule and manage ETL workflows.
Transform SQL queries into Spark SQL code for ETL pipelines.
Develop custom Python functions to handle data quality and validation.
Write PySpark scripts to process data and perform transformations.
Perform data validation and ensure data accuracy and completeness by creating automated tests and implementing data validate Run Spark jobs on AWS EMR cluster using Airflow modeling, data warehousing Implement best practices for data engineering, including data modeling, data warehousing, and data pipeline architecture
Collaborate with other members of the data engineering team to improve processes and implement new technologies.