Data Platform Engineer (Python Airflow AWS Snowflake Data Engineer)

apartmentEpergne Solutions placePune calendar_month 

Job Description

Epergne Solutions is looking for Data Platform Engineer (Python + Airflow Data Engineer)

Total Yrs. of Experience* 6+ yrs

Relevant Yrs. of experience* 5+ yrs

Mandatory skills* Python / Airflow Data Engineer

Budget: As per market Standards

Work Location* Pune, Bangalore

Mode- Work from Office

Positions: Contract

Payroll: Epergne solutions

Detailed JD *(Roles and Responsibilities)

We are seeking a highly skilled with 3 to 6 years of experience, specifically with a strong background in AWS technologies. The ideal

candidate will have a deep understanding of Apache Airflow and its integration within AWS ecosystem, enabling efficient data pipeline orchestration and management.

Responsibilities:

  • Design, develop, and maintain complex data pipelines using Python for efficient data processing and orchestration.
  • Collaborate with cross-functional teams to understand data requirements and architect robust solutions within the AWS environment.
  • Implement data integration and transformation processes to ensure optimal performance and reliability of data pipelines.
  • Optimize and fine-tune existing data pipelines / Airflow to improve efficiency, scalability, and maintainability.
  • Troubleshoot and resolve issues related to data pipelines, ensuring smooth operation and minimal downtime.
  • Work closely with AWS services like S3, Glue, EMR, Redshift, and other related technologies to design and optimize data infrastructure.
  • Develop and maintain documentation for data pipelines, processes, and system architecture.
  • Stay updated with the latest industry trends and best practices related to data engineering and AWS services.

Requirements:

  • Bachelors degree in Computer Science, Engineering, or a related field.
  • Proficiency in Python and SQL for data processing and manipulation.
  • Min 5 years of experience in data engineering, specifically working with Apache Airflow and AWS technologies.
  • Strong knowledge of AWS services, particularly S3, Glue, EMR, Redshift, and AWS Lambda.
  • Understanding of Snowflake is preferred.
  • Experience with optimizing and scaling data pipelines for performance and efficiency.
  • Good understanding of data modeling, ETL processes, and data warehousing concepts.
  • Excellent problem-solving skills and ability to work in a fast-paced, collaborative environment.
  • Effective communication skills and the ability to articulate technical concepts to non-technical stakeholders.

Preferred Qualifications:

  • AWS certification(s) related to data engineering or big data.
  • Experience working with big data technologies like Snowflake, Spark, Hadoop, or related frameworks.
  • Familiarity with other data orchestration tools in addition to Apache Airflow.
  • Knowledge of version control systems like Bitbucket, Git

Preferred candidates who can join in 30 days/lesser Notice Period

business_centerHigh salary

Data Engineer: Data Integration - IBM

apartmentIBMplacePune
of evolution and empathy centers on long-term career growth and learning opportunities in an environment that embraces your unique skills and experience. Your Role and Responsibilities As Data Engineer at IBM you will harness the power of data to unveil...
local_fire_departmentUrgent

Software Data Engineer II - Pune

apartmentKeywords StudiosplacePune
on 2B+ devices worldwide with Helpshift. Some numbers that illustrate our scale: 85k/rps 30ms response time 300 GB data transfer/hour 1000 VMs deployed at peak About the team - Consumers care first and foremost about having their time valued...
thumb_up_altRecommended

Senior Data Engineer

apartmentAventiorplacePune
Job Description Job Title: Senior Data Engineer Required Experience: 5+ years We are looking for an experienced Senior Data Engineer to join our team. You will be responsible for large-scale data ingestion, cleaning, and standardization from...