Full-time

Data Engineer - Pyspark, Airflow, AWS

Posted by NucleusTeq • Indore, Madhya Pradesh, India

📍 Indore, Madhya Pradesh 🕒 March 03, 2026

About the Role

Key Responsibilities:

● Develop Apache Airflow DAGs and PySpark ETL pipelines for high volume data processing.

● Write optimized SQL queries for data transformation and aggregation.

● Build data products serving Business Process, Executive KPIs, and Product Analytics.

● Implement data quality and monitoring solutions.

● Optimize pipeline performance and troubleshoot production issues.

● Collaborate with cross-functional teams.

● Production Pipeline Monitoring (KLO).


Required skills:

● 10+ years of data engineering experience. 7+ years dedicated to the Big data stack.

● Expert in Python and PySpark (DataFrame API, Spark SQL).

● Advanced SQL skills (window functions, complex queries)....

Ready to Apply?

Submit your application today and take the next step in your career journey with NucleusTeq.

Apply Now