About the Role
Responsibilities
- Design, develop, and maintain data pipelines and ETL workflows using AWS services.
- Implement orchestration and automation for data workflows.
- Work with large datasets to ensure data integrity, scalability, and performance.
- Collaborate with stakeholders to understand data requirements and deliver solutions.
- Deploy changes directly to production environments with confidence and accountability.
- Support migration efforts to Databricks and optimize workflows for performance.
Qualifications
- Experience with data lake architectures, big data technologies, and data pipeline orchestration.
- Familiarity with CI/CD practices for data engineering.
- AWS Certification (e.g., AWS Certified Data Analytics - Specialty or Solutions Architect) is a plus.
- Strong problem‑solving skills and attention to detail.
- Expert‑level fluency in AWS services rele...
Ready to Apply?
Submit your application today and take the next step in your career journey with Businesslist.
Apply Now