About the Role
ID: 14686
**What you will do**
- Design and implement data pipelines using DBT for transformation and modeling;
- Manage and optimize data warehouse solutions on Snowflake;
- Develop and maintain ETL processes using Fivetran for data ingestion;
- Utilize Terraform for infrastructure as code (IaC) to provision and manage resources in AWS, Snowflake, Kubernetes, and Fivetran;
- Collaborate with cross-functional teams to understand data requirements and deliver scalable solutions;
- Implement workflow automation using Argoworkflows to streamline data processing tasks;
- Ensure data quality and integrity throughout the data lifecycle.
**Must haves**
- Bachelor’s degree in Computer Science, Engineering, or related field;
- 5+ years of experience working with Python;
- Proven experience as a Data Engineer with a focus on DBT, Snowflake, ArgoWorkflows, and Fivetran;
- Strong SQL skills for data manipulation and querying;
- Experience with cloud platfo...
**What you will do**
- Design and implement data pipelines using DBT for transformation and modeling;
- Manage and optimize data warehouse solutions on Snowflake;
- Develop and maintain ETL processes using Fivetran for data ingestion;
- Utilize Terraform for infrastructure as code (IaC) to provision and manage resources in AWS, Snowflake, Kubernetes, and Fivetran;
- Collaborate with cross-functional teams to understand data requirements and deliver scalable solutions;
- Implement workflow automation using Argoworkflows to streamline data processing tasks;
- Ensure data quality and integrity throughout the data lifecycle.
**Must haves**
- Bachelor’s degree in Computer Science, Engineering, or related field;
- 5+ years of experience working with Python;
- Proven experience as a Data Engineer with a focus on DBT, Snowflake, ArgoWorkflows, and Fivetran;
- Strong SQL skills for data manipulation and querying;
- Experience with cloud platfo...
Ready to Apply?
Submit your application today and take the next step in your career journey with AgileEngine.
Apply Now