About the Role
- Design and architect scalable data platforms using Databricks and Apache Spark.
- Lead end-to-end implementation of data lakes, lakehouse, and analytics solutions.
- Define data engineering, security, and governance best practices.
- Collaborate with data engineers, data scientists, and business stakeholders.
- Optimize performance, cost, and reliability of Databricks workloads.
- Strong hands-on experience with Databricks, Spark, and Delta Lake.
- Proficiency in Python/Scala and SQL for big data processing.
- Experience with cloud platforms (AWS, Azure, or GCP).
- Solid understanding of data architecture, ETL/ELT, and data modeling.
- Excellent communication and solution-design skills
- Strong hands-on experience with Databricks, Spark, and Delta Lake.
- ...
Ready to Apply?
Submit your application today and take the next step in your career journey with Unison Group.
Apply Now