Job Title: Data Transformation Engineer
Requirements
Hands-on experience with:
* dbt for data modeling and transformation.
* Snowflake or similar cloud data warehouse platforms.
* Airflow for workflow orchestration.
* Bitbucket/GitHub for source control.
* Strong SQL skills for complex queries and performance tuning.
* Solid understanding of data modeling principles and best practices.
Responsibilities
* Design, develop, and maintain scalable data transformation workflows using dbt.
* Navigate Snowflake to ensure accuracy, performance, and compliance.
* Orchestrate data pipelines and scheduling using Airflow.
* Implement version control and collaborative development practices with Bitbucket.
* Write and optimise SQL queries for data extraction, transformation, and loading.
* Collaborate with data engineers and analysts to deliver high-quality data solutions.
* Monitor and troubleshoot data workflows to ensure reliability and efficiency.
Nice-to-Haves
* Experience with Data Vault modeling for enterprise data architecture.
* Familiarity with AWS or other cloud platforms.
* Proficiency in Python for scripting and automation.
* Knowledge of CI/CD practices for data workflows.
* Exposure to data governance tools and practices.