Data Platform Engineer (Azure Databricks) Fully Remote from Spain €25–30/hour 12-month contract (Autonomo setup) We are looking for a Data Platform Engineer to support and evolve multiple Unity Catalog–enabled Azure Databricks environments. This role is focused on platform engineering and operations, NOT on building data pipelines or ML models. You will work on improving, automating, and scaling the platform itself. Key Responsibilities: Platform Engineering & Operations Improve implementation patterns across jobs and platform components Identify and resolve scalability, reliability, and maintainability bottlenecks Standardise workspace configurations and environment structures Monitor platform health and performance Unity Catalog Management Maintain consistent and secure Unity Catalog configurations Ensure governance structures (catalogs, schemas, grants, service principals) are efficient and developer-friendly Performance & Cost Optimisation Optimise compute configurations and autoscaling behaviour Analyse Spark jobs, Delta Lake usage, and storage layout for performance and cost efficiency Automation & Infrastructure-as-Code Automate environments using Databricks Asset Bundles (DABs) and Terraform Ensure reproducible, versioned environments aligned with CI/CD principles Minimise manual configuration Documentation & Guidance Document platform patterns and operational runbooks Support engineers and analysts working with Databricks Required Experience: 3–5 years in Platform (Data) Engineering Strong hands-on experience with Azure Databricks Strong knowledge of Unity Catalog (access models, governance, service principals, workspace configs) Solid PySpark and SQL skills Understanding of performance optimisation (partitioning, Z-ordering, caching, data layout) Experience with CI/CD, Git workflows, and Infrastructure-as-Code Bonus: MLOps experience If you're based in Spain and available for a 12-month Autonomo contract, feel free to apply or message me directly.