Empleo
Mis anuncios
Mis alertas
Conectarse
Encontrar un trabajo Consejos empleo Fichas empresas
Buscar

Senior data engineer (azure, databricks, airflow)

Capitole
Publicada el Publicado hace 12 hr horas
Descripción

Senior Data Engineer (Azure, Databricks, Airflow)

About the role

We are looking for a Senior Data Engineer to join a modern, cloud-based data platform team within a leading international tech company.

In this role, you will design, build, and evolve scalable data pipelines and data products within a Lakehouse architecture on Azure, leveraging tools such as Databricks, Data Factory, and Airflow.

You will work at the intersection of data engineering and platform evolution, contributing not only to building data products but also to improving the underlying platform — including metadata-driven frameworks, data quality, observability, and governance.

This is a highly hands-on role with strong ownership, where you will collaborate with cross-functional teams and play a key part in shaping how data is consumed across the business.

If you enjoy working in modern data environments, solving complex data challenges, and building reliable and scalable solutions — this could be a great fit.

What you'll do

Design and build scalable, reliable, and reusable data pipelines using Azure Data Factory and Apache Airflow

Develop data transformations in Azure Databricks (PySpark / SQL) following Medallion Architecture (Bronze, Silver, Gold layers)

Optimize performance, cost, and reliability of data workloads

Contribute to the evolution of the data platform (metadata-driven orchestration, observability, data quality)

Support the migration of legacy data solutions to modern Lakehouse architecture

Implement and improve data quality frameworks (e.g. Soda, Great Expectations)

Ensure pipelines are observable, testable, and production-ready

Collaborate with Run/Operations teams to troubleshoot incidents and ensure platform stability

Participate in incident management, root cause analysis, and continuous improvement initiatives

Contribute to data cataloging, lineage, and governance (e.g. Unity Catalog)

Must Have

4+ years of experience in Data Engineering / Analytics Engineering / BI

Strong expertise in: Python (PySpark); SQL (advanced level)

Hands-on experience with: Azure (Data Factory, cloud data platforms), Databricks & Delta Lake, Apache Airflow

Solid understanding of: Data Lake / Data Warehouse / Lakehouse architectures, Medallion architecture, ETL / ELT design patterns, Metadata-driven approaches

Experience with performance optimization, partitioning, and scalable data pipelines

Understanding of batch and streaming pipelines

Familiarity with DevOps / DataOps practices

Strong problem-solving skills and ownership mindset

Ability to work with both technical and business stakeholders

Fluent English

Nice to Have

Experience with Infrastructure as Code (Terraform, ARM, Bicep)

Experience with CI/CD pipelines (Azure DevOps, GitHub)

Exposure to Data Quality tools (Great Expectations, Soda)

Experience with Data Catalogs (Unity Catalog, DataHub, Atlan, etc.)

Experience with monitoring/logging tools (Grafana, Azure Log Analytics)

Background in AdTech or digital environments

Experience with BI tools (Power BI, Tableau)

Barcelona – Hybrid (2 days onsite)

Why join this project?

People first – diverse and inclusive culture in an international environment.

Modern cloud platforms and large-scale, global projects.

High team stability and collaborative culture.

€1200 per year training budget and continuous learning opportunities.

Flexible compensation model.

Private health insurance and benefits package.

Flexible working hours and hybrid model.

Wellhub: fitness, wellness, and mental health support.

Football and paddle tennis teams sponsored by Capitole.

Team buildings, global events, and strong tech communities.

Want to know more about us? Click here and discover all the details.

Curious about our culture? Check out what people are saying about us on Glassdoor.

We know that not every candidate will meet 100% of the requirements. If your profile doesn't match perfectly but you believe you can add value, we'd still love to hear from you!

Ready for the challenge? Apply now and be part of a global team driving cloud innovation and security.

Empowering People, Unlocking Innovation.

Information Security Notice

* The employee will have access to confidential information related to Capitole and the assigned project.
* Compliance with internal security and information protection policies is mandatory.
* NDA signature required.

Enviar
Crear una alerta
Alerta activada
Guardada
Guardar
Oferta cercana
Frontend developer (vue.js) - capitole
Capitole
Oferta cercana
Ingeniero/a de señalización ferroviaria - capitole (badajoz)
Badajoz
Talent
Oferta cercana
Fullremote- ai engineer (google adk) - capitole
Badajoz
Capitole
Ofertas cercanas
Empleo Provincia de Badajoz
Empleo Extremadura
Inicio > Empleo > Senior Data Engineer (Azure, Databricks, Airflow)

Jobijoba

  • Dosieres empleo
  • Opiniones Empresas

Encuentra empleo

  • Ofertas de empleo por profesiones
  • Búsqueda de empleo por sector
  • Empleos por empresas
  • Empleos para localidad

Contacto/ Colaboraciones

  • Contacto
  • Publiquen sus ofertas en Jobijoba

Menciones legales - Condiciones legales y términos de Uso - Política de Privacidad - Gestionar mis cookies - Accesibilidad: No conforme

© 2026 Jobijoba - Todos los Derechos Reservados

Enviar
Crear una alerta
Alerta activada
Guardada
Guardar