Empleo
Mis anuncios
Mis alertas
Conectarse
Encontrar un trabajo Consejos empleo Fichas empresas
Buscar

Data architect (azure, databricks, data products)

Lugo (27001)
Capitole
Publicada el 2 abril
Descripción

Data Architect (Azure, Databricks, Data Platform & Products)
Lea atentamente toda la información sobre esta oportunidad y luego utilice el botón de solicitud de abajo para enviar su CV y su candidatura.
About the role
We are looking for a Data Architect to join a modern, cloud-based data platform team within an international environment.
In this role, you will contribute to the design, evolution, and implementation of a scalable data platform on Azure, while also working on data products aligned with business domains.
You will operate at the intersection of data engineering and architecture, translating business needs into robust, scalable, and well-governed data solutions. Depending on your experience, you will either drive architectural decisions end-to-end or contribute to the design and implementation of data products within a broader architecture.
You will collaborate with cross-functional teams (Data, Product, Business) and play a key role in shaping how data is structured, governed, and consumed across the organization.
If you enjoy working with modern data platforms, designing data models, and building impactful data solutions — this could be a great fit.
What you'll do
Design and evolve data architecture and data models within a Lakehouse environment (Azure + Databricks)
Translate business requirements into scalable data products and domain-oriented models
Work with Medallion Architecture (Bronze, Silver, Gold) and modern data modeling patterns (Kimball, SCDs, hierarchical data)
Contribute to the design and implementation of data pipelines (batch & streaming) using tools like Data Factory and Airflow
Ensure performance, scalability, and cost optimization of data workloads
Support and/or lead the evolution of the data platform (Delta Lake, Unity Catalog, governance frameworks)
Define and implement data governance standards, including data quality, metadata, lineage, and access control
Collaborate with stakeholders to define data contracts, SLAs, and data quality expectations
Contribute to data cataloging and metadata management (DataHub, Atlan, OpenMetadata, etc.)
Work closely with engineering, product, and business teams to enable self-service analytics
Must Have
4+ years of experience in Data Engineering / Analytics Engineering / Data Architecture
Strong expertise in SQL and solid understanding of data modeling principles
Hands-on experience with: Azure (Data Factory, Storage, cloud data services), Databricks (Spark, Delta Lake), Apache Airflow
Strong understanding of: Lakehouse architecture and Medallion design, Data modeling patterns (dimensional modeling, SCDs, domain-driven approaches), ETL / ELT pipelines and data transformation
Experience working with cloud-based data platforms
Ability to think in terms of data products and data domains, not only datasets
Strong problem-solving skills and ownership mindset
Ability to collaborate with both technical and business stakeholders
Fluent English
Nice to Have
Experience with Unity Catalog, data governance, and access control (RBAC, IAM)
Exposure to data quality frameworks (Great Expectations, Soda)
Experience with data catalog tools (DataHub, Atlan, OpenMetadata, Alation)
Knowledge of Python / PySpark
Experience with Salesforce, SAP, or other operational systems
Background in AdTech or digital environments
Experience with BI tools (Power BI, Tableau, Looker)
Understanding of metadata-driven architectures and self-service analytics
Barcelona – Hybrid (flexible model)
Why join this project?
People first – diverse and inclusive culture in an international environment.
Modern cloud platforms and large-scale, global projects.
High team stability and collaborative culture.
€1200 per year training budget and continuous learning opportunities.
Flexible compensation model.
Private health insurance and benefits package.
Flexible working hours and hybrid model.
Wellhub: fitness, wellness, and mental health support.
Football and paddle tennis teams sponsored by Capitole.
Team buildings, global events, and strong tech communities.
Want to know more about us? Click here and discover all the details.
Curious about our culture? Check out what people are saying about us on Glassdoor.
We know that not every candidate will meet 100% of the requirements. If your profile doesn't match perfectly but you believe you can add value, we'd still love to hear from you!
Ready for the challenge? Apply now and be part of a global team driving cloud innovation and security.
Empowering People, Unlocking Innovation.
Information Security Notice
- The employee will have access to confidential information related to Capitole and the assigned project.
- Compliance with internal security and information protection policies is mandatory. xqysrnh

Enviar
Crear una alerta
Alerta activada
Guardada
Guardar
Oferta cercana
Senior java backend engineer — remote, flexible hours - capitole
Lugo (27001)
JobLeads
Oferta cercana
Remote qa automation engineer –e testing and frameworks - capitole (lugo)
Lugo
Capitole
Ofertas cercanas
Empleo Lugo (27001)
Empleo Lugo (27001)
Empleo Provincia de Lugo
Empleo Galicia
Inicio > Empleo > Data Architect (Azure, Databricks, Data Products)

Jobijoba

  • Dosieres empleo
  • Opiniones Empresas

Encuentra empleo

  • Ofertas de empleo por profesiones
  • Búsqueda de empleo por sector
  • Empleos por empresas
  • Empleos para localidad

Contacto/ Colaboraciones

  • Contacto
  • Publiquen sus ofertas en Jobijoba

Menciones legales - Condiciones legales y términos de Uso - Política de Privacidad - Gestionar mis cookies - Accesibilidad: No conforme

© 2026 Jobijoba - Todos los Derechos Reservados

Enviar
Crear una alerta
Alerta activada
Guardada
Guardar