Join to apply for the DATA ENGINEER role at CASER
Position Overview
Caser Grupo Helvetia is seeking an Azure Databricks DevOps Engineer to join its IT Asset Management team at the Helvetia Service Hub in Madrid. The role involves designing, developing, and optimizing data pipelines and workflows, utilizing technologies such as Azure Databricks, Airflow, Unity Catalog, Data Factory, and Git. The successful candidate will work closely with data engineers, architects, and stakeholders to deliver high-quality data solutions supporting analytics and operational processes.
Responsibilities
1. Design, develop, and optimize ETL/ELT data pipelines with Azure Databricks and Apache Spark.
2. Manage Unity Catalog for metadata, access control, and governance.
3. Develop and orchestrate workflows using Apache Airflow and Azure Data Factory.
4. Collaborate with teams to define data models, standards, and best practices.
5. Monitor and troubleshoot data pipelines for efficiency and error handling.
6. Write clean, modular PySpark/Scala code following best practices.
7. Use Git for version control, CI/CD, and code quality assurance.
8. Engage with stakeholders to align solutions with business needs.
9. Optimize performance for Spark transformations and distributed computing.
10. Stay updated with advancements in Azure and big data technologies.
Minimum Requirements
1. Bachelor's or Master’s degree in Computer Science, Engineering, Data Science, or related field.
2. Experience managing data pipelines in Azure Databricks.
3. Proficiency in Apache Spark, PySpark, or Scala for large-scale data processing.
4. Knowledge of Azure Data Services, including Data Factory and Unity Catalog.
5. Experience with Apache Airflow for workflow orchestration.
6. Understanding of data modeling and ETL best practices.
7. Experience with Git and CI/CD practices.
8. Problem-solving skills for debugging and optimizing workflows.
9. Knowledge of data governance, security, and compliance in cloud environments.
10. Excellent communication and teamwork skills.
Desirable Skills
1. Azure certifications or related cloud certifications.
2. Scala programming experience.
3. Familiarity with dbt and streaming technologies like Kafka or Spark Streaming.
4. Knowledge of financial instruments and asset management data.
Languages
English – Advanced (required for collaboration with Basel team).
#J-18808-Ljbffr