People First : trust, respect, and professional development.
#Cutting-edge projects with global clients and modern technologies.
#Tailored training programs: up to €1,200/year per employee.
#Private health insurance, flexible compensation, and Wellhub for your overall wellbeing.
#Active tech communities to learn and share knowledge.
#Amazing team events (yes, we know how to have fun too #128521;).
We are seeking a skilled Data Engineer to join a German international client in the automotive sector. The ideal candidate will have a strong background in designing and implementing robust ETL pipelines, with a focus on Azure cloud technologies and big data processing frameworks.
Design, develop, and maintain ETL pipelines using Azure Data Factory to process data from multiple sources into Azure Data Lake and other destinations.
Leverage Databricks and Spark environments for data transformation, processing, and analytics.
Collaborate with cross-functional teams to translate business requirements into data solutions.
Maintain data catalogs and ensure accurate documentation within Databricks.
Optimize and troubleshoot existing ETL workflows to improve performance and reliability.
Stay current with Azure data engineering best practices .
Bachelor’s degree in Computer Science, Engineering, or related field.
Proven experience with Azure Data Factory, Databricks, Azure Data Lake, SQL, Python, Spark, and PySpark.
Strong proficiency in SQL and Python for data manipulation and querying.
Certification in Azure Data Engineering or related field.
Knowledge of data governance principles and best practices.
Experience with DevOps CI/CD pipelines, especially GitHub workflows.
Location: 100% Remote
Flexible, with reduced hours on Fridays
Language: English (C1)
The employee will adhere to information security policies:
-Will have access to confidential information related to Capitole and the project they are working on.
-