PbCapitole keeps growing and we want to do it with you! /b /ppbr/ppWe are is seeking a Data Engineer - Azure ETL Specialist (Only residents in Spain) /ppbr/ppWe are seeking a skilled Data Engineer proficient in building robust ETL (Extract, Transform, Load) pipelines using Azure Data Factory, Databricks, SQL, and Azure Data Lake. The ideal candidate will have a strong background in data engineering with a focus on Azure technologies, particularly Azure Data Factory and Databricks. /ppbr/ppbKey Responsibilities: /b /pp• Design, develop, and maintain ETL pipelines using Azure Data Factory to extract, transform, and load data from various sources into Azure Data Lake and other target destinations. /pp• Utilize Databricks for advanced data transformations, processing, and analytics, ensuring optimal performance and scalability. /pp• Collaborate with cross-functional teams to understand data requirements and implement solutions that meet business needs. /pp• Develop and maintain data catalogs within Databricks, ensuring accurate documentation and metadata management. /pp• Optimize and troubleshoot existing ETL processes to improve efficiency, reliability, and performance. /pp• Stay up-to-date with the latest Azure technologies and best practices in data engineering. /pp• Actively participate in code reviews, documentation, and knowledge sharing sessions. /ppbr/ppbRequirements: /b /pp• Bachelor's degree in Computer Science, Engineering, or a related field. /pp• Proven experience in building ETL pipelines using Azure Data Factory, Databricks, SQL, and Azure Data Lake. /pp• Strong proficiency in SQL for data manipulation and querying. /pp• Deep Knowledge of Programming Languages: Proficiency in Python, Java, or Scala. /pp• Strong knowledge in Apache Spark. /pp• Excellent problem-solving skills and attention to detail. /pp• Ability to work effectively both independently and as part of a team in a fast-paced environment. /pp• Strong communication and interpersonal skills. /ppbr/ppbPreferred Qualifications: /b /pp• Certification in Azure Data Engineering or related field. /pp• Knowledge of data governance principles and best practices. /pp• Familiarity with DevOps practices for CI/CD pipelines using GitHub workflows. /pp• Familiarity with the SPIRE working model is highly desirable. /pp• Experience in the Finance sector or similar regulated industries is a plus. /ppbr/ppWe're great, but with you we'll be even better. /ppbr/ppbFor this you will have: /b /pp- Budget of 1.200€ in individual training for you to use it in whatever you want (technological events, books, trainings, certifications, etc.). /pp- Follow-up with your team every month for continuous feedback. /pp- Teleworking. /pp- Flexible working hours to help you reconcile your professional / family life. /pp- Private medical insurance paid in full by Capitole. /pp- Flexible remuneration (restaurant tickets, transport and/or childcare). /pp- Andjoy (Gymforless) /pp- Discounts on major brands for employees (Club Capitole). /ppbr/ppbGet to know the whole family: /b /pp- Team Buildings every two months - don't miss the summer party or the Christmas dinner! /pp- Football team sponsored by Capitole. /pp- Technological communities for you to share your knowledge and ideas with the other teams, sharing internal knowledge is essential!!!! /pp- Last but not least, a TEAM! Don't you know us yet? Discover us!!! /ppbr/ppSee what people are saying about us /p pbr/ppDon't hesitate to send us your profile, we are looking forward to meeting you! /ppbr/ppbThe employee will adhere to the information security policies: /b /ppb- Will have access to confidential information relating to Capitole and the project on which he/she is working. /b /ppb- Will have to comply with the security policies and internal policies of the company and client. /b /ppb- You will have to sign an NDA. /b /p