Get AI-powered advice on this job and more exclusive features.
At HAYS, a British multinational company that offers recruitment and human resources services in 33 countries worldwide and is listed on the London Stock Exchange. We are currently looking for 2 Cloud Data Engineers to collaborate remotely with some international clients with headquarters in Málaga and Barcelona, Spain.
What are the requirements for these 2 positions?
* 4+ years in data engineering
* Experience with Azure.
* Experience with Pyspark and Databricks working in an Azure environment.
* Python programming experience
* Strong verbal and written skills in English.
Nice to have :
* Experience working with Snowflake.
* Preferably Azure/Databricks certified Engineer.
* 4+ years of experience with Google cloud platforms (GCP).
* Python and Java programming experience and in design, programming, unit-testing (Pytest, JUnit).
* Experience using Terraform.
* Experience using version control (GIT), Kafka or PubSub, and Docker.
* Experience using Apache Beam with Dataflow.
* Knowledge of Clean code principles
* Strong verbal and written skills in English.
Nice to have :
* Experience working with Snowflake.
* Preferably GCP certified Engineer.
Which are the main tasks?
* As part of the team, you are responsible to maintain and extend their customer data platform based on different GCP services (e.g. Dataflow, Cloud Functions).
* Develop and maintain scalable data pipelines using GCP.
* The platform development is based on Python and Terraform.
* Furthermore you will work with SQL related technologies like Google BigQuery or Snowflake and dimensional data models, to support advanced analytics and reporting capabilities.
* Design, programming, unit-testing (Pytest), quality assurance and documentation
* Work closely together in an agile team with members located in other countries (team language will be English).
* Develop and maintain scalable data pipelines using Microsoft Azure services, leveraging PySpark, Python, and Databricks.
* Build and deploy CI/CD pipelines in Azure DevOps, automating data workflows and ensuring seamless integration and delivery of data solutions.
* Design and implement data models using industry-standard tools such as ER/Studio, ERwin, Oracle Data Modeling, or Toad Data Modeling.
* Work closely together in an agile team with members located in other countries (team language will be English).
What can we offer you?
* Remote mode. However, you must be based in Spain.
* Integration in an international project of a solid company with prospects for the future and with continuity.
We are waiting for profiles like yours, passionate about technology and who want to face a new challenge. If this is your case, sign up for the offer with your CV so we can tell you more!
Seniority level
* Seniority levelMid-Senior level
Employment type
* Employment typeFull-time
Job function
* Job functionInformation Technology
* IndustriesIT Services and IT Consulting
Referrals increase your chances of interviewing at Hays by 2x
Sign in to set job alerts for “Data Engineer” roles.
Madrid, Community of Madrid, Spain 2 months ago
Madrid, Community of Madrid, Spain 2 days ago
Madrid, Community of Madrid, Spain 2 weeks ago
Madrid, Community of Madrid, Spain 1 month ago
Madrid, Community of Madrid, Spain 2 months ago
Madrid, Community of Madrid, Spain 2 weeks ago
Madrid, Community of Madrid, Spain 4 weeks ago
Madrid, Community of Madrid, Spain 1 month ago
Madrid, Community of Madrid, Spain 4 days ago
Madrid, Community of Madrid, Spain 4 months ago
Sevilla La Nueva, Community of Madrid, Spain 4 months ago
We’re unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI.
#J-18808-Ljbffr