Position: Data Engineer (GCP)
Location: Spain (Remote).
Duration: Permanent/Freelance
Description
We are seeking an experienced Cloud Data Engineer with strong expertise in designing, developing, and deploying cloud‑based data solutions on Google Cloud Platform (GCP). The ideal candidate will have deep technical skills in data pipelines, BigQuery, SQL, Python, Airflow, and enterprise cloud architectures.
Key Responsibilities
* Design, develop, and deploy high‑performance data pipelines for streaming and batch data sources.
* Build and maintain data lakes and data warehouses on GCP with focus on quality, security, scalability, and reliability.
* Design enterprise cloud solutions and support customer project implementations.
* Work with BigQuery, SQL optimization, data ingestion, data tokenization, and Airflow orchestration.
* Write high‑quality Python code, including data structures, algorithms, and large‑scale system design.
* Automate infrastructure provisioning using DevOps and CI/CD tools.
* Implement production‑grade cloud and big‑data solutions using managed services.
* Collaborate with data scientists, analysts, and software engineers to meet data needs effectively.
* Validate and optimize SQL queries for performance and cost efficiency.
* Optional: Develop Looker dashboards and work on real‑time streaming architectures with strong event‑delivery semantics.
Required Skills (Must-Have)
(3–5 years each)
* Python – Mandatory
* BigQuery – Mandatory
* SQL – Mandatory
* Airflow
* GCP – Google Cloud Platform
* Data Lakes
* Data Warehouses
* Cloud Data Engineering
Additional:
* Google Cloud Professional Data Engineer Certification
* Fluent communication in English and Spanish
Nice to Have
* Experience in Data Engineering beyond the required stack
* Additional cloud engineering experience with Google Cloud services
* Experience with Looker
* Experience with event‑stream processing and semantics
* Experience in technical consulting