What you’ll do
We are seeking a skilled and passionate Data Modeler / Data Engineer / ETL Developer to join our data engineering team to design, build, and maintain data warehouse solutions on the Google Cloud Platform. As a key contributor, you will design and implement scalable data models, develop robust data pipelines, and ensure seamless data integration and data quality across platforms. Your work will directly support business insights, product innovation, and strategic decisions by enabling high-quality, well-structured, and easily accessible data.
- Analyze business requirements and translate them into effective data models and technical designs.
- Design and maintain conceptual, logical, and physical data models DWHs.
- Build and manage scalable, reliable, and secure data pipelines (ETL/ELT) in hybrid and cloud environments.
- Ensure data consistency, integrity, and quality across data platforms.
- Collaborate closely with data architects, analysts, and software developers to deliver end-to-end solutions.
- Continuously optimize data flows for performance and cost-efficiency.
- Accuracy and scalability of data models.
- Pipeline stability and performance metrics (e.g., run time, error rate).
- Reduction in data latency and processing costs.
- Positive feedback from cross-functional collaborators.
- Contributions to documentation and knowledge-sharing within the team.
Who you are
Core competencies, knowledge and experience:
Technical Requirements
- To perform in this role at a top-tier level, the candidate should have strong expertise in:
- Data Modeling and Architecture
- Solid experience in data modeling techniques (3NF, star/snowflake schemas, data vault) and data architecture best practices.
- Proficiency in SQL
- Experience in metadata management, data lineage, and governance practices.
Cloud and DevOps
- Experience with cloud data platforms, preferably Google Cloud Platform (GCP); knowledge of BigQuery, Teradata, Oracle.
- Familiarity with CI/CD pipelines using tools such as Jenkins, GitHub Actions, or GitLab CI.
Tools & Technologies
- Hands-on with data preparation and profiling tools (e.g., Talend, Trifacta, Collibra).
- Experience working in Agile/Scrum
Other Competencies
- Bachelor’s or Master’s degree in Computer Science, Engineering, Mathematics, or a related field.
- 3+ years of experience in data warehouse implementation or related field.
- Professional certification in cloud platforms (e.g., GCP Data Engineer, AWS Big Data, Azure Data Engineer) is a plus.
- Fluent English speaking and writing.
- Experience in Agile Scrum Methodology.
What We Offer
- Bonus on top of the gross salary.
- Versátil working hours from Monday to Thursday, and an intensive schedule on Fridays.
- Intensive Summer Schedule during July and August.
- Up to 20 days per year of 100% remote work from other locations.
- Private Health and Life Insurance for employees.
- 25 vacation days, plus December 24th and 31st off.
- Access to an online learning platform for continuous training.
#J-18808-Ljbffr