What you’ll doWe are seeking a skilled and passionate Data Modeler / Data Engineer / ETL Developer to join our data engineering team to design, build, and maintain data warehouse solutions on the Google Cloud Platform. As a key contributor, you will design and implement scalable data models, develop robust data pipelines, and ensure seamless data integration and data quality across platforms. Your work will directly support business insights, product innovation, and strategic decisions by enabling high-quality, well-structured, and easily accessible data.
Analyze business requirements and translate them into effective data models and technical designs.
Design and maintain conceptual, logical, and physical data models DWHs.
Build and manage scalable, reliable, and secure data pipelines (ETL/ELT) in hybrid and cloud environments.
Ensure data consistency, integrity, and quality across data platforms.
Collaborate closely with data architects, analysts, and software developers to deliver end-to-end solutions.
Continuously optimize data flows for performance and cost-efficiency.
Accuracy and scalability of data models.
Pipeline stability and performance metrics (e.g., run time, error rate).
Reduction in data latency and processing costs.
Positive feedback from cross-functional collaborators.
Contributions to documentation and knowledge-sharing within the team.
Who you areCore competencies, knowledge and experience:
Technical Requirements
To perform in this role at a top-tier level, the candidate should have strong expertise in:
Data Modeling and Architecture
Solid experience in data modeling techniques (3NF, star/snowflake schemas, data vault) and data architecture best practices.
Proficiency in SQL
Experience in metadata management, data lineage, and governance practices.
Cloud and DevOps
Experience with cloud data platforms, preferably Google Cloud Platform (GCP); knowledge of BigQuery, Teradata, Oracle.
Familiarity with CI/CD pipelines using tools such as Jenkins, Gi