Agap2 is looking for a Data Engineer to join our team and contribute to innovative projects within the aeronautics industry. You will work with cutting-edge cloud technologies such as AWS, Airflow, dbt, Snowflake, ClickHouse, and S3 to design, build, and operate scalable data solutions that enable advanced analytics and decision-making.
Design and manage both batch and streaming pipelines using Airflow for workflow management, DBT for data transformations, and AWS/S3 for storage.
Develop and fine-tune data models and ELT processes in Snowflake and/or ClickHouse, writing efficient SQL and Python code.
Apply testing, validation, and documentation to guarantee data quality, while contributing to governance and security standards.
Track and troubleshoot pipeline performance, improving efficiency and controlling costs.
Participate in Agile workflows (sprints, JIRA) and follow coding best practices with GitHub for reviews, version control, and CI/CD.
Partner with analysts, data scientists, and business teams to deliver trusted and valuable datasets.
More than 2 years of professional experience in data engineering or a related role.
Strong skills in SQL and Python, with proven experience using Airflow and DBT.
Knowledge of AWS (S3, IAM basics) and experience with at least one warehouse technology (Snowflake or ClickHouse).
Familiarity with testing (DBT tests/Pytest), CI/CD tools (GitHub Actions), and observability.
Expertise in performance tuning for Snowflake/ClickHouse and cost optimization in AWS.
Awareness of data governance and security practices.
The chance to take part in challenging projects within a dynamic environment.
A permanent contract from day one.
Hybrid remote work format.