Analytics Engineer – Data Modelling & Transformation
Por favor, lea detenidamente la siguiente descripción del puesto para asegurarse de que encaja con el perfil antes de enviar su solicitud.
About the role
We're looking for an Analytics Engineer to join a high-impact unit driving advanced analytics and data transformation initiatives across our client ́s organization.
This role sits at the intersection of data engineering, modelling, and analytics, with a strong focus on transforming heterogeneous raw data into scalable, structured, and analytics-ready data models.
You'll work with modern data platforms such as ClickHouse, Snowflake, and Microsoft Fabric (Lakehouses, Warehouses, Notebooks), contributing to the development of robust data foundations that power dashboards, analytical applications, and AI-driven insights.
If you're passionate about data modelling, SQL performance, and building clean, scalable data layers, this role is for you.
What you'll do
Explore and analyze heterogeneous raw data sources to identify structure, quality issues, and modelling opportunities.
Design and build scalable dimensional data models optimized for analytical consumption.
Develop efficient data transformations using T-SQL and Python.
Standardize and structure data for dashboards and analytical applications.
Perform ad hoc analysis to support business decision-making.
Work with modern platforms including ClickHouse, Snowflake, and Microsoft Fabric (Lakehouses, Warehouses, Notebooks).
Use tools such as DBeaver, SQL Server Management Studio, and VSCode.
Leverage Copilot and GenAI tools to enhance development productivity.
Document data models, transformation logic, and architecture decisions clearly and thoroughly.
Collaborate closely with data, AI, and business stakeholders within the Skynet team.
Must Have
4+ years of experience in T-SQL and data modelling.
Strong experience in data transformation and dimensional modelling (star schemas, performance optimization).
Solid experience using Python for data transformation, analysis, and API integration.
Ability to work with heterogeneous data sources and design scalable data layers.
Degree in Engineering, Mathematics, Statistics, or a related technical field.
Familiarity with GenAI tools to support coding and development workflows.
Strong analytical mindset and attention to data quality and structure.
Clear documentation and communication skills.
Nice to Have
Experience with ClickHouse, Snowflake, or Microsoft Fabric.
Knowledge of Power BI and analytical reporting use cases.
Experience in modern data architectures (Lakehouse, hybrid warehouse models).
Exposure to AI/ML data preparation workflows.
Experience working in data-driven or AI-focused teams.
Why join this project?
People first – diverse and inclusive culture in an international environment.
Modern cloud platforms and large-scale, global projects.
High team stability and collaborative culture.
€1200 per year training budget and continuous learning opportunities.
Flexible compensation model.
Private health insurance and benefits package.
Flexible working hours and hybrid model.
Wellhub: fitness, wellness, and mental health support.
Football and paddle tennis teams sponsored by Capitole.
Team buildings, global events, and strong tech communities.
Want to know more about us? Click here and discover all the details.
Curious about our culture? Check out what people are saying about us on Glassdoor.
We know that not every candidate will meet 100% of the requirements. If your profile doesn't match perfectly but you believe you can add value, we'd still love to hear from you!
Ready for the challenge? Apply now and be part of a global team driving cloud innovation and security.
Empowering People, Unlocking Innovation.
Information Security Notice
* The employee will have access to confidential information related to Capitole and the assigned project.
* Compliance with internal security and information protection policies is mandatory. xqysrnh
* NDA signature required.