Social network you want to login/join with:
(Busqueda Urgente) Data Engineer, Toledo
Client:
Altia
Location:
Toledo
Job Category:
Other
EU work permit required: Yes
Job Views:
2
Posted:
27.04.2025
Expiry Date:
11.06.2025
Job Description:
At Altia, we have spent 30 years creating digital solutions prepared for the future, capable of generating real value and driving significant change. We are driven by a clear purpose: to grow by enabling growth, and to do so in a sustainable and lasting way. We believe that our impact is meaningful only if we contribute positively and everyone evolves in the process. We are an international team of professionals who, since 1994, have combined their energy and innovative vision to work on relevant projects for organizations that are drivers of change.
Our approach is end-to-end, developing customized solutions and integrating products from leading manufacturers. We foster innovation and technological renewal through a broad range of services and products.
We are expanding our Data team and are looking for a Data Engineer with experience in building data ingestion pipelines and developing ETL processes. We seek candidates with over 5 years of experience in developing and implementing ETL jobs for data warehouses using SQL and Python, who are eager to learn and grow professionally with us in a supportive environment.
Your main responsibilities will include:
1. Creating ingestion pipelines from various data sources.
2. Developing and maintaining ETL processes using SQL and Python.
3. Building aggregated Metrics Dashboards.
4. Working with AWS or Azure Big Data tools (Glue, Athena, Redshift, Kinesis, Data bricks, Azure Analytics, Data Explorer).
5. Utilizing cloud storage and functions (S3, Blob, Lambda, Azure Functions).
6. Applying data engineering methodologies (Data warehouse, data lake, star schema).
Requirements:
* University degree in computer science, engineering, mathematics, or a related field.
* More than 5 years of experience in data engineering.
* Experience in developing and implementing ETL jobs.
* Proficiency with SQL and Python.
* Experience with AWS or Azure Big Data tools.
* Knowledge of cloud storage and functions.
* Understanding of Data warehouse, data lake, and star schema concepts.
* Advanced English skills.
Preferred qualifications include:
* AWS certifications (Solutions Architect, Data Analytics Specialty).
* Experience with AWS CloudFormation or Terraform.
* Knowledge of CI/CD pipelines (AWS CodePipeline/CodeBuild/CodeDeploy).
* Knowledge of data APIs and information security.
* Experience with Git, Jira, and Bitbucket.
* Familiarity with the Gartner ODM framework.
We offer:
* Remote work model.
* Flexible schedule and summer hours.
* Continuous training plan.
* Opportunity to work with major clients in both public and private sectors.
* Flexible and competitive compensation.
* An open and collaborative culture.
#J-18808-Ljbffr