Job description
As a Data Engineer within the corporate division, you will be involved in the process of splitting data. We require solid knowledge in data models and experience with tools such as Databricks and Airflow. Programming skills will be highly valued, and if you have experience with additional platforms, it will be considered a plus.
- Design, create and evolve data products via programming data pipelines and using models of
structured/unstructured raw data.
- Ensure best experience of data product users through measurement of SLO and satisfaction
levels and implementation of learning.
- Provide real time solutions using modern tools and programming languages.
- Maintain, develop and orchestrate the pipelines that build curated datasets
- Ensure that quality standards and data governance are being achieved
Data corporate area.
- Guarantee definitions of dimensions and indicators across the company
- Develop comprehensive data capture, transformation, and visualisations for self service
analytics.
- Collaborate and communicate with other team members and common platform team to
develop and standardize high quality solutions.
- Coach colleagues and users on how to access the right data helping eliminate barriers for
self service and autonomy of the teams.
Profile Requirement
**Must**:
- 3 years experience in a similar position
- Extensive experience with SQL
- Extensive experience with Python and Pyspark.
- Databricks
- Experience working with data processing frameworks like Spark, Flink, Kafka Streams or
Beam.
- Familiarity developing data pipelines with scheduling tools like: Airflow
- Proficiency in building well-tested cod
- Querying tools: Hive, Athena, Presto, BigQuery
- Good development practices (SOLID, testing) and data care (rigor for data quality).
- Self-starter and detail-oriented. You can complete projects with mínimal supervision
- Excellent communication skills and ability to interact with multidisciplinary teams
- Intermediate knowledge of GitHub
- Experience working with Agile methodologies
- Fluent in English
**Nice to have knowledge**:
- Experience with Docker and container orchestration tools like Kubernetes, ECS, Docker
Swarm.
- Experience working with some Hadoop distribution: Hortonworks, Cloudera, EMR, Aws,
HDInsight, Google Cloud Dataproc, Databricks
- NoSQL databases: Redshift, Cassandra, HBase.
- Knowledge of Tableau.
- Machine learning tools: scikit-learn, Spark ML, Kubeflow, MLflow
- Storytelling skills
Who are we?
Amaris Consulting is an independent technology consulting firm providing guidance and solutions to businesses. With more than 1000 clients across the globe, we have been rolling out solutions in major projects for over a decade - this is made possible by an international team of 7500 people spread across 5 continents and more than 60 countries. Our solutions focus on four different Business Lines: Information System & Digital, Telecom, Life Sciences and Engineering. We’re focused on building and nurturing a top talent community where all our team members can achieve their full potential. Amaris is your steppingstone to cross rivers of change, meet challenges and achieve all your projects with success.
**Brief Call**: Our process typically begins with a brief virtual/phone conversation to get to know you! The objective? Learn about you, understand your motivations, and make sure we have the right job for you!
Interviews (the average number of interviews is 3 - the number may vary depending on the level of seniority required for the position). During the interviews, you will meet people from our team: your line manager of course, but also other people related to your future role. We will talk in depth about you, your experience, and skills, but also about the position and what will be expected of you. Of course, you will also get to know Amaris: our culture, our roots, our teams, and your career opportunities!
**Case study**: Depending on the position, we may ask you to take a test. This could be a role play, a technical assessment, a problem-solving scenario, etc.
We look forward to meeting you!