Data Engineer - PagoNxt
Country: Spain
**WHAT YOU WILL BE DOING**
**PAGONXT is looking for a DATA ENGINEER, preferably based in our BOADILLA DEL MONTE office in Madrid.
**WHY YOU SHOULD CONSIDER THIS OPPORTUNITY
**PagoNxt **is a new company wholly owned by Santander that incorporates our most disruptive payments & trade businesses into a single, autonomous company, creating one of the largest private fintech companies in the world. This new company will deliver payment & trade solutions for merchants and consumers with compelling technology to become a leading general payments provider in the medium term. In the short term PagoNxt Payments (Payments Hub) will centralize all the Santander Group payments worldwide in its proprietary technology based on public cloud, optimizing the group processes and costs based on cloud technology scale economies and synergies.
In order to achieve these goals, PagoNxt is organized in three different business:
- Merchant Solutions, for merchants and acquirers.
- PagoNxt Payments (Payments Hub) cloud based payments processor and payments data ecosystem/ Trade Solutions for corporates and SMEs.
- Consumer Solutions, digital products and services for individuals.
This role will be part of our PagoNxt Payments team and will be responsible for delivering fast and efficient big data arquitecture and data products in our data lake to our clients, all the banks and companies owned by Santander that will migrate their payments processing to PagoNxt Payments cloud technology.
Santander is proud of being an organization where there are equal opportunities regardless of gender identity, culture and disability. Our mission is to contribute to help more people and business prosper. We embrace a strong risk culture and all our professionals at all levels are expected to take a proactive and responsible approach toward risk management.
**WHAT YOU WILL BE DOING
As a **Data Engineer **, you will join a squad for the development and implementation of the functionalities of a data lake where all the payments events of the company will be ingested and processed to create data products for our clients for different purposes such as Internal Management Information (MI), Regulatory Reporting, Client Data Consumption, Financial Crime (AML), Business Analytics and Data Science etc.
You will be part of a multidisciplinary team responsible for the E2E definition, implementation, testing and deployment in a public cloud-based architecture, following a data mesh culture, agile methodologies and the most outstanding standards in programming methodologies, data visualization, data governance, process automation, cyber security, analytics developments (ML/DL), pipeline deployments etc.
We need someone like you to help us in different fronts:
- Implementation of new ingestions, data models, transformations, visualizations and analysis based on requirements captured by the Product Owners in user stories.
- Definition and implementation of automated test suites to ensure the correct execution of the implemented software: unit, schemas, E2E...
- Strongly encourage the data mesh culture adoption across the company.
- Participate in the meetings and dynamics of the squad and the projects.
- Follow-up, execution and improvement/optimization of existing processes.
- Able to collaborate with different stakeholders and Management to make strategic architectural and design decisions.
- Comply with defined quality standards
EXPERIENCE
- 2+ years.
EDUCATION
- Technical degree, computer science engineer, math’s, physics, statistics or equivalent.
- Master’s degree in big data / data engineering.
SKILLS & KNOWLEDGE
- English language essential, Spanish desirable but not required
- Working with data in cloud environments (AWS preferably)
- Spark sql (ETL, data prep.), Spark core (Spark process optimization)
- Python programming high skills (data related is a must but not only)
- Github
- Visualization tools (power bi, tableau...)
- Data warehousing / data modelling
- Data governance and security
- Agile methodologies
NICE TO HAVE
- Kafka
- Elasticsearch
- Hadoop ecosystem
- API technical knowledge
- Working with hierarchical text files (json, xml...)
- Payments knowlege (ISO 20020, SWIFT)
- Financial crime knowlege (screening, transaction monitoring)
- Data science (ML/DL)
- Production deployments of ML pipelines in distributed environments
OTHER INFORMATION
- If you want to know more about us, follow us on _
**Idiomas**:
- Spanish