The RoleThis administrator will be a subject matter expert responsible for the operational health, performance, and governance of the core components of our modern data stack, built on
Microsoft Azure. The role focuses on the administration, optimisation, and security of our
Azure Data Lake ,
Fivetran
ingestion platform,
Kafka
streaming platform,
Airflow
orchestration environment, and
Snowflake
data warehouse, ensuring a reliable and efficient environment for all data operations.RequirementsAdminister the
Snowflake
data warehouse (running on Azure), including user/role management, resource monitoring, cost optimisation, and implementing security controls.Manage and operate the Kafka/ Confluent Cloud
platform, including topic management, monitoring cluster health, and ensuring the reliability of real-time data streams.Administer the
Fivetran
platform, managing connectors, monitoring data ingestion pipelines, and ensuring reliable integration with sources and destinations.Manage and operate the
Airflow
environment, including the deployment of DAGs, managing connections, monitoring scheduler and worker health, and ensuring reliable execution of data workflows.Govern the
Azure Data Lake (ADLS Gen2), managing access controls (IAM), storage policies, and ensuring it serves as a reliable landing zone for raw data.Act as the technical point of contact for platform-level issues, liaising with
security, network, and infrastructure teams
to manage integrations within the Azure ecosystem.
Ideal Candidate ProfileExperience:
Solid experience in Data Platform Administration, Data Operations, or as a Data Engineer with a strong focus on platform management within an Azure environment.Azure Ecosystem Expertise:
Deep, hands-on experience with the Microsoft Azure cloud. Must be proficient with core services like Azure Data Lake, Azure Active Directory for identity management, and Azure Networking concepts (VNet, NSGs, Private Endpoints).Platform Expertise:
Hands-on administrative experience with the core components of a modern data stack, including:
Snowflake:
Administering security, cost management, and performance on Azure.Kafka:
Managing and operating the platform and its ecosystem.Fivetran:
Managing connectors, monitoring pipelines, and troubleshooting.Airflow:
Administering the environment, managing DAG deployments, and troubleshooting workflow executions.Technical Integration:
Good understanding of integration patterns and the technical requirements for connecting platforms within a secure enterprise environment, specifically on Azure (e.G., VNet peering, firewall rules, private endpoints, authentication protocols).Technical Skills:
Strong command of SQL for platform diagnostics and data analysis, with practical scripting skills (PowerShell, Python, Bash) for automation. Experience with orchestration tools like Airflow from an operational perspective is required.Industry:
Previous experience in the Insurance or Financial Services sector is a significant advantage.
BenefitsCompetitive compensationPermanent positionBenefits package, including health insurance and mental health supportFinancial support for ongoing trainingThere is a relaxed dress code at the Marionete officesAn abundance of career paths and opportunities in which to advanceA flexible and hybrid work environmentInterview process:
3 interviews
OUR PROCESS:
At Marionete we try to make our recruitment process as fair and unbiased as possible. We use virtually in-person screenings alongside online technical tests as we like to let the candidate’s ability speak for itself. In applying to this role, you confirm you are willing to participate in the Marionete recruitment process.
We will endeavour to give you feedback at all stages of the process, either directly or through your appointed agent. We look forward to meeting you.
By applying to Marionete vacancies, you agree that your data and CV remain secure and confidential with your application. Marionete is a company in compliance with General Data Protection Regulation (GDPR).