Experteer Overview
La información a continuación detalla los requisitos del puesto, la experiencia esperada del candidato y las cualificaciones correspondientes.
As part of JLL Technologies, you will design and build scalable data solutions to support capital markets initiatives. You will work in a global, fast-paced team to shape data platforms that enable better decision-making and operational efficiency. You’ll tackle data quality, pipeline reliability, and cross-cloud integration, contributing to the broader mission of leveraging technology to enhance the value of buildings. This role offers impact-driven work in a collaborative, multi-disciplinary environment.
Compensaciones / Beneficios
• Design and implement robust data pipelines using Databricks, Apache Spark, and Delta Lake, and integrate with BigQuery
• Create scalable data pipeline frameworks and ensure data flows from sources to data lakes, warehouses, and analytics platforms
• Troubleshoot data processing, quality, and pipeline performance issues
• Document data infrastructure, pipelines, and ETL processes; enable knowledge transfer
• Develop automated tests and integrate them into testing frameworks
• Configure and optimize Databricks workspaces, clusters, and job scheduling
• Operate in a multi-cloud environment (Azure, GCP, AWS)
• Enforce security best practices including access controls, encryption, and audit logging
• Build integrations with market data vendors, trading systems, and risk platforms
• Establish monitoring and performance tuning for data pipelines and overall health
• Collaborate with cross-functional teams to define data ingestion and meet stakeholder needs
Responsabilidades
• Bachelor's degree in Computer Science, Data Engineering, or related field; Master’s preferred
• xpzdshu 3-5 years of experience in data engineering or full-stack development with cloud focus
• Strong Python, SQL, PySpark, and Spark expertise for large-scale data projects
• Proven Databricks experience and data pipeline design/implementation
• Experience with data warehousing concepts and cloud-based platforms (BigQuery, Snowflake)
• Proficiency with cloud platforms (Azure, GCP, AWS) and DevOps practices
• Experience with data validation, testing (unit, functional, integration, security) and workflow automation
Requisitos principales
•