Experteer Overview As part of JLL Technologies, you will design and build scalable data solutions to support capital markets initiatives. You will work in a global, fast-paced team to shape data platforms that enable better decision-making and operational efficiency. You’ll tackle data quality, pipeline reliability, and cross-cloud integration, contributing to the broader mission of leveraging technology to enhance the value of buildings. This role offers impact-driven work in a collaborative, multi-disciplinary environment.Compensaciones / Beneficios - Design and implement robust data pipelines using Databricks, Apache Spark, and Delta Lake, and integrate with BigQuery - Create scalable data pipeline frameworks and ensure data flows from sources to data lakes, warehouses, and analytics platforms - Troubleshoot data processing, quality, and pipeline performance issues - Document data infrastructure, pipelines, and ETL processes;
enable knowledge transfer - Develop automated tests and integrate them into testing frameworks - Configure and optimize Databricks workspaces, clusters, and job scheduling - Operate in a multi-cloud environment (Azure, GCP, AWS) - Enforce security best practices including access controls, encryption, and audit logging - Build integrations with market data vendors, trading systems, and risk platforms - Establish monitoring and performance tuning for data pipelines and overall health - Collaborate with cross-functional teams to define data ingestion and meet stakeholder needsResponsabilidades - Bachelor's degree in Computer Science, Data Engineering, or related field;
Master’s preferred - 3-5years of experience in data engineering or full-stack development with cloud focus - Strong Python, SQL, PySpark, and Spark expertise for large-scale data projects - Proven Databricks experience and data pipeline design/implementation - Experience with data warehousing concepts and cloud-based platforms (BigQuery, Snowflake) - Proficiency with cloud platforms (Azure, GCP, AWS) and DevOps practices - Experience with data validation, testing (unit, functional, integration, security) and workflow automationRequisitos principales -