About Nova & The Role
Nova
is the
professional network that connects the most talented people globally.
What makes Nova different from LinkedIn and other networks is the merit-based access. All members need to go through a rigorous selection process to join.
Nova is based on the fact that your professional success is determined by your network. Whether finding a business partner, a mentor, or getting peer support, members achieve their professional goals through Nova, one connection at a time.
We operate
two main business lines
:
B2C:
Through our
paid membership
, we give top professionals access to a high-calibre network. Nova's platform already connects over 25,000 top-talented individuals.
B2B:
Through
recruiting and headhunting services
, helping companies find and hire Nova-level talent. We have over
700+ B2B clients
, including
Uber, Santander, and BCG
, among others.
We just raised our first VC round, and we are growing fast. Our goals in 2025 are ambitious: we want to
2x our membership base
, expand internationally to +25 cities around the world and make Nova the global network which connects the most talented people around the world.
To support this growth, we're expanding our
Data & AI team
and looking for an
Analytics Engineer
to take ownership of how we structure, model, and deliver data across the company. You'll play a key role in making Nova a truly data-driven organization.
What You'll Be Doing
* Own
Nova's data models by designing clean, reliable dimensions and metrics that power dashboards, analytics, and decision-making across teams.
* Build and maintain
pipelines that integrate data from multiple sources into our central data warehouse using Airbyte or similar ingestion tools.
* Implement orchestration
with modern workflow tools (e.g., Prefect, Dagster, Airflow) to manage, monitor, and schedule data pipelines.
* Ensure
data quality by implementing tests, validations, and documentation, making sure business teams trust the numbers they use.
* Enable
the business by making data accessible in Metabase, creating dashboards, and supporting teams in their analytics needs.
* Collaborate
closely with Business, Product and Tech teams to translate their needs into well-structured data assets.
* Stay agile and adaptive
, leveraging modern tools (including AI-assisted development environments like Cursor or Cloud Code) to work faster, smarter, and keep Nova at the cutting edge.
Requirements for this position
What We're Looking For
* Strong SQL skills:
You can write complex queries and optimize them when needed.
* Proficiency in Python:
You're comfortable scripting and automating data workflows.
* Pipeline mindset:
You have experience building or maintaining data pipelines and integrating third-party data sources into a warehouse.
* Familiarity with orchestration tools:
At least some exposure to workflow managers such as Airflow, Prefect, or Dagster.
* Modeling & Documentation:
You understand the basics of data modeling (dimensions, metrics) and care about documentation and quality.
* BI experience:
You've worked with at least one BI tool (Metabase, Looker, Tableau, PowerBI, etc.).
* Cloud & Git:
You're comfortable working in cloud environments (AWS/GCP/Azure) and using Git for version control and collaboration.
* Curiosity & Ownership:
You want to "own the data," ensuring it's reliable, well-modeled, and valuable for the business.
* Experience level:
2–4 years of experience in data-related roles (analytics, engineering, or similar).
Your compensation
* Yearly base salary revision
and global career opportunities
* 10% Yearly Bonus
* Networking
with
high-potential individuals
in your everyday job
* Be part of a team disrupting an industry
with tech and data-oriented solutions
* Flexible
schedule to adapt to your life
* Talented
global team of Novas
* Responsibility
and autonomy
from day 1
* Learning and development programs
developed by Nova and tailored to top talent like you
* Office
in the city center
and
flexible remote work
* Perks such as
Payflow
and free coffee
Languages
Required (fluent)
* English
* Spanish
Valuable skills
* Experience with
dbt
for modeling, testing, and documenting data.
* Exposure to
Infrastructure as Code
tools (Terraform, Pulumi) and automated deployments.
* Familiarity with
data quality/observability tools
.
* Previous experience with
data warehouses
(e.g., BigQuery, Snowflake, Redshift).
* Interest in
AI, agents, and ML
, and willingness to experiment with emerging tools to stay ahead.