
Transforming AI queries with precise search infrastructure
Exa is revolutionizing AI search infrastructure with its embeddings-based search engine, designed to enhance the accuracy of AI model responses by filtering the internet for precise knowledge. Headquartered in Lower Haight, San Francisco, Exa has raised $112.2 million in funding through several succ...
Exa offers competitive salaries, equity options, and a flexible remote work policy. Employees enjoy generous PTO, parental leave, and a budget for pro...
Exa fosters a culture focused on innovation and precision in AI search technology. The team values collaboration and is committed to building a unique...

Exa • San Francisco, California
Exa is building a search engine from scratch to serve every AI application. We build massive-scale infrastructure to crawl the web, train state-of-the-art embedding models to index it, and develop super high performant vector databases in Rust to search over it. We also own a $5M H200 GPU cluster that regularly lights up tens of thousands of machines.
On the ML team, we train foundational models for search. Our goal is to build systems that can instantly filter the world's knowledge to exactly what you want, no matter how complex your query. Basically, put the web into an extremely powerful database.
We're looking for an ML Research Engineer to train embedding models for perfect search over the web. The role involves dreaming up novel transformer-based search architectures, creating datasets, creating evals, beating our internal SOTA, and repeat.
Desired Experience
You have graduate-level ML experience (or are an exceptionally strong undergrad)
You can code up a transformer from scratch in PyTorch
You like creating large-scale datasets and diving deeply into the data
You care about the problem of finding high quality knowledge and recognize how important this is for the world
Example Projects
Pre-training: Train a hundred billion parameter model
Fine-tuning: Build an RLAIF pipeline for search
Dream up a novel architecture for search in the shower, then code it up and beat our best model's top score
Build an eval system that answers how do we know we're advancing our search quality? (this is an incredibly difficult question to answer)
This is an in-person opportunity in San Francisco. We're happy to sponsor international candidates (e.g., STEM OPT, OPT, H1B, O1, E3).
Apply now or save it for later. Get alerts for similar jobs at Exa.