auxerta · labs

Foundation models,
from the ground up.

Founders·Philip Abao·Soraya Johnson

Abstract

We pursue foundational research on novel pretraining paradigms — architectures and training objectives that depart from standard autoregressive next-token modeling. Our work focuses on representations that transfer cleanly across tasks and modalities.

1.  On the boards

What we're working on, what we've written, and how to get in touch.

Careers

No open positions right now.

2.  Products

Auxerta's release platform.

Products

Neognathae.com

Have a question or want to work together?

We usually reply within a day.

§ 3 · Timeline

Milestones
  1. [1]2025.Q3First domain annotation service
  2. [2]2026.Q1Accepted to NVIDIA Innovation Labs
  3. [3]2026.Q11M specialized datasets
  4. [4]2026.Q2Researchin progress

Merit. Craft. Responsibility.