5h ago
Research Engineer / Scientist, Memory
San Francisco, CA
✨ $200k-$350k / yearest.
full-timeseniorai-ml
🛠 Tech Stack
💼 About This Role
You'll design the long-term memory architecture for LLMs, working with a world-class team towards self-improving superintelligence. You'll advance the field through open publishing of research and open-source code. This role requires being in-person 5 days a week in downtown San Francisco.
🎯 What You'll Do
- Define key abstractions for the LLM memory layer
- Build memory architectures supporting multiple memory types
- Research memory sharing between multiple agents
- Improve context management to solve long-context derailment
📋 Requirements
- Deep expertise in LLMs and retrieval
- Track record of impactful research (publications or open source)
- Ability to balance execution speed with empirical rigor
✨ Nice to Have
- Experience with multi-agent systems
- Published memory-related AI research
🎁 Benefits & Perks
- 🚀 Work on frontier AI at a small, high-impact team
- 📝 Open publishing culture with papers and open-source code
- 🏢 In-person collaboration in downtown San Francisco
📨 Hiring Process
Initial screen (30 min), technical screen (1-1.5 hours), paid in-person work trial (2 days onsite in SF).
🚩 Heads Up
- Requires 5 days/week in-office with no hybrid option
- Explicitly states 9-5 work style guarantees failure
- High ambiguity and pressure in a small team environment
0 0 0