Production Sub-agents for LLM Post Training
Faye Zhang (Pinterest) Lightning Talk at the Coding Agents Conference at the Computer History Museum, March 3rd, 2026.
Abstract //
Training models used to take weeks. Faye Zhang cut it to days with sub-agents, but the catch is brutal: more agents mean more chaos, memory issues, drift, and broken workflows, so the real game isn’t faster training, it’s controlling the mess you just created.
Bio //
Faye Zhang is a staff AI engineer and tech lead at Pinterest, where she leads Multimodal AI work for search traffic discovery and shopping driving platform growth globally.
Watch on YouTube ↗
(saves to browser)
DeepCamp AI