Writes about Ken Goldberg's talk on the robot data gap on substack.
How this journalist typically writes
Referenced in coverage
Training a robot foundation model equivalent to language model scale (2 trillion tokens) requires 70,000+ robot-years of data, but scaling fleets, simulation, and human video data combined could make this feasible with substantial investment.
“Writes about Ken Goldberg's talk on the robot data gap on substack.”