Co-introduced VaultGemma, the most capable language model trained from scratch with differential privacy.
How media typically covers Amer Sinha
Amer Sinha as author
Scaling laws for differentially private language model training accurately model compute-privacy-utility tradeoffs and identify optimal training configurations for privacy-preserving LLM development.
“Author of "Tool-space interference in the MCP era: Designing for agent compatibility at sca" on arXiv”
Referenced in coverage
Google released VaultGemma, the largest open-weight language model (1B parameters) trained from scratch with differential privacy, alongside new scaling laws quantifying compute-privacy-utility trade-offs.
“Co-introduced VaultGemma, the most capable language model trained from scratch with differential privacy.”