Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
ksynwa
8 days ago
|
parent
|
context
|
favorite
| on:
Arcee Trinity Mini: US-Trained Moe Model
> Trinity Large is currently training on 2048 B300 GPUs and will arrive in January 2026.
How long does the training take?
arthurcolle
8 days ago
[–]
Couple days or weeks usually. No one is doing 9 month training runs
reply
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search:
How long does the training take?