Technology@stad
vidarh
•
10mo ago
•
100%
Frontier trained a ChatGPT-sized large language model with only 3,000 of its 37,888 Radeon GPUs
www.tomshardware.comThe world's fastest supercomputer blasts through one trillion parameter model with only 8 percent of its MI250X GPUs
Comments 0