mimiclem.me
  • Communities
  • Create Post
  • heart
    Support Lemmy
  • search
    Search
  • Login
  • Sign Up
V H@lemmy.stad.social to Technology@stad@lemmy.stad.social · 2 years ago

Frontier trained a ChatGPT-sized large language model with only 3,000 of its 37,888 Radeon GPUs

www.tomshardware.com

external-link
message-square
0
link
fedilink
1
external-link

Frontier trained a ChatGPT-sized large language model with only 3,000 of its 37,888 Radeon GPUs

www.tomshardware.com

V H@lemmy.stad.social to Technology@stad@lemmy.stad.social · 2 years ago
message-square
0
link
fedilink
Frontier trained a ChatGPT-sized large language model with only 3,000 of its 37,888 Radeon GPUs — the world's fastest supercomputer blasts through one trillion parameter model with only 8 percent of its MI250X GPUs
www.tomshardware.com
external-link
Now you're playing with AI power!

The world’s fastest supercomputer blasts through one trillion parameter model with only 8 percent of its MI250X GPUs

alert-triangle
You must log in or # to comment.

Technology@stad@lemmy.stad.social

tech@lemmy.stad.social

Subscribe from Remote Instance

Create a post
You are not logged in. However you can subscribe from another Fediverse account, for example Lemmy or Mastodon. To do this, paste the following into the search field of your instance: [email protected]

Technology News and Opinion

Visibility: Public
globe

This community can be federated to other instances and be posted/commented in by their users.

  • 4 users / day
  • 4 users / week
  • 13 users / month
  • 4 users / 6 months
  • 1 local subscriber
  • 63 subscribers
  • 31 Posts
  • 0 Comments
  • Modlog
  • mods:
  • BE: 0.19.12
  • Modlog
  • Instances
  • Docs
  • Code
  • join-lemmy.org