DeepMind’s New AI with a Memory Outperforms Algorithms 25 Times Its Size

DeepMind’s New AI With a Memory Outperforms Algorithms 25 Times Its Size.
DeepMind’s model, with just 7 billion parameters, outperformed the 178 billion-parameter Jurassic-1 transformer on various language tasks.

Read in full here:

This thread was posted by one of our members via one of our news source trackers.

2 Likes

Corresponding tweet for this thread:

Share link for this tweet.

1 Like

The end of the headline, " Outperforms Algorithms 25 Times Its Size", sounds like one of those “well no duh!” moments in programming! :slight_smile: I dunno about the rest of y’all, but when I see two algorithms to do the same thing, one 25x the size of the other, I’d bet on the little one to perform much faster (if not “better” in other terms too).

2 Likes