J
What’s more important for training AI — data or compute?
A report from CNBC suggests training data is becoming more important, with the outlet claiming Google’s latest AI language model, PaLM 2, used five times as much training data as its predecessor (3.6 trillion tokens vs. 780 billion) but less compute (340 billion parameters vs. 540 billion).
Any way you slice it though, these are big numbers, and show that creating cutting-edge AI models is still very much the domain of rich companies with a lot of resources.
Follow topics and authors from this story to see more like this in your personalized homepage feed and to receive email updates.
Loading comments
Getting the conversation ready...











