W
Microsoft says its newest compact “small language model,” Phi-2, is bigger and better.
The company has been working on training AI models on much smaller data sets comprised only of “textbook-quality” data, as part of its Phi models.
Microsoft says in a research blog that Phi-2, which is about twice as big as its predecessor, Phi 1.5, continues to perform on par or better than certain larger open-source Llama 2 models, including one with 13 billion parameters.
Follow topics and authors from this story to see more like this in your personalized homepage feed and to receive email updates.
Loading comments
Getting the conversation ready...












