How far does one billion parameters take you? As it turns out, pretty far!!!
— Sebastien Bubeck (@SebastienBubeck) September 12, 2023
Today we're releasing phi-1.5, a 1.3B parameter LLM exhibiting emergent behaviors surprisingly close to much larger LLMs.
For warm-up, see an example completion w. comparison to Falcon 7B & Llama2-7B pic.twitter.com/x5qZGPjoSZ
Key Details on Phi-1.5:
- Architecture: Transformer-based focused on next-word prediction.
- Training Data: 30 billion tokens from diverse sources like StackOverflow, competitive programming contests, and synthetic textbooks.
- Training Tokens: 150 billion tokens total.
- Precision: fp16
- Hardware: 32x A100 40GB GPUs
- Training Time: 8 days
Phi-1.5 was meticulously designed to excel at question answering, conversational tasks, and code-related applications.
Its comprehensive training regimen sets phi-1.5 apart, drawing from a rich blend of technical and conversational data sources. This empowers top-notch performance across multiple domains.
The Microsoft Research team asserts phi-1.5 has achieved near state-of-the-art results for models under 10 billion parameters. Benchmark tests show phi-1.5 outperforming Llama 2 in common sense, language, and reasoning capabilities.
Notably, phi-1.5 exceeded Llama-2 7b in the AGIEval score and approached parity in the LM-Eval harness.
With sophisticated efficiency, phi-1.5 proves that bigger models aren't always better. Its meticulous training unlocks remarkable versatility and power with far fewer parameters than competitors.
Microsoft's latest model cements their position at the forefront of lean yet capable AI systems. Try out phi-1.5 on Hugging Face to experience its capabilities firsthand.
Click here to read the paper: Textbooks Are All You Need II: phi-1.5 technical report
Hey, join our AI SubReddit, Facebook Community, Discord Channel, and Email Newsletter, where we share the latest AI research news, awesome AI projects, AI guides/tutorial, Best AI tools, and more.
Subscribe to our daily newsletter to receive the top headlines and essential stories delivered straight to your inbox. If you have any questions or comments, please contact us. Your feedback is important to us.