August 24, 2023
We release Code Llama, a family of large language models for code based on Llama 2 providing state-of-the-art performance among open models, infilling capabilities, support for large input contexts, and zero-shot instruction following ability for programming tasks. We provide multiple flavors to cover a wide range of applications: foundation models (Code Llama), Python specializations (Code Llama - Python), and instruction-following models (Code Llama - Instruct) with 7B, 13B and 34B parameters each. All models are trained on sequences of 16k tokens and show improvements on inputs with up to 100k tokens. 7B and 13B Code Llama and Code Llama - Instruct variants support infilling based on surrounding content. Code Llama reaches state-of-the-art performance among open models on several code benchmarks, with scores of up to 53% and 55% on HumanEval and MBPP, respectively. Notably, Code Llama - Python 7B outperforms Llama 2 70B on HumanEval and MBPP, and all our models outperform every other publicly available model on MultiPL-E. We release Code Llama under a permissive license that allows for both research and commercial use.
Written by
Jonas Gehring
Fabian Gloeckle
Sten Sootla
Itai Gat
Ellen Tan
Jingyu Liu
Tal Remez
Jérémy Rapin
Artyom Kozhevnikov
Ivan Evtimov
Manish Bhatt
Aaron Grattafiori
Wenhan Xiong
Alexandre Defossez
Jade Copet
Faisal Azhar
Louis Martin
Thomas Scialom
Publisher
Meta AI
September 05, 2024
Chunting Zhou, Lili Yu, Arun Babu, Kushal Tirumala, Michihiro Yasunaga, Leonid Shamis, Jacob Kahn, Luke Zettlemoyer, Omer Levy, Xuezhe Ma
September 05, 2024
August 20, 2024
Ashish Shenoy, Yichao Lu, Srihari Jayakumar, Debojeet Chatterjee, Mohsen Moslehpour, Pierce Chuang, Abhay Harpale, Vikas Bhardwaj, Di Xu (SWE), Shicong Zhao, Ankit Ramchandani, Luna Dong, Anuj Kumar
August 20, 2024
August 12, 2024
Arman Zharmagambetov, Yuandong Tian, Aaron Ferber, Bistra Dilkina, Taoan Huang
August 12, 2024
August 11, 2024
Igor Tufanov, Karen Hambardzumyan, Javier Ferrando, Lena Voita
August 11, 2024
Foundational models
Latest news
Foundational models