SenseTime and Shanghai AI Laboratory recently joined forces with the Chinese University of Hong Kong, Fudan University and Shanghai Jiao Tong University to release the large language model "InternLM" with hundreds of billions of parameters. "InternLM", with 104 billion parameters, is trained on a multilingual high-quality data set containing 1.6 trillion tokens.
The comprehensive evaluation results showed that "InternLM" not only performs well in multiple test tasks such as knowledge mastery, reading comprehension, mathematical reasoning and multilingual translation, but also has a strong comprehensive ability. As a result, it executes outstandingly in the comprehensive test, and achieves results beyond ChatGPT performance in a number of Chinese examinations, including