BBT-2 is a general-purpose large language model containing 12 billion parameters. On the basis of BBT-2, professional models such as code, finance, and Vincent graphs have been trained. The series of models based on BBT-2 include: BBT-2-12B-Text: Chinese basic model with 12 billion parameters BBT-2.5-13B-Text: Chinese + English bilingual basic model with 13 billion parameters BBT-2-12B-TC- 001-SFT A code model fine-tuned by instructions, capable of dialogue BBT-2-12B-TF-001 A financial model trained on 12 billion models to solve tasks in the financial field -2-12B-Science scientific papers… |
#General #Big #Language #Model #BBT2