Moss | An Open Source Tool Augmented Conversational Language Model

MOSS

Code License Data License Model License

[中文版] [English]

Table of Contents


:spiral_notepad: Open-source List

Models

  • moss-moon-003-base: The base language model of MOSS-003, which was initialized with CodeGen and further pre-trained on 100B Chinese tokens and 20B English tokens. The model has seen 700B tokens during pre-training and consumed ~6.67x1022 FLOPs in total.
  • moss-moon-003-sft: We performed supervised fine-tuning on ~1.1M multi-turn conversational data. The fine-tuned model can follow instructions in multi-turn dialogues and refuse inappropriate requests.
  • moss-moon-003-sft-plugin: We performed supervised fine-tuning on ~1.1M multi-turn conversational data and additional ~300K plugin-augmented data. The fine-tuned model is capable of using several tools including search engine, text-to-image, calculator, and equation solver.
  • moss-moon-003-sft-int4: 4-bit version of moss-moon-003-sft, which requires 12GB GPU memory to perform inference.
  • moss-moon-003-sft-int8: 8-bit version of moss-moon-003-sft, which requires 24GB GPU memory to perform inference.
  • moss-moon-003-sft-plugin-int4: 4-bit version of moss-moon-003-sft-plugin, which requires 12GB GPU memory to perform inference.
  • moss-moon-003-sft-plugin-int8: 8-bit version of moss-moon-003-sft-plugin, which requires 24GB GPU memory to perform inference.
  • moss-moon-003-pm: The preference model (PM) trained on preference data collected using the responses of moss-moon-003-sft. Will be open-sourced in the near future.
  • moss-moon-003: The final MOSS-003 model trained using moss-moon-003-pm, which demonstrated better factuality, safety, and more stable response quality. Will be open-sourced in the near future.
  • moss-moon-003-plugin: The final MOSS-003-plugin model trained using moss-moon-003-pm, which poccessed stronger abilities in understanding user intents and using plugins. Will be open-sourced in the near future.

Data

  • moss-002-sft-data: The multi-turn conversational data used to train MOSS-002, covering helpfulness, honesty, and harmlessness. The data is consisting of 570K English and 590K Chinese conversations generated by text-davinci-003.
  • moss-003-sft-data: The multi-turn conversational data used to train moss-moon-003-sft. The data is generated by gpt-3.5-turbo from a seed set of user prompts collected through our early deployed MOSS-002 API. In contrast to moss-002-sft-data, moss-003-sft-data is well-aligned with the real-world distribution of user intents, covering finer-grained categories and more diverse harmlessness-related data. The data consists of ~1.1M conversational data. Full data is now available🔥.
  • moss-003-sft-plugin-data: The plugin-augmented multi-turn conversational data, which is consisting of ~300K conversations in which the AI assistant uses four plugins (search engine, text-to-image, calculator, and equation solver) to generate responses. Currently we open-sourced all the data.
  • moss-003-pm-data: The preference data used to train moss-moon-003-pm, including ~180K additional dialogue contexts and their corresponding responses generated by moss-moon-003-sft. Will be publicly available in the near future.

Engineering Solutions

:fountain_pen: Introduction

MOSS is an open-sourced plugin-augmented conversational language model. moss-moon models have 16B parameters, allowing users to perform inference on a single A100 GPU or 2 NVIDIA 3090 GPUs with FP16 precision, and on a single NVIDIA 3090 GPU with INT-4/8 precision. The base language model of MOSS was pre-trained on ~700B English, Chinese, and code tokens, including the PILE, BigQuery, BigPython, and our private Chinese corpus. The base model was then fine-tuned on multi-turn plugin-augmented conversational data. Finally, we performed preference-aware training to further improve the model.

Limitations: Due to the (relatively) small number of parameters and the autoregressive nature, MOSS is still possible to generate outputs that contain incorrect, misleading, or biased information. Please carefully check the contents generated by MOSS before you use them.

Continue…

GitHub:

3 Likes