Skip to main content

FFM-Mistral-7B、FFM-Mistral-7B-4K、FFM-Mixtral-8x7B

The FFM-Mistral series is an enhanced Traditional Chinese version of large language models based on the open-source large language models Mistral 7B and Mixtral 8x7B launched by Mistral AI. By using the FFM-Mistral series models, you agree to comply with the terms of the Apache 2.0 license.

FFM-Mistral-7B
FFM-Mistral-7B is based on the Mistral AI open source model Mistral-7B training by high-quality Traditional Chinese language data. FFM-Mistral-7B can handle large amounts of text with a native 32K context length, and its coding capability is comparable to Meta-CodeLlama-7B, meeting various needs such as long text processing and coding.

FFM-Mistral-7B-4K
FFM-Mistral-7B also provides a version with a limited context length of 4K, allowing users who may not require 32K to have a more cost-effective option.

FFM-Mixtral-8x7B
FFM-Mixtral-8x7B is based on the Mistral AI open source model Mixtral-8x7B training by high-quality Traditional Chinese language data. FFM-Mixtral-8x7B can handle large amounts of text with a native 32K context length. The model utilizes a mixture of experts (MoE) architecture, where each interaction triggers different expert modules to handle specific domain problems. This enables cost-effective computation of a large number of parameters and data.

For the characteristics and specifications of the Llama3-FFM series models, please refer to this document.

The FFM-Mistral series models can be fine-tuned in AFS Platform and deployed in AFS Cloud, AFS ModelSpace public mode and private mode.

  • For instructions on AFS Platform, please refer to this document.
  • For instructions on AFS Cloud, please refer to this document.
  • For instructions on AFS ModelSpace public mode, please refer to this document.
  • For instructions on AFS ModelSpace private mode, please refer to this document.
  • For instructions on Chat and Playground Interfaces in AFS ModelSpace private mode, please refer to this document.
  • For instructions on Function Calling, please refer to this document.