Enable javascript in your browser for better experience. Need to know to enable it? Go here.
Published : Apr 03, 2024
NOT ON THE CURRENT EDITION
This blip is not on the current edition of the Radar. If it was on one of the last few editions, it is likely that it is still relevant. If the blip is older, it might no longer be relevant and our assessment might be different today. Unfortunately, we simply don't have the bandwidth to continuously review blips from previous editions of the Radar. Understand more
Apr 2024
Assess ?

MixtralMistral 发布的开放权重大语言模型家族的一部分,它采用了稀疏混合专家架构。这个模型家族以 7B 和 8x7B 参数大小的形式,提供原始预训练和微调版本。其大小、开放权重特性、基准测试中的性能以及 32,000 个 token 的上下文长度,使其成为自托管大语言模型中一个非常耀眼的选择。需要注意的是,这些开放权重模型并没有针对安全性进行优化调整,用户需要根据自己的用例进行精细调整。我们在开发与特定印度法律任务相关的数据上训练的精调 Mistral 7B 模型 Aalap方面有一定经验,该模型在有限成本的基础上表现相当好

Download the PDF

 

 

 

English | Português 

Sign up for the Technology Radar newsletter

 

 

Subscribe now

Visit our archive to read previous volumes