菜单
NOT ON THE CURRENT EDITION
This blip is not on the current edition of the Radar. If it was on one of the last few editions it is likely that it is still relevant. If the blip is older it might no longer be relevant and our assessment might be different today. Unfortunately, we simply don't have the bandwidth to continuously review blips from previous editions of the RadarUnderstand more
Published: May 19, 2020
May 2020
评估?

在上一期技术雷达中,我们加入了BERT—— NLP(Natural Language Processing,自然语言处理) 领域中的一个关键里程碑。去年,百度发布了 ERNIE 2.0(Enhanced Representation through kNowledge IntEgration),它在 7 个 GLUE(General Language Understanding Evaluation) 语言理解任务和全部 9 个中文 NLP 任务上的表现均优于 BERT。和 BERT 一样,ERNIE 也提供了无监督预训练语言模型。它可以通过添加输出层的方法来进行微调,以创建多种 NLP 任务的当前最优模型。ERNIE 与传统预训练方法不同之处在于,它是一个连续的预训练框架。它可以不断地引入各种各样的预训练任务,以帮助模型有效地学习语言表达,而不是仅使用少量的预训练目标进行训练。我们对 NLP 的进步感到非常兴奋,并期待在我们的项目中尝试。