Enable javascript in your browser for better experience. Need to know to enable it? Go here.
Published : Nov 20, 2019
NOT ON THE CURRENT EDITION
This blip is not on the current edition of the Radar. If it was on one of the last few editions, it is likely that it is still relevant. If the blip is older, it might no longer be relevant and our assessment might be different today. Unfortunately, we simply don't have the bandwidth to continuously review blips from previous editions of the Radar. Understand more
Nov 2019
Assess ?

Fairseq es un conjunto de herramientas de modelado s2s (sequence-to-sequence) desarrollado por Facebook AI Research que permite a investigadoras/es y desarrolladoras/es entrenar modelos personalizados para traducción, síntesis de texto, modelado de lenguaje y otras tareas de procesamiento de lenguaje natural. Es una buena elección para usuarios de PyTorch. Provee referencias de implementación para varios modelos s2s (sequence-to-sequence), admite distribuir entrenamientos a través de múltiples GPUs y máquinas. Es muy extensible y dispone de varios modelos pre-entrenados, incluyendo RoBERTa, una versión optimizada de BERT.

Download the PDF

 

 

 

English | Português

Sign up for the Technology Radar newsletter

 

 

Subscribe now

Visit our archive to read the previous volumes