Greater-than-word length text encoder for question answer retrieval.
Release date: 2020-03-11
Unlike previous cross-lingual tasks, LAReQA tests for "strong" cross-lingual alignment, requiring semantically related cross-language pairs to be closer in space representation than unrelated same-lan...
Release date: 2020-04-11
Greater-than-word length multi-lingual text encoder for question answer retrieval.
Release date: 2019-07-01
When evaluated on a wide range of open-domain QA datasets, our dense retriever outperforms a strong Lucene-BM25 system largely by 9%-19% absolute in terms of top-20 passage retrieval accuracy, and hel...
Release date: 2020-10-04
These are Distilled Roberta QA trained on MSMACRO dataset from sbert.net by UKPLab.
Release date: 2019-08-27
We propose a novel method to reduce the memory consumption of BERT, and show that it improves the scalability of BERT models.
Release date: 2019-09-26
Unlike recent language representation models, BERT is designed to pre-train deep bidirectional representations from unlabeled text by jointly conditioning on both left and right context in all layers.
Release date: 2018-10-11
For example, one can use a BERT model that was trained on text from a similar domain or by use a BERT model that was trained for a similar task.
Release date: 2021-02-15