国产v亚洲v天堂无码久久无码_久久久久综合精品福利啪啪_美女扒开尿口让男人桶_国产福利第一视频在线播放_滨崎步无码AⅤ一区二区三区_三年片免费观看了_大屁股妇女流出白浆_泷川苏菲亚无码AV_我想看我想看一级男同乱伦_国产精品午夜福利免费视频,gogo国模全球大胆高清摄影图,2008门艳照全集视频,欧美午夜在线精品品亚洲AV中文无码乱人伦在线播放

實驗室論文獲NAACL頂會錄用
來源: 郝天永/
華南師范大學
2407
21
0
2023-01-25

實驗室論文獲NLP領域的5大頂會之一NAACL錄用,,第1作者為研二朱曉智同學,,第1作者和通訊作者單位均為華南師范大學。

Title: A Self-supervised Joint Training Framework for Document Reranking

Author: Xiaozhi Zhu, Tianyong Hao, Sijie Cheng, Fu Lee Wang, Hai Liu

Abstract: Pretrained language models such as BERT have been successfully applied to a wide range of natural language processing tasks and also achieved impressive performance in document reranking tasks. Recent works indicate that further pretraining the language models on the task-specific datasets before fine-tuning helps improve reranking performance. However, the pre-training tasks like masked language model and next sentence prediction were based on the context of documents instead of encouraging the model to understand the content of queries in document reranking task. In this paper, we propose a new self-supervised joint training framework (SJTF) with a self-supervised method called Masked Query Prediction (MQP) to establish semantic relations between given queries and positive documents. The framework randomly masks a token of query and encodes the masked query paired with positive documents, and uses a linear layer as a decoder to predict the masked token. In addition, the MQP is used to jointly optimize the models with supervised ranking objective during fine-tuning stage without an extra further pre-training stage. Extensive experiments on the MS MARCO passage ranking and TREC Robust datasets show that models trained with our framework obtain significant improvements compared to original models.

錄用類型:Long paper


登錄用戶可以查看和發(fā)表評論,, 請前往  登錄 或  注冊,。
SCHOLAT.com 學者網(wǎng)
免責聲明 | 關于我們 | 聯(lián)系我們
聯(lián)系我們: