暂无分享,去创建一个
Existing search engines use keyword matching or tf-idf based matching to map the query to the web-documents and rank them. They also consider other factors such as page rank, hubs-and-authority scores, knowledge graphs to make the results more meaningful. However, the existing search engines fail to capture the meaning of query when it becomes large and complex. BERT, introduced by Google in 2018, provides embeddings for words as well as sentences. In this paper, I have developed a semantics-oriented search engine using neural networks and BERT embeddings that can search for query and rank the documents in the order of the most meaningful to least meaningful. The results shows improvement over one existing search engine for complex queries for given set of documents.
[1] Junaidah Mohamed Kassim,et al. Introduction to Semantic Search Engine , 2009, 2009 International Conference on Electrical Engineering and Informatics.
[2] Ming-Wei Chang,et al. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , 2019, NAACL.