Chinese Named Entity Recognition with Bert

As a basic task of NLP, named entity recognition has always been the focus of researchers. At the same time, the word vector representation which is a necessary part of many named entity recognition neural network models has been more and more important. Recently, the emergence of a new type word representation, BERT, has greatly promoted many NLP tasks. In this paper, we will use Bert to train Chinese character embedding and connect it with Chinese radical-level representations, and put it into the BGRU-CRF model. We have achieved good results in Chinese data set through a series of experiments.