贪心科技|让每个人享受个性化教育服务AllaboutNameEntities(usingDeepLearning)•Recognition•Resolution•Linking•RelationExtraction贪心科技|让每个人享受个性化教育服务NameEntityRecognitionMorereading:Lietal.(2020),ASurveyonDeepLearningforNamedEntityRecognition贪心科技|让每个人享受个性化教育服务NameEntityRecognition:recapNamedentityrecognition(NER)—sometimesreferredtoasentitychunking,extraction,oridentification—isthetaskofidentifyingandcategorizingkeyinformation(entities)intext,whichalwaystwosteps:1.Detectanameentity;2.CategoriestheentityOutputsInputs贪心科技|让每个人享受个性化教育服务NameEntityRecognition:opensourcedataandtools贪心科技|让每个人享受个性化教育服务WhyDeepLearningforNER?•NERbenefitsfromthenon-lineartransformation;•DeeplearningsavessignificanteffortsondesigningNERfeatures;•TheDeepLearningbasedmodelsaretrainedinanEnd2Endparadigm.TypicalPipelineBIOTagSet:•Beginning•Inside•Outside贪心科技|让每个人享受个性化教育服务DistributedRepresentationsforInput(1/2)•Word-levelrepresentation•Character-levelrepresentation贪心科技|让每个人享受个性化教育服务DistributedRepresentationsforInput(2/2)•Hybridrepresentation贪心科技|让每个人享受个性化教育服务ContextEncoder(1/5)CNNEncoder:•Encodesmoreglobalfeatures•ID-CNNcouldacceleratethespeedofthemodel贪心科技|让每个人享受个性化教育服务ContextEncoder(2/5)RNNEncoder:•Commonlyusedforsequencetagging;•Morestraight-forwardtomodelsequentialdata贪心科技|让每个人享受个性化教育服务ContextEncoder(3/5)RecursiveNN:ClassifyeverynodeinaconstituencystructureforNER贪心科技|让每个人享受个性化教育服务ContextEncoder:CombiningwithLM(4/5)AdditionalLMobjectivesBidirectionalLanguageModels贪心科技|让每个人享受个性化教育服务ContextEncoder:CombiningwithLM(5/5)贪心科技|让每个人享受个性化教育服务Decoder贪心科技|让每个人享受个性化教育服务TricksFromJayLou娄杰微信公众号:夕小瑶的卖萌小屋•提升NER性能(performance)的⽅式往往不是直接堆砌⼀个BERT+CRF,这样做不仅性能不一定会好,推断速度也非常堪忧;就算直接使用BERT+CRF进行finetune,BERT和CRF层的学习率...