Temporal Relationship Extraction for Natural Language Texts by Using Deep Bidirectional Language Model








Abstract

In general, documents contain temporal information and recognizing that information is crucial in understanding the overall content of documents written in natural language. To find the temporal information, there are three tasks that capturing the time representation itself, finding out the event associated with the time representation, and extracting the temporal relationship between times or events. As inherent linguistic characteristics of the multiple languages, it is hard to capture every time information from a given sentence without considering the context of temporal relationships. In this paper, we design an artificial neural network model that extracts temporal relations, one of the tasks that extract temporal information from natural language sentences. Our proposed model is based on a deep bidirectional architecture to design temporal relationships learning from given sentences. The model separates an input single sentence into individual word tokens and converts them into embedding vectors, and then learns whether each token is a subject or an object of temporal relationship information in the given sentence. Before using models and datasets that target multiple languages, we first conduct our research on English and Korean.


Modules


Algorithms


Software And Hardware

• Hardware: Processor: i3 ,i5 RAM: 4GB Hard disk: 16 GB • Software: operating System : Windws2000/XP/7/8/10 Anaconda,jupyter,spyder,flask Frontend :-python Backend:- MYSQL