Extracting sentences recommended to annotate for understanding writer's opinions in a document

When reading documents, people often find writer's opinions in documents which are new information for people. Annotation is known as a quite powerful method to find such information. Making annotation on documents, however, is a hard work because people need to read documents very carefully to think which sentences in documents can be annotated. People spent much time for choosing sentences and thinking what annotations they make on sentences. If people are automatically given sentences recommended to make annotations on, people can make annotations on documents easily with shaving off time. New methods for recommending sentences which can be made annotations on are required. This paper proposes an extraction method of sentences recommended to make annotations on for understanding writer's opinions in a document. The method extracts sentences including writer's opinions by evaluating words included in sentences. If a sentence includes words used for showing writer's opinions, the method extracts the sentence to recommend users to make annotations on. Users of the proposed method can understand writer's opinions deeply by reading a document and making annotations on the extracted sentences. We experimented with the proposed method and verified that the proposed method can extract sentences which are made annotations on by participants for understanding writer's opinions deeply.

[1]  Kim Rutherford,et al.  Artemis: sequence visualization and annotation , 2000, Bioinform..

[2]  Takashi Yoshino,et al.  The Role of Annotation in Intercultural Communication , 2007, HCI.

[3]  Andrea Esuli,et al.  SENTIWORDNET: A Publicly Available Lexical Resource for Opinion Mining , 2006, LREC.

[4]  Katashi Nagao,et al.  Interactive Paraphrasing Based on Linguistic Annotation , 2002, COLING.

[5]  Janyce Wiebe,et al.  Recognizing subjectivity: a case study in manual tagging , 1999, Natural Language Engineering.

[6]  Peter D. Turney Thumbs Up or Thumbs Down? Semantic Orientation Applied to Unsupervised Classification of Reviews , 2002, ACL.

[7]  Barry Smyth,et al.  Genre Classification and Domain Transfer for Information Filtering , 2002, ECIR.

[8]  J. Flavell Metacognition and Cognitive Monitoring: A New Area of Cognitive-Developmental Inquiry. , 1979 .

[9]  R. Manmatha,et al.  Automatic image annotation and retrieval using cross-media relevance models , 2003, SIGIR.

[10]  Lillian Lee,et al.  Opinion Mining and Sentiment Analysis , 2008, Found. Trends Inf. Retr..

[11]  Amit P. Sheth,et al.  Meteor-s web service annotation framework , 2004, WWW '04.

[12]  Matthew G. Snover,et al.  A Study of Translation Edit Rate with Targeted Human Annotation , 2006, AMTA.

[13]  I-Min A. Chen,et al.  IMG ER: a system for microbial genome annotation expert review and curation , 2009, Bioinform..

[14]  Nian-Shing Chen,et al.  Effects of reviewing annotations and homework solutions on math learning achievement , 2011, Br. J. Educ. Technol..

[15]  Mark McMahon,et al.  Mark-UP: Facilitating Reading Comprehension through On-Line Collaborative Annotation , 2003 .

[16]  Paul Lukowicz,et al.  Wearable sensing to annotate meeting recordings , 2002, Proceedings. Sixth International Symposium on Wearable Computers,.

[17]  Antonio Torralba,et al.  LabelMe: A Database and Web-Based Tool for Image Annotation , 2008, International Journal of Computer Vision.

[18]  Michael A. Arbib,et al.  Annotation technology , 1999, Int. J. Hum. Comput. Stud..

[19]  Petri Nokelainen,et al.  Evaluating the role of a shared document-based annotation tool in learner-centered collaborative learning , 2003, Proceedings 3rd IEEE International Conference on Advanced Technologies.

[20]  Jennifer L. Harrow,et al.  Meeting report: a workshop on Best Practices in Genome Annotation , 2010, Database J. Biol. Databases Curation.

[21]  Tristan E. Johnson,et al.  Individual and team annotation effects on students' reading comprehension, critical thinking, and meta-cognitive skills , 2010, Comput. Hum. Behav..

[22]  K. Nagao,et al.  Weblog-style video annotation and syndication , 2005, First International Conference on Automated Production of Cross Media Content for Multi-Channel Distribution (AXMEDIS'05).

[23]  R. Manmatha,et al.  Multiple Bernoulli relevance models for image and video annotation , 2004, Proceedings of the 2004 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2004. CVPR 2004..

[24]  Ralph Weischedel,et al.  A STUDY OF TRANSLATION ERROR RATE WITH TARGETED HUMAN ANNOTATION , 2005 .