Text-world annotation and visualization for crime narrative reconstruction

Yufang Ho, Jane Lugea, Dan McIntyre, Zhijie Xu, Jing Wang

Research output: Contribution to journalArticlepeer-review

7 Citations (Scopus)

Abstract

To assist legal professionals with more effective information processing and evaluation, we aim to develop software to identify and visualize the key information dispersed in the unstructured language data of a criminal case. A preliminary model of the software, Worldbuilder, is described in Wang et al. (2016a, b). The present article focuses on explaining the theory and vision behind the computational development of the software, which has involved establishing a means to annotate discourse for visualization purposes. The design of the annotation scheme is based on a cognitive model of discourse processing, Text World Theory (TWT), which describes and tracks how language users create a dynamic representation of events (i.e. text-worlds) in their minds as they communicate. As this is the first time TWT has informed the computational analysis of language, the model is augmented with Contextual Frame Theory, among other linguistic apparatus, to account for the complexities in the data and its translation from text to visualization. Using a statement from the Meredith Kercher murder trial as a case study, we illustrate the efficacy of the augmented TWT framework in the careful and purposeful preparation of linguistic data for computational visualization. Ultimately, this research bridges Cognitive and Computational Linguistics, improves the TWT model’s analytical accuracy, and yields a potentially useful tool for forensic work.

Original languageEnglish
Pages (from-to)310-334
Number of pages25
JournalDigital Scholarship in the Humanities
Volume34
Issue number2
Early online date11 Oct 2018
DOIs
Publication statusPublished - 1 Jun 2019

Fingerprint

Dive into the research topics of 'Text-world annotation and visualization for crime narrative reconstruction'. Together they form a unique fingerprint.

Cite this