Jump to ContentJump to Main Navigation
Show Summary Details
Open access from 2017!

Zeitschrift für Sprachwissenschaft


IMPACT FACTOR increased in 2015: 0.333
5-year IMPACT FACTOR: 0.625

SCImago Journal Rank (SJR) 2015: 0.171
Source Normalized Impact per Paper (SNIP) 2015: 0.727
Impact per Publication (IPP) 2015: 0.211

49,00 € / $74.00 / £37.00*

Online
ISSN
1613-3706
See all formats and pricing



Select Volume and Issue

Issues

30,00 € / $42.00 / £23.00

Get Access to Full Text

Annotation for and Robust Parsing of Discourse Structure on Unrestricted Texts

Jason Baldridge1 / Nicholas Asher2 / Julie Hunter3

1

2

3

Citation Information: Zeitschrift für Sprachwissenschaft. Volume 26, Issue 2, Pages 213–239, ISSN (Online) 1613-3706, ISSN (Print) 0721-9067, DOI: 10.1515/ZFS.2007.018, December 2007

Publication History

Received:
2007-02-03
Revised:
2007-04-21
Published Online:
2007-12-04

Abstract

Predicting discourse structure on naturally occurring texts and dialogs is challenging and computationally intensive. Attempts to construct hand-built systems have run into problems both in how to specify the required knowledge and how to perform the necessary computations in an efficient manner. Data-driven approaches have recently been shown to be successful for handling challenging aspects of discourse without using lots of fine-grained semantic detail, but they require annotated material for training. We describe our effort to annotate Segmented Discourse Representation Structures on Wall Street Journal texts, arguing that graph-based representations are necessary for adequately capturing the dependencies found in the data. We then explore two data-driven parsing strategies for recovering discourse structures. We show that the generative PCFG model of Baldridge & Lascarides (2005b) is inherently limited by its inability to incorporate new features when learning from small data sets, and we show how recent developments in dependency parsing and discriminative learning can be utilized to get around this problem and thereby improve parsing accuracy. Results from exploratory experiments on Verbmobil dialogs and our annotated news wire texts are given; these results suggest that these methods do indeed enhance performance and have the potential for significant further improvements by developing richer feature sets.

Keywords: discourse structure; SDRT; probabilistic parsing; Verbmobil; rhetorical relations; dependency grammar

Citing Articles

Here you can find all Crossref-listed publications in which this article is cited. If you would like to receive automatic email messages as soon as this article is cited in other publications, simply activate the “Citation Alert” on the top of this page.

[1]
Rashmi Prasad, Bonnie Webber, and Aravind Joshi
Computational Linguistics, 2014, Volume 40, Number 4, Page 921
[2]
Manfred Stede
Synthesis Lectures on Human Language Technologies, 2011, Volume 4, Number 3, Page 1
[3]
B. WEBBER, M. EGG, and V. KORDONI
Natural Language Engineering, 2012, Volume 18, Number 04, Page 437

Comments (0)

Please log in or register to comment.