Jump to ContentJump to Main Navigation

Corpus Linguistics and Linguistic Theory

Editor-in-Chief: Gries, Stefan Th.

2 Issues per year

IMPACT FACTOR increased in 2013: 1.000
5-year IMPACT FACTOR: 1.019
Rank 41 out of 169 in category Linguistics in the 2013 Thomson Reuters Journal Citation Report/Social Sciences Edition
ERIH category 2011: INT2

Coding coherence relations: Reliability and validity

Wilbert Spooren1 / Liesbeth Degand2

1E-mail:

2E-mail:

Citation Information: Corpus Linguistics and Linguistic Theory. Volume 6, Issue 2, Pages 241–266, ISSN (Online) 1613-7035, ISSN (Print) 1613-7027, DOI: 10.1515/cllt.2010.009, November 2010

Publication History

Published Online:
2010-11-04

Abstract

This paper tackles the issue of the validity and reliability of coding discourse phenomena in corpus-based analyses. On the basis of a sample analysis of coherence relation annotation that resulted in a poor kappa score, we describe the problem and put it into the context of recent literature from the field of computational linguistics on required intercoder agreement. We describe our view on the consequences of the current state of the art and suggest three routes to follow in the coding of coherence relations: double coding (including discussion of disagreements and explicitation of the coding decisions), single coding (including the risk of coder bias, and a lack of generalizability), and enriched kappa statistics (including observed and specific agreement, and a discussion of the (possible reasons for) disagreement). We end with a plea for complimentary techniques for testing the robustness of our data with the help of automatic (text mining) techniques.

Keywords:: coherence relations; discourse; reliability; interrater agreement; corpus analysis

Comments (0)

Please log in or register to comment.
Users without a subscription are not able to see the full content. Please, subscribe or login to access all content.