Source PaperYearLineSentence
W05-1506 2005 18
So we might want to postpone some disambiguation by propagating k-best lists to subsequent phases, as in joint parsing and semantic role labeling (Gildea and Jurafsky, 2002; Suttonand McCallum, 2005), information extraction and coreference resolution (Wellner et al, 2004), and formal se mantics of TAG (Joshi and Vijay-Shanker, 1999).Moreover, much recent work on discriminative training uses k-best lists; they are sometimes used to approximate the normalization constant or partition func tion (which would otherwise be intractable), or to train a model by optimizing some metric incompatible with the packed representation
W05-0620 2005 133
Probably, this is due to the versatility of the approaches and the availability of very good software toolkits.In particular, 8 teams used the Maximum En tropy (ME) statistical framework (Che et al, 2005; Haghighi et al, 2005; Park and Rim, 2005; Tjong Kim Sang et al, 2005; Sutton and McCallum, 2005; Tsai et al, 2005; Yi and Palmer, 2005; Venkatapathy et al, 2005)
W05-0620 2005 226
Finally, Haghighi et al (2005) and Sutton and McCallum (2005) performed a different ap proach by learning a re-ranking function as a global model on top of the base SRL models
W06-1673 2006 10
A common improvement on this architecture is to pass k-best lists between processing stages, for example (Sutton and McCallum, 2005; Wellner et al., 2004)
P07-1019 2007 177
These forest rescoring algorithms have potential applications to other computationally intensive tasks involving combinations of different models, for example, head-lexicalized parsing (Collins, 1997); joint parsing and semantic role labeling (Sutton andMcCallum, 2005); or tagging and parsing with non local features
D08-1070 2008 18
Recentnegative results on the integration of syntactic pars ing with SRL (Sutton and McCallum, 2005) provide additional evidence for the difficulty of this general approach
P10-1074 2010 13
While negative results arerarely published, this was not the first failed attempt at joint parsing and semantic role label ing (Sutton and McCallum, 2005)
P10-1113 2010 38
Sutton and McCallum (2005) adopted a probabilistic SRL system to re-rank the N-best results of a probabilistic syntactic parser
P10-1113 2010 123
The pipeline parsing approach employed in this paper is largely motivated by the general framework of re-ranking, as proposed in Sutton and McCallum (2005)
P10-1113 2010 243
It is worth noting that our experimental results in applying the re-ranking framework in Chinese pipeline parsing on N-best parse trees are very encouraging, considering the pessimistic results of Sutton and McCallum (2005), in which the re-ranking framework failed to improve the per formance on English SRL
P10-1113 2010 244
It may be because, 1114 unlike Sutton and McCallum (2005), P(F, t|x) defined in this paper only considers those con stituents which are identified as arguments
C10-1066 2010 231
(Sutton and McCallum, 2005; Wellner et al, 2004) maintain a beam of n best interpretations in the pipeline architecture
P11-1048 2011 166
Such diverse topics as machine transla tion (Dyer et al, 2008; Dyer and Resnik, 2010; Mi et al, 2008), part-of-speech tagging (Jiang et al., 2008), named entity recognition (Finkel and Manning, 2009) semantic role labelling (Sutton andMcCallum, 2005; Finkel et al, 2006), and oth ers have also been improved by combined models.Our empirical comparison of BP and DD also com plements the theoretically-oriented comparison ofmarginal- and margin-based variational approxima tions for parsing described by Martins et al (2010)