Source PaperYearLineSentence
P07-1027 2007 190
type of auxiliary problems might hold some hope for further improvement in argument identification, if a larger number of auxiliary problems can be used.ASO has been demonstrated to be an effec tive semi-supervised learning algorithm (Ando and Zhang, 2005a; Ando and Zhang, 2005b; Ando,2006)
P07-1027 2007 200
More recently, for the word sense disambiguation (WSD) task, (Ando, 2006) experimented with both supervised and semi-supervised auxiliary problems,although the auxiliary problems she used are differ ent from ours
P07-1027 2007 24
ASO hasbeen shown to be effective on the following natu ral language processing tasks: text categorization, named entity recognition, part-of-speech tagging, and word sense disambiguation (Ando and Zhang, 2005a; Ando and Zhang, 2005b; Ando, 2006)
D07-1108 2007 31
Similarly, (Ando, 2006) exploits data from related tasks, using all labeled examples irrespective of target words for learning each sense using the Alternating Structure Optimization (ASO) algorithm (Ando and Zhang, 2005a; Ando and Zhang, 2005b)
D07-1108 2007 161
1020 Bayes (Soft Tag) 73.6 SVM-topic 73.0 SVM baseline 72.4 NB baseline 69.8 ASO(Ando, 2006) 74.1 SVM-LSA (Strapparava et al, 2004) 73.3 Senseval-3 Best System(Grozea, 2004) 72.9 Table 5: Results compared to previous best systems on Senseval-3 English lexical sample task
D07-1108 2007 148
Bayes (Soft Tag) 68.9 SVM-Topic 66.0 SVM baseline 65.2 NB baseline 63.4 ASO(best configuration)(Ando, 2006) 68.1 Classifier Combination(Florian, 2002) 66.5 Polynomial KPCA(Wu et al, 2004) 65.8 SVM(Lee and Ng, 2002) 65.4 Senseval-2 Best System 64.2 Table 4: Results (best configuration) compared to previous best systems on Senseval-2 English lexical sample task
C08-1003 2008 56
Our present paper differs from theirs in that we propose an additional method to use SVD (the OMTmethod, Section 4.2), and that we evaluate the con tribution of unlabeled data and SVD in isolation, leaving combination for future work.Ando (2006) used Alternative Structured Op timization, which is closely related to StructuralLearning (cited above)
C08-1133 2008 75
The feature set used herein is similar to sev eral state-of-the-art WSD systems (Lee and Ng., 2002; Ando, 2006; Tratz et al, 2007; Cai et al, 2007; Agirre and Lopez de Lacalle, 2007; Specia et al, 2007), which is further integrated into a Na?ve Bayes classifier (Lee and Ng., 2002; Mihalcea, 2007)
E09-3005 2009 120
In practice, there are more free parameters and model choices (Ando and Zhang, 2005; Ando, 2006; Blitzer et al, 2006; Blitzer, 2008) besides the ones discussed above
E09-3005 2009 128
Due to the positive results in Ando (2006), Blitzer et al (2006) include this in their standard setting of SCL and report results using block SVDs only
E09-1006 2009 53
Ando (2006) used Alternative Structured Optimization
D09-1013 2009 45
With well-designed auxiliary problems, the method has been applied to text classification, text chunking, and word sense disam biguation (Ando, 2006)
P09-2065 2009 10
(Ando, 2006) (abbreviated as Ando[CoNLL?06]) have successfully applied the ASO (Alternating Structure Optimiza tion) technique proposed by (Ando and Zhang, 2005), in its transfer learning configuration, to the problem of WSD by doing joint empirical risk minimization of a set of related problems (words in this case)
S12-1023 2012 224
The only exception, to our knowledge, is Ando (2006), who pools the labeled examples for all words from a dataset for learning, implicitly exploiting regularities in sense alternations