Standard phrase based models condition the generation of an output phrase solely on its aligned input phrase. However, a wider source context may give additional guidance.
Context Features is the main subject of 26 publications. 13 are discussed here.
Phrase translation may be informed by additional context features, for instance by applying methods used in word sense disambiguation (Carpuat et al., 2006)
. Such features may be integrated using a maximum entropy model (Bangalore et al., 2006)
, support vector machines (Giménez and Màrquez, 2007)
, or by directly integrating more complex word sense disambiguation components, such as an ensemble of different machine learning methods (Carpuat and Wu, 2007
; Carpuat and Wu, 2007b)
. Ittycheriah and Roukos (2007)
propose a maximum entropy model for phrase translation.
Syntactic context dependencies may be added to phrase translations in the phrase-based approach, for instance verb-argument relationships (Hwang and Sasaki, 2005)
, or the syntactic structure underlying each phrase translation (Sun et al., 2007)
. Gimpel and Smith (2008)
add features around the context of a source phrase into a probabilistic back-off model.
Chan et al. (2007)
use a support vector machine to include context features for rules in a hierarchical translation model. Maximum entropy models may be used for the same purpose (Xiong et al., 2008
; He et al., 2008)
Liu et al. (2010)
use collocation information as a feature for each phrase pair, which suggests that collocated words make better phrases.
- Servan and Dymetman (2015)
- Stewart et al. (2014)
- Ha et al. (2014)
- Haque et al. (2011)
- Haque et al. (2010)
- Liu et al. (2008)
- Brown (2008)
- Shen et al. (2009)
- Huang and Xiang (2010)
- Andrade et al. (2010)
- Onishi et al. (2011)
- Subotin (2011)
- Gong et al. (2011)