Search Descriptions

General

Neural machine Translation

Statistical Machine Translation

Search Publications


author

title

other

year

Dependency Structure

While most current syntax models are based on phrase structure grammar, another appealing view on grammar is dependency structure, which links each word with a parent word in a dependency relationship.

Dependency Structure is the main subject of 39 publications. 18 are discussed here.

Publications

One example for the statistical machine translation models based on dependency structures is the treelet approach (Menezes and Richardson, 2001; Menezes and Quirk, 2005; Quirk et al., 2005), which uses the depenency structure on the source side. Other researchers found this a promising direction (Lin, 2004). Such models have been shown to be competitive with phrase-based models (Menezes and Quirk, 2005; Menezes et al., 2006).
An extension of this model also allows fragments with gaps that are filled by variables, as in the hierarchical phrase-based model (Xiong et al., 2007). Dependency structure may be also used in a string-to-tree model: Shen et al. (2008) use rules that map into dependency tree fragments of neighboring words and additional restrictions, which allows the use of a dependecy-based language model.
Translating dependency tree fragments is also the idea behind synchronous dependency insertion grammars. Ding et al. (2003); Ding and Palmer (2004) develop methods for aligning dependency trees, and then develop decoding algorithms (Ding and Palmer, 2005). More recent work integrates the use of an n-gram language model during decoding (Ding and Palmer, 2006).
The generation of an output string from a dependency structure requires insertion of function words and defining an ordering. Hall and Nemec (2007) present a generative model with a search algorithm that proceeds through stages. Chang and Toutanova (2007) present a discriminatively trained model to the word ordering problem. Mapping into dependency structure and its ordering gives better results than separating the two steps (Menezes and Quirk, 2007).
Tectogrammatical models are also based on dependency trees, but also include morphological analysis and generation (Čmejrek et al., 2003; Eisner, 2003). By mapping arbitrary connected fragments of the dependency tree, the approach may be extended to apply the lessons from phrase-based models by mapping larger fragments (Bojar and Hajič, 2008).

Benchmarks

Discussion

Related Topics

New Publications

  • Chen et al. (2014)
  • Mareček (2016)
  • Seemann and Maletti (2015)
  • Tamura et al. (2013)
  • Gimpel and Smith (2014)
  • Na et al. (2010)
  • Shen et al. (2010)
  • Galley and Manning (2009)
  • Bach et al. (2009)
  • Xiong et al. (2009)
  • Tu et al. (2010)
  • Zabokrtsky and Popel (2009)
  • Chen et al. (2010)
  • Mi and Liu (2010)
  • Žabokrtský et al. (2010)
  • Su et al. (2010)
  • Singh and Bandyopadhyay (2010)
  • Venkatapathy et al. (2010)
  • Attardi et al. (2011)
  • Chen et al. (2007)
  • Fox (2005)

Actions

Download

Contribute