Search Descriptions

General

Neural machine Translation

Statistical Machine Translation

Search Publications


author

title

other

year

Tree to Tree Models

While being able to exploit the maximum of available syntactic annotation, models that use both source and target side syntax are computationally challenging.

Tree To Tree is the main subject of 12 publications. 6 are discussed here.

Publications

As for other models, rules for tree-to-tree models may be learned from word-aligned parallel corpora. To maximize the number of rules, alignment points from the intersection and union of GIZA++ alignments may be treated differently (Imamura et al., 2005).
Probabilistic synchronous tree-insertion grammars (STIG) (Nesson et al., 2006) may be automatically learned without any provided syntactic annotations. Synchronous tree substitution grammars (STSG) map tree fragments in the source to tree fragments in the target (Zhang et al., 2007). Shieber (2007) argue for the use of synchronous tree adjoining grammars (S-TAG) as following the structure of printed bilingual dictionaries. This idea was also proposed by Shieber and Schabes (1990).
Instead of using the 1-best or n-best syntactic parses of the source sentence, a forest of parse trees may be used during translation (Mi et al., 2008).

Benchmarks

Discussion

Related Topics

New Publications

  • Richardson et al. (2016)
  • Nakazawa et al. (2016)
  • Shen et al. (2016)
  • Zhai et al. (2011)
  • Liu et al. (2009)
  • Ambati et al. (2009)

Actions

Download

Contribute