Search Descriptions

General

Neural machine Translation

Statistical Machine Translation

Search Publications


author

title

other

year

N-Gram Language Models

All competitive statistical machine translation systems use n-gram language models that predict the probability of a word from a maximum window of proceeding words.

N Gram Language Models is the main subject of 7 publications. 6 are discussed here.

Topics in LanguageModels

N Gram Language Models | Targeted Language Models | Morphological Language Models | Very Large Language Models

Publications

The most commonly used discount methods to smooth language models are proposed by Good (1953) — see also the description by Gale and Sampson (1995) —, Witten and Bell (1991), as well as Kneser and Ney (1995). A good introduction to the topic of language modelling is given by Chen and Goodman (1998).
Instead of training language models, large corpora can be also exploited by checking if potential translations occur in them as sentences (Soricut et al., 2002).

Benchmarks

Discussion

Related Topics

New Publications

  • Freitag et al. (2013)

Actions

Download

Contribute