There were two source languages (Arabic & Chinese) and one
target language (English) evaluated in the MT-05 evaluation.
Both the Arabic and Chinese MT-05 evaluation test sets include
100 newwire documents.
Translations were measured automatically using the BLEU statistic
as originally defined by IBM and described in the paper Papineni,
Roukos, Ward, Zhu (2001). "Bleu:
a Method for Automatic Evaluation of Machine Translation"
(keyword = RC22176).
The BLEU metric measures performance of a task on a scale of 0
to 1 with 1 being the best.
Arabic-to-English: BLEU score = .5137
Chinese-to-English: BLEU score = .3531
is an agency of the U.S.
Commerce Department's Technology