Knowledge Base

Explore 5 core concepts in AI/ML research.

BLEU Score
Metric

A metric for evaluating the quality of machine translated text.

Definition

BLEU (Bilingual Evaluation Understudy) compares n-gram overlaps between generated and reference translations. Scores range from 0 to 1, with higher scores indicating better translation quality.

Related Concepts

ROUGEMETEORBERTScore

Key Papers

BLEU: a Method for Automatic Evaluation

Examples: MT evaluation, Summarization scoring