Entropy (Dec 2016)
Humans Outperform Machines at the Bilingual Shannon Game
Abstract
We provide an upper bound for the amount of information a human translator adds to an original text, i.e., how many bits of information we need to store a translation, given the original. We do this by creating a Bilingual Shannon Game that elicits character guesses from human subjects, then developing models to estimate the entropy of those guess sequences.
Keywords