The open/close/pause sequence of mouth movements corresponding to an animated character's dialogue. One of the reasons that dubbed dialogue for foreign animation does not always correspond to the literal translation of the original dialogue is that it must be adapted to match the mouth flaps. If it isn't, the illusion of animated speech is dispelled, resulting in a "bad Hercules movie" effect. Through analysis by speech scientists and animators, the Disney studios developed a set of facial motion combinations that can represent all sounds a human makes in English. These are called phonemes. They are grouped by sounds that look almost the same when spoken, but sound different. Good information about the "Preston Blair" phoneme series can be found here. Some software programs are now available that can generate animation data for lip-sync based on input audio and scripts. Examples of this are the professional animation software "Magpie Pro", as well as lip-sync systems built into the video games Half-Life 2 and The Movies. Small sets of mouth flaps are useful in animation because it gives flexibility in recording dialogue (depending on which method you prefer). Incidentally, it also makes it much easier to dub into other languages. The higher budget of movies, however, force ADR to sometimes add words that obviously shouldn't be coming out of a certain mouth movement. (Indeed, some TV dubs do this as well.) See Lip Lock. Definitely an issue for translators — when the character's been handed dinner and the most logical thing to say is "thanks", but the original is "itadaitebayo", the challenge is on. At best, it can lead to Woolseyism, at worst to a bunch of syllables that sound incredibly awkward.