Sensorimotor Learning Modulates Automatic Imitation in Visual Speech
暂无分享,去创建一个
People automatically imitate observed actions, including speech. Automatic Imitation (AI) is linked to observationexecution associations in the mirror neuron system (MNS). AI is measured using interference tasks, in which prompts (say ”ba” or ”da”) are paired with congruent or incongruent distracters (video of someone saying ”ba” or ”da”). Faster responses for congruent than for incongruent prompt-distracter pairings signal AI. Observation-execution associations for speech actions are thought to be inflexible, unlike associations for manual actions, which have been shown to be flexible. We trained participants to reinforce or abolish their AI response by providing them with compatible (say ”ba” for a video of someone saying ”ba”) or incompatible training (say ”ba” for a video of ”da”). After training, the AI response was reduced for participants who received incompatible training, thus showing that the MNS for speech actions is also flexible and subject to experience, like the MNS for manual actions.