Keyboard Shortcuts?

×
  • Next step
  • Previous step
  • Skip this slide
  • Previous slide
  • mShow slide thumbnails
  • nShow notes
  • hShow handout latex source
  • NShow talk notes latex source

Click here and press the right key for the next slide (or swipe left)

also ...

Press the left key to go backwards (or swipe right)

Press n to toggle whether notes are shown (or add '?notes' to the url before the #)

Press m or double tap to slide thumbnails (menu)

Press ? at any time to show the keyboard shortcuts

 

Emotion vs Speech Categorical Perception

articulation of phoneme

expression of emotion

Compare expressing an emotion by, say, smiling or frowning, with articulating a phoneme.
Both have a communicative function (on expressions of emotion, see for example \citealp{blair:2003_facial,sato:2007_spontaneous}) and both are categorically perceived, but the phonetic case has been more extensively investigated.

- communicative function

- communicative function

Variations due to coarticulation, rate of speech, dialect and many other factors mean that isolated acoustic signals are not generally diagnostic of phonemes: in different contexts, the same acoustic signal might be a consequence of the articulation of any of several phonemes.

- isolated acoustic signals not diagnostic

So here there is a parallel between speech and emotion. Much as isolated facial expressions are not diagnostic of emotions (as we saw a moment ago), isolated acoustic signals are plausibly not diagnostic of phonetic articulations.

- isolated facial expressions not diagnostic

Why then are isolated acoustic signals---which rarely even occur outside the lab---categorised by perceptual or motor processes at all? To answer this question we first need a rough idea of what it is to articulate a phoneme. Articulating a phoneme involves making coordinated movements of the lips, tongue, velum and larynx. How these should move depends in complex ways on numerous factors including phonetic context \citep{Browman:1992da,Goldstein:2003bn}. In preparing for such movements, it is plausible that the articulation of a particular phoneme is an outcome represented motorically, where this motor representation coordinates the movements and normally does so in such a way as to increase the probability that the outcome represented will occur.

- complex coordinated, goal-directed movements

This implies that the articulation of a particular phoneme, although probably not an intentional action, is a goal-directed action whose goal is the articulation of that phoneme.
(On the link between motor representation and goal-directed action, see \citealp{butterfill:2012_intention}.)
Now some hold that the things categorised in categorical perception of speech are not sounds or movements (say) but rather these outcomes---the very outcomes in terms of which speech actions are represented motorically (\citealp{Liberman:2000gr}; see also \citealp{Browman:1992da}).% \footnote{ Note that this claim does not entail commitment to other components of the motor theory of speech perception. } % On this view, categorical perception of speech is a process which takes as input the bodily and acoustic effects of speech actions and attempts to identify which outcomes the actions are directed to bringing about, that is, which phonemes the speaker is attempting to articulate. That isolated acoustic signals can engage this process and thereby trigger categorical perception is merely a side-effect, albeit one with useful methodological consequences.

- complex coordinated, goal-directed movements

We can think of expressions of emotion as goal-directed in the same sense that articulations of phonemes are. They are actions whose goal is the expression of a particular emotional episode.
This may initially strike you as implausible given that such expressions of emotion can be spontaneous, unintentional and involuntary. But note that expressing an emotion by, say, smiling or frowning, whether intentionally or not, involves making coordinated movements of multiple muscles where exactly what should move and how can depend in complex ways on contextual factors. That such an expression of emotion is a goal-directed action follows just from its involving motor expertise and being coordinated around an outcome (the goal) in virtue of that outcome being represented motorically.% \footnote{ To increase the plausibility of the conjecture under consideration, we should allow that some categorically perceived expressions of emotion are not goal-directed actions but events grounded by two or more goal-directed actions. For ease of exposition I shall ignore this complication. }
Recognising that some expressions of emotion are goal-directed actions in this sense makes it possible to explain what distinguishes a genuine expression of emotion of this sort, a smile say, from something unexpressive like the exhalation of wind which might in principle resemble the smile kinematically. Like any goal-directed actions, genuine expressions of emotion of this sort are distinguished from their kinematically similar doppelgänger in being directed to outcomes by virtue of the coordinating role of motor representations and processes.
the wild conjecture under consideration is that the things categorical perception is supposed to categorise, the ‘expressions of emotion’, are actions of a certain type, and these are categorised by which outcomes they are directed to.

What are the perceptual processes supposed to categorise?

Actions whose goals are to express certain emotions.

- The perceptual processes categorise events (not e.g. facial configurations).

Let me explain the increasingly bold commitments involved in accepting this conjecture.
First, the things categorised in categorical perception of expressions of emotion are events rather than configurations or anything static. (Note that this is consistent the fact that static stimuli can trigger categorical perception; after all, static stimuli can also trigger motor representations of things like grasping \citep{borghi:2007_are}.)

- These events are not mere physiological reactions.

Second, these events are not mere physiological reactions (as we might intuitively take blushing to be) but things like frowning and smiling, whose performance involves motor expertise. \footnote{ To emphasise, one consequence of this is that not everything which might intuitively be labelled as an expression of emotion is relevant to understanding what is categorised by perceptual processes. %For example, in the right context a blush may signal emotion without requiring motor expertise. }

- These events are are perceptually categorised by the outcomes to which they are directed.

Third, these events are perceptually categorised by the outcomes to which they are directed. That is, outcomes represented motorically in performing these actions are things by which these events are categorised in categorical perception.
Should we accept the wild conjecture? It goes well beyond the available evidence and currently lacks any reputable endorsement. In fact, we lack direct evidence for even the first of the increasingly bold commitments just mentioned (namely, the claim that the things categorically perceived are events). A further problem is that we know relatively little about the actions which, according to the wild conjecture, are the things categorical perception is supposed to categorise (\citealp[p.\ 47]{scherer:2013_understanding}; see also \citealp{scherer:2007_are} and \citealp{fernandez-dols:2013_advances}). However, the wild conjecture is less wild than the only published responses to the problems that motivate it.% % (which, admittedly, are wilder than an acre of snakes).% \footnote{ See \citet[p.\ 15]{motley:1988_facial}: ‘particular emotions simply cannot be identified from psychophysiological responses’; and \citet[p.\ 289]{barrett:2011_context}: ‘scientists have created an artifact’. } And, as I shall now explain, several considerations make the wild conjecture seem at least worth testing.
Consider again the procedure used in testing for categorical perception. Each experiment begins with a system for categorising the stimuli (expressions). This initial system is either specified by the experimenters or, in some cases, by having the participants first divide stimuli into categories using verbal labels or occasionally using non-verbal decisions. The experiment then seeks to measure whether this initial system of categories predicts patterns in discrimination. But what determines which category each stimulus is assigned to in the initial system of categories? You might guess that it is a matter of how likely people think it is that each stimulus---a particular facial configuration, say---would be associated with a particular emotion. In fact this is wrong. Instead, each stimulus is categorised in the initial system according to how suitable people think such an expression would be to express a given emotion: this is true whether the stimuli are facial \citep{horstmann:2002_facial} or vocal \citep{laukka:2011_exploring} expressions of emotion (see also \citealp[pp.\ 98--9]{parkinson:2013_contextualizing}). To repeat, in explicitly assigning an expression to a category of emotion, people are not making a judgement about the probability of someone with that expression having that emotion: they are making a judgement about which category of emotion the expression is most suited to expressing. Why is this relevant to understanding what perceptual processes categorise? The most straightforward way of interpreting the experiments on categorical perception is to suppose that they are testing whether perceptual processes categorise stimuli in the same ways as the initial system of categories does. But we have just seen that the initial system categorises stimuli according to the emotions they would be best suited to expressing. So on the most straightforward interpretation, the experiments on categorical perception of expressions of emotion are testing whether there are perceptual processes whose function is to categorise actions of a certain type by the outcomes to which they are directed. So the wild conjecture is needed for the most straightforward interpretation of these experiments. This doesn't make it true but it does make it worth testing.
So far I have focussed on evidence for categorical perception from experiments using faces as stimuli. However, there is also evidence that perceptual processes categorise vocal and facial expressions alike (\citealp{grandjean:2005_voices,laukka:2005_categorical}; see also \citealp{jaywant:2012_categorical}). We also know that %judgements about which emotion an observed person is expressing in a photograph can depend on the posture of the whole body and not only the face \citep{aviezer:2012_body}, and that various contextual factors can affect how even rapidly occurring perceptual processes discriminate expressions of emotion \citep{righart:2008_rapid}. There is even indirect evidence that categorical perception may concern whole bodies rather than just faces or voices \citep{aviezer:2008_angry,aviezer:2011_automaticity}. In short, categorical perception of expressions of emotion plausibly resembles categorical perception of speech in being a multimodal phenomenon which concerns the whole body and is affected by several types of contextual feature. This is consistent with the wild conjecture we are considering. The conjecture generates the further prediction that the effects of context on categorical perception of expressions of emotion will resemble the myriad effects of context on categorical perception of speech so that `every potential cue ... is an actual cue' (\citealp[p.\ 11]{Liberman:1985bn}; for evidence of context effects see in categorical perception of speech, for example, \citealp{Repp:1987xo}; \citealp{Nygaard:1995po} pp.\ 72--5; \citealp{Jusczyk:1997lz}, p.\ 44).

modest hypothesis about perceptual experience of emotion

Information about others’ emotions can faciliate categorical perception of their expressions of emotion,

which gives rise to phenomenal expectations concerning their bodily configurations, articulations and movements.