**** SUMMARY *** Previous studies have shown that people tend to associate happiness with female faces and anger with male faces, and vice versa (Aguado et al., 2009; Becker et al., 2007; Harris et al., 2016; Hess et al., 2009; Korb & Massaccesi, 2020). This was also recently observed with ratings and categorisation of highly controlled avatar faces, and extended to the voice domain (Korb et al., in press). In the current project, we will further investigate this emotion × sex interaction in the perception of dynamic (as opposed to the previously used static) face stimuli, based a perceptual phenomenon known as Representational Momentum (RM). As an effect of RM, the perception of the offset of a dynamic event is systematically displaced forward, into the immediate future (Hubbard, 2005, 2010). RM was initially described for simple motion (Freyd & Finke, 1984) and thought to reflect the extrapolation of a target’s position along its anticipated path of motion (Hubbard, 2005, 2010), and later observed in complex social events (Hudson et al., 2009, 2016), such as dynamic emotional faces (Jellema et al., 2011; Palumbo & Jellema, 2013; Yoshikawa & Sato, 2008; Prigent et al., 2018; Dozolme et al., 2018). For instance, Jellema et al. (2011) showed that neutral faces are rated as somewhat angry if they were preceded by a video showing a gradual change in emotional expression from happy to neutral, compared to a change from angry to neutral. This suggests that the immediate perceptual history of a dynamic face can affect the evaluation of that stimulus along its anticipated emotional state. However, to the best of our knowledge, no studies have yet investigated if sex changes in faces (e.g. a woman gradually changing into a man) can also induce RM effects. Thus, by combining both emotion and sex changes, this study will not only extend the research on the associations of maleness-anger and femaleness-happiness to a dynamic domain, but also provide innovative evidence on RM effects in face perception. We will conduct an online experiment, in which 240 participants are shown short videos displaying an avatar’s face dynamically changing in emotion (e.g. 100% angry to 50/50% angry/happy), sex characteristics (e.g. 100% male to 50/50% male/female), or both emotion and sex characteristics (e.g. 100% angry male to 50/50% angry/happy and 50/50% male/female). Participants will be randomly assigned to one of two tasks and asked to subjectively rate either the emotion (EmoRate task) or the gender (SexRate task) of the last frame of each dynamic face. Moreover, we will manipulate participants’ expectations by adopting a procedure similar to that used by Coles et al. (2020): at the beginning of the task, participants will be informed that previous research has shown that sex differences in emotional facial expressions do affect participants’ face judgments (positive priming), or that they do not affect participants’ judgments (negative priming). This will be used to investigate whether the perception of emotion, sex and their interaction in faces can be modulated through a between-subjects priming manipulation. *** EXPERIMENTAL TASK *** Participants will be randomly assigned to one of the two tasks: EmoRate, in which they will explicitly rate the emotion shown by the last frame of the dynamic face; and SexRate, in which they will explicitly rate of the sex shown by the last frame of the dynamic face. To manipulate participants’ expectations, each task will start with a cover story. Using a procedure similar to that used by Coles et al. (2020), participants will be randomly assigned to a positive or negative priming condition, depending on the task they were allocated to (negative priming condition in brackets): - EmoRate: “Men are generally more aggressive, and [but] research has shown that this has an [no] effect on how emotion is perceived in male and female faces. For example, more anger is seen in male faces and more happiness is seen in female faces [the same amount of anger and happiness is seen in male and female faces].” - SexRate: “Aggression is more widespread in men, and [but] research has shown that this has an [no] effect on how gender is perceived in faces. For example, more masculine features are seen in angry faces, and more feminine features are seen in happy faces [the same amount of masculine and feminine features is seen in angry and happy faces].” Each trial will start with a white fixation cross at the center of a black screen (2 sec), followed by a short video of a dynamic avatar face, gradually changing in emotion only, sex only, or both emotion and sex. To create the videos, we will use a total of 57 of the original 112 highly controlled pictures of avatar faces generated by Korb et al. (in press), as shown in Fig. 1. Each video will begin with a 100% angry/happy and 100% male/female avatar face, followed by 5 pictures of avatar faces with different levels of emotion and/or sex morph, thus resulting in a total of 12 dynamic stimuli: angry-to-ambiguous male, angry-to-ambiguous female, happy-to-ambiguous male, happy-to-ambiguous female (stimuli changing in emotion only); angry male-to-androgynous, angry female-to-androgynous, happy male-to-androgynous, happy female-to-androgynous (stimuli changing in sex only); and angry-to-ambiguous male-to-androgynous, angry-to-ambiguous female-to-androgynous, happy-to-ambiguous male-to-androgynous, happy-to-ambiguous female-to-androgynous (stimuli changing in emotion and sex, simultaneously). Previous research has found that the strength of the RM effect is often modulated by the target’s velocity (De Sá Teixeira & Oliveira, 2011; Hubbard, 2005, 2010; Yoshikawa & Sato, 2008). To investigate if the velocity of changes in emotion and/or sex of faces impacts the corresponding momentum effects, videos will be shown at three different speeds: 600, 1200, and 1800 ms (frame durations are, respectively, 100, 200, and 300 ms). After a retention interval of 250 ms, participants will be asked to rate either the emotion (EmoRate) or the gender (SexRate) of the last frame of the dynamic face, as intuitively as possible. Responses will be made with a Visual Analogue Scale (VAS) ranging from “Very angry” to “Ambiguous” to “Very happy” (EmoRate), or from “Very male” to “Ambiguous” to “Very female” (SexRate) (Fig. 2). For similar procedures, see Jellema et al. (2011) and Palumbo & Jellema (2013). Participants will be required to press “Continue” to proceed to the next trial. If a participant fails to respond within 5 sec, the task will automatically proceed to the next trial. Both tasks will contain 180 trials, randomly presented and equally distributed by InitialFeatures (angry-male, angry-female, happy-male, happy-female), Change (emotion, sex, both), and Velocity (600, 1200, 1800). After the completion of the task, and similar to Coles et al. (2020), participants will rate how much they believe in the idea that people tend to perceive male faces as more angry, and female faces as more happy (or that people tend to perceive more male features in angry faces, and more female features in happy faces), by using a VAS with the anchors “Not at all” and “Very much”.