Wednesday, May 4, 2022

A large majority of the population agreed that certain people had better aesthetic taste than others, only 1.3pct considered that good taste is determined by the taste of experts

De Gustibus Est Disputandum: An Empirical Investigation of the Folk Concept of Aesthetic Taste. Constant Bonard, Florian Cova, Steve Humbert-Droz. Chapter in Perspectives on Taste, 1st ed, Routeledge, 2022. ISBN 9781003184225. https://www.taylorfrancis.com/chapters/edit/10.4324/9781003184225-7/de-gustibus-est-disputandum-constant-bonard-florian-cova-steve-humbert-droz

Abstract: Past research on folk aesthetics has suggested that most people are subjectivists when it comes to aesthetic judgment. However, most people also make a distinction between good and bad aesthetic taste. To understand the extent to which these two observations conflict with one another, we need a better understanding of people’s everyday concept of aesthetic taste. In this chapter, we present the results of a study in which participants drawn from a representative sample of the US population were asked whether they usually distinguish between good and bad taste, how they define them, and whether aesthetic taste can be improved. Those who answered positively to the first question were asked to provide their definition of good and bad taste, while those who answered positively to the third question were asked to detail by what means taste can be improved. Our results suggest that most people distinguish between good and bad taste and think taste can be improved. People’s definitions of good and bad taste were varied and were torn between very subjectivist conceptions of taste and others that lent themselves to a more objectivist interpretation. Overall, our results suggest that the tension Hume observed in conceptions of aesthetic taste is still present today.



The relationship of facial width-to-height ratio and aggressiveness is strongest for males at 27–33 and females at 34–61

Tracking sexual dimorphism of facial width-to-height ratio across the lifespan: implications for perceived aggressiveness. Stephanie Summersby, Bonnie Harris, Thomas F. Denson and David White. Royal Society Open Science, Vol 9 Iss 5, May 4 2022. https://doi.org/10.1098/rsos.211500

Abstract: The facial width-to-height ratio (FWHR) influences social judgements like perceived aggression. This may be because FWHR is a sexually dimorphic feature, with males having higher FWHR than females. However, evidence for sexual dimorphism is mixed, little is known about how it varies with age, and the relationship between sexual dimorphism and perceived aggressiveness is unclear. We addressed these gaps by measuring FWHR of 17 607 passport images of male and female faces across the lifespan. We found larger FWHR in males only in young adulthood, aligning with the stage most commonly associated with mate selection and intrasexual competition. However, the direction of dimorphism was reversed after 48 years of age, with females recording larger FWHRs than males. We then examined how natural variation in FWHR affected perceived aggressiveness. The relationship between FWHR and perceived aggressiveness was strongest for males at 27–33 and females at 34–61. Raters were most sensitive to differences in FWHR for young adult male faces, pointing to enhanced sensitivity to FWHR as a cue to aggressiveness. This may reflect a common mechanism for evaluating male aggressiveness from variability in structural (FWHR) and malleable (emotional expression) aspects of the face.

4. General discussion

The present study was a large-scale investigation into sexual dimorphism in the FWHR across the lifespan. We found that sexual dimorphism was present in our sample of over 17 000 Australian citizens. In younger to middle adulthood, the FWHR in men was greater than the FWHR in women. From the age of 48 onwards, this pattern was reversed such than women had larger FWHRs than men. One qualification is that the differences between FWHRs in men and women were very small across the age groups. These small effect sizes are consistent with that reported for sexual dimorphism in a meta-analysis (d = 0.11; [5]).

The predicted sexual dimorphism was therefore only observed in a surprisingly narrow age band in early adulthood. Yet this pattern in young adults is consistent with the view that FWHR is an evolutionarily important cue to physical formidability, as sexual dimorphism in this age band aligns with the period of life most commonly associated with mate selection and intrasexual competition. The reversal of this dimorphism in middle and late-adulthood, with progressively larger female compared to male FWHR, is more difficult to explain. It is possible that there are broader physical changes in ageing that explain the pattern. For example, because body mass index (BMI) is moderately correlated with the FWHR (r = 0.31; [5]), one possibility is that age-related BMI changes are different for males and females. Other possibilities are that the reversal in dimorphism is connected to age-related structural changes to the faces, such as differences in the rate of face lengthening with age [42]. Other possibilities are that increasingly fewer males with higher FWHR apply for passports later in life—perhaps because many men with the largest FWHR may be removed from society via incarceration [43] or early mortality relative to women—or that the difference is affected by changes in head pose behaviour in males and females of different ages [44].

We also tested the relationships between the FWHR and perceived aggressiveness in men and women across the lifespan. In their meta-analysis, Geniole et al. [5] found that the relationship between the FWHR and perceived aggressiveness was stronger for younger faces than older faces, but there was no evidence of moderation by sex. By contrast, we found that the relationships between the FWHR and perceived aggressiveness for males was strongest for the youngest age group of faces (27–33 years old), but from 34–61 years old, this relationship was strongest for female faces. These results suggest that the effect of FWHR on perceived aggressiveness ratings varies as a function of age and sex.

Moreover, the effect of FWHR on perceived aggressiveness was somewhat independent from physical variation in FWHR in these age groups. Aggressiveness ratings to faces in the 34 to 40 age range show greater modulation for female faces, despite there being more physical FWHR variation in male faces (see the electronic supplementary material, figure S4). This shows increased sensitivity to FWHR in the youngest male faces when people evaluate perceived aggressiveness, albeit restricted to a relatively narrow age band. This is consistent with results showing that people are more sensitive to threatening emotional expressions in male compared with female faces [45,46] and may point to a common mechanism responsible for processing FWHR-related and expression-related cues to threat. In face perception research more broadly, it has been proposed that our social impressions of structural aspects of faces are shaped by social learning of facial expressions [47,48], for example, that trustworthiness judgements from structural properties of faces are linked to transient changes such as smiling or warm expressions (e.g. [49]). Future work examining whether other threat cues are also modulated by face age can potentially help to resolve whether similar social learning mechanisms are involved in perceived aggressiveness.3

Another potential explanation of these findings is that the apparent increase in sensitivity to FWHR cues in young males was owing to participants being mostly undergraduate students. For face identity processing at least, there is consistent evidence that people develop perceptual expertise specifically for faces fitting the viewer's demographic profile, including faces of the same age as the viewer [51,52]. This raises the possibility that the apparent perceptual sensitivity to FWHR in young faces that we observe may be specific to the younger participants in our study. However, we note that this ‘own age effect’ is reported mostly in identity memory-based recognition tasks and is not consistently found for other types of identity processing task formats [53], or for other types of face judgements [54].4 Moreover, participant's age is not known to affect perceptions of aggression [56]. Nevertheless, this is an intriguing question that could be addressed in future work.

The enhanced effect of FWHR on aggressiveness for young men is consistent with evolutionary perspectives on the FWHR as a cue to physical formidability. However, the relationship between the FWHR and aggressiveness for women in middle and late-adulthood is more difficult to explain. Physical variation in FWHR in these age groups was greater for male faces (electronic supplementary material, figure S4), but the effect this variation had on aggressiveness ratings was higher for female faces (figure 4). The differences in ratings of aggression for younger and older men and women may be related to age-specific facial adiposity (the perception of weight in the face). Higher facial adiposity had been associated with higher perceptions of male facial masculinity [57], and lower perceptions of female facial femininity [58]. Thus, one possibility is that age-related changes in facial adiposity are different for males and females, and could be contributing to the sex differences in FWHR and perceived aggressiveness.

Another plausible explanation is that differences in sensitivity to FWHR cues were mediated by widely held stereotypes of masculinity and femininity in younger and older men and women. Men with relatively larger FWHRs are considered masculine, unattractive and physically formidable [5]. As in previous research, we found that these men were also considered likely to become aggressive if provoked [5]. This perception was largest among young men and faded with time. This finding is consistent with the stereotype of older men becoming weaker and less formidable with age, while younger men with relatively large FWHRs were probably viewed as ‘fighting fit’. By contrast, younger women are stereotyped as more feminine, attractive and passive than older women. Thus, participants' judgements of aggressiveness may have been relatively unaffected by the FWHR of the young women. However, stereotypes of older women can be particularly harmful, as they lead to appearance-based discrimination [59]. In this case, the larger FWHRs may have elicited a global negative evaluation bias because the women were not stereotypically attractive, feminine women.

These proposals should be treated as speculative for now. Indeed, there should be some caution when interpreting associations between FWHR and social impressions more generally. The use of discrete anthropometric ratios can sometimes be misleading owing to their associations with other surrounding measures in the body and face (e.g. [60]). This can lead to effects emerging as a consequence of interactions with other interrelated traits (e.g. attractiveness, facial masculinity, facial maturity and anger resemblance, [56]) rather than a consequence of the ratio itself. Therefore, it is important for future research to explore potential traits that may be interacting with the FWHR to impact judgements such as perceived aggressiveness in younger and older low- and high-FWHR faces.

Notwithstanding, to our knowledge our results provide the most comprehensive analysis of FWHR across the lifespan to date. We show that the sexual dimorphism of this trait is consistent with a secondary sexual characteristic that signals formidability in young males. We also show that perceivers were particularly sensitive to FWHR variation in young faces when evaluating perceived aggressiveness. Understanding the causes of face age dependency on perceptions of aggressiveness would be a worthwhile focus of future work.

Greedy individuals more often self-selected themselves into business-related environments, which presumably allow them to fulfill their greed-related need to earn a lot of money

The development of trait greed during young adulthood: A simultaneous investigation of environmental effects and negative core beliefs. Patrick Mussel et al. European Journal of Personality, May 3, 2022. https://doi.org/10.1177/08902070221090101

Abstract: Recent models of personality development have emphasized the role of the environment in terms of selection and socialization effects and their interaction. Our study provides partial evidence for these models and, crucially, extends these models by adding a person variable: Core beliefs, which are defined as mental representations of experiences that individuals have while pursuing need-fulfilling goals. Specifically, we report results from a longitudinal investigation of the development of trait greed across time. Based on data from the German Personality Panel, we analyzed data on 1,965 young adults on up to 4 occasions, spanning a period of more than 3 years. According to our results, negative core beliefs that have so far been proposed only in the clinical literature (e.g., being unloved or being insecure) contributed to the development of trait greed, indicating that striving for material goals might be a substitute for unmet needs in the past. Additionally, greedy individuals more often self-selected themselves into business-related environments, which presumably allow them to fulfill their greed-related need to earn a lot of money. Our results expose important mechanisms for trait greed development. Regarding personality development in general, core beliefs were identified as an important variable for future theory building.

Keywords: personality development, trait greed, core beliefs, environment


Tuesday, May 3, 2022

COVID-19: people started to remember more dreams, & dream content changed as themes like sickness, confinement, and—in the English-speaking world—even bugs began to dominate

Dreams and nightmares during the pandemic. Severin Ableidinger, Franziska Nierwetberg & Brigitte Holzinger. Somnologie, May 3 2022. https://link.springer.com/article/10.1007/s11818-022-00351-x

Abstract: The pandemic caused by the coronavirus disease 2019 (COVID-19) had a huge impact on public mental health. This was also reflected in dreams. Not only did people start to remember more dreams, but dream content changed as themes like sickness, confinement, and—in the English-speaking world—even bugs began to dominate. This also led to an increase in nightmare frequency. There are various factors that contributed to this change in the dream landscape. Some people have started to sleep more and hereby spend more time in REM sleep, which is known to increase dream recall and further lead to bizarre and vivid dreams. On the other hand, stress and poor mental health had an impact on sleep, and sleep quality thus dropped in many individuals. Poor sleep quality can also lead to an increase in dream recall. Dreams are known to regulate mood, so the rise in dreams and the change in dream content could also reflect a reaction to the overall rise in stress and decline in mental health. Recent studies have shown that as the pandemic progresses, further changes in mental health, dream recall, and dream content arise, but data are still scarce. Further research could help understand the impact the pandemic still has on mental health and dreams, and how this impact is changing over the course of the pandemic.


Making sense of changed dreams

But why do people remember more dreams? There are various factors and theories on dream recall, but including all of them would go beyond the scope of this paper. Thus, in this paper, various factors that were found in the cited studies will be mentioned. For further factors and theories relevant for dream recall, look at [34]. One possible explanation for people remembering more dreams is that people could be getting more sleep. As people started working from home, and sometimes had reduced workloads, people didn’t have the need to commute to work, school, or university anymore. Having spared this time, many people could now afford to sleep in. This translated into people reporting longer sleep times [27] and better sleep quality [13]. Interestingly, many did not change their time for going to bed, even though some went to bed later now, but rather changed their time for waking up and getting out of bed, and in this manner increased their total bedtime [1]. This has the effect of enhancing REM sleep, as the proportion of REM sleep in one sleep cycle increases in the second half of the night and before waking up [24]. So the relative amount of REM sleep gets larger later in the night, or better said, later in the total time slept, as not all sleep solely during the night. Even though it is now revoked that REM sleep is equivalent to dreaming and dream research has shown that dreams also occur in non-REM sleep, REM sleep dreams are characterized by being much more vivid, bizarre, and emotional as compared to non-REM sleep dreams [20]. It is therefore plausible that longer bedtime and more sleep is responsible for higher dream recall. But not everybody is getting more sleep. In fact, many studies report insomnia, lower sleep quality, and more awakenings, which should reduce the total sleep time instead of increasing it [222527].

In a multinational study, worse sleep quality was found in about 20%, while better sleep quality was found in only 5% [25]. A study comparing sleep quality during and after a lockdown found that 46.07% reported low sleep quality during the lockdown. After the lockdown, people had significantly greater ease falling asleep and fewer awakenings during the night [30]. A different study found similar results, as people experienced longer sleep latency and more difficulties falling asleep during lockdown [1]. However, other research found that sleep quality did not increase after the end of a lockdown [11]. Indeed, it has been found that sleep quality and dream recall are associated, as people with lower sleep quality remembered dreams more frequently [8]. These results received further support as a recent study found dream recall and nightmare frequency to be solely associated with bad sleep quality and not with daily worries about COVID-19 [36].

Many different factors contribute to sleep quality, ranging from stress, physical activity, and physical health, to various psychological factors [316]. Since many people experience a lot of stress during the pandemic, it is logical to expect their sleep quality to decline. Additionally, it has been suggested that people neglect their sleep hygiene, by drinking more alcohol [26] or having longer screen time, even before sleeping [29].

Another factor related to dreaming is mental health. Although the existing literature is not conclusive on the effect of overall mental health and dream recall, it has been shown that stress is related to dream recall [35] and psychopathology to nightmare frequency [28]. As mentioned before, the stressors of the pandemic had a huge impact on public mental health. This translated into a rise in anxiety, depression, insomnia, and PTSD symptoms [522]. Research has shown that heightened dream recall during the pandemic was especially pronounced in people with poorer mental health [8]. But not everyone was affected equally by the pandemic. Mental health problems and high dream recall were especially frequent in women and younger people [822]. COVID-19 infections also increase the risk for anxiety, depression, and PTSD [31]. Additional pandemic-related factors, like financial burden, social isolation, and subjective risks have also been identified to contribute to mental health and dream recall during the pandemic. It has been shown that those who were more severely affected also suffered worse mental implications, like a heightened risk for insomnia [22], and experienced the biggest changes in their dreams and dream content [33]. Further, anxiety and PTSD are known to have a connection to nightmares [21]: 80% of PTSD patients suffer from regular nightmares [21]. So the rise in symptoms of both anxiety and PTSD could also contribute to the rise in dream recall and nightmares. Nightmares themselves can also lead to various disturbances. Decline in sleep quality, mood disturbances, daytime sleepiness, and cognitive impairment are some of the harmful results nightmares can bring [21]. In the context of the pandemic, this could indicate a vicious circle, where daily stress leads to nightmares, which themselves further lead to various problems and more daily stress.

Dreams are often seen as a mechanism that regulates our mood [437]. Therefore, increased dream recall could further indicate a natural way to deal with the overall stress during the pandemic. This would also be consistent with studies reporting changes in dream frequencies after other significant and threatening events (see [24]).

Conclusion

The pandemic has had a huge impact. It has affected mental health and caused various psychological symptoms. Further, it has influenced dream content and led to a rise in dream recall. As dreams started to get more negative, themes of sicknesses and pandemics arose, and nightmare frequency thus also spiked. This demonstrates how our daily lives influence our dreams, and how they reflect our mental health and wellbeing. Further, the quantity of changes shows how much of an impact the pandemic has had on the general population. This also implies that sudden changes in dream recall and dream content can reflect significant events and changes in mental health and sleep quality. So assessing dreams and dream frequency could help detect changes in mental wellbeing in the population.

Although the cited studies seem to show that the overall dream landscape has changed, it is necessary to emphasize that for all of the studies, participation was voluntary. Therefore, especially people who already had a change in dream and nightmare frequency as well as dream content could have chosen to participate in these studies which focused on dreams and COVID-19. However, the study by Callagher and Incelli [9] tested especially whether this sort of self-selecting bias leads to an overestimation of the changes in dream and nightmare frequency and dream content, or even creates the described changes in the first place. They did not give their participants any information about the purpose of the study until participation and yet they still found the described changes. However, all of the studies comparing pre-pandemic to pandemic dreams ask retrospectively, so there could possibly be memory bias.

In this article the continuity theory of dreaming has already been mentioned, as has emotion regulation, which is also central to various theories of dreaming. However, there are many more theories regarding the purpose of dreams. A complete listing of all theories and models would go beyond the scope of this review, for further reading [1019] are referred to.

As of now, most studies have explored the impact at the beginning of the pandemic, but more recent research has shown that there are differences between lockdown and post-lockdown periods, and that both are different to pre-pandemic times. But data on how dreams change during progression and time course of the pandemic are still lacking. It seems as if mental health problems get worse the longer the pandemic progresses, and this should also be reflected in dreams. So assessing dreams could help in finding trends regarding how mental health and wellbeing change throughout the course of the pandemic. Further, as dreams work as an emotional regulation mechanism, assessing nightmare and dream frequency could help assess whether this mechanism still functions properly and indicate heightened emotional load.

Clothing contributed to person misidentification just as much as the face in the first seeing but became less important over time; this suggests a dynamic shift in person identification depending on time

Miura, Hiroshi, Daisuke Shimane, and Yuji Itoh. 2022. “Person Misidentification Occurs in One-half of Cases: Demonstration in a Field Experiment.” PsyArXiv. May 3. doi:10.31234/osf.io/2etpm

Abstract: Few studies have examined person misidentification. Due to the lack of experimental studies, the reason person misidentification occurs, and its occurrence rate, remains unclear. Therefore, we aimed to demonstrate that person misidentification occurs with a certain probability, in a field experiment. We also wanted to examine whether the similarity between two people affects the occurrence of person misidentification. When 66 undergraduate participants made a rendezvous with an acquaintance, another person who wore similar clothes to the acquaintance or had a similar face appeared. The results showed that in both the conditions, approximately half of the participants made the person misidentification error, and one-fourth even spoke to the person mistakenly. Moreover, the results indicated that clothing contributed to person misidentification just as much as the face in the first seeing but became less important over time. This suggests a dynamic shift in person identification depending on time.


Female orgasm evolved as a mate-selection tool for females and promotes long-term, pair bonding but does not provide support for the hypothesis that it evolved as an indicator of male value

The Effect of Female Orgasm Frequency on Female Mate Selection: A Test of Two Hypotheses. Patrick J. Nebl, Anne K. Gordon. Evolutionary Psychology, March 9, 2022. https://doi.org/10.1177/14747049221083536

Abstract: Female orgasm has been a mystery that psychologists have been attempting to understand for decades. Many have contended that female orgasm is a functionless by-product of male orgasm, while others have argued that female orgasm may be an adaptation in its own right, offering several adaptationist accounts of female orgasm. In the current research, we tested predictions derived from two hypotheses regarding adaptive functions of female orgasm: female orgasm indicates partner mate value or female orgasm promotes long-term, pair bonding. 199 female undergraduates participated in an experiment where they imagined themselves as a member of a romantic relationship provided in a scenario. Within these scenarios, the relationships varied between either short- or long-term and the frequency that the female experienced orgasm during intercourse varied between never, occasionally, and almost always. Participants answered questions regarding relationship satisfaction and perceptions of the fictional relationship. A series of analysis of variance (ANOVAs) indicated that females assigned to conditions of experiencing more frequent orgasms reported greater relationship satisfaction, across both short- and long-term relationships. The relationship between female orgasm frequency and relationship satisfaction was fully mediated by the female's perceived love for her hypothetical partner but not by perceptions of her hypothetical partner's commitment. Taken together, this study provides preliminary support for the hypothesis that female orgasm evolved as a mate-selection tool for females and promotes long-term, pair bonding but does not provide support for the hypothesis that female orgasm evolved as an indicator of male value.

Keywords: orgasm, female orgasm, human sexuality, human mating, mate selection

According to the “Mr. Right” hypothesis of mate choice the frequency of female orgasm acts as an indicator that helps females evaluate males during short-term mating contexts as potential long-term mates. If this hypothesis is valid then when evaluating a male's care and commitment toward her, females should be sensitive to information about a man's ability to bring her to orgasm.

According to the long-term pair bonding hypothesis of mate choice, female orgasm promotes female attachment to a male with whom she experiences orgasm, promoting long-term relationships. If this hypothesis is valid then a female's feelings of attachment and closeness to a mate should be affected by experiencing orgasm with the mate in long-term contexts.

In this study, female participants read a scenario about a relationship between a male and a female and imagined themselves as the female in the relationship. Within these scenarios two independent variables were manipulated—female orgasm frequency and relationship context. The female's frequency of orgasm varied among three levels--never, occasionally, and almost always. The relationship context varied between two levels--short-term or long-term. The main dependent measures included female's perceptions of the relationship's quality, including her degree of relationship satisfaction and commitment.

  • Prediction 1: Based on the mate-choice hypothesis, we expected that females, regardless of context, who role-played experiencing higher frequencies of orgasm would report greater relationship satisfaction.

  • Prediction 2: If the Mr. Right hypothesis of mate choice is correct, we expected this relationship between female orgasm frequency and relationship satisfaction to be at least partially mediated by females’ perceptions of their partner's commitment.

  • Prediction 3: If the long-term pair bonding hypothesis of mate choice is correct, we expected this relationship between female orgasm frequency and relationship satisfaction to be at least partially mediated by love for her partner.

Girls score higher in tested emotional intel, but boys tend to overestimate their performance, girls underestimate; opposite tendencies in the two sexes may amplify the distances between the emotional world of boys and girls, probably increasing the gender conflicts

Sex differences in emotional and meta-emotional intelligence in pre-adolescents and adolescents. Antonella D'Amico, Alessandro Geraci. Acta Psychologica, Volume 227, July 2022, 103594. https://doi.org/10.1016/j.actpsy.2022.103594

Highlights

• Girls show higher levels of ability EI than boys.

• Boys report higher levels of self-reported EI than girls.

• Boys tend to overestimate their emotional abilities compared to girls.

• Girls tend to underestimate their emotional abilities compared to boys.

• Girls report higher scores than boys in meta-emotional beliefs.

Abstract: The study focuses on sex differences in emotional and meta-emotional intelligence in a sample of 355 pre-adolescents and 164 adolescents. Emotional and meta-emotional intelligence were measured using the multi-trait multi-method IE-ACCME test, allowing to define individuals' profiles of ability EI, emotional self-concept, meta-emotional knowledge, meta-emotional ability in self-evaluation and meta-emotional beliefs. Meta-emotional dimensions refer to the awareness of individuals about their emotional abilities and to their beliefs about the functioning of emotions in everyday life. Results demonstrated that girls scored better than boys in ability-EI, in particular in adolescents' group, whereas boys reported higher score than girls in emotional self-concept in both groups of age. Result about meta-emotional knowledge and meta-emotional ability in self-evaluation revealed that boys systematically overestimate their emotional abilities whereas girls, particularly in the adolescent group, tend to underestimate them. Finally, in both age groups, girls scored higher than males in metaemotional beliefs. The adoption of the meta-emotional intelligence framework may help to explain the discordances about sex differences found in previous studies using self-report vs. performance measures of EI. Moreover, it may contribute to shed light on the nature-nurture debate and on the role of meta-emotional variables for explaining sex differences in EI.

Keywords: Emotional intelligenceMeta-emotional intelligenceSexAgePre-adolescenceAdolescence

4. Discussion

Our results demonstrate that there are many differences in girls' and boys' emotional and meta-emotional intelligence and that they are also influenced by age. Even if effect sizes are small, these differences are quite systematic, and we consider them noteworthy. Consistently with previous literature (Curci & D'Amico, 2010Cabello et al., 2016Day & Carroll, 2004Fernández-Berrocal et al., 2012Gutiérrez-Cobo et al., 2016Mayer et al., 1999Mayer et al., 2002Rivers et al., 2012) and with results already obtained by D'Amico (2013), girls score better than boys in ability-EI, and this is particularly evident in adolescent group. Results of emotional self-concept presents an opposite pattern, with boys reporting higher score than girls in both groups of age and in particular in the adolescent group and that is consistent with some studies (Khan & Bat, 2013Petrides & Furnham, 2000). Differently from what found by D'Amico (2013), there are no sex differences in self-rating about performance scale, indicating that when specific emotional tasks are considered, boys and girls are equally accurate in evaluating own performance. However, the age-group differences in self-rating about performance indicates that, independently from sex, adolescents are more parsimonious and critics in evaluate their own performance than pre-adolescents. This is consistent with results found by D'Amico (2013) in standardization sample, indicating that scores in self-rating about performance scale were negatively related to age (r = −0.152, p < .005). Girls show to own higher levels than boys in meta-emotional knowledge and boys systematically overestimate their emotional ability in everyday situation. This result is particularly evident among adolescents, where boy overestimate and girls underestimate their emotional abilities. In preadolescents' group, both sexes overestimate their abilities, but again the overestimation is higher for boys than for girls.

Concerning meta-emotional self-evaluation, independently by sex, pre-adolescents overestimate their performance in the ability test more than adolescents. However, there are sex and age differences in the direction of estimation: in pre-adolescents' group, both boys and girls tend to overestimate their performance, in adolescents' group, boys overestimate and girls underestimate their performance in the ability test. Finally, consistently with D'Amico (2013) we found that girls, in both age groups, scored higher than males in meta-emotional beliefs.

These results give rise again to the famous debate on nature and culture: by evidencing that the gap among sexes is higher in older than in younger group, they seem to give more weight to the culture pole. As previously argued, probably more and more over their life, culture might influence sex difference in term of the different styles that boys and girls adopt in sharing their emotion with others: typically, women compared to men feel the need to share their problem with others. These differences in what we could define as coping strategies (coping/isolation vs. coping/sharing) are likely to be at the basis of EI differences, especially in the case of measurement tools, like MSCEIT or IE-ACCME ability test, that are scored using the consensus criterion. As it is well known, the consensus criterion foresees that, in the ability test, the score assigned to each answer corresponds to the percentage of subjects who consider that answer valid. In other words, the “best” answers are those chosen by the largest number of subjects (i.e., the statistical mode). This procedure implies that people who obtain higher scores in emotional intelligence are not better than others but, rather, more similar to the rest of the population in the way they feel emotions. In this perspective, D'Amico (2018) defined ability EI as the ability of tuning with others or, in other word, to elaborate emotional experience like others. Thus, the tendency of little girls, teenagers and then women, to share their emotions with peers, and to listen to other's emotional issues, could generate their higher ability to tune with others, and it could be therefore the basis for this form of intelligence that is substantiated by feeling emotions like others and not differently from others. On the contrary, boys and men are probably less inclined to share emotions with others and to participate to personal and social building of emotional consensus. This could be an obstacle in developing adequate perspective taking abilities and the ability to tune with others.

As already said, the reason for many boys and men being less inclined to share their emotions with others probably stems from education and culture. Indeed, this style increases with age: on the other hand, almost in all cultures boys are expected, even as children, to be less sensitive than girls. Boys are expected not to cry, not to show or share emotions and to follow reason. On the contrary, girls are expected to follow feelings and to talk about emotions, and this expectation probably turn in a real lifelong “exercise of emotions”. However, our results seem to demonstrate that girls are not aware of their high emotional ability. Indeed, consistently with claims by Ciarrochi and colleagues, adolescent girls show an underestimation bias, since their emotional self-concept is lower than the abilities that they show in the ability test. A similar even if opposite pattern is showed by boys in adolescents' group, showing a stable overestimation bias in meta-emotional knowledge, with an emotional self-concept higher than the abilities that they show in the ability test. The same biases are observed also concerning the meta-emotional self-evaluation, with boys tending to overestimate and girls to underestimate their performance in the ability test.

In general, it seems that neither boys nor girls, with a very slight difference in pre-adolescents' and adolescents' groups, show a proper awareness of their emotional abilities. Indeed, both overestimation and underestimation reflect poor awareness of one's own emotions and may have negative effects on individual personal life. The overestimation of one's own emotional abilities might lead adolescents to copy with situation they are not able to manage; underestimation of their emotional abilities might lead them to avoid those situations that they could be able to front, reducing the experiences of success and in general their self-efficacy.

In our previous study on the relationship between emotional and meta-emotional intelligence and sociometric status (D'Amico & Geraci, 2021) we demonstrated that pre-adolescents with higher levels of ability EI, meta-emotional knowledge and meta-emotional self-evaluation are more accepted by others while those that overestimate their emotional abilities are more refused by peers. For this reason, we claimed that, for social relationships, the most “dangerous bias” in evaluating one's own emotional abilities is the overestimation. In this sense, based on our results, boys might be statistically more at risk for social rejection by peers than girl.

On the other hand, the tendency to underestimate may be likewise dangerous for girls. Indeed, we know from literature (Miao et al., 2016) that people perceiving themselves as emotional intelligent, tend to perceive general positive affect, such as feeling active, alert, and energetic at any given moment in time, whereas people who perceive themselves as poor in emotional intelligent tend to experience negative affect. Thus, on the basis of our results, girls might be statistically more at risk for negative affect and this could also help to explain, along with other neurobiological factors, the prevalence for anxiety (Jalnapurkar et al., 2018) and depression (Labaka et al., 2018) in females when compared to males. Overestimation and underestimation errors might be also a side effect of the different degree of importance that boys and girls give to emotions in everyday life. Our results in meta-emotional beliefs seem to corroborate this view. Indeed, girls show higher scores than boys, demonstrating they own a belief's systems about emotions that is consistent with current scientific knowledge on emotional intelligence. In other words, girls believe more than boys that emotions count, and the importance given to emotions could lead girls to never consider themselves good enough in the emotional field.

4.1. Limitations and future directions

We are aware that our study presents some limitation that could be overcome in future studies. Firstly, it could be useful to examine sex differences in a wider range of age groups, and in particular in younger children, in order to see if sex differences are less evident in children than in pre-adolescents. Definitely, longitudinal studies in which emotional and meta-emotional intelligence are measured during transition from childhood to adulthood may give very important insights on their developmental trend. We are also aware that this study is about sex differences and that sex does not always corresponds to gender identity and sexual orientation. For instance, the study focusing on emotional intelligence and sexual orientation realized by Mîndru and Năstasă (2017), evidenced higher levels of both self-reported EI and ability EI in adults with homosexual orientation compared to those with heterosexual orientation. It is not clear to what extent this could be related to results about sex differences and, however, only one study is not enough for making clear conclusions. Moreover, due to the novelty of the paradigm, there is actually no information about meta-emotional intelligence in people with different sexual orientation. Thus, future studies should focus on differences in emotional and meta-emotional intelligence of adolescents and adults that differ not only for biological sex but also for gender identity and sexual orientation. We wonder if gender identity and sexual orientation, more than biological sex, may also predict size and direction of meta-emotional knowledge and meta-emotional self-evaluation.

A 10 percent increase in the binding minimum wage reducesA 10 percent increase in the binding minimum wage reduces vacancies by 2.4 pct contemporaneously, & 4 pct a year later; effect bigger for occupations in places of les educational attainment (high school or less) & in counties with more poverty vacancies by 2.4 pct contemporaneously, and could be as large as 4.5 pct a year later;

Kudlyak, Marianna and Tasci, Murat and Tuzemen, Didem, Minimum Wage Increases and Vacancies (April 21, 2022). FRB of Cleveland Working Paper No. 19-30R, SSRN: http://dx.doi.org/10.2139/ssrn.4089901

Abstract: Using a unique data set and a novel identification strategy we explore the effect of the state-level minimum wage increases on firms' existing and new vacancy postings. Utilizing occupation-specific county-level vacancy data from the Conference Board's Help Wanted Online for 2005-2018, we show that the state-level minimum wage increases lead to substantial declines in existing and new vacancy postings for occupations with a larger share of workers earning around the prevailing minimum wage. Our estimate implies that a 10 percent increase in the binding minimum wage reduces vacancies by 2.4 percent contemporaneously, and could be as large as 4.5 percent a year later. The negative effect on vacancies is more pronounced for occupations where workers generally have lower educational attainment (high school or less) and those in counties with higher poverty rates.

Keywords: Minimum Wage; Vacancies; Hiring; Search and Matching

JEL Classification: E24, E32, J30, J41, J63, J64


3.8 percent of deaths in Mexico are caused by suboptimal temperature (26,000 every year): 92pct of weather-related deaths are induced by cold (below 12 C) or mildly cold (12–20 C) days & only 2 percent by outstandingly hot days (above 32 C)

Cohen, François, and Antoine Dechezleprêtre. 2022. "Mortality, Temperature, and Public Health Provision: Evidence from Mexico." American Economic Journal: Economic Policy, 14 (2): 161-92. DOI: 10.1257/pol.20180594

Abstract: We examine the impact of temperature on mortality in Mexico using daily data over the period 1998–2017 and find that 3.8 percent of deaths in Mexico are caused by suboptimal temperature (26,000 every year). However, 92 percent of weather-related deaths are induced by cold (less than 12 degrees C) or mildly cold (12–20 degrees C) days and only 2 percent by outstandingly hot days (more than 32 degrees C). Furthermore, temperatures are twice as likely to kill people in the bottom half of the income distribution. Finally, we show causal evidence that the Seguro Popular, a universal health care policy, has saved at least 1,600 lives per year from cold weather since 2004.


We hypothesized that restraining the urge to void facilitates honest behavior

Urge to void and dishonest behavior: Evidence from a field experiment. Erez Siniver, Yossef Tobol, Gideon Yaniv. Economics Letters, April 27 2022, 110544. https://doi.org/10.1016/j.econlet.2022.110544

Highlights

• We hypothesized that restraining the urge to void facilitates honest behavior.

• A die-under-the-cup experiment was run in a shopping center with WC entrants and exiters.

• WC entrants cheated significantly less than WC exiters.

• Cheating decreased with the intensity of the urge to void.

Abstract: The present paper reports the results of a study designed to investigate whether restraining the urge to void simultaneously facilitates self-control in the unrelated domain of dishonest behavior. We conducted a field experiment in a big shopping center with passersby who entered or exited the public WC. Participants were recruited by asking WC entrants and exiters if they could spare a few minutes of their time in return for a monetary reward. WC entrants indicated on a short questionnaire the intensity of their urge to void as well as its source (bladder/colon), whereas WC exiters indicated just the latter. All subjects were then offered to perform the Fischbacher and Föllmi-Heusi (2013) die-under-the-cup task which incentivizes cheating. The results reveal that WC entrants cheated significantly less than WC exiters, supporting the hypothesis that inhibiting visceral responses may spill over to inhibiting simultaneous dishonest behavior, and that cheating decreased with the intensity of the urge to void. No significant connection emerged between cheating and the voiding source.

JEL: C92 K42

Keywords: Urge to voidSelf-controlDishonest behaviorDie-under-the-cup task


Primitive communism: Adam Smith's & Karl Marx’s idea that societies were naturally egalitarian and communal before farming is widely influential and quite wrong

Primitive communism: Marx’s idea that societies were naturally egalitarian and communal before farming is widely influential and quite wrong. Manvir Singh, AEON, Apr 2022. https://aeon.co/essays/the-idea-of-primitive-communism-is-as-seductive-as-it-is-wrong

[Marx's The Origin of the Family, Private Property and the State]is like Yuval Noah Harari’s blockbuster Sapiens (2014) but written by a 19th-century socialist: a sweeping take on the dawn of property, patriarchy, monogamy and materialism. Like many of its contemporaries, it arranged societies on an evolutionary ladder from savagery to barbarism to civilisation. Although wrong in most ways, The Origin was described by a recent historian as ‘among the more important and politically applicable texts in the Marxist canon’, shaping everything from feminist ideology to the divorce policies of Maoist China.

Of the text’s legacies, the most popular is primitive communism. The idea goes like this. Once upon a time, private property was unknown. Food went to those in need. Everyone was cared for. Then agriculture arose and, with it, ownership over land, labour and wild resources. The organic community splintered under the weight of competition. The story predates Marx and Engels. The patron saint of capitalism, Adam Smith, proposed something similar, as did the 19th-century American anthropologist Lewis Henry Morgan. Even ancient Buddhist texts described a pre-state society free of property. But The Origin is the idea’s most important codification. It argued for primitive communism, circulated it widely, and welded it to Marxist principles.

Today, many writers and academics still treat primitive communism as a historical fact. To take an influential example, the economists Samuel Bowles and Jung-Kyoo Choi have argued for 20 years that property rights coevolved with farming. For them, the question is less whether private property predated farming, but rather why it appeared at that time. In 2017, an article in The Atlantic covering their work asserted plainly: ‘For most of human history, there was no such thing as private property.’ A leading anthropology textbook captures the supposed consensus when it states: ‘The concept of private property is far from universal and tends to occur only in complex societies with social inequality.’

Historical narratives matter. In his bestseller Humankind (2019), Rutger Bregman took the fact that ‘our ancestors had scarcely any notion of private property’ as evidence of fundamental human goodness. In Civilized to Death (2019), Christopher Ryan wrote that pre-agricultural societies were defined by ‘obligatory sharing of minimal property, open access to the necessities of life, and a sense of gratitude toward an environment that provided what was needed.’ As a result, he concluded: ‘The future I imagine (on a good day) looks a lot like the world inhabited by our ancestors…’

Primitive communism is appealing. It endorses an Edenic image of humanity, one in which modernity has corrupted our natural goodness. But this is precisely why we should question it. If a century and a half of research on humanity has taught us anything, it is to be sceptical of the seductive. From race science to the noble savage, the history of anthropology is cluttered with the corpses of convenient stories, of narratives that misrepresent human diversity to advance ideological aims. Is primitive communism any different?

[...]

Is primitive communism another seductive but incorrect anthropological myth? On the one hand, no hunter-gatherer society lacked private property. And although they all shared food, most balanced sharing with special rights. On the other hand, living in a society like the Aché’s was a masterclass in reallocation. It’s hard to imagine farmers engaging in need-based redistribution on that scale.

Whatever we call it, the sharing economy that Hill observed with the Aché does not reflect some lost Edenic goodness. Rather, it sprang from a simpler source: interdependence. Aché families relied on each other for survival. We share with you today so that you can share with us next week, or when we get sick, or when we are pregnant. Hill once saw a man fall from a tree and break his hip. ‘He couldn’t walk for three months, and in those three months, he produced zero food,’ Hill said. ‘And you would think that he would have starved to death and his family would have starved to death. But, of course, nothing happened like that, because everybody provisioned him the whole time.’

This is partly about reciprocity. But it’s also about something deeper. When people are locked in networks of interdependence, they become invested in each other’s welfare. If I rely on three other families to keep me alive and get me food when I cannot, then not only do I want to maintain bonds with them – I also want them to be healthy and strong and capable.

Interdependence might seem enviable. Yet it begets a cruelty often overlooked in talk about primitive communism. When a person goes from a lifeline to a long-term burden, reasons to keep them alive can vanish. In their book Aché Life History (1996), Hill and the anthropologist Ana Magdalena Hurtado listed many Aché people who were killed, abandoned or buried alive: widows, sick people, a blind woman, an infant born too soon, a boy with a paralysed hand, a child who was ‘funny looking’, a girl with bad haemorrhoids. Such opportunism suffuses all social interactions. But it is acute for foragers living at the edge of subsistence, for whom cooperation is essential and wasted efforts can be fatal.

Once that need to survive dissipated, even friends could become disposable

Consider, for example, how the Aché treated orphans. ‘We really hate orphans,’ said an Aché person in 1978. Another Aché person was recorded after seeing jaguar tracks:

Don’t cry now. Are you crying because you want your mother to die? Do you want to be buried with your dead mother? Do you want to be thrown in the grave with your mother and stepped on until your excrement comes out? Your mother is going to die if you keep crying. When you are an orphan nobody will ever take care of you again.

The Aché had among the highest infanticide and child homicide rates ever reported. Of children born in the forest, 14 per cent of boys and 23 per cent of girls were killed before the age of 10, nearly all of them orphans. An infant who lost their mother during the first year of life was always killed.

(Since acculturation, many Aché have regretted killing children and infants. In Aché Life History, Hill and Hurtado reported an interview with a man who strangled a 13-year-old girl nearly 20 years earlier. He ‘asked for our forgiveness’, they wrote, ‘and acknowledged that he never should have carried out the task and simply “wasn’t thinking”.’)

Hunter-gatherers shared because they had to. They put food into their bandmates’ stomachs because their survival depended on it. But once that need dissipated, even friends could become disposable.

The popularity of the idea of primitive communism, especially in the face of contradictory evidence, tells us something important about why narratives succeed. Primitive communism may misrepresent forager societies. But it is simple, and it accords with widespread beliefs about the arc of human history. If we assume that societies went from small to big, or from egalitarian to despotic, then it makes sense that they transitioned from property-less harmony to selfish competition, too. Even if the facts of primitive communism are off, the story feels right.

More important than its simplicity and narrative resonance, however, is primitive communism’s political expediency. For anyone hoping to critique existing institutions, primitive communism conveniently casts modern society as a perversion of a more prosocial human nature. Yet this storytelling is counterproductive. By drawing a contrast between an angelic past and our greedy present, primitive communism blinds us to the true determinants of trust, freedom and equity. If we want to build better societies, the way forward is neither to live as hunter-gatherers nor to bang the drum of a make-believe state of nature. Rather, it is to work with humans as they are, warts and all.