Thursday, March 25, 2021

People with schizophrenia, and possibly forensic personality disorders populations, demonstrate a range of reading skills deficits

Reading skills deficits in people with mental illness: A systematic review and meta-analysis. Martina Vanova et al. European Psychiatry, Volume 64 Issue 1, November 3 2020. https://www.cambridge.org/core/journals/european-psychiatry/article/reading-skills-deficits-in-people-with-mental-illness-a-systematic-review-and-metaanalysis/2EE4FD903FF4FA7FED80B8DC62E18E62


Abstract

Background: Good reading skills are important for appropriate functioning in everyday life, scholastic performance, and acquiring a higher socioeconomic status. We conducted the first systematic review and meta-analysis to quantify possible deficits in specific reading skills in people with a variety of mental illnesses, including personality disorders (PDs).

Methods: We performed a systematic search of multiple databases from inception until February 2020 and conducted random-effects meta-analyses.

Results: The search yielded 34 studies with standardized assessments of reading skills in people with one or more mental illnesses. Of these, 19 studies provided data for the meta-analysis. Most studies (k = 27; meta-analysis, k = 17) were in people with schizophrenia and revealed large deficits in phonological processing (Hedge’s g = −0.88, p < 0.00001), comprehension (Hedge’s g = −0.96, p < 0.00001) and reading rate (Hedge’s g = −1.22, p = 0.002), relative to healthy controls; the single-word reading was less affected (Hedge’s g = −0.70, p < 0.00001). A few studies in affective disorders and nonforensic PDs suggested weaker deficits (for all, Hedge’s g < −0.60). In forensic populations with PDs, there was evidence of marked phonological processing (Hedge’s g = −0.85, p < 0.0001) and comprehension deficits (Hedge’s g = −0.95, p = 0.0003).

Conclusions: <. Future studies are needed to establish how these deficits directly compare to those seen in developmental or acquired dyslexia and to explore the potential of dyslexia interventions to improve reading skills in these populations.

Discussion

This systematic review and meta-analysis evaluated existing evidence to identify the type and degree of reading impairments in different MIs, the reading assessment tools that might most consistently detect them, and possible differences in the pattern of reading skills deficits in people with different MIs in forensic and nonforensic settings. Most of the reviewed studies (27/34) included people with SZ. There were seven studies of reading skills deficits in people with different MIs (PD or general MI) in forensic settings. Our findings are discussed below.

Effect of diagnosis in nonforensic samples

We observed significant deficits in multiple reading skills in SZ, resembling the pattern typically seen in dyslexia [6], and consistent with previous evidence for shared genetic and psychophysiological traits in SZ and dyslexia [7]. In our meta-analysis, both phonological processing and comprehension were greatly impaired. These impairments may be associated with ineffective use of contextual information [91] and contribute to poor speech in SZ, especially in close association with thought disorder [92]. Reading rate was low but the deficit in reading accuracy was lower. This indicates relatively preserved single-word reading skills, most likely because they are usually acquired before illness onset and remain intact [47]. In contrast, there was evidence for impairments in vocabulary and spelling, presumably as a result of disrupted scholastic experience. Disrupted scholastic experience during adolescence can affect complex skills such as comprehension [44,45,47], which could precipitate difficulties with processing complex written information in SZ. People with SZ showed reading skills well below their achieved education level (see Education). Reading skills deficits in SZ also do not seem to be explained by other aspects of cognition (see Cognitive Function) although more comprehensive investigations are needed to substantiate this. Our findings (Symptoms and Medication) further indicated that while symptoms and high antipsychotic doses may worsen reading skills, they do not fully explain the profile of reading skills deficits in SZ. Impairment in comprehension and vocabulary was present even before the onset of symptoms [44,45] together with deficient phonological processing, which has been related to disrupted visual processing in SZ since early age [21]. The symptoms can, however, aggravate deficits in reading skills, such as comprehension, which are acquired with experience, and also depend on the earlier acquired skills [93]. Recent data [94] suggest that some aspects of language production (e.g., slower articulation) that can affect reading skills assessments are particularly sensitive to dopamine-D2 receptor blocking antipsychotics. Furthermore, most studies in SZ included more men than women or men solely and also included people with schizoaffective disorder. Further studies need to comprehensively examine specific reading skills in both men and women with schizophrenia and schizoaffective disorder (separately) while taking medication, symptoms, cognition, education, and socioeconomic status into account.

Unlike in SZ and psychosis [51,58,65], nonpsychotic bipolar disorder, and affective disorders, seemed to have comprehension and single-word reading skills comparable to HC [30,47]. Although not all studies specified the type of PD, it seems that reading skill deficits may not be as prominent in nonforensic psychopathy as in SZ.

Effect of diagnosis in forensic samples

Our findings suggest only a weak or no deficit in nonforensic psychopathy but indicate a marked phonological processing and comprehension deficit in the incarcerated group. It is possible that PD/psychopathic individuals with good phonological processing and comprehension are more able to evade incarceration [30,95]. Nonetheless, marked reading deficits in the incarcerated group may have contributed to their poor adjustment within the community [27], which, in turn, increased the risk of incarceration. Men with MIs within forensic settings had significantly lower general reading abilities and spelling than women with MIs [27], consistent with the pattern seen in healthy samples [22].

Clinical implications

Comprehension has a significant influence on decision-making capacity in SZ [96], and this is likely to be true also for people with other MIs, especially within forensic populations. Dyslexia is often underdiagnosed in people with MIs, and this might explain their inability to complete higher education and obtain jobs [15], or the expression of socially unacceptable behaviors [27]. Furthermore, progression and engagement in therapeutic activities within mental health services often depend on good reading and language skills. This highlights a need to accurately identify reading deficits and develop specific programs to improve reading skills of people in psychiatric services. It may be possible to target reading deficits in SZ and other MIs by building on the less affected aspects, such as lexical knowledge (access to words) [97,98], and access to familiar information that can compensate for some of the reading deficits [99], while implementing interventions to ameliorate reading skills [100].

Effect of assessments

Significant between-test differences were found only in tests detecting deficits in comprehension, accuracy, and rate in SZ. In comprehension and rate, the NDRT and GORT-4, and in accuracy, the GORT solely, consistently detected large deficits while the Alouette (French) test detected no deficits (Figure 2). It is conceivable that certain deficits emerge more often/strongly in English compared to some other languages, as is the case in developmental dyslexia [101]. This possibility requires further study.

Promoting a message that highlights threat to life may be effective in raising levels of adherence to measures of infection control, but may also lead to a reduction in health-promoting behaviours

Brown, Richard D., Lynne Coventry, and Gillian V. Pepper. 2020. “COVID-19: The Relationship Between Perceptions of Risk and Behaviours During Lockdown.” OSF Preprints. July 30. doi:10.31219/osf.io/dwjvy.

Abstract

Background: Understanding COVID-19 risk perceptions and their impact on behaviour can improve the effectiveness of public health strategies in the future. Prior evidence suggests that, when people perceive uncontrollable risks to their health, they are less likely to make efforts to protect their health in those ways which they can control (e.g. through diet, exercise, and limiting alcohol intake). It is therefore important to understand the extent to which the threat of COVID-19 is perceived to be an uncontrollable risk, and to assess whether this perceived risk is associated with differences in health behaviour.

Methods: We surveyed a nationally representative sample of 496 participants, shortly after the peak of the pandemic in the UK. We collected data to assess people’s perceptions of COVID-19-related risk, and how these perceptions were associated with behaviours. We examined self-reported adherence to behaviours recommended by the UK Government and National Health Service to prevent the spread of the virus, as well as more general health behaviours. We predicted that increased perceived extrinsic mortality risk (the portion of a person’s mortality risk which they perceive to be uncontrollable) would disincentivise healthy behaviour.

Results: Perceived threat to life was found to be the most consistent predictor of reported adherence to measures designed to prevent the spread of infection. Perceived extrinsic mortality risk was found to have increased due to the pandemic, and was also associated with lower reported adherence to Government advice on diet and physical activity, as well as smoking.

Conclusions: Our findings suggest that promoting a message that highlights threat to life may be effective in raising levels of adherence to measures of infection control, but may also have unintended consequences, leading to a reduction in health-promoting behaviours. We suggest that messages that highlight threat to life should be accompanied by statements of efficacy, and that messages evoking feelings of concern for others may also be effective in promoting compliance with anti-infection measures.


No convincing evidence outgroups are denied uniquely human characteristics: Distinguishing intergroup preference from trait-based dehumanization

No convincing evidence outgroups are denied uniquely human characteristics: Distinguishing intergroup preference from trait-based dehumanization. Florence E. Enock et al. Cognition, Volume 212, July 2021, 104682, https://doi.org/10.1016/j.cognition.2021.104682

Highlights

• The dual model predicts outgroups are attributed human traits to a lesser extent.

• To date, predominantly desirable traits have been investigated, creating a confound.

• We test attributions of desirable and undesirable human traits to social groups.

• Attributions of undesirable human traits were stronger for outgroups than ingroups.

• We find no support for the predictions of the dual model of dehumanization.

Abstract: According to the dual model, outgroup members can be dehumanized by being thought to possess uniquely and characteristically human traits to a lesser extent than ingroup members. However, previous research on this topic has tended to investigate the attribution of human traits that are socially desirable in nature such as warmth, civility and rationality. As a result, it has not yet been possible to determine whether this form of dehumanization is distinct from intergroup preference and stereotyping. We first establish that participants associate undesirable (e.g., corrupt, jealous) as well as desirable (e.g., open-minded, generous) traits with humans. We then go on to show that participants tend to attribute desirable human traits more strongly to ingroup members but undesirable human traits more strongly to outgroup members. This pattern holds across three different intergroup contexts for which dehumanization effects have previously been reported: political opponents, immigrants and criminals. Taken together, these studies cast doubt on the claim that a trait-based account of representing others as ‘less human’ holds value in the study of intergroup bias.

Keywords: DehumanizationIntergroup biasPrejudiceSocial cognition

7. General discussion

In this paper, we question the central claims of one of the most prominent psychological accounts of dehumanization - the dual model - which holds that outgroup members are perceived as lesser humans than ingroup members by being denied human specific traits (Haslam, 2006). We first revisited work relating to how the lay concept of ‘human’ is best characterised. We then tested its predictions about outgroup dehumanization in a series of seven experiments. Our results present a serious empirical challenge to the dual model.

The dual model argues that there are two sense of humanness: human uniqueness and human nature. Uniquely human traits can be summarised as civility, refinement, moral sensibility, rationality, and maturity. Human nature traits can be summarised as emotional responsiveness, interpersonal warmth, cognitive openness, agency, and depth (Haslam, 2006). However, the traits that supposedly characterise ‘humanness’ within this model are broadly socially desirable (Over, 2020aOver, 2020b). We showed that people also associate some undesirable traits with the concept ‘human’. As well as considering humans to be refined and cultured, people also consider humans to be corrupt, selfish and cruel.

Results from our pretest provided us with grounds for re-examining predictions made by the dual model of dehumanization about the nature of intergroup bias in trait attributions. The dual model account holds that lesser attribution of human specific traits to outgroup members represents a psychological process of dehumanization that is separable from ingroup preference. However, as the human specific attributes summarised by the model are positive and socially desirable, it is possible that previous findings are better explained in terms of ingroup preference, the process of attributing positive qualities to ingroup members to a greater extent than to outgroup members.

In seven highly-powered experiments, we tested the predictions of the dual model against this alternative. We pitted the two hypotheses against each other by comparing attributions of uniquely human traits that varied in whether they were socially desirable or undesirable to ingroup and outgroup members. The dual model holds that subtle dehumanization is evidenced by denying outgroup members uniquely human traits relative to ingroup members. We reasoned that whereas outgroup members may be denied desirable human traits, they are likely to be attributed undesirable human traits to a greater extent than ingroup members.

Across three distinct intergroup contexts, we found no evidence for either animalistic or mechanistic dehumanization of outgroup members. Instead, we found strong and reliable intergroup preference effects. Desirable traits were ascribed more strongly to ingroup members than outgroup members and undesirable traits more strongly to outgroup members than ingroup members, irrespective of perceived humanness.

A possible defence of the dual model account could be to argue that we chose three intergroup contexts in which animalistic and mechanistic dehumanization does not occur. However, we chose to investigate judgements of political opponents, immigrants and criminals specifically because previous research has suggested that they are dehumanized on a range of measures (Banton et al., 2020Bastian et al., 2013Markowitz & Slovic, 2020Pacilli et al., 2016Viki et al., 2013). In addition, we also showed in every experiment that outgroup members were explicitly rated as less human than were the ingroup on the blatant dehumanization scale (Kteily et al., 2015). Prior work shows that measures of animalistic and mechanistic dehumanization correlate positively with blatant dehumanization scores (Kteily et al., 2015). Though they are not claimed to measure the same construct, they have been shown to reliably co-occur. These findings confirm that these are the sorts of intergroup contexts in which we would expect to see trait-based dehumanization should the process occur.

We acknowledge that without testing all possible intergroup contexts, it remains a possibility that some outgroups could be denied human specific attributes relative to ingroups even when valence is appropriately controlled for. In other words, it could be the case that trait-based dehumanization occurs independently of ingroup preference in some social settings. It may be particularly interesting for future research to investigate intergroup contexts that are not so strongly associated with competition, threat and animosity.

However, the possibility that some, as yet untested, groups may be denied human unique attributes does not detract from the importance of our critique. To accurately measure trait-based dehumanization in future research, studies must consider the central role of valence. Prior work utilising the dual model framework has reported dehumanization to be extremely widespread in society, affecting not just marginalised groups but doctors, patients and even cyclists (Delbosc, Naznin, Haslam, & Haworth, 2019Haslam & Stratemeyer, 2016). Rigorous measurement and tighter experimental control may change some or all of the conclusions from previous research.

Across our experiments, we observed strong intergroup preference effects, with desirable traits more strongly ascribed to the ingroup and undesirable traits more strongly to the outgroup. Our results demonstrate both ingroup favouritism (assigning greater positivity to the ingroup) and outgroup derogation (assigning greater negativity to the outgroup) (Brewer, 1999Hewstone et al., 2002). However, we also suggest that group specific stereotypes are likely to play an important role in these processes. In many social contexts, trait attributions may reflect social stereotyping as well as intergroup preferences (Fiske et al., 2002). For example, previous work suggested that Anglo-Australians were ‘animalistically’ dehumanized both by themselves and by Ethnic-Chinese participants, whilst Ethnic-Chinese people were ‘mechanistically’ dehumanized both by themselves and by Anglo-Australians (Bain et al., 2009). These effects may be more compatible with stereotype content than with trait-based dehumanization. Future work would benefit from addressing the distinction between stereotyping and trait-based dehumanization.

An outstanding question relates to whether other psychological models of dehumanization more accurately capture the ways in which different social groups are perceived. For example, infrahumanisation theory predicts that people tend to believe ingroup members experience uniquely human emotions more strongly than do outgroup members (Leyens et al., 2000Leyens et al., 2001). It would be valuable for future research to examine the utility of this theory by testing whether participants perceive ingroup members to experience human emotions more strongly overall or whether they perceive ingroup members to experience prosocial emotions more strongly but outgroup members to experience antisocial emotions more strongly. Further work could helpfully investigate how these findings bare on the claim that outgroups are sometimes dehumanized by being denied mental states (Harris & Fiske, 2006).

Taken together, our studies suggest that the dual model does not accurately characterise the ways in which outgroups are perceived in at least the social contexts examined here – political groups, immigrants and criminals. Prejudice and discrimination are pressing social problems. If psychological research is to contribute to the interdisciplinary mission to reduce prejudice and encourage more egalitarian behaviour, then it must start by accurately characterising the psychological biases underlying discriminatory behaviour. We suggest that the dual model of dehumanization conflates apparent evidence for dehumanization with ingroup preference. As a result, it may obscure more than it reveals about the psychology of intergroup bias.

Women were much more likely to report long-term mate attraction to the benevolent, vs to the hostile, sexist; the hostile sexist one generated greater distrust and dislike in both men & women, compared to the benevolent one

Grab 'em by the...: Hostile and benevolent sexism as signals of mating strategies. Adair, L. & Ferenczi, N. European Human Behaviour and Evolution Association, 15th Conference, Mar 2021. https://ehbea2021.com/

Abstract

Objective: Men’s hostile sexism is associated with many negative interpersonal consequences (e.g., lower relationship quality). Men’s benevolent sexism does not seem to similarly promote negative behaviours and interpersonal interactions but is still associated with harmful attitudes and stereotypes (e.g., endorsement of rape myths). Recent work finds that disclosures of benevolent sexism are associated with perceived attractiveness and provisioning qualities, even though they are perceived as patronising. Our work is designed to apply evolutionary and feminist perspectives to investigate this further – is sexism used to infer mating-relevant qualities, mate attractiveness, and mating strategy?

Methods: Using a nationally representative sample (N = 317; Female = 50%), participants were randomly assigned to evaluate descriptions of men varying along two criteria, sexism (hostile vs. benevolent) and social prestige (with vs. without prestige criteria).

Results: 2x2 Between-subjects ANOVAs highlighted several main effects: benevolent sexists were rated higher on long-term mate qualities, hostile sexists were rated higher on short-term mate qualities, and hostile sexists were rated as “more sexist” overall. Interaction effects indicated that having prestige decreased how “sexist” hostile sexists were perceived, the same was not true for benevolent sexists.

Conclusions: As predicted, disclosures of benevolent sexism were used to infer long-term mating qualities, while disclosures of hostile sexism were used to infer short-term mating qualities. We found that women were much more likely to report long-term mate attraction to the benevolent, compared to the hostile, sexist. The hostile sexist generated greater distrust and dislike in both men and women, compared to the benevolent sexist.


The trophic level of the Homo lineage that most probably led to modern humans evolved from a low base to a high, carnivorous position during the Pleistocene; partially reversed later & with the advent of agriculture

The evolution of the human trophic level during the Pleistocene. Miki Ben‐Dor  Raphael Sirtoli  Ran Barkai. American Journal of Physical Anthropology, March 5 2021. https://doi.org/10.1002/ajpa.24247

Abstract: The human trophic level (HTL) during the Pleistocene and its degree of variability serve, explicitly or tacitly, as the basis of many explanations for human evolution, behavior, and culture. Previous attempts to reconstruct the HTL have relied heavily on an analogy with recent hunter‐gatherer groups' diets. In addition to technological differences, recent findings of substantial ecological differences between the Pleistocene and the Anthropocene cast doubt regarding that analogy's validity. Surprisingly little systematic evolution‐guided evidence served to reconstruct HTL. Here, we reconstruct the HTL during the Pleistocene by reviewing evidence for the impact of the HTL on the biological, ecological, and behavioral systems derived from various existing studies. We adapt a paleobiological and paleoecological approach, including evidence from human physiology and genetics, archaeology, paleontology, and zoology, and identified 25 sources of evidence in total. The evidence shows that the trophic level of the Homo lineage that most probably led to modern humans evolved from a low base to a high, carnivorous position during the Pleistocene, beginning with Homo habilis and peaking in Homo erectus. A reversal of that trend appears in the Upper Paleolithic, strengthening in the Mesolithic/Epipaleolithic and Neolithic, and culminating with the advent of agriculture. We conclude that it is possible to reach a credible reconstruction of the HTL without relying on a simple analogy with recent hunter‐gatherers' diets. The memory of an adaptation to a trophic level that is embedded in modern humans' biology in the form of genetics, metabolism, and morphology is a fruitful line of investigation of past HTLs, whose potential we have only started to explore.


7 DISCUSSION

7.1 Summary of the evidence

The actual HTL during the Pleistocene is unobservable; therefore, we looked for evidence of the impact of the HTL on the studied humans' biological, behavioral, and ecological systems. Observing only its reflection and not the HTL itself, we are left employing varying degrees of interpretation in forming an opinion about what HTL caused a specific impact and whether it denotes a step toward specialization or generalization. We reviewed 25 different evidence sources, 15 of which are biological and the remainder archaeological, paleontological, zoological, and ethnographic. With a list of 25 evidence items, by necessity, the descriptions of the findings and the justification of the various interpretations cannot be thorough, leaving much room for further review and debate. One of the main aims of this article was to present the power of paleobiology in particular and to cast a wide net of scientific fields in general in investigating HTL. The evidence here is by no means comprehensive. There is additional genetic and physiological evidence for biological adaptations to a higher and lower trophic level in the past and additional evidence for Paleolithic HTLs in other fields of science, which we chose not to include, and others that we probably missed. Thus, although we do not shy away from presenting a clear hypothesis based on the evidence, the article should be looked at as a first iteration in what hopefully will be a continuous scientific endeavor.

The presented evidence can be organized into three epistemological groups with potentially increasing levels of validity regarding a specific trophic level (see text or Table 2 for a description of the evidence):

TABLE 2. Paleolithic human trophic level: Summary of results (summary of the evidence)
ItemDescription
Biology
BioenergeticsHumans had high energetic requirements for a given body mass and had a shorter time during the day to acquire and consume food. Hunting provides tenfold higher energetic return per hour compared to plants, leaving little room for flexibility (plasticity) between the two dietary components. Animals tend to specialize in the highest return items in their niche. (Specialization)
Diet quality

In primates, a larger brain is associated with high energy density food. With the largest brain among primates, humans are likely to have targeted the highest density food, animal fats, and proteins.

Brain size declined during the terminal Pleistocene, and subsequent Holocene. Diet quality declined at the same time with the increased consumption of plants. (Specialization)

Higher fat reservesWith much higher body fat reserves than primates, humans are uniquely adapted to lengthy fasting. This adaptation may have helped with overcoming the erratic encountering of large prey. (Change in trophic level) (Change in trophic level)
Genetic and metabolic adaptation to a high‐fat dietHumans adapted to higher fat diets, presumably from animals. Study (Swain‐Lenz et al., 2019): “suggests that humans shut down regions of the genome to accommodate a high‐fat diet while chimpanzees open regions of the genome to accommodate a high sugar diet” (Change in trophic level)
FADS‐Omega3 oils metabolismGenetic changes in the FADS gene in African humans 85 Kya allowed for a slight increase in the conversion of the plant DHA precursor to DHA signaling adaptation to a higher plant diet (Change in trophic level)
Late adaptations to tubers' consumptionTubers were assumed to be a mainstay of Paleolithic diets that cooking could prepare for consumption. Recent groups that consume high quantities of tubers have specific genetic adaptations to deal with toxins and antinutrients in tubers. Other humans are not well adapted to consume large quantities of tubers. (Change in trophic level)
Stomach acidityHigher stomach acidity is found in carnivores to fight meat‐borne pathogens. Humans' stomach acidity is even higher than in carnivores, equaling that of scavengers. Adaptation may have evolved to allow large animals' consumption in a central place over days and weeks with pathogen build‐up. (Categorization to a trophic group)
Insulin resistanceHumans, like carnivores, have a low physiological (non‐pathological) insulin sensitivity.
Gut morphologyHumans' gut morphology and relative size are radically different from chimpanzees' gut. Longer small intestines and shorter large intestines are typical of carnivores' gut morphology and limit humans' ability to extract energy from plants' fiber. (Specialization)
MasticationReduction of the masticatory system size already in Homo erectus, compared to early hominins, who relied on terrestrial vegetation as a food source. The reduced size is compatible with meat and fat consumption. (Aiello & Wheeler, 1995; Zink & Lieberman, 2016) (Change in trophic level)
CookingCooking was hypothesized to have enabled plants' high consumption despite the need for a high‐quality diet, starting with H. erectus. Other researchers argue that habitual use of fire is evident only around 0.4 Mya. Also, a fire has other uses and is costly to maintain. (Change in trophic level)
Postcranial morphologyA set of adaptations for endurance running is found already in H. erectus, useful in hunting. Shoulders adapted to spear‐throwing in H. erectus. But limited tree climbing capability. (Specialization)
Adipocyte morphologySimilar to the morphology in carnivores. “These figures suggest that the energy metabolism of humans is adapted to a diet in which lipids and proteins rather than carbohydrates, make a major contribution to the energy supply.” (Pond & Mattacks, 1985). (Categorization to a trophic group)
Age at weaningCarnivores wean at a younger age, as do humans. Early weaning “highlight the emergence of carnivory as a process fundamentally determining human evolution” (Psouni et al., 2012). (Categorization to a trophic group)
LongevityKaplan et al. (2007) hypothesized that a large part of the group depended on experienced hunters due to long childhood. Extended longevity in humans evolved to allow utilization of hunting proficiency, which peaks by age 40. The grandmother hypothesis claim women's longevity allowed additional gathering. (Change in trophic level)
VitaminsHypothesis for required nutritional diversity to supply vitamins is contested. It appears that all vitamins, including vitamin C are supplied in adequate quantities on a carnivorous diet. (Neutral)
Multicopy AMY1 genesMulti‐copies of the AMY1 gene have been hypothesized as adaptive to high starch diets. However, both findings of its possible lack of functionality and the unclear timing of its appearance limits the use of the evidence to support a change in trophic level. (Change in trophic level)
Archaeology
PlantsPlants were consumed throughout the Paleolithic, but their relative dietary contribution is difficult to assess. Recent advances in plant residues identification in dental pluck provide non‐quantitative evidence of widespread plants' consumption. Division of labor may point to a background level of plant supply, but the evidence is based largely on ethnography, which may not be analogous to the Pleistocene. (Inconclusive)
Stone toolsStone tools specific to plant food utilization appear only some 40 Kya, and their prevalence increases just before the appearance of agriculture, signaling increased plant consumption toward the end of the Paleolithic. (Change in trophic level)
ZooarchaeologyFirst access to large prey, denoting hunting, appears already in H. erectus archaeological sites 1.5 Mya. Humans also hunted large carnivores. (Change in trophic level)
Targeting fatHumans concentrated on hunting fatty animals at substantial energetic costs. They preferentially brought fatty parts to base camps, hunted fattier prime adults, and exploited bone fat. That behavior may indicate that plants could not have been easily obtained to complete constrained protein consumption. (Specialization)
Stable IsotopesNitrogen 15N isotope measurement of human fossil collagen residues is the most extensively used method for determining trophic level in the last 50 thousand years. All studies show that humans were highly carnivorous until very late, before the appearance of agriculture. The method has some shortcomings but was able to identify variation in trophic level among present‐day traditional groups. (Categorization to a trophic group)
Dental pathologyDental caries, evidence of substantial consumption of carbohydrates, appeared only some 15 Kya in groups with evidence of high plant food consumption. (Change in trophic level)
Dental wearDifferent wear on fossilized teeth as a result of different diets has the potential to reconstruct diets. However, the claim for the reconstruction of variable diets in Paleolithic humans could not be verified as the comparison the groups' diets were unclear. (Inconclusive)
Behavioral adaptationsA comparison of humans' behavior patterns with chimpanzees and social carnivores found that humans have carnivore‐like behavior patterns. Food sharing, alloparenting, labor division, and social flexibility are among the shared carnivorous behaviors. (Categorization to a trophic group)
Others
Paleontological evidence

Evidence for hunting by H. erectus 1.5 Mya was associated with the extinction of several large carnivores, but not smaller carnivores. This suggests that H. erectus became a member of the large carnivores' guild. Extinction of large carnivores in Europe also coincided with the arrival of humans (Categorization to a trophic group)

Extinctions of large herbivores were associated with humans' presence in Africa and arrival to continents and islands, such as Australia and America, suggesting preferential hunting of large prey.

Zoological analogyHumans were predators of large prey. In carnivores, predation on large prey is exclusively associated with hypercarnivory, i.e., consuming over 70% of the diet from animals. (Categorization to a trophic group)
EthnographyVariability in trophic level in the ethnographic context is frequently mentioned as proof of HTL variability during the Paleolithic. However, ecological and technological considerations limit the analogy to the terminal Paleolithic. (Change in trophic level)

7.2 Type 1. Change in trophic level

  • Higher fat reserves
  • Adaptations to a high‐fat diet
  • Late adaptation to USO
  • FADS—omega 3 metabolism
  • Amy1 multicopy
  • Longevity—Fathers
  • Longevity—Mothers
  • Dental pathology
  • Cooking
  • Zooarchaeology
  • Stone tools
  • Ethnography

This group includes evidence for physiological or behavioral adaptations to acquire and consume higher or lower amounts of either animal or plant‐sourced food. There is no argument that the evolution of the genus Homo was associated with increasing HTLs in H. habilis and further in H. erectus; therefore, it is not surprising that most of this type of evidence in the Lower Paleolithic includes evidence for adaptation to carnivory. Detailing these pieces of evidence may thus appear to be superfluous; however, identifying a trend in the number and timing of the acquisition of the adaptation may supply important indications for the occurrence of a change in trophic level. Accumulation of type 1 evidence in the Late UP supports a significant change to lower HTL at that period. Also, evidence in one period and not in the other can be interpreted as evidence for a different HTL in the last. For example, if we accept that the appearance of the AMY 1 gene multicopy sometime in H. sapiens evolution suggests a higher consumption of starch, we have to accept that there was no pressure to adapt in prior species to high consumption of starch. The same logic applies to the appearance of grains handling stone tools and dental pathology that appear only toward the Paleolithic end.

7.3 Type 2. Specialization—Reduced flexibility

  • Bioenergetics
  • Diet quality
  • Gut morphology
  • Mastication
  • Postcranial morphology
  • Targeting fat

Since we cannot observe our subjects, evidence for specialization is defined here as evidence that is similar to Type 1 but that, at the same time, potentially reducing the phenotypic plasticity of humans by hindering the acquisition or assimilation of the other food group (plant or animal).

Specialization and generalization must be defined with reference to particular axes such as temperature, habitat, and feeding (Futuyma & Moreno, 1988). Pineda‐Munoz and Alroy (2014) defined feeding specialization as selecting 50% or more of the food from a certain food group (fruits, seeds, green plants, insects, or vertebrates). By this definition, humans could be called specialists if they selected to consume 50% or more of their diet from vertebrates or another group of plants or invertebrates.

Another axis on which specialization can be defined is prey size. Large carnivores specialize in large prey. Evidence for humans' specialization in large herbivores can contribute to the continuing debate regarding humans' role in megafauna extinction (Faith et al., 2020; Faurby et al., 2020; Sandom et al., 2014; F. A. Smith et al., 2018; Werdelin & Lewis, 2013) and the implications of megafauna extinction on humans. Potts et al. (2018) identified an association between prey size decline during the Middle Pleistocene and the appearance of the MSA, and Ben‐Dor et al. (2011) further proposed that the extinction of elephants in the Southern Levant led to the appearance of the Acheulo‐Yabrudian cultural complex 420 Kya. Ben‐Dor and Barkai (2020a) have argued that humans preferred to acquire large prey and that large prey is underrepresented in zooarchaeological assemblages (Ben‐Dor & Barkai, 2020b). Listed in Table 3 is evidence, among the ones collected here, that can be interpreted as supporting humans' specialization in acquiring large prey.

TABLE 3. Specialization in the acquisition of large prey
BioenergeticLarge prey provides higher energetic returns than smaller prey. The need to substitute large prey for smaller prey is energetically costly.
Higher fat reservesLarge prey is less abundant than smaller prey. Fat reserves may have evolved to allow extended fasting of several weeks, thereby bridging an erratic encountering rate with large prey. Humans have adapted to easily synthesize ketones to replace glucose as an energy source for the brain.
Stomach acidityStronger acidity than carnivores' can be interpreted as an adaptation to a large prey's protracted consumption over days and weeks, whereby humans are acting as scavengers of their prey.
Targeting fatThe recognition of targeting fat as a driver of human behavior supports the importance of large, higher fat bearing animals to humans' survival.
Stable isotopesHigher levels of nitrogen isotope 15 than carnivores were interpreted by researchers as testifying to the higher consumption of large prey than other carnivores.
PaleontologyA decline in the guild of large prey carnivores 1.5 Mya is interpreted as resulting from humans' entrance to the guild. Also, the extinction of large prey throughout the Pleistocene is interpreted by some researchers as anthropogenic, testifying to humans' preference for large prey.
Zoological analogyLarge social carnivores hunt large prey.
EthnographyInterpreting ethnographic and Upper Paleolithic technologies as an adaptation to the acquisition of smaller prey means that humans were less adapted to the acquisition of smaller prey in earlier periods.

Dietary plasticity is a function of phenotypic plasticity (Fox et al., 2019) and technological and ecological factors. In many cases, evolution is a process of solving problems with trade‐offs (Garland, 2014). Identifying features that were traded‐off in specific adaptations could inform us of changing dietary phenotypic plasticity levels. Relative energetic returns on primary (plant) and secondary (animal) producers are key to assessing plasticity's ecological potential. In humans, technology can expend plasticity by enabling and increasing the energetic return on the acquisition of certain food items. Bows for the hunting of smaller, faster prey and grinding stones are two examples of such technologies.

We mainly listed specialization adaptations that affected phenotypic plasticity, but there are technological and ecological pieces of evidence that potentially changed dietary plasticity like the invention of bows that increased the level of plasticity regarding prey size and the appearance and disappearance of savannas with the accompanied change in primary to secondary production ratios that swayed plasticity toward primary or secondary producers.

7.4 Type 3. Categorization to a trophic group

  • Stomach acidity
  • Adipocyte morphology
  • Age at weaning
  • Stable isotopes
  • Behavioral adaptations
  • Paleontologic evidence
  • Zoological analogy
  • Insulin resistance

All the eight pieces of evidence of membership in a trophic group concluded that humans were carnivores. Assigning humans to a specific dietary trophic group has the highest potential validity, as it answers the research question with minimal interpretation.

In some cases, interpretation is required to assign a phenomenon to HTL. Belonging to the carnivores' trophic groups still does not tell us if humans were 90% or 50% carnivores. It does tell us, however, that humans were carnivorous enough and carnivorous long enough to justify physiological and behavioral adaptations unique to carnivores. Following the zoological analogy with large social carnivores that acquire large prey, we hypothesized that humans were hypercarnivores, defined as consuming more than 70% of the diet from animal sources.

When Tonight Is Not the Night: Sexual Rejection Behaviors and Satisfaction in Romantic Relationships

When Tonight Is Not the Night: Sexual Rejection Behaviors and Satisfaction in Romantic Relationships. James J. Kim et al. Personality and Social Psychology Bulletin, March 11, 2020. https://doi.org/10.1177/0146167220907469

Abstract: In most long-term romantic relationships, partners experience sexual conflicts of interest in which one partner declines the other partner’s sexual advances. We investigated the distinct ways people reject a partner’s advances (i.e., with reassuring, hostile, assertive, and deflecting behaviors) in Studies 1 and 2. Using cross-sectional (Study 3) and daily experience methods (Study 4), we investigated how perceptions of a partner’s rejection behaviors are linked with the rejected partner’s relationship and sexual satisfaction. We found robust evidence that perceived partner reassuring behaviors were associated with greater satisfaction, whereas perceived partner hostile behaviors were associated with lower levels of satisfaction. Perceived partner responsiveness was a key mechanism underlying the effects. Findings for assertive and deflecting behaviors were limited, but the effect of deflecting behaviors was qualified by levels of hostile behaviors for sexual satisfaction. Findings provide the first empirical investigation of the specific ways partners can decline one another’s advances to preserve satisfaction.

Keywords: sexual rejection, satisfaction, close relationships, sexual communication, responsiveness


Individuals in the top quartile of neuroticism had higher risk of all-cause dementia, but not fronto-temporal dementia

Is Neuroticism Differentially Associated with Risk of Alzheimer’s Disease, Vascular Dementia, and Frontotemporal Dementia? Antonio Terracciano et al. Journal of Psychiatric Research, March 25 2021. https://doi.org/10.1016/j.jpsychires.2021.03.039

Rolf Degen's take: Neurotics are at greater risk of succumbing to dementia

Abstract: This study examines whether neuroticism is differentially associated with risk of incident Alzheimer’s disease (AD), vascular dementia (VD), and frontotemporal dementia (FTD) using a prospective study design. Participants from the UK Biobank (N = 401,422) completed a self-report neuroticism scale in 2006-2010 and incident all-cause dementia, AD, VD, and FTD were ascertained using electronic health records or death records up to 2018. During an average follow-up of 8.8 years (3,566,123 person-years), there were 1,798 incident of all-cause dementia, 675 AD, 376 VD, and 81 FTD. Accounting for age and sex, compared to individuals in the low quartile, individuals in the top quartile of neuroticism had higher risk of all-cause dementia (HR = 1.70; 95% CI: 1.49-1.93), AD (HR = 1.42; 1.15-1.75), VD (HR = 1.73; 1.30-2.29), but not FTD (HR = 0.89; 0.49-1.63). The associations with AD and VD were attenuated but remained significant after further accounting for education, household income, deprivation index, diabetes, hypertension, stroke, heart attack, ever smoker, physical activity, obesity, hemoglobin A1c, C-reactive protein, and low-density lipoprotein. The associations were not moderated by socioeconomic status. The findings were consistent in analyses that excluded cases that occurred within the first 5 years of follow-up. In conclusion, neuroticism is a robust predictor of incident AD and VD, but not FTD. This pattern suggests that the affective symptoms that distinguish dementia types may partly reflect premorbid differences in trait neuroticism.

Keywords: PersonalityAlzheimer’s diseasePick’s diseaseEmotional distressNeurodegenerative disease


Wednesday, March 24, 2021

Cross-cultural differences in strategic risky decisions in baseball among professional baseball teams in North America and Japan: European American participants preferred high risk-high payoff strategies

Swinging for the Fences Versus Advancing the Runner: Culture, Motivation, and Strategic Decision Making. Roxie Chuang et al. Social Psychological and Personality Science, March 19, 2021. https://doi.org/10.1177/1948550621999273

Abstract: This research investigated cross-cultural differences in strategic risky decisions in baseball—among professional baseball teams in North America and Japan (Study 1) and among baseball fans in the United States and Japan (Study 2—preregistered). Study 1 analyzed archival data from professional baseball leagues and demonstrated that outcomes reflecting high risk-high payoff strategies were more prevalent in North America, whereas outcomes reflecting low risk-low payoff strategies were more prevalent in Japan. Study 2 investigated fans’ strategic decision making with a wider range of baseball strategies as well as an underlying reason for the difference: approach/avoidance motivational orientation. European American participants preferred high risk-high payoff strategies, Japanese participants preferred low risk-low payoff strategies, and this cultural variation was explained by cultural differences in motivational orientation. Baseball, which exemplifies a domain where strategic decision making has observable consequences, can demonstrate the power of culture through the actions and preferences of players and fans alike.

Keywords: decision making, culture, motivation, risk-taking


No clear support for sweet liking as a major risk factor for obesity: Extreme sweet likers may have greater awareness of internal appetite regulation

Understanding sweet-liking phenotypes and their implications for obesity: narrative review and future directions. Rhiannon M Armitage, Vasiliki Iatridi, Martin R Yeomans. Physiology & Behavior, March 23 2021, 113398. https://doi.org/10.1016/j.physbeh.2021.113398

Highlights

• Sweet liking can be separated into three identifiable phenotypes

• Differences may reflect sensitivity to homeostatic appetite signalling

• Extreme sweet likers may have greater awareness of internal appetite regulation

• No clear support for sweet liking as a major risk factor for obesity

• Future research should consider genetic, neural and broader behavioural differences

Abstract: Building on a series of recent studies that challenge the universality of sweet liking, here we review the evidence for multiple sweet-liking phenotypes which strongly suggest, humans fall into three hedonic response patterns: extreme sweet likers (ESL), where liking increases with sweetness, moderate sweet likers (MSL), who like moderate but not intense sweetness, and sweet dislikers (SD), who show increasing aversion as sweetness increases. This review contrasts how these phenotypes differ in body size and composition, dietary intake and behavioural measures to test the widely held view that sweet liking may be a key driver of obesity. Apart from increased consumption of sugar-sweetened beverages in ESL, we found no clear evidence that sweet liking was associated with obesity and actually found some evidence that SD, rather than ESL, may have slightly higher body fat. We conclude that ESL may have heightened awareness of internal appetite cues that could protect against overconsumption and increased sensitivity to wider reward. We note many gaps in knowledge and the need for future studies to contrast these phenotypes in terms of genetics, neural processing of reward and broader measures of behaviour. There is also the need for more extensive longitudinal studies to determine the extent to which these phenotypes are modified by exposure to sweet stimuli in the context of the obesogenic environment.

Keywords: Sweet tastelikinghedonicsindividual differencesobesitysweet-liking phenotypes


Sex-Dependent Shared and Non-Shared Genetic Architecture, Across Mood and Psychotic Disorders, using 85k cases with 109k controls

Sex-Dependent Shared and Non-Shared Genetic Architecture, Across Mood and Psychotic Disorders. Gabriëlla A.M. Blokland et al. Biological Psychiatry, March 22, 2021. https://doi.org/10.1016/j.biopsych.2021.02.972

Popular version: https://www.eurekalert.org/pub_releases/2021-03/mgh-lga032321.php

Abstract

Background: Sex differences in incidence and/or presentation of schizophrenia (SCZ), major depressive disorder (MDD), and bipolar disorder (BIP) are pervasive. Previous evidence for shared genetic risk and sex differences in brain abnormalities across disorders suggest possible shared sex-dependent genetic risk.

Methods: We conducted the largest to date genome-wide genotype–by–sex (GxS) interaction of risk for these disorders, using 85,735 cases (33,403 SCZ, 19,924 BIP, 32,408 MDD) and 109,946 controls from the Psychiatric Genomics Consortium (PGC) and iPSYCH.

Results: Across disorders, genome-wide significant SNP-by-sex interaction was detected for a locus encompassing NKAIN2 (rs117780815; p=3.2×10−8), that interacts with sodium/potassium-transporting ATPase enzymes implicating neuronal excitability. Three additional loci showed evidence (p<1×10−6) for cross-disorder GxS interaction (rs7302529, p=1.6×10−7; rs73033497, p=8.8×10−7; rs7914279, p=6.4×10−7) implicating various functions. Gene-based analyses identified GxS interaction across disorders (p=8.97×10−7) with transcriptional inhibitor SLTM. Most significant in SCZ was a MOCOS gene locus (rs11665282; p=1.5×10−7), implicating vascular endothelial cells. Secondary analysis of the PGC-SCZ dataset detected an interaction (rs13265509; p=1.1×10−7) in a locus containing IDO2, a kynurenine pathway enzyme with immunoregulatory functions implicated in SCZ, BIP, and MDD. Pathway enrichment analysis detected significant GxS of genes regulating vascular endothelial growth factor (VEGF) receptor signaling in MDD (pFDR<0.05).

Conclusions: In the largest genome-wide GxS analysis of mood and psychotic disorders to date, there was substantial genetic overlap between the sexes. However, significant sex-dependent effects were enriched for genes related to neuronal development, immune and vascular functions across and within SCZ, BIP, and MDD at the variant, gene, and pathway enrichment levels.

Key words: sex differences schizophrenia bipolar disorder major depressive disorder genome-wide association study genotype-by-sex interaction


Tuesday, March 23, 2021

High income men are more likely to marry, less likely to divorce, if divorced are more likely to remarry, & are less likely to be childless; income is not linked with the probability of marriage for women

High income men have high value as long-term mates in the U.S.: personal income and the probability of marriage, divorce, and childbearing in the U.S. Rosemary L. Hopcroft. Evolution and Human Behavior, March 23 2021. https://doi.org/10.1016/j.evolhumbehav.2021.03.004

Abstract: Using data from the first Census data set that includes complete measures of male biological fertility for a large-scale probability sample of the U.S. population (the 2014 wave of the Study of Income and Program Participation-N = 55,281), this study shows that high income men are more likely to marry, are less likely to divorce, if divorced are more likely to remarry, and are less likely to be childless than low income men. Men who remarry marry relatively younger women than other men, on average, although this does not vary by personal income. For men who divorce who have children, high income is not associated with an increased probability of having children with new partners. Income is not associated with the probability of marriage for women and is positively associated with the probability of divorce. High income women are less likely to remarry after divorce and more likely to be childless than low income women. For women who divorce who have children, high income is associated with a lower chance of having children with new partners, although the relationship is curvilinear. These results are behavioral evidence that women are more likely than men to prioritize earning capabilities in a long-term mate and suggest that high income men have high value as long-term mates in the U.S.

Keywords: Evolutionary psychologyFertilityMarriageChildlesnessDivorceSex differences