Thursday, March 25, 2021

The trophic level of the Homo lineage that most probably led to modern humans evolved from a low base to a high, carnivorous position during the Pleistocene; partially reversed later & with the advent of agriculture

The evolution of the human trophic level during the Pleistocene. Miki Ben‐Dor  Raphael Sirtoli  Ran Barkai. American Journal of Physical Anthropology, March 5 2021. https://doi.org/10.1002/ajpa.24247

Abstract: The human trophic level (HTL) during the Pleistocene and its degree of variability serve, explicitly or tacitly, as the basis of many explanations for human evolution, behavior, and culture. Previous attempts to reconstruct the HTL have relied heavily on an analogy with recent hunter‐gatherer groups' diets. In addition to technological differences, recent findings of substantial ecological differences between the Pleistocene and the Anthropocene cast doubt regarding that analogy's validity. Surprisingly little systematic evolution‐guided evidence served to reconstruct HTL. Here, we reconstruct the HTL during the Pleistocene by reviewing evidence for the impact of the HTL on the biological, ecological, and behavioral systems derived from various existing studies. We adapt a paleobiological and paleoecological approach, including evidence from human physiology and genetics, archaeology, paleontology, and zoology, and identified 25 sources of evidence in total. The evidence shows that the trophic level of the Homo lineage that most probably led to modern humans evolved from a low base to a high, carnivorous position during the Pleistocene, beginning with Homo habilis and peaking in Homo erectus. A reversal of that trend appears in the Upper Paleolithic, strengthening in the Mesolithic/Epipaleolithic and Neolithic, and culminating with the advent of agriculture. We conclude that it is possible to reach a credible reconstruction of the HTL without relying on a simple analogy with recent hunter‐gatherers' diets. The memory of an adaptation to a trophic level that is embedded in modern humans' biology in the form of genetics, metabolism, and morphology is a fruitful line of investigation of past HTLs, whose potential we have only started to explore.


7 DISCUSSION

7.1 Summary of the evidence

The actual HTL during the Pleistocene is unobservable; therefore, we looked for evidence of the impact of the HTL on the studied humans' biological, behavioral, and ecological systems. Observing only its reflection and not the HTL itself, we are left employing varying degrees of interpretation in forming an opinion about what HTL caused a specific impact and whether it denotes a step toward specialization or generalization. We reviewed 25 different evidence sources, 15 of which are biological and the remainder archaeological, paleontological, zoological, and ethnographic. With a list of 25 evidence items, by necessity, the descriptions of the findings and the justification of the various interpretations cannot be thorough, leaving much room for further review and debate. One of the main aims of this article was to present the power of paleobiology in particular and to cast a wide net of scientific fields in general in investigating HTL. The evidence here is by no means comprehensive. There is additional genetic and physiological evidence for biological adaptations to a higher and lower trophic level in the past and additional evidence for Paleolithic HTLs in other fields of science, which we chose not to include, and others that we probably missed. Thus, although we do not shy away from presenting a clear hypothesis based on the evidence, the article should be looked at as a first iteration in what hopefully will be a continuous scientific endeavor.

The presented evidence can be organized into three epistemological groups with potentially increasing levels of validity regarding a specific trophic level (see text or Table 2 for a description of the evidence):

TABLE 2. Paleolithic human trophic level: Summary of results (summary of the evidence)
ItemDescription
Biology
BioenergeticsHumans had high energetic requirements for a given body mass and had a shorter time during the day to acquire and consume food. Hunting provides tenfold higher energetic return per hour compared to plants, leaving little room for flexibility (plasticity) between the two dietary components. Animals tend to specialize in the highest return items in their niche. (Specialization)
Diet quality

In primates, a larger brain is associated with high energy density food. With the largest brain among primates, humans are likely to have targeted the highest density food, animal fats, and proteins.

Brain size declined during the terminal Pleistocene, and subsequent Holocene. Diet quality declined at the same time with the increased consumption of plants. (Specialization)

Higher fat reservesWith much higher body fat reserves than primates, humans are uniquely adapted to lengthy fasting. This adaptation may have helped with overcoming the erratic encountering of large prey. (Change in trophic level) (Change in trophic level)
Genetic and metabolic adaptation to a high‐fat dietHumans adapted to higher fat diets, presumably from animals. Study (Swain‐Lenz et al., 2019): “suggests that humans shut down regions of the genome to accommodate a high‐fat diet while chimpanzees open regions of the genome to accommodate a high sugar diet” (Change in trophic level)
FADS‐Omega3 oils metabolismGenetic changes in the FADS gene in African humans 85 Kya allowed for a slight increase in the conversion of the plant DHA precursor to DHA signaling adaptation to a higher plant diet (Change in trophic level)
Late adaptations to tubers' consumptionTubers were assumed to be a mainstay of Paleolithic diets that cooking could prepare for consumption. Recent groups that consume high quantities of tubers have specific genetic adaptations to deal with toxins and antinutrients in tubers. Other humans are not well adapted to consume large quantities of tubers. (Change in trophic level)
Stomach acidityHigher stomach acidity is found in carnivores to fight meat‐borne pathogens. Humans' stomach acidity is even higher than in carnivores, equaling that of scavengers. Adaptation may have evolved to allow large animals' consumption in a central place over days and weeks with pathogen build‐up. (Categorization to a trophic group)
Insulin resistanceHumans, like carnivores, have a low physiological (non‐pathological) insulin sensitivity.
Gut morphologyHumans' gut morphology and relative size are radically different from chimpanzees' gut. Longer small intestines and shorter large intestines are typical of carnivores' gut morphology and limit humans' ability to extract energy from plants' fiber. (Specialization)
MasticationReduction of the masticatory system size already in Homo erectus, compared to early hominins, who relied on terrestrial vegetation as a food source. The reduced size is compatible with meat and fat consumption. (Aiello & Wheeler, 1995; Zink & Lieberman, 2016) (Change in trophic level)
CookingCooking was hypothesized to have enabled plants' high consumption despite the need for a high‐quality diet, starting with H. erectus. Other researchers argue that habitual use of fire is evident only around 0.4 Mya. Also, a fire has other uses and is costly to maintain. (Change in trophic level)
Postcranial morphologyA set of adaptations for endurance running is found already in H. erectus, useful in hunting. Shoulders adapted to spear‐throwing in H. erectus. But limited tree climbing capability. (Specialization)
Adipocyte morphologySimilar to the morphology in carnivores. “These figures suggest that the energy metabolism of humans is adapted to a diet in which lipids and proteins rather than carbohydrates, make a major contribution to the energy supply.” (Pond & Mattacks, 1985). (Categorization to a trophic group)
Age at weaningCarnivores wean at a younger age, as do humans. Early weaning “highlight the emergence of carnivory as a process fundamentally determining human evolution” (Psouni et al., 2012). (Categorization to a trophic group)
LongevityKaplan et al. (2007) hypothesized that a large part of the group depended on experienced hunters due to long childhood. Extended longevity in humans evolved to allow utilization of hunting proficiency, which peaks by age 40. The grandmother hypothesis claim women's longevity allowed additional gathering. (Change in trophic level)
VitaminsHypothesis for required nutritional diversity to supply vitamins is contested. It appears that all vitamins, including vitamin C are supplied in adequate quantities on a carnivorous diet. (Neutral)
Multicopy AMY1 genesMulti‐copies of the AMY1 gene have been hypothesized as adaptive to high starch diets. However, both findings of its possible lack of functionality and the unclear timing of its appearance limits the use of the evidence to support a change in trophic level. (Change in trophic level)
Archaeology
PlantsPlants were consumed throughout the Paleolithic, but their relative dietary contribution is difficult to assess. Recent advances in plant residues identification in dental pluck provide non‐quantitative evidence of widespread plants' consumption. Division of labor may point to a background level of plant supply, but the evidence is based largely on ethnography, which may not be analogous to the Pleistocene. (Inconclusive)
Stone toolsStone tools specific to plant food utilization appear only some 40 Kya, and their prevalence increases just before the appearance of agriculture, signaling increased plant consumption toward the end of the Paleolithic. (Change in trophic level)
ZooarchaeologyFirst access to large prey, denoting hunting, appears already in H. erectus archaeological sites 1.5 Mya. Humans also hunted large carnivores. (Change in trophic level)
Targeting fatHumans concentrated on hunting fatty animals at substantial energetic costs. They preferentially brought fatty parts to base camps, hunted fattier prime adults, and exploited bone fat. That behavior may indicate that plants could not have been easily obtained to complete constrained protein consumption. (Specialization)
Stable IsotopesNitrogen 15N isotope measurement of human fossil collagen residues is the most extensively used method for determining trophic level in the last 50 thousand years. All studies show that humans were highly carnivorous until very late, before the appearance of agriculture. The method has some shortcomings but was able to identify variation in trophic level among present‐day traditional groups. (Categorization to a trophic group)
Dental pathologyDental caries, evidence of substantial consumption of carbohydrates, appeared only some 15 Kya in groups with evidence of high plant food consumption. (Change in trophic level)
Dental wearDifferent wear on fossilized teeth as a result of different diets has the potential to reconstruct diets. However, the claim for the reconstruction of variable diets in Paleolithic humans could not be verified as the comparison the groups' diets were unclear. (Inconclusive)
Behavioral adaptationsA comparison of humans' behavior patterns with chimpanzees and social carnivores found that humans have carnivore‐like behavior patterns. Food sharing, alloparenting, labor division, and social flexibility are among the shared carnivorous behaviors. (Categorization to a trophic group)
Others
Paleontological evidence

Evidence for hunting by H. erectus 1.5 Mya was associated with the extinction of several large carnivores, but not smaller carnivores. This suggests that H. erectus became a member of the large carnivores' guild. Extinction of large carnivores in Europe also coincided with the arrival of humans (Categorization to a trophic group)

Extinctions of large herbivores were associated with humans' presence in Africa and arrival to continents and islands, such as Australia and America, suggesting preferential hunting of large prey.

Zoological analogyHumans were predators of large prey. In carnivores, predation on large prey is exclusively associated with hypercarnivory, i.e., consuming over 70% of the diet from animals. (Categorization to a trophic group)
EthnographyVariability in trophic level in the ethnographic context is frequently mentioned as proof of HTL variability during the Paleolithic. However, ecological and technological considerations limit the analogy to the terminal Paleolithic. (Change in trophic level)

7.2 Type 1. Change in trophic level

  • Higher fat reserves
  • Adaptations to a high‐fat diet
  • Late adaptation to USO
  • FADS—omega 3 metabolism
  • Amy1 multicopy
  • Longevity—Fathers
  • Longevity—Mothers
  • Dental pathology
  • Cooking
  • Zooarchaeology
  • Stone tools
  • Ethnography

This group includes evidence for physiological or behavioral adaptations to acquire and consume higher or lower amounts of either animal or plant‐sourced food. There is no argument that the evolution of the genus Homo was associated with increasing HTLs in H. habilis and further in H. erectus; therefore, it is not surprising that most of this type of evidence in the Lower Paleolithic includes evidence for adaptation to carnivory. Detailing these pieces of evidence may thus appear to be superfluous; however, identifying a trend in the number and timing of the acquisition of the adaptation may supply important indications for the occurrence of a change in trophic level. Accumulation of type 1 evidence in the Late UP supports a significant change to lower HTL at that period. Also, evidence in one period and not in the other can be interpreted as evidence for a different HTL in the last. For example, if we accept that the appearance of the AMY 1 gene multicopy sometime in H. sapiens evolution suggests a higher consumption of starch, we have to accept that there was no pressure to adapt in prior species to high consumption of starch. The same logic applies to the appearance of grains handling stone tools and dental pathology that appear only toward the Paleolithic end.

7.3 Type 2. Specialization—Reduced flexibility

  • Bioenergetics
  • Diet quality
  • Gut morphology
  • Mastication
  • Postcranial morphology
  • Targeting fat

Since we cannot observe our subjects, evidence for specialization is defined here as evidence that is similar to Type 1 but that, at the same time, potentially reducing the phenotypic plasticity of humans by hindering the acquisition or assimilation of the other food group (plant or animal).

Specialization and generalization must be defined with reference to particular axes such as temperature, habitat, and feeding (Futuyma & Moreno, 1988). Pineda‐Munoz and Alroy (2014) defined feeding specialization as selecting 50% or more of the food from a certain food group (fruits, seeds, green plants, insects, or vertebrates). By this definition, humans could be called specialists if they selected to consume 50% or more of their diet from vertebrates or another group of plants or invertebrates.

Another axis on which specialization can be defined is prey size. Large carnivores specialize in large prey. Evidence for humans' specialization in large herbivores can contribute to the continuing debate regarding humans' role in megafauna extinction (Faith et al., 2020; Faurby et al., 2020; Sandom et al., 2014; F. A. Smith et al., 2018; Werdelin & Lewis, 2013) and the implications of megafauna extinction on humans. Potts et al. (2018) identified an association between prey size decline during the Middle Pleistocene and the appearance of the MSA, and Ben‐Dor et al. (2011) further proposed that the extinction of elephants in the Southern Levant led to the appearance of the Acheulo‐Yabrudian cultural complex 420 Kya. Ben‐Dor and Barkai (2020a) have argued that humans preferred to acquire large prey and that large prey is underrepresented in zooarchaeological assemblages (Ben‐Dor & Barkai, 2020b). Listed in Table 3 is evidence, among the ones collected here, that can be interpreted as supporting humans' specialization in acquiring large prey.

TABLE 3. Specialization in the acquisition of large prey
BioenergeticLarge prey provides higher energetic returns than smaller prey. The need to substitute large prey for smaller prey is energetically costly.
Higher fat reservesLarge prey is less abundant than smaller prey. Fat reserves may have evolved to allow extended fasting of several weeks, thereby bridging an erratic encountering rate with large prey. Humans have adapted to easily synthesize ketones to replace glucose as an energy source for the brain.
Stomach acidityStronger acidity than carnivores' can be interpreted as an adaptation to a large prey's protracted consumption over days and weeks, whereby humans are acting as scavengers of their prey.
Targeting fatThe recognition of targeting fat as a driver of human behavior supports the importance of large, higher fat bearing animals to humans' survival.
Stable isotopesHigher levels of nitrogen isotope 15 than carnivores were interpreted by researchers as testifying to the higher consumption of large prey than other carnivores.
PaleontologyA decline in the guild of large prey carnivores 1.5 Mya is interpreted as resulting from humans' entrance to the guild. Also, the extinction of large prey throughout the Pleistocene is interpreted by some researchers as anthropogenic, testifying to humans' preference for large prey.
Zoological analogyLarge social carnivores hunt large prey.
EthnographyInterpreting ethnographic and Upper Paleolithic technologies as an adaptation to the acquisition of smaller prey means that humans were less adapted to the acquisition of smaller prey in earlier periods.

Dietary plasticity is a function of phenotypic plasticity (Fox et al., 2019) and technological and ecological factors. In many cases, evolution is a process of solving problems with trade‐offs (Garland, 2014). Identifying features that were traded‐off in specific adaptations could inform us of changing dietary phenotypic plasticity levels. Relative energetic returns on primary (plant) and secondary (animal) producers are key to assessing plasticity's ecological potential. In humans, technology can expend plasticity by enabling and increasing the energetic return on the acquisition of certain food items. Bows for the hunting of smaller, faster prey and grinding stones are two examples of such technologies.

We mainly listed specialization adaptations that affected phenotypic plasticity, but there are technological and ecological pieces of evidence that potentially changed dietary plasticity like the invention of bows that increased the level of plasticity regarding prey size and the appearance and disappearance of savannas with the accompanied change in primary to secondary production ratios that swayed plasticity toward primary or secondary producers.

7.4 Type 3. Categorization to a trophic group

  • Stomach acidity
  • Adipocyte morphology
  • Age at weaning
  • Stable isotopes
  • Behavioral adaptations
  • Paleontologic evidence
  • Zoological analogy
  • Insulin resistance

All the eight pieces of evidence of membership in a trophic group concluded that humans were carnivores. Assigning humans to a specific dietary trophic group has the highest potential validity, as it answers the research question with minimal interpretation.

In some cases, interpretation is required to assign a phenomenon to HTL. Belonging to the carnivores' trophic groups still does not tell us if humans were 90% or 50% carnivores. It does tell us, however, that humans were carnivorous enough and carnivorous long enough to justify physiological and behavioral adaptations unique to carnivores. Following the zoological analogy with large social carnivores that acquire large prey, we hypothesized that humans were hypercarnivores, defined as consuming more than 70% of the diet from animal sources.

When Tonight Is Not the Night: Sexual Rejection Behaviors and Satisfaction in Romantic Relationships

When Tonight Is Not the Night: Sexual Rejection Behaviors and Satisfaction in Romantic Relationships. James J. Kim et al. Personality and Social Psychology Bulletin, March 11, 2020. https://doi.org/10.1177/0146167220907469

Abstract: In most long-term romantic relationships, partners experience sexual conflicts of interest in which one partner declines the other partner’s sexual advances. We investigated the distinct ways people reject a partner’s advances (i.e., with reassuring, hostile, assertive, and deflecting behaviors) in Studies 1 and 2. Using cross-sectional (Study 3) and daily experience methods (Study 4), we investigated how perceptions of a partner’s rejection behaviors are linked with the rejected partner’s relationship and sexual satisfaction. We found robust evidence that perceived partner reassuring behaviors were associated with greater satisfaction, whereas perceived partner hostile behaviors were associated with lower levels of satisfaction. Perceived partner responsiveness was a key mechanism underlying the effects. Findings for assertive and deflecting behaviors were limited, but the effect of deflecting behaviors was qualified by levels of hostile behaviors for sexual satisfaction. Findings provide the first empirical investigation of the specific ways partners can decline one another’s advances to preserve satisfaction.

Keywords: sexual rejection, satisfaction, close relationships, sexual communication, responsiveness


Individuals in the top quartile of neuroticism had higher risk of all-cause dementia, but not fronto-temporal dementia

Is Neuroticism Differentially Associated with Risk of Alzheimer’s Disease, Vascular Dementia, and Frontotemporal Dementia? Antonio Terracciano et al. Journal of Psychiatric Research, March 25 2021. https://doi.org/10.1016/j.jpsychires.2021.03.039

Rolf Degen's take: Neurotics are at greater risk of succumbing to dementia

Abstract: This study examines whether neuroticism is differentially associated with risk of incident Alzheimer’s disease (AD), vascular dementia (VD), and frontotemporal dementia (FTD) using a prospective study design. Participants from the UK Biobank (N = 401,422) completed a self-report neuroticism scale in 2006-2010 and incident all-cause dementia, AD, VD, and FTD were ascertained using electronic health records or death records up to 2018. During an average follow-up of 8.8 years (3,566,123 person-years), there were 1,798 incident of all-cause dementia, 675 AD, 376 VD, and 81 FTD. Accounting for age and sex, compared to individuals in the low quartile, individuals in the top quartile of neuroticism had higher risk of all-cause dementia (HR = 1.70; 95% CI: 1.49-1.93), AD (HR = 1.42; 1.15-1.75), VD (HR = 1.73; 1.30-2.29), but not FTD (HR = 0.89; 0.49-1.63). The associations with AD and VD were attenuated but remained significant after further accounting for education, household income, deprivation index, diabetes, hypertension, stroke, heart attack, ever smoker, physical activity, obesity, hemoglobin A1c, C-reactive protein, and low-density lipoprotein. The associations were not moderated by socioeconomic status. The findings were consistent in analyses that excluded cases that occurred within the first 5 years of follow-up. In conclusion, neuroticism is a robust predictor of incident AD and VD, but not FTD. This pattern suggests that the affective symptoms that distinguish dementia types may partly reflect premorbid differences in trait neuroticism.

Keywords: PersonalityAlzheimer’s diseasePick’s diseaseEmotional distressNeurodegenerative disease


Wednesday, March 24, 2021

Cross-cultural differences in strategic risky decisions in baseball among professional baseball teams in North America and Japan: European American participants preferred high risk-high payoff strategies

Swinging for the Fences Versus Advancing the Runner: Culture, Motivation, and Strategic Decision Making. Roxie Chuang et al. Social Psychological and Personality Science, March 19, 2021. https://doi.org/10.1177/1948550621999273

Abstract: This research investigated cross-cultural differences in strategic risky decisions in baseball—among professional baseball teams in North America and Japan (Study 1) and among baseball fans in the United States and Japan (Study 2—preregistered). Study 1 analyzed archival data from professional baseball leagues and demonstrated that outcomes reflecting high risk-high payoff strategies were more prevalent in North America, whereas outcomes reflecting low risk-low payoff strategies were more prevalent in Japan. Study 2 investigated fans’ strategic decision making with a wider range of baseball strategies as well as an underlying reason for the difference: approach/avoidance motivational orientation. European American participants preferred high risk-high payoff strategies, Japanese participants preferred low risk-low payoff strategies, and this cultural variation was explained by cultural differences in motivational orientation. Baseball, which exemplifies a domain where strategic decision making has observable consequences, can demonstrate the power of culture through the actions and preferences of players and fans alike.

Keywords: decision making, culture, motivation, risk-taking


No clear support for sweet liking as a major risk factor for obesity: Extreme sweet likers may have greater awareness of internal appetite regulation

Understanding sweet-liking phenotypes and their implications for obesity: narrative review and future directions. Rhiannon M Armitage, Vasiliki Iatridi, Martin R Yeomans. Physiology & Behavior, March 23 2021, 113398. https://doi.org/10.1016/j.physbeh.2021.113398

Highlights

• Sweet liking can be separated into three identifiable phenotypes

• Differences may reflect sensitivity to homeostatic appetite signalling

• Extreme sweet likers may have greater awareness of internal appetite regulation

• No clear support for sweet liking as a major risk factor for obesity

• Future research should consider genetic, neural and broader behavioural differences

Abstract: Building on a series of recent studies that challenge the universality of sweet liking, here we review the evidence for multiple sweet-liking phenotypes which strongly suggest, humans fall into three hedonic response patterns: extreme sweet likers (ESL), where liking increases with sweetness, moderate sweet likers (MSL), who like moderate but not intense sweetness, and sweet dislikers (SD), who show increasing aversion as sweetness increases. This review contrasts how these phenotypes differ in body size and composition, dietary intake and behavioural measures to test the widely held view that sweet liking may be a key driver of obesity. Apart from increased consumption of sugar-sweetened beverages in ESL, we found no clear evidence that sweet liking was associated with obesity and actually found some evidence that SD, rather than ESL, may have slightly higher body fat. We conclude that ESL may have heightened awareness of internal appetite cues that could protect against overconsumption and increased sensitivity to wider reward. We note many gaps in knowledge and the need for future studies to contrast these phenotypes in terms of genetics, neural processing of reward and broader measures of behaviour. There is also the need for more extensive longitudinal studies to determine the extent to which these phenotypes are modified by exposure to sweet stimuli in the context of the obesogenic environment.

Keywords: Sweet tastelikinghedonicsindividual differencesobesitysweet-liking phenotypes


Sex-Dependent Shared and Non-Shared Genetic Architecture, Across Mood and Psychotic Disorders, using 85k cases with 109k controls

Sex-Dependent Shared and Non-Shared Genetic Architecture, Across Mood and Psychotic Disorders. Gabriëlla A.M. Blokland et al. Biological Psychiatry, March 22, 2021. https://doi.org/10.1016/j.biopsych.2021.02.972

Popular version: https://www.eurekalert.org/pub_releases/2021-03/mgh-lga032321.php

Abstract

Background: Sex differences in incidence and/or presentation of schizophrenia (SCZ), major depressive disorder (MDD), and bipolar disorder (BIP) are pervasive. Previous evidence for shared genetic risk and sex differences in brain abnormalities across disorders suggest possible shared sex-dependent genetic risk.

Methods: We conducted the largest to date genome-wide genotype–by–sex (GxS) interaction of risk for these disorders, using 85,735 cases (33,403 SCZ, 19,924 BIP, 32,408 MDD) and 109,946 controls from the Psychiatric Genomics Consortium (PGC) and iPSYCH.

Results: Across disorders, genome-wide significant SNP-by-sex interaction was detected for a locus encompassing NKAIN2 (rs117780815; p=3.2×10−8), that interacts with sodium/potassium-transporting ATPase enzymes implicating neuronal excitability. Three additional loci showed evidence (p<1×10−6) for cross-disorder GxS interaction (rs7302529, p=1.6×10−7; rs73033497, p=8.8×10−7; rs7914279, p=6.4×10−7) implicating various functions. Gene-based analyses identified GxS interaction across disorders (p=8.97×10−7) with transcriptional inhibitor SLTM. Most significant in SCZ was a MOCOS gene locus (rs11665282; p=1.5×10−7), implicating vascular endothelial cells. Secondary analysis of the PGC-SCZ dataset detected an interaction (rs13265509; p=1.1×10−7) in a locus containing IDO2, a kynurenine pathway enzyme with immunoregulatory functions implicated in SCZ, BIP, and MDD. Pathway enrichment analysis detected significant GxS of genes regulating vascular endothelial growth factor (VEGF) receptor signaling in MDD (pFDR<0.05).

Conclusions: In the largest genome-wide GxS analysis of mood and psychotic disorders to date, there was substantial genetic overlap between the sexes. However, significant sex-dependent effects were enriched for genes related to neuronal development, immune and vascular functions across and within SCZ, BIP, and MDD at the variant, gene, and pathway enrichment levels.

Key words: sex differences schizophrenia bipolar disorder major depressive disorder genome-wide association study genotype-by-sex interaction


Tuesday, March 23, 2021

High income men are more likely to marry, less likely to divorce, if divorced are more likely to remarry, & are less likely to be childless; income is not linked with the probability of marriage for women

High income men have high value as long-term mates in the U.S.: personal income and the probability of marriage, divorce, and childbearing in the U.S. Rosemary L. Hopcroft. Evolution and Human Behavior, March 23 2021. https://doi.org/10.1016/j.evolhumbehav.2021.03.004

Abstract: Using data from the first Census data set that includes complete measures of male biological fertility for a large-scale probability sample of the U.S. population (the 2014 wave of the Study of Income and Program Participation-N = 55,281), this study shows that high income men are more likely to marry, are less likely to divorce, if divorced are more likely to remarry, and are less likely to be childless than low income men. Men who remarry marry relatively younger women than other men, on average, although this does not vary by personal income. For men who divorce who have children, high income is not associated with an increased probability of having children with new partners. Income is not associated with the probability of marriage for women and is positively associated with the probability of divorce. High income women are less likely to remarry after divorce and more likely to be childless than low income women. For women who divorce who have children, high income is associated with a lower chance of having children with new partners, although the relationship is curvilinear. These results are behavioral evidence that women are more likely than men to prioritize earning capabilities in a long-term mate and suggest that high income men have high value as long-term mates in the U.S.

Keywords: Evolutionary psychologyFertilityMarriageChildlesnessDivorceSex differences



Stronger negative feelings (i.e., more disturbed, upset, & frustrated) when encountered others who, in our view, hold false beliefs, compared to when we think that others’ beliefs are merely different from our own

Molnar, Andras and Loewenstein, George F., Thoughts and Players: An Introduction to Old and New Economic Perspectives on Beliefs (March 16, 2021). The Science of Beliefs: A multidisciplinary Approach (provisional title, to be published in October 2021). Cambridge University Press. Edited by Julien Musolino, Joseph Sommer, and Pernille Hemmer. SSRN: http://dx.doi.org/10.2139/ssrn.3806135

Abstract: In this chapter we summarize how economists conceptualize beliefs. Moving both backward and forward in time, we review the way that mainstream economics currently deals with beliefs, as well as, briefly, the history of economists’ thinking about beliefs. Most importantly, we introduce the reader to a recent, transformational movement in economics that focuses on belief-based utility. This approach challenges the standard economic assumption that beliefs are only an input to decision making and examines implications of the intuitive idea that people derive pleasure and pain directly from their beliefs. We also address the question of when and why people care about what other people believe. We close with a discussion of the implications of these insights for contemporary social issues such as political polarization and fake news.

Keywords: Anticipatory Emotions, Belief-based Utility, Cognitive Dissonance, False Beliefs, Homophily, Motivated Reasoning, Polarization, Self-esteem, Theory of Mind

JEL Classification: A11, B12, B21, D83, D91

---

In our own recent research, however (Molnar & Loewenstein, 2020), we have been advancing a subtly, but we believe crucially, different perspective. Our own view is that it is not awareness that other people have different beliefs than our own which causes discomfort. Rather, it is the belief that others hold, and act on, beliefs that we perceive to be wrong. This idea is also captured by “Cunningham’s Law”—named after Ward Cunningham, the developer of the first wiki—which states that “the best way to get the right answer on the internet is not to ask a question; it’s to post the wrong answer.”5 At the heart of this rather witty “law” lies the intuition that people have a strong desire to correct others’ beliefs when they deem those beliefs to be false. Aligned with the above anecdotal evidence, our own research demonstrates that participants express stronger negative feelings (i.e., are more disturbed, upset, and frustrated) when they encounter others who—from the participant’s point of view—hold false beliefs, compared to when participants think that others’ beliefs are merely different from their own (Molnar & Loewenstein, 2020). These strong negative emotions can then, based on the situation and the type of relationship, either trigger approach (e.g., confronting the other person, attempting to persuade them) or avoidance behaviors (e.g., blocking the other person online).

The subject of these false beliefs can be anything: beliefs about the individual (e.g., misunderstanding one’s intentions), about relationships (incorrectly believing that someone’s partner had been cheating on them), economic outcomes (tax cuts on the rich ultimately “trickle down” to help the poor), or even global phenomena (climate change is unrelated to human activity). What matters more is not the domain of belief, rather, the conviction that someone else holds an incorrect view of the individual, relationships, outcomes, or the world. The more convinced people are that others hold false beliefs, the more upset they will be (Molnar & Loewenstein, 2020), and the more likely they will take some action (either confront these others, or making extra effort to avoid them).

We find that men remain active in the dating and sexual marketplace longer than women

Dating and sexualities across the life course: The interactive effects of aging and gender. Lisa R. Miller, Justin R. Garcia, Amanda N. Gesselman. Journal of Aging Studies, Volume 57, June 2021, 100921. https://doi.org/10.1016/j.jaging.2021.100921

Highlights

• The authors investigate whether aging differentially impacts single women's and men's dating and sexual attitudes and behaviors

• The authors find that aging has greater effects on women's than men's dating and sexual attitudes and behaviors

• Gender differences in dating and sexuality are only found in specific stages of the life course

• Age is central to understandings of gendered heterosexuality

Abstract: Research on aging and sexuality has proliferated in recent years. However, little is known about the gender-specific effects of aging on dating and sexuality. Using survey data from the 2014 wave of Singles in America (SIA), a comprehensive survey on adult singles' experiences with dating and sexuality, we examine whether age differentially affects heterosexual women's and men's dating and sexual attitudes and behaviors and whether gender differences persist across the life course. We find that men remain active in the dating and sexual marketplace longer than women. Although main effects of gender differences are documented in dating and sexual attitudes and behaviors, the results show that gender does not operate the same across the life course. Notably, gender differences shrink or are eliminated in attitudes and behaviors surrounding partnering in midlife and late adulthood, suggesting that age is integral to understanding gendered heterosexuality.

Keywords: GenderSexualitiesAgingIntimate relationshipsLife course


Over and above other Big Five personality domains, both conscientiousness & agreeableness were negatively correlated with alcohol consumption, risky/hazardous drinking, & negative drinking-related consequences

Lui, P. P., Michael Chmielewski, Mayson Trujillo, Joseph Morris, and Terri Pigott. 2021. “Linking Big Five Personality Domains and Facets to Alcohol (mis)use: A Systematic Review and Meta-analysis.” PsyArXiv. March 17. doi:10.31234/osf.io/x2yta

Abstract

Aims: The goal of this investigation was to synthesize (un)published studies linking Big Five personality domains and facets to a range of alcohol use outcomes. Meta-analyses were conducted to quantify the unique associations between alcohol use outcomes and each Big Five personality domains over and above other domains. Within each domain, meta-analyses also were conducted to examine the unique contribution of each personality facet in alcohol use outcomes.

Methods: Systematic literature reviews were performed in PsycINFO and PubMed using keywords related to alcohol use and personality. Peer-reviewed and unpublished studies were screened and coded for the meta-analyses. Eighty independent samples were subjected to correlated effects meta-regressions.

Results: Over and above other Big Five personality domains, both conscientiousness and agreeableness were negatively correlated with alcohol consumption, risky/hazardous drinking, and negative drinking-related consequences. Facet-level analyses indicated that deliberation and dutifulness were uniquely associated with alcohol (mis)use over and above other conscientiousness facets, and compliance and straightforwardness were uniquely associated with alcohol (mis)use over and above other agreeableness facets. Extraversion—namely excitement seeking—was correlated with alcohol consumption, whereas neuroticism—namely impulsiveness and angry hostility—was correlated with negative drinking-related consequences.

Conclusions: Personality characteristics are robust correlates of alcohol (mis)use. Examining relevant narrowband traits can inform mechanisms by which personality affects drinking behaviors and related problems, and ways to enhance clinical interventions for alcohol use disorder. Gaps in this literature and future research directions are discussed.


From 2019... Educational attainment impacts drinking behaviors and risk for alcohol dependence: results from a two-sample Mendelian randomization study with ~780,000 participants

From 2019... Educational attainment impacts drinking behaviors and risk for alcohol dependence: results from a two-sample Mendelian randomization study with ~780,000 participants. Daniel B. Rosoff, Toni-Kim Clarke, Mark J. Adams, Andrew M. McIntosh, George Davey Smith, Jeesun Jung & Falk W. Lohoff. Molecular Psychiatry volume 26, pages1119–1132. Oct 2019. https://www.nature.com/articles/s41380-019-0535-9

Rolf Degen's take: The genetic instruments associated with higher educational attainment are linked to reduced binge drinking, reduced amount of alcohol consumed per occasion, but to increased frequency of alcohol intake, especially of wine

Abstract: Observational studies suggest that lower educational attainment (EA) may be associated with risky alcohol use behaviors; however, these findings may be biased by confounding and reverse causality. We performed two-sample Mendelian randomization (MR) using summary statistics from recent genome-wide association studies (GWAS) with >780,000 participants to assess the causal effects of EA on alcohol use behaviors and alcohol dependence (AD). Fifty-three independent genome-wide significant SNPs previously associated with EA were tested for association with alcohol use behaviors. We show that while genetic instruments associated with increased EA are not associated with total amount of weekly drinks, they are associated with reduced frequency of binge drinking ≥6 drinks (ßIVW = −0.198, 95% CI, −0.297 to –0.099, PIVW = 9.14 × 10−5), reduced total drinks consumed per drinking day (ßIVW = −0.207, 95% CI, −0.293 to –0.120, PIVW = 2.87 × 10−6), as well as lower weekly distilled spirits intake (ßIVW = −0.148, 95% CI, −0.188 to –0.107, PIVW = 6.24 × 10−13). Conversely, genetic instruments for increased EA were associated with increased alcohol intake frequency (ßIVW = 0.331, 95% CI, 0.267–0.396, PIVW = 4.62 × 10−24), and increased weekly white wine (ßIVW = 0.199, 95% CI, 0.159–0.238, PIVW = 7.96 × 10−23) and red wine intake (ßIVW = 0.204, 95% CI, 0.161–0.248, PIVW = 6.67 × 10−20). Genetic instruments associated with increased EA reduced AD risk: an additional 3.61 years schooling reduced the risk by ~50% (ORIVW = 0.508, 95% CI, 0.315–0.819, PIVW = 5.52 × 10−3). Consistency of results across complementary MR methods accommodating different assumptions about genetic pleiotropy strengthened causal inference. Our findings suggest EA may have important effects on alcohol consumption patterns and may provide potential mechanisms explaining reported associations between EA and adverse health outcomes.

Discussion

Using large summary-level GWAS data and complementary two-sample MR methods, we show that EA has a likely causal relationship with alcohol consumption behaviors and alcohol dependence risk in individuals of European Ancestry. More specifically, higher EA reduced binge drinking (six or more units of alcohol), the amount of alcohol consumed per occasion, frequency of memory loss due to drinking, distilled spirits intake, and AD risk. EA increased the frequency of alcohol intake, whether alcohol is consumed with meals, and wine consumption. We found evidence that our results may be driven by genetic pleiotropy in only two of the eight alcohol consumption behaviors (average weekly beer plus cider intake and alcohol usually taken with meals) and significance remained after additional analysis using EA instruments with SNPs nominally associated with either cognition or income suggest that EA may be an important factor responsible for variation in alcohol use behaviors. Consistency of our results across MR methods also strengthens our inference of causality.

Educated persons generally have healthier lifestyle habits, fewer comorbidities, and live longer than their less educated counterparts [52], and our results suggest EA is causally associated with different likelihoods of belonging to variegated alcohol consumer typologies. We found that an additional 3.61 years of education reduced the risk of alcohol dependence by ~50%, which is consistent with results from small community samples [53], and the two most recent alcohol dependence GWASs findings strong inverse genetic correlations with educational attainment [2754]. Notably, binge drinking significantly increases the alcohol dependence risk [55], and distilled spirits and beer consumption account for the majority of hazardous alcohol use [56]. Furthermore, compared to wine drinkers, beer and spirits drinkers are at increased risk of becoming heavy or excessive drinkers [57], for alcohol-related problems and illicit drug use [5859], and AD [57]. Our findings related to alcoholic drink preferences, when combined with our results showing increased binge drinking, memory loss due to alcohol, and a suggestive relationship with remorse after drinking, imply a pattern of alcohol consumption motivated to reduce negative emotions or becoming intoxicated [14].

In contrast to the often-reported positive association between EA and total amount of alcohol consumption reported from observational studies [1860], we found little evidence of a causal relationship. This null finding may be reconciled by the opposing influences on alcohol intake frequency and total alcohol consumed per occasion, which, while not leading to an overall change in total consumption, nonetheless significantly affect the pattern. Our null finding regarding total consumption does support similar results from Davies et al. [52], who used the 1972 mandated increase in school-leaving age in the UK as a natural experiment instrumental variable design to investigate the causal effects of staying in school on total alcohol consumption (from individuals in the UKB sample who turned 15 in the first year before and after the schooling age increased). Davies et al. may have found a significant effect of staying in school had they included the disaggregated behavioral dimensions of alcohol consumption behaviors. Nevertheless, even if no EA-total alcohol consumption relationship exists, studies have reported that both the specific alcoholic beverage and the pattern with which it is consumed, controlling for total consumption, independently contribute to risky health behaviors [6162].

Natural experiments [5263], and twin studies have found that differences in EA, even after controlling for shared environmental factors, still significantly impact mortality risk [64,65,66], and recent large Mendelian randomization studies have demonstrated inverse relationships between EA on smoking behaviors [35] and coronary heart disease (CHD) risk [34] add to the growing body of literature, suggesting a causal effect of increased EA on health and mortality. Other observational studies have linked alcohol consumption patterns to health, disease, and mortality risk [67,68,69]. In particular, binge drinking may have dramatic short-term consequences, including motor vehicle accidents, alcoholic coma, cerebral dysfunction, and violent behavior [70], as well as long-term effects such as hypertension, stroke, and other cardiovascular outcomes [71]. A recent MR study showed that smoking mediates, in part, the effect of education on cardiovascular disease [72], and our results suggest that differences in alcohol consumption patterns may also be another mediator. Health consequences incur significant costs with binge drinking accounting for ~77% of the $249 billion alcohol-related costs (lost workplace productivity, health care expenses, law enforcement, and criminal justice expenses, etc.) in the United States in 2010 [55].

While we do not fully understand the underlying biological mechanisms through which the instrument SNPs influence EA, they are primarily found in genomic regions regulating brain development and expressed in neural tissue. These SNPs demonstrate significant expression throughout the life course, but exhibit the highest expression during development [36]. For example, rs4500960, which was associated with reduced EA, is an intronic variant in the transcription factor protein, T-box, Brain 1 (TBR1), that is important for differentiation and migration of neurons during development [36], while rs10061788 is associated with cerebral cortex and hippocampal mossy fiber morphology [36]. It is, however, important to note that interpreting these SNPs as representing “genes for education” may be “overly simplistic” since EA is strongly affected by environmental factors [36]. Our results remained when using an EA instrument with SNPs nominally associated with income removed, suggesting that an individual’s genetics may impact behavior development, which then increases EA [73]. Conversely, genetic estimates of EA and its correlations with other complex social phenotypes using population-based samples may be susceptible to biases, such as assortative mating and dynastic effects that provide pathways alternate to direct biological effects [40]. For example, EA-associated genetic influence on parental behavior could causally affect the child’s environment [73]. Using polygenic scores for EA, Belsky et al. [73] recently found the mothers’ EA-linked genetics actually predicted their children’s social attainment better than the child’s own EA-linked genetics, suggesting an effect mediated by environmental effects. While policies are not able to change children’s genes, or their inherited social status, they can provide resources [73], and our results suggest that interventions to increase education may help improve health outcomes through changing alcohol consumption patterns.

Notably, there was evidence for some causal effects of alcohol consumption patterns on EA, and the divergent effects again demonstrate the importance of separating drinking variables. However, we failed to find evidence that total alcohol consumed, binge drinking, or AD impacts EA, which is in line with observational studies finding no, or small effects [21], and suggests that other studies findings a negative effect [21] may be due to confounding. Alternatively, EA may not be sensitive enough to detect changes in schooling, e.g., grade point average [21], falling behind in homework and other academic difficulties that also reported association  with heavy drinking [74]. Further, there are currently no adolescent drinking behavior GWAS, so the temporal sequence of these analyses should be considered during their interpretation. Our findings, therefore, need replication when GWAS on adolescent alcohol consumption patterns becomes available.

Exploratory sex-specific analyses revealed differences in certain aspects of the relationship between EA and alcohol consumption. For men, the relationship between their consumption of red wine, beer, and whether they drink with meals was more sensitive to changes in EA than for women. Conversely, the reduction in binge drinking with increased EA may be driven by its effect for women since its effect on men was not significant. In addition, in women the negative effect of EA on spirit consumption was more than double its effect on men. We found no differences among the AUDIT question.

There are noted gender gaps in alcohol use and associated outcomes due to a combination of physiological and social factors [39]. Notably, Huerta et al. [75] found sex-specific effects of EA and academic performance on the odds of belonging to different alcohol consumption typologies (ranging from “Abstainer” to “Regular Heavy Drinker with Problems”). The absence of any association in males may be due to their inability to model binge drinking [75]; however, our results suggest otherwise. Additionally, the recent Clarke et al. [28] total weekly alcohol GWAS found sex-specific genetic correlation differences with an rg = 0.1 in men and 0.33 in women. Taken together, our findings suggest EA may partially account for some of these observed gender gaps in alcohol consumption, but not others. We should note that the only available sex-specific EA GWAS had significant overlap ( ≥18.9%) with the outcome datasets, so our exploratory sex-specific analysis used the same EA GWAS combining men and women. The lack of available sex-specific AD GWAS also meant we were unable to examine differences in AD risk. Notably, the sex-specific EA GWAS demonstrated nearly identical effect sizes between men and women, which support the validity of the estimates derived from the combined-sex EA GWAS, but future studies using sex-specific instruments are required.

Strengths and limitations

We note several strengths. We have analyzed multiple alcohol-related behavioral phenotypes, which support the consistency of our results. We have implemented multiple complementary MR methods (IVW, Egger, weighted median, and weighted mode MR) and diagnostics. Consistency of results across MR methods (accommodating different assumptions about genetic pleiotropy) strengthens our causal interpretation of the estimates [76]. We also used the largest publicly available GWASs for both exposure and outcome samples; large summary datasets are important for MR and other genetic analysis investigating small effect sizes [77]. We also note limitations and future directions. There is minimal sample overlap between the exposure SSGAC GWAS and the outcome PGC GWAS (AD), but there may still be individuals participating in multiple surveys, which event we cannot ascertain with available summary-level GWAS statistics. Further, the GWASs cohorts are from Anglophone countries, where beer is the preferred drink [78]; therefore, applicability to other countries with different alcohol preferences may be limited. Further still, it has been reported the UKB sample is more educated, with healthier lifestyles, and fewer health problems than the UK population [79], which may limit the generalizability to other populations. Replication of these findings using alcohol use information from different ethnicities is necessary. EA only measured years of completed schooling; determining how various aspects of education differentially impact alcohol consumption was not possible but should be a topic of future work. Finally, alcohol consumption is not stable over time [15]; however, the alcohol consumption outcomes correspond to current drinking behavior, which may have led to the misclassification of some individuals. The current drinking also impacts the temporal relationship of our bidirectional analyses, since the current alcohol intake likely occurred after maximum educational attainment for most of the participants. Future GWAS that evaluate drinking behavior during adolescence, or other longitudinal studies are necessary to confirm these findings and better elucidate the impact of alcohol intake on EA.