Thursday, September 26, 2019

Male juvenile rats and laughter: There was evidence that tickling showed rebound and emotional contagion effects

Relationships between play and responses to tickling in male juvenile rats. Tayla Hammond et al. Applied Animal Behaviour Science, September 25 2019, 104879. https://doi.org/10.1016/j.applanim.2019.104879

Highlights
•    Solitary but not social play increased prior to and potentially in anticipation of tickling sessions
•    There were substantial differences between cohorts in their tickling responses and play behaviour.
•    Taking account of cohort there was evidence that tickling showed rebound and emotional contagion effects
•    Cohort effects may be explained by differences in physical condition prior to tickling.

Abstract: Play is a putatively positive experience and of key interest to the study of affective state in animals. Rats produce 50 kHz ultrasonic vocalisation (USVs) during positive experiences, including social play and tickling. The tickling paradigm is intended to mimic social play resulting in positively valanced ultrasonic vocalisation (USV) production. We tested two hypotheses on the relationship between tickling and play: that tickling would increase play behaviour or that play behaviour would increase in anticipation of tickling, and that tickling would share some specific properties of play (rebound and emotional contagion of unexposed cage mates). Male Wistar rats (N = 64, with 32 rats/cohort) of 28 days of age were housed in pairs with one rat assigned to be tickled and one as the non-tickled control. Production of 50 kHz USVs and hand-following behaviour was measured. Prior to handling, solitary and social play was recorded for 5 minutes in the home cage. A two-day break in tickling was used to assess a potential rebound increase in responses to tickling. Only one rat within each cage was handled to assess emotional contagion through changes in the behaviour of the cage-mate. Solitary but not social play increased prior to tickling relative to controls (p = 0.01). There were marked differences between cohorts; tickled rats in C2 produced less 50 kHz USVs than those in C1 (p  = 0.04) and overall, C2 rats played less than rats in C1 (social p =  0.04 and solitary p <  0.001) and had a lighter start weight on arrival (p =  0.009) compared with cohort 1 (C1). In C1, there was evidence of rebound in USV production (p <  0.001) and a contagious effect of tickling reflected by increased hand-following in cage mates (p = 0.02). We found a positive relationship between start weight and USV responses to tickling (Rs = 0.43, p < 0.001), suggesting that the divergence in USV production may be due to developmental differences between cohorts. The results suggest that the relationship between tickling and play is complex in that tickling only affected solitary and not social play, and that tickling responses showed rebound and contagion effects on cage-mates which were specific to cohort responses to tickling.

People in creative occupations and the entertainment industry – artists (both genders), musicians (males) and actors (males) – were at increased risk of suicide

Occupation-specific suicide risk in England: 2011–2015. Ben Windsor-Shellard and David Gunnell. The British Journal of Psychiatry, Volume 215, Issue 4, October 2019 , pp. 594-599. https://doi.org/10.1192/bjp.2019.69

Abstract
Background: Previous research has documented marked occupational differences in suicide risk, but these estimates are 10 years old and based on potentially biased risk assessments.

Aims: To investigate occupation-specific suicide mortality in England, 2011–2015.

Method: Estimation of indirectly standardised mortality rates for occupations/occupational groups based on national data.

Results: Among males the highest risks were seen in low-skilled occupations, particularly construction workers (standardised mortality ratio [SMR] 369, 95% CI 333–409); low-skilled workers comprised 17% (1784/10 688) of all male suicides (SMR 144, 95% CI 137–151). High risks were also seen among skilled trade occupations (SMR 135 95% CI 130–139; 29% of male suicides). There was no evidence of increased risk among some occupations previously causing concern: male healthcare professionals and farmers. Among females the highest risks were seen in artists (SMR 399, 95% CI 244–616) and bar staff (SMR 182, 95% CI 123–260); nurses also had an increased risk (SMR 123, 95% CI 104–145). People in creative occupations and the entertainment industry – artists (both genders), musicians (males) and actors (males) – were at increased risk, although the absolute numbers of deaths in these occupations were low. In males (SMR 192, 95% CI 165–221) and females (SMR 170, 95% CI 149–194), care workers were at increased risk and had a considerable number of suicide deaths.

Conclusions: Specific contributors to suicide in high-risk occupations should be identified and measures – such as workplace-based interventions – put in place to mitigate this risk. The construction industry seems to be an important target for preventive interventions.

Internet use resulted in better answers, but also in significant and persistent overestimation of information problem-solving ability and performance, even in more accurate postdictive metacognitive judgments

Will using the Internet to answer knowledge questions increase users’ overestimation of their own ability or performance? Stephanie Pieschl. Media Psychology, Sep 24 2019. https://doi.org/10.1080/15213269.2019.1668810

Abstract: Using the Internet is ubiquitous, but not all of the consequences of this habitual technology use are known. Theoretical models and related research suggest that the act of searching for information on the Internet itself may bias users toward overestimating themselves. The current study is the first direct empirical test of this assumption. In a two-by-two design, n = 184 participants were randomly assigned to between-subject Internet or NoInternet conditions in each of two phases: In the Induction Phase 1, participants made predictive metacognitive judgments about their ability to answer a first set of explanatory knowledge questions. In the Target Phase 2, they made predictive and postdictive metacognitive judgments about their ability to answer a second, entirely unrelated set of explanatory knowledge questions and answered two of these questions. Results show that Internet use affected tasks only in the same phase, but not in subsequent unrelated phases. Internet use resulted in better answers, but also in significant and persistent overestimation of information problem-solving ability and performance, even in more accurate postdictive metacognitive judgments. Potential consequences of this side effect of Internet use are discussed such as premature termination of information problem-solving and suboptimal performance.

Check also Illusion of Knowledge through Facebook News? Effects of Snack News in a News Feed on Perceived Knowledge, Attitude Strength, and Willingness for Discussions. Svenja Schäfer. Computers in Human Behavior, September 4 2019. https://www.bipartisanalliance.com/2019/09/illusion-of-knowledge-through-facebook.html

Laypeople Can Predict Which Social Science Studies Replicate

Hoogeveen, Suzanne, Alexandra Sarafoglou, and Eric-Jan Wagenmakers. 2019. “Laypeople Can Predict Which Social Science Studies Replicate.” PsyArXiv. September 25. doi:10.31234/osf.io/egw9d

Abstract: Large-scale collaborative projects recently demonstrated that several key findings from the social science literature could not be replicated successfully. Here we assess the extent to which a finding’s replication success relates to its intuitive plausibility. Each of 27 high-profile social science findings was evaluated by 233 people without a PhD in psychology. Results showed that these laypeople predicted replication success with above-chance performance (i.e., 58%). In addition, when laypeople were informed about the strength of evidence from the original studies, this boosted their prediction performance to 67%. We discuss the prediction patterns and apply signal detection theory to disentangle detection ability from response bias. Our study suggests that laypeople’s predictions contain useful information for assessing the probability that a given finding will replicate successfully.

Wednesday, September 25, 2019

Just 4% of participants appeared to use prior research to make probability estimates—most seemed to focus on the latest study, ignoring/discounting prior ones, even when they had more statistics classes

Is One Study as Good as Three? College Graduates Seem to Think So, Even if They Took Statistics Classes. m W Burt Thompson et al. Psychology Learning & Teaching, September 25, 2019. https://doi.org/10.1177/1475725719877590

Abstract: When people interpret the outcome of a research study, do they consider other relevant information such as prior research? In the current study, 251 college graduates read a single brief fictitious news article. The article summarized the findings of a study that found positive results for a new drug. Three versions of the article varied the amount and type of previous research: (a) two prior studies that found the drug did not work, (b) no prior studies of the drug, or (c) two prior studies that found the drug had a positive effect. After reading the article, participants estimated the probability the drug is effective. Average estimates were similar for the three articles, even for participants who reported more statistics experience. Overall, just 4% of participants appeared to use prior research to make probability estimates—most seemed to focus on the latest study, while ignoring or discounting prior studies. Implications for statistics education and reporting are discussed.

Keywords; Statistics education, statistics misconception, base rate neglect

Check also: Political partisans disagreed about the importance of conditional probabilities; highly numerate partisans were more polarized than less numerate partisans
It depends: Partisan evaluation of conditional probability importance. Leaf Van Boven et al. Cognition, Mar 2 2019, https://www.bipartisanalliance.com/2019/03/political-partisans-disagreed-about.html

And: Biased Policy Professionals. Sheheryar Banuri, Stefan Dercon, and Varun Gauri. World Bank Policy Research Working Paper 8113. https://www.bipartisanalliance.com/2017/08/biased-policy-professionals-world-bank.html

And: Dispelling the Myth: Training in Education or Neuroscience Decreases but Does Not Eliminate Beliefs in Neuromyths. Kelly Macdonald et al. Frontiers in Psychology, Aug 10 2017. https://www.bipartisanalliance.com/2017/08/training-in-education-or-neuroscience.html

And: Wisdom and how to cultivate it: Review of emerging evidence for a constructivist model of wise thinking. Igor Grossmann. European Psychologist, in press. Pre-print: https://www.bipartisanalliance.com/2017/08/wisdom-and-how-to-cultivate-it-review.html

And: Individuals with greater science literacy and education have more polarized beliefs on controversial science topics. Caitlin Drummond and Baruch Fischhoff. Proceedings of the National Academy of Sciences, vol. 114 no. 36, pp 9587–9592, https://www.bipartisanalliance.com/2017/09/individuals-with-greater-science.html

And: Expert ability can actually impair the accuracy of expert perception when judging others' performance: Adaptation and fallibility in experts' judgments of novice performers. By Larson, J. S., & Billeter, D. M. (2017). Journal of Experimental Psychology: Learning, Memory, and Cognition, 43(2), 271–288. https://www.bipartisanalliance.com/2017/06/expert-ability-can-actually-impair.html

And: Collective Intelligence for Clinical Diagnosis—Are 2 (or 3) Heads Better Than 1? Stephan D. Fihn. JAMA Network Open. 2019;2(3):e191071, https://www.bipartisanalliance.com/2019/03/one-conclusion-that-can-be-drawn-from.html

Quashing the hopes of personalized antidepressants: "Our findings did not provide empirical support for individual differences in response to antidepressants"

Munkholm, Klaus, Stephanie Winkelbeiner, and Philipp Homan. 2019. “Individual Response to Antidepressants for Depression in Adults – a Simulation Study and Meta-analysis.” PsyArXiv. September 25. doi:10.31234/osf.io/m4aqc

Abstract
Background. The observation that some patients appear to respond better to antidepressants for depression than others encourages the assumption that the effect of antidepressants differs between individuals and that treatment can be personalized. To test this assumption, we compared the outcome variance in the group of patients receiving antidepressants with the outcome variance of the group of patients receiving placebo in randomized controlled trials (RCTs) of adults with major depressive disorder (MDD). An increased variance in the antidepressant group would indicate individual differences in response to antidepressants. In addition, we illustrate in a simulation study why attempts to personalize antidepressant treatment using RCTs might be misguided.

Methods. We first illustrated the variance components of trials by simulating RCTs and crossover trials of antidepressants versus placebo. Second, we analyzed data of a large meta-analysis of antidepressants for depression, including a total of 222 placebo-controlled studies from the dataset that reported outcomes on the 17 or 21 item Hamilton Depression Rating Scale or the Montgomery-Åsberg Depression Rating Scale. We performed inverse variance, random-effects meta-analyses of the variability ratio (VR) between the antidepressant and placebo groups.

Outcomes. The meta-analyses of the VR comprised 345 comparisons of 19 different antidepressants with placebo in a total of 61144 adults with an MDD diagnosis. Across all comparisons, we found no evidence for a larger variance in the antidepressant group compared with placebo overall (VR = 1.00, 95% CI: 0.98; 1.01, I2 = 0%) or for any individual antidepressant.

Interpretation. Our findings did not provide empirical support for individual differences in response to antidepressants.

Adolescent drinking has declined across many developed countries from the turn of the century; we aim to explore existing evidence examining possible reasons for this decline; the main reason could be shifts in parental practices

Why is adolescent drinking declining? A systematic review and narrative synthesis. Rakhi Vashishtha et al. Addiction Research & Theory, Sep 23 2019. https://doi.org/10.1080/16066359.2019.1663831

Abstract
Background: Adolescent drinking has declined across many developed countries from the turn of the century. The aim of this review is to explore existing evidence examining possible reasons for this decline.

Methods: We conducted systematic searches across five databases: Medline, PsycINFO, CINAHL, Informit Health and Scopus. Studies were included if association between declining alcohol consumption and potential explanatory factors were measured over time. Narrative synthesis was undertaken due to substantial methodological heterogeneity in these studies.

Results: 17 studies met the inclusion criteria. Five studies found moderate evidence for changes in parental practices as a potential cause for the decline. Five studies that examined whether alcohol policy changes influenced the decline found weak evidence of association. Three studies explored whether alcohol use has been substituted by illicit substances but no evidence was found. Two studies examined the effect of a weaker economy; both identified increase in adolescent alcohol use during times of economic crisis. One study indicated that changes in exposure to alcohol advertising were positively associated with the decline and another examined the role of immigration of non-drinking populations but found no evidence of association. One study tested participation in organised sports and party lifestyle as a potential cause but did not use robust analytical methods and therefore did not provide strong evidence of association for the decline.

Conclusions: The most robust and consistent evidence was identified for shifts in parental practices. Further research is required using robust analytical methods such as ARIMA modelling techniques and utilising cross-national data.

Keywords: Drinking, review, decline, downward trend, adolescents

Hedonic responses to music are the result of connectivity between structures involved in auditory perception as a predictive process, & those involved in the brain's dopaminergic reward system


Musical anhedonia and rewards of music listening: current advances and a proposed model. Amy M. Belfi, Psyche Loui. Annals of the New York Academy of Sciences, September 23 2019. https://doi.org/10.1111/nyas.14241

Abstract: Music frequently elicits intense emotional responses, a phenomenon that has been scrutinized from multiple disciplines that span the sciences and arts. While most people enjoy music and find it rewarding, there is substantial individual variability in the experience and degree of music‐induced reward. Here, we review current work on the neural substrates of hedonic responses to music. In particular, we focus the present review on specific musical anhedonia, a selective lack of pleasure from music. Based on evidence from neuroimaging, neuropsychology, and brain stimulation studies, we derive a neuroanatomical model of the experience of pleasure during music listening. Our model posits that hedonic responses to music are the result of connectivity between structures involved in auditory perception as a predictive process, and those involved in the brain's dopaminergic reward system. We conclude with open questions and implications of this model for future research on why humans appreciate music.

Introduction

The capacity to perceive, produce, and appreciate music, together termed musicality,1 has been a growing topic of interest in the past 20 years of cognitive neuroscience. While most cognitive neuroscience studies on musicality focus on music perception and production skills, there has been a recent explosion of interest in the appreciation of music.2, 3 Multiple research programs in the cognitive neuroscience of music have involved comparing participants with different types and levels of musical training.4-7 However, to cognitive neuroscientists who are not particularly concerned with music, these studies may appear to be highly specialized and of limited interest as they seem to focus on a special population—highly trained musicians. In contrast, studies on the appreciation of music can be thought of as more general and inclusive, encompassing the vast majority of humans regardless of formal musical training.

Humans show knowledge of fundamental musical building blocks, such as rhythm and beat, from as early as 1 day old,8 and as shown from the success of the multibillion‐dollar music industry, humans around the world enjoy music. One of the most frequently reported reasons for listening to music is the overwhelming influence it has on feelings and emotions.9 Music has been deemed an ultimate group bonding activity;10, 11 this is supported by structural features of melody, harmony, and scales that are observed across many cultures,12 as well as the ubiquity of songs that serve social functions, such as lullabies, dance songs, healing songs, and love songs, across cultures.13 Singing and making music together enhance social interactions and group bonding14, 15 and elicit physiological effects that are observable from infancy.16 Even in the few cultures where music is not produced in groups, members of these cultures nevertheless enjoy singing for each other,17 suggesting that the capacity for music enjoyment, that is, the rewarding aspects of music, may be intrinsic to humans as a social species. Together, these lines of research suggest that understanding why humans love music may offer a window into how humans interact in a social environment.

The rapid growth of research on musical enjoyment, specifically in cognitive neuroscience, may also be facilitated in part by recent findings on the role of dopamine in coding for prediction and reward. Since the classic observations that stimulating dopaminergic neurons elicits motivated behavior,18 and that dopaminergic neurons signal changes in the predictability of rewards,19 thousands of studies have identified a set of regions within the human brain that are especially sensitive to reward. These regions center around the midbrain (the ventral tegmental area and substantia nigra (SN)), the dorsal and ventral striatum (VS) (the caudate, putamen, nucleus accumbens (NAcc), and globus pallidus), and the medial prefrontal cortex (mPFC).20 These areas, which we refer to throughout this article as the reward system, are reliably activated during the experience of unconditionally rewarding, evolutionarily salient stimuli, such as food and sex, as well as stimuli that are strongly associated with such rewards, such as money. Findings from the monetary incentive delay task show that cues that predict monetary rewards reliably activate the striatum and mPFC,21 core areas of the reward network. Interestingly, activity in the striatum is also observed when social stimuli (faces) are substituted for monetary rewards,22, 23 suggesting that social and monetary cues tap into “common neural currency” of the reward system.22

The findings that food and sex activate the same reward system can be readily explained as being evolutionarily adaptive: being motivated to seek out these stimuli improves our chances of survival. In contrast, the adaptive value of music—and aesthetic stimuli more generally—is less obvious. Nevertheless, much recent work has shown that music engages the reward system (as reviewed below; see also Ref. 24). While music ranks highly among the pleasures in life,25, 26 recent work has identified a unique condition of people with specific musical anhedonia:27, 28 people who are insensitive to the rewarding aspects of music despite normal hedonic responses to other sensory and aesthetic stimuli, and normal auditory perceptual abilities.29 The existence of this unique population raises many important questions. Some of these questions include:

.    The nature versus nurture of musical reward sensitivity: Does musical anhedonia run in families? When and how did it develop? What, if any, genetic underpinnings might predispose an individual toward musical anhedonia? What is its developmental trajectory?

.    Domain‐specificity versus domain‐generality of reward sensitivity: Are there specific neural pathways for music reward that are separate from general reward? What are the neural pathways through which specific stimuli (such as music) come to have privileged access to the reward system? What endows a certain stimulus with privileged access to the reward system?

.    Psychological associations and clinical comorbidity: What are the associations between musical anhedonia and psychological traits, both in the normal range (e.g., big five personality traits) and clinical populations? What is the comorbidity between musical anhedonia and personality, mood, and communication disorders?

.    The evolution of music: To what extent do nonhuman animals also show reward sensitivity to music? Have there been people with musical anhedonia for as long as there has been music? By extension, if people with musical anhedonia have survived for generations with no apparent disadvantage alongside the rest of the population who have normal reward sensitivity to music, then the lack of reward sensitivity to music seems not to affect their survival. If this is the case, then why do we seek out music?

Here, we review the recent cognitive neuroscience evidence for musical engagement of the reward system, as well as an extreme end of the spectrum of individual differences in sensitivity to music reward in specific musical anhedonia. Based on our review of the literature, we propose a model that accounts for the nature of the auditory access to the human reward system, and its disruption in musical anhedonia.

The effect of close elections on the life expectancy of politicians: Winners outlive losers by over a year, on average

Run for your life? The effect of close elections on the life expectancy of politicians. Mark Borgschulte, Jacob Vogler. Journal of Economic Behavior & Organization, September 24 2019. https://doi.org/10.1016/j.jebo.2019.09.003

Highlights
•    Examine effect of winning or losing a close election on the life expectancy of candidates.
•    Regression discontinuity design estimated using newly-collected data on winning and losing candidates for governor, senator, and representative in the United States.
•    Winners outlive losers by over a year, on average.
•    Largest effects for governors and candidates who run later in US history.
•    No discernable effect of stress on life expectancy.

Abstract: We estimate the causal effect of election to political office on natural lifespan using a regression discontinuity design and a novel dataset of winning and losing candidates for US governor, senator, and House representative. We find that candidates gain over a year of life from winning a close election. The effect is strongest for governors, and has grown larger over the course of US history. We also examine the effect of stress experienced in office, finding that serving in more challenging situations is not associated with reduced lifespan.


Tuesday, September 24, 2019

German middle-aged men having ≥2 children, higher frequency of solo-masturbation, perceived importance of sexuality, & higher sexual self-esteem were less likely to have low sexual desire

Meissner VH, Schroeter L, Köhn F-M, et al. Factors Associated with Low Sexual Desire in 45-Year-Old Men: Findings from the German Male Sex-Study. J Sex Med Volume 16, Issue 7, July 2019, Pages 981-991. https://www.sciencedirect.com/science/article/pii/S1743609519311622

Abstract
Introduction Although low sexual desire is 1 of the most common sexual dysfunctions in men, there is a lack of studies investigating associated factors in large, population-based samples of middle-aged men.

Aim To survey the prevalence of low sexual desire in a population-based sample of 45-year-old German men and to evaluate associations with a broad set of factors.

Methods Data were collected between April 2014–April 2016 within the German Male Sex-Study. Participants were asked to fill out questionnaires about 6 sociodemographic, 5 lifestyle, and 8 psychosocial factors, as well as 6 comorbidities and 4 factors of sexual behavior. Simple and multiple logistic regressions were used to assess potential explanatory factors.

Main Outcome Measures We found a notable prevalence of low sexual desire in middle-aged men and detected associations with various factors.

Results 12,646 men were included in the analysis, and prevalence of low sexual desire was 4.7%. In the multiple logistic regression with backward elimination, 8 of 29 factors were left in the final model. Men having ≥2 children, higher frequency of solo-masturbation, perceived importance of sexuality, and higher sexual self-esteem were less likely to have low sexual desire. Premature ejaculation, erectile dysfunction, and lower urinary tract symptoms were associated with low sexual desire.

Clinical Implications Low sexual desire is common in middle-aged men, and associating factors that can potentially be modified should be considered during assessment and treatment of sexual desire disorders.

Strengths & Limitations The strength of our study is the large, population-based sample of middle-aged men and the broad set of assessed factors. However, because of being part of a prostate cancer screening trial, a recruiting bias is arguable.

Conclusion Our study revealed that low sexual desire among 45-year-old men is a common sexual dysfunction, with a prevalence of nearly 5% and might be affected by various factors, including sociodemographic and lifestyle factors, as well as comorbidities and sexual behavior.

Key Words: Sexual Desire Sexual Dysfunction Sexual Behavior Lifestyle Comorbidity Representative Sample

Theories link threat with right-wing political beliefs; our findings show that political beliefs and perceptions of threat are linked, but that the relationship is nuanced

Brandt, Mark J., Felicity M. Turner-Zwinkels, Beste Karapirinler, Florian van Leeuwen, Michael Bender, Yvette van Osch, and Byron G. Adams. 2019. “The Association Between Threat and Politics Depends on the Type of Threat, the Political Domain, and the Country.” PsyArXiv. September 24. doi:10.31234/osf.io/e9uk7

Abstract: Theories link threat with right-wing political beliefs. We use the World Values Survey (72,836 participants) to test how different types of threat (economic, violence, and surveillance) are associated with different types of political beliefs (social, economic, and political identification) across 42 different countries. The association between threat and political beliefs depends on the type of threat, the type of political beliefs, and the country. Economic threats tended to be associated with more left-wing economic beliefs, violence threats tended to be associated with more general right-wing beliefs, and surveillance threats tended to be associated with more right-wing economic beliefs and more left-wing social beliefs. Additional analyses explored how 24 country characteristics might help explain variation in the threat-political beliefs association; however, these analyses identified few cross-country characteristics that consistently helped. Our findings show that political beliefs and perceptions of threat are linked, but that the relationship is nuanced.


There is no number sense as traditionally conceived; neural substrates of number sense are more widely distributed than common consensus says, complicating the neurobiological evidence linking number sense to numerical abilities

Challenging the neurobiological link between number sense and symbolic numerical abilities. Eric D. Wilkey. Daniel Ansari. Annals of the New York Academy of Sciences, September 23 2019. https://doi.org/10.1111/nyas.14225

Abstract: A significant body of research links individual differences in symbolic numerical abilities, such as arithmetic, to number sense, the neurobiological system used to approximate and manipulate quantities without language or symbols. However, recent findings from cognitive neuroscience challenge this influential theory. Our current review presents an overview of evidence for the number sense account of symbolic numerical abilities and then reviews recent studies that challenge this account, organized around the following four assertions. (1) There is no number sense as traditionally conceived. (2) Neural substrates of number sense are more widely distributed than common consensus asserts, complicating the neurobiological evidence linking number sense to numerical abilities. (3) The most common measures of number sense are confounded by other cognitive demands, which drive key correlations. (4) Number sense and symbolic number systems (Arabic digits, number words, and so on) rely on distinct neural mechanisms and follow independent developmental trajectories. The review follows each assertion with comments on future directions that may bring resolution to these issues.

How Stress Affects Performance and Competitiveness Across Gender

How Stress Affects Performance and Competitiveness Across Gender. Jana Cahlíková, Lubomír Cingl, Ian Levely. Management Science, Jul 16 2019. https://doi.org/10.1287/mnsc.2019.3400

Abstract: Because many key career events, such as examinations and interviews, involve competition and stress, gender differences in response to these factors could help to explain the labor market gender gap. In a laboratory experiment, we manipulate psychosocial stress using the Trier Social Stress Test and confirm that this is effective by measuring salivary cortisol level and heart rate. Subjects perform in a real-effort task under both tournament and piece-rate incentives, and we elicit willingness to compete. We find that women under heightened stress perform worse than women in the control group when compensated with tournament incentives, whereas there is no treatment difference under piece-rate incentives. For men, stress does not affect output under competition or under piece rate. The gender gap in willingness to compete is not affected by stress, but stress decreases competitiveness overall, which is related to performance for women. Our results could explain gender differences in performance under competition, with implications for hiring practices and incentive structures in firms.

Disagreeable men produce higher-quality ejaculates: A Preliminary but Methodologically Improved Investigation of the Relationships Between Major Personality Dimensions and Human Ejaculate Quality

A Preliminary but Methodologically Improved Investigation of the Relationships Between Major Personality Dimensions and Human Ejaculate Quality. Tara DeLecce et al. [in press, Personality and Individual Differences, September 2019]. toddkshackelford.com/downloads/Delecce-et-al-PAID.pdf

Abstract: Some research has reported relationships between personality dimensions and ejaculate quality, but this research has methodological limitations. In the current study, we investigated the relationships between six major personality dimensions and ejaculate quality in a design that offered several methodological improvements over previous research. Forty-fivefertile men provided two masturbatory ejaculates and completed a measure ofpersonality (HEXACO-60) assessing honesty-humility, emotionality, extraversion, agreeableness, conscientiousness, and openness to experience. Agreeablenesswas the only personality dimension associated with ejaculate quality,aftercontrolling statistically for participant age, Body Mass Index, and abstinence duration, and this association was negative. However, once the covariates of BMI, age, and abstinence duration were included in a hierarchical regression (along with the six personality dimensions), agreeableness was no longer a statistically significant predictor of ejaculate quality, although the direction of the relationship remained negative. The current study adds to previous research documenting that psychological attributes—including major dimensions of personality—may be associated with ejaculate quality. We highlight limitations of the current research and identify directions for future study.

Keywords: personality; agreeableness; ejaculate quality; semenanalysis; HEXACO

Greta Thunberg's zeal, as the press summarized her speech at the UN Climate Summit, Sep 23, 2019 — updated Jan 2020

Greta Thunberg's character, as the press summarized her speech at the UN Climate Summit, Sep 23, 2019:
"How dare you. You have stolen my dreams and my childhood with your empty words."

"This is all wrong. I shouldn’t be up here. I should be back in school on the other side of the ocean. Yet you all come to us young people for hope. How dare you."

"People are suffering. People are dying and dying ecosystems are collapsing. We are in the beginning of a mass extinction, and all you can talk about is the money and fairy tales of eternal economic growth," she said Monday, as she fought back tears. "How dare you! For more than 30 years the science has been crystal clear."

"You are failing us but young people are starting to understand your betrayal."
“The eyes of all future generations are upon you and if you choose to fail us, I say we will never forgive you.”

At the World Economic Forum annual meeting in Davos, Jan 2020:
“Our house is still on fire. Your inaction is fueling the flames by the hour, and we are telling you to act as if you loved your children above all else.”

Greetings in wild chimpanzees: Signals of submission (the greetings are started by the low-position individual)

Social relationships and greetings in wild chimpanzees (Pan troglodytes): use of signal combinations. Eva Maria Luef, Simone Pika. Primates, September 24 2019. https://link.springer.com/article/10.1007/s10329-019-00758-5

Abstract: Signals of submission, so-called ‘greetings’, represent an important tool for the regulation of social life in primates. In chimpanzees, vocalizations and gestures are commonly employed to communicate greetings, however, the topic of signal complexity (i.e., combinations of signals) during greeting instances has been neglected by research to date. Here, we investigate combinatorial possibilities in vocal greetings in a free-ranging group of chimpanzees (Pan troglodytes) and study how greeter sex, rank relationship between an interacting pair, and strength of the social bond of a greeting dyad influence signal complexity. Results show that the social bond and the dominance distance between individuals engaged in a greeting bout are important determiners for vocal combinations. The findings indicate that greeting signals in chimpanzees, like other vocal signals of the species, can become subject to social influences.

Keywords: Chimpanzees Pan troglodytes Ngogo Greeting Pant-grunt Combinations Repetitions

While praise from a manager has no effect, criticism negatively impacts workers' job satisfaction & perception of the task's importance; when female managers give opinion, the negative effects double for both male & female workers

Do Workers Discriminate against Female Bosses? Martin Abel. IZA Discussion Papers No. 12611, September 2019. https://www.iza.org/publications/dp/12611/do-workers-discriminate-against-female-bosses

Abstract: I hire 2,700 workers for a transcription job, randomly assigning the gender of their (fictitious) manager and provision of performance feedback. While praise from a manager has no effect, criticism negatively impacts workers' job satisfaction and perception of the task's importance. When female managers, rather than male, deliver this feedback, the negative effects double in magnitude. Having a critical female manager does not affect effort provision but it does lower workers' interest in working for the firm in the future. These findings hold for both female and male workers. I show that results are consistent with gendered expectations of feedback among workers. By contrast, I find no evidence for the role of either attention discrimination or implicit gender bias.

Keywords: gender discrimination gig economy female leadership



Rolf Degen summarizing: Body and facial attractiveness were more important to men, whereas personality attractiveness was more important to women in real-life dating interactions

Sidari M, Lee A, Murphy S, Sherlock J, Barnaby D & Zietsch B (2019) Preferences for sexually dimorphic body characteristics revealed in a large sample of speed daters. Social Psychological and Personality Science, forthcoming. https://dspace.stir.ac.uk/handle/1893/30130

Abstract: While hundreds of studies have investigated the indices that make up attractive body shapes, these studies were based on preferences measured in the lab using pictorial stimuli. Whether these preferences translate into real-time, face-to-face evaluations of potential partners is unclear. Here 539 (275 female) participants in 75 lab-based sessions had their body dimensions measured before engaging in round-robin speed dates. After each date they rated each other’s body, face, personality, and overall attractiveness, and noted whether they would go on a date with the partner. Women with smaller waists and lower waist-to-hip ratios were found most attractive, and men with broader shoulders and higher shoulder-to-waist (or hips) ratios were found most attractive. Taller individuals were preferred by both sexes. Our results show that body dimensions associated with greater health, fertility, and (in men) formidability influence face-to-face evaluations of attractiveness, consistent with a role of intersexual selection in shaping human bodies

While journalists may indeed be biased toward telling certain types of stories, audience judgements may be biased as well: Rival partisans thought media attention was unfair with their views

Biased Gatekeepers? Partisan Perceptions of Media Attention in the 2016 U.S. Presidential Election. Mallory R. Perryman. Journalism Studies, Mar 27 2019. https://doi.org/10.1080/1461670X.2019.1598888

ABSTRACT: Deciding which stories to cover is an essential function of the press, and pundits and citizens commonly criticize journalists for these so-called “gatekeeping” choices. While journalists may indeed be biased toward telling certain types of stories, research on the hostile media perception (HMP) suggests that audience judgments about how journalists divvy up attention may be biased as well–shaped, at least in part, by partisan preferences. This study explores how partisanship impacted perceptions of media coverage among news consumers (N = 657) shortly before the 2016 U.S. presidential election. Results show that, across a variety of news stories involving the candidates, polling, and key election issues, rival partisans had diverging impressions of media attention that were not explained by differing news habits. A relative HMP pattern is evident when partisans evaluate how media allocate attention across news topics.

KEYWORDS: Audience perceptions, election news, gatekeeping, hostile media perception, partisanship, perceived bias, U.S. elections

Chimpanzees do not share, by themselves, the spoils with cooperators; the cooperators need to beg to get their reward

How chimpanzees (Pan troglodytes) share the spoils with collaborators and bystanders. Maria John et al. PLOS One, September 23, 2019. https://doi.org/10.1371/journal.pone.0222795

Abstract: Chimpanzees hunt cooperatively in the wild, but the factors influencing food sharing after the hunt are not well understood. In an experimental study, groups of three captive chimpanzees obtained a monopolizable food resource, either via two individuals cooperating (with the third as bystander) or via one individual acting alone alongside two bystanders. The individual that obtained the resource first retained most of the food but the other two individuals attempted to obtain food from the "captor" by begging. We found the main predictor of the overall amount of food obtained by bystanders was proximity to the food at the moment it was obtained by the captor. Whether or not an individual had cooperated to obtain the food had no effect. Interestingly, however, cooperators begged more from captors than did bystanders, suggesting that they were more motivated or had a greater expectation to obtain food. These results suggest that while chimpanzee captors in cooperative hunting may not reward cooperative participation directly, cooperators may influence sharing behavior through increased begging.

Monday, September 23, 2019

We Are Upright-Walking Cats: Human Limbs as Sensory Antennae During Locomotion

We Are Upright-Walking Cats: Human Limbs as Sensory Antennae During Locomotion. Gregory E. P. Pearcey and E. Paul Zehr. Physiology, Aug 7 2019. https://doi.org/10.1152/physiol.00008.2019

Abstract: Humans and cats share many characteristics pertaining to the neural control of locomotion, which has enabled the comprehensive study of cutaneous feedback during locomotion. Feedback from discrete skin regions on both surfaces of the human foot has revealed that neuromechanical responses are highly topographically organized and contribute to “sensory guidance” of our limbs during locomotion.

Chimpanzees, like human children, do not rely solely on their own actions to make use of novel causal relations, but they can learn causal sequences based on observation alone

Chimpanzees use observed temporal directionality to learn novel causal relations. Claudio Tennie. Primates, September 23 2019. https://link.springer.com/article/10.1007/s10329-019-00754-9

Abstract: We investigated whether chimpanzees use the temporal sequence of external events to determine causation. Seventeen chimpanzees (Pan troglodytes) witnessed a human experimenter press a button in two different conditions. When she pressed the “causal button” the delivery of juice and a sound immediately followed (cause-then-effect). In contrast, she pressed the “non-causal button” only after the delivery of juice and sound (effect-then-cause). When given the opportunity to produce the desired juice delivery themselves, the chimpanzees preferentially pressed the causal button, i.e., the one that preceded the effect. Importantly, they did so in their first test trial and even though both buttons were equally associated with juice delivery. This outcome suggests that chimpanzees, like human children, do not rely solely on their own actions to make use of novel causal relations, but they can learn causal sequences based on observation alone. We discuss these findings in relation to the literature on causal inferences as well as associative learning.

Keywords: Causal cognition Social learning Chimpanzees Action representation Simultaneous conditioning Primate cognition

Consumers view time-donations as morally better than money-donations because they perceive time-donations as signaling greater emotional investment in the cause & therefore better moral character

Johnson, Samuel G. B., and Seo Y. Park. 2019. “Moral Signaling Through Donations of Money and Time.” PsyArXiv. September 23. doi:10.31234/osf.io/tg9xs

Abstract: Prosocial acts typically take the form of time- or money-donations. Do third-parties differ in how they evaluate these different kinds of donations? Here, we show that consumers view time-donations as more morally praiseworthy than money-donations, even when the resource investment is comparable. This moral preference occurs because consumers perceive time-donations as signaling greater emotional investment in the cause and therefore better moral character; this occurs despite consumers’ belief that time-donations are less effective than money-donations (Study 1). The more signaling power of time-donations has downstream implications for interpersonal attractiveness in a dating context (Study 2) and for donor decision-making (Study 3). Moreover, donors who are prompted with an affiliation rather (versus dominance) goal are likelier to favor time-donations (Study 3). However, reframing money-donations in terms of time (e.g., donating a week’s salary) reduced and even reversed these effects (Study 4). These results support theories of prosociality that place reputation-signaling as a key motivator of consumers’ moral behavior. We discuss implications for the charity market and for social movements, such as effective altruism, that seek to maximize the social benefit of consumers’ altruistic acts.

Humans may have evolved to experience far greater pain, malaise & suffering than the rest of the animal kingdom, due to their intense sociality giving them a reasonable chance of receiving help

The neuroscience of vision and pain: evolution of two disciplines. Barbara L. Finlay. Proceedings of the Royal Society B, Volume 374, Issue 1785, September 23 2019. https://doi.org/10.1098/rstb.2019.0292

Abstract: Research in the neuroscience of pain perception and visual perception has taken contrasting paths. The contextual and the social aspects of pain judgements predisposed pain researchers to develop computational and functional accounts early, while vision researchers tended to simple localizationist or descriptive approaches first. Evolutionary thought was applied to distinct domains, such as game-theoretic approaches to cheater detection in pain research, versus vision scientists' studies of comparative visual ecologies. Both fields now contemplate current motor or decision-based accounts of perception, particularly predictive coding. Vision researchers do so without the benefit of earlier attention to social and motivational aspects of vision, while pain researchers lack a comparative behavioural ecology of pain, the normal incidence and utility of responses to tissue damage. Hybrid hypotheses arising from predictive coding as used in both domains are applied to some perplexing phenomena in pain perception to suggest future directions. The contingent and predictive interpretation of complex sensations, in such domains as ‘runner's high’, multiple cosmetic procedures, self-harm and circadian rhythms in pain sensitivity is one example. The second, in an evolutionary time frame, considers enhancement of primary perception and expression of pain in social species, when expressions of pain might reliably elicit useful help.

---
My comments: Could it be that those who experienced no pain died much more frequently (hemorrhages, internal damage, etc) than those who experienced pain and asked for support?

---
From 2015... The unique pain of being human. Barbara Finlay. New Scientist, Volume 226, Issue 3020, May 9 2015, Pages 28-29. https://doi.org/10.1016/S0262-4079(15)30311-0

We seem to experience pain differently to other animals. Why, wonders neuroscientist Barbara Finlay

FOR years I worked in Brazil on the evolution of vision in primates and was often stationed near monkeys who had undergone caesarean sections. Their recovery was impressive – in stark contrast with my own after two
C-sections. Within hours, the monkeys would be sitting, climbing and annoying each other.

Looking at these unbothered monkeys, I began to think that some causes of the sensation of pain in humans might be fundamentally different to those in other animals.

The basic function of pain is the same for all vertebrates: it alerts an animal to potential damage and reduces activity after trauma. It is often argued that pain must be different in humans because of our ability to anticipate it or imagine its effects on us. But independent of whether cognition and culture can modify
pain, I am suggesting a more basic difference in humans compared with animals: that some varieties, such as labour pain, appear only in humans, and others such as posttrauma pain are magnified.

These forms of pain appear in tandem with the ability to recruit help, to elicit an altruistic response in others. By “help” I mean the simple protection and provisioning that parents supply to children, not medical intervention – although our medical interventions probably first grew from this basis. This view arises from work carried out nearly 50 years ago by pain researcher Patrick Wall. He was the first person to suggest a functional view of pain – that it should be understood as a mixture of sensation and the motivation to make it stop, not sensation alone. His starting point was the now wellresearched placebo effect. His account explained how rituals or procedures offered by a doctor or shaman, regardless of the efficacy or even existence of an actual treatment, could reduce pain.

But even with this early advocate for a functional view of it, studies of pain have mainly concentrated on receptors and specific clinical manifestations, while neglecting its purpose. Pain is a motivational signal to get an animal to do something – escape from a source of damage, protect a wound or devote energy to recovery. Wall argued that one of its roles in humans is as a motivation to seek help from some trusted source. When that goal is satisfied, pain is relieved.

I want to extend this view. I think that, over evolutionary time, several stimuli and situations that are not painful in other animals have come to be experienced as painful for humans. This is because our obvious distress elicits help from others and hence offers a survival advantage. This is distinct from the numerous demonstrations that context and culture can alter our sensation of pain. I argue that the primary circuitry of pain and malaise has been changed in human evolution in cases where getting help from others would be useful.

The pain of altruism

There is much indirect evidence in support of this “pain of altruism”. Take, for instance, the fact that certain types of pain are not associated with any physiological damage, and studies that show the presence of others can affect reported sensations of pain. Labour pain is another good example.

Across all human cultures, there are nearly always helpers, from relatives to medical professionals, who attend births. Giving birth is risky and help at any level improves survival. The cliché scenario of a mother from some exotic tribe going off to give birth alone is not unheard of, but is exceedingly rare. By contrast, among our primate relatives, solitary birth is the norm.

Human childbirth appears to be uniquely painful among members of the animal kingdom. Typically, scientists have accounted for this in terms of the size mismatch between the infant’s head and the mother’s pelvis, and not in terms of differences in social structure.

Human birth is dangerous, but we are not the only primates at risk – the smallest-bodied, large-brained monkeys, like marmosets, have similar head to pelvis disproportionality and birth-related mortality. Yet compared with humans, primates appear to give birth with little pain. Ungulates such as horses and cattle produce large, long-limbed offspring with a substantial chance of complications, but with little evidence of distress. Any such evidence, in fact, could prove fatal by alerting predators. So why is childbirth so painful for women? The source of labour pain is the contraction of the uterus and dilation of the cervix, but these are not damaging or pathological in themselves. Rather they predict a risky and potentially lethal event: the actual birth, to occur hours later. I suggest that protracted labour pains make us show distress and recruit help from others well in advance of the birth – a strategy that offers a survival advantage, as the offspring of those who seek help are more likely to survive.

But if the pain of labour is not linked to tissue damage and is primarily a call to enlist help, why does it have to be so excruciating? Helping someone in pain comes at a cost to the helper, and societies can’t afford to tolerate “cheating” in the form of excessive malingering or fake pain. I think that the pain of altruism may be connected to the concept of honest signalling in behavioural biology, whereby producing a signal has to be sufficiently costly to deter cheaters and freeloaders. Pain could be the obligatory cost of an honest signal, in the same way that a peacock’s tail or stag’s antlers are hard-to-fake signs of their owner’s underlying fitness.

However, since pain has no concrete physical manifestation that others can verify, cheating is difficult to eliminate – there is probably not one person reading this article who has never exaggerated pain or illness for their own benefit. If feeling pain to recruit the help of others is an evolutionarily assembled neural construct, this could be triggered in error. Perhaps this is what happens in the case of mysterious but distressing illnesses for which a direct physical cause cannot be found.

The pain of altruism also explains why malaise after trauma and infection are long and exaggerated for humans compared with laboratory mice. Mice, like most non-human animals, cannot provide the high level of social support needed to nurse an individual with an illness or a broken leg. Such injured animals must confine their energetically expensive immune response to the minimum time needed to survive without help.

It is also possible that this pain of altruism has been extended to domesticated livestock and pets, as they too can enlist human help. In contrast, most adult animals in the wild try to avoid showing disability or distress, which might attract rivals or predators.

Periods of extended illness might only be feasible in species where individuals protect and provide for others for such lengths of time. If help from others is the root cause of some types of pain, then this needs to be factored into our understanding of pain and disease. An evolutionary calculation that we cannot be aware of, rather than a specific physical cause, could be the source of much real agony.

Other Unexplained Agony
Pain exists to get an animal to change its behaviour. This functional account of pain may explain some ongoing mysteries, such as the cause of the muscle soreness that follows a day of intense exercise, which has eluded physiological explanation. The popular idea that it is due to the build-up of lactic acid has been discounted, as have other proposed theories. Bodybuilders have found that the optimal way to build muscle is to take a rest day after a strenuous workout. Perhaps nature has converged on the same idea, and muscle soreness is simply a signal to rest, to enable optimal muscle building.

Persistence of pain in humans and other mammals

Persistence of pain in humans and other mammals. Amanda C. de C. Williams. Proceedings of the Royal Society B, Volume 374, Issue 1785, September 23 2019. https://doi.org/10.1098/rstb.2019.0276

Abstract: Evolutionary models of chronic pain are relatively undeveloped, but mainly concern dysregulation of an efficient acute defence, or false alarm. Here, a third possibility, mismatch with the modern environment, is examined. In ancestral human and free-living animal environments, survival needs urge a return to activity during recovery, despite pain, but modern environments allow humans and domesticated animals prolonged inactivity after injury. This review uses the research literature to compare humans and other mammals, who share pain neurophysiology, on risk factors for pain persistence, behaviours associated with pain, and responses of conspecifics to behaviours. The mammal populations studied are mainly laboratory rodents in pain research, and farm and companion animals in veterinary research, with observations of captive and free-living primates. Beyond farm animals and rodent models, there is virtually no evidence of chronic pain in other mammals. Since evidence is sparse, it is hard to conclude that it does not occur, but its apparent absence is compatible with the mismatch hypothesis.

Humans’ left cheek portrait bias extends to chimpanzees: Depictions of chimps on Instagram

Humans’ left cheek portrait bias extends to chimpanzees: Depictions of chimps on Instagram. Annukka K. Lindell. Laterality: Asymmetries of Body, Brain and Cognition, Sep 22 2019. https://doi.org/10.1080/1357650X.2019.1669631

ABSTRACT: When posing for portraits, humans favour the left cheek. This preference is argued to stem from the left cheek’s greater expressivity: as the left hemiface is predominantly controlled by the emotion-dominant right hemisphere, it expresses emotion more intensely than the right hemiface. Whether this left cheek bias extends to our closest primate relatives, chimpanzees, has yet to be determined. Given that humans and chimpanzees share the same oro-facial musculature and contralateral cortical innervation of the face, it appears probable that humans would also choose to depict chimps showing the more emotional left cheek. This paper thus examined portrait biases in images of chimpanzees. Two thousand photographs were sourced from Instagram’s “Most Recent” feed using the #chimpanzee, and coded for pose orientation (left, right) and portrait type (head and torso, full body). As anticipated, there were significantly more left cheek (57.2%) than right cheek images (42.8%), with the bias observed across both head and torso and full body portraits. Thus humans choose to depict chimpanzees just as we depict ourselves: offering the left cheek. As such, these data confirm that the left cheek bias is observed in both human and non-human primates, consistent with an emotion-based account of the orientation preference.

KEYWORDS: Left, right, emotion, photo, primate


In face perception, reducing visual input greatly increases perceived attractiveness; left/right half faces look far more attractive than bilaterally symmetric whole faces

Face perception loves a challenge: Less information sparks more attraction. Javid Sadr, Lauren Krowicki. Vision Research, Volume 157, April 2019, Pages 61-83. https://doi.org/10.1016/j.visres.2019.01.009

Highlights
•    In face perception, reducing visual input greatly increases perceived attractiveness.
•    This “partial information effect” occurs with blur, contrast reduction, and occlusion.
•    Left/right half faces look far more attractive than bilaterally symmetric whole faces.
•    There are no male/female differences in this “less is more” enhancement effect.

Abstract: Examining hedonic questions of processing fluency, objective stimulus clarity, and goodness-of-fit in face perception, across three experiments (blur, contrast, occlusion) in which subjects performed the simple, natural task of rank-ordering faces by attractiveness, we find a very consistent and powerful effect of reduced visual input increasing perceived attractiveness. As images of faces are blurred (i.e., as higher spatial frequencies are lost, mimicking at-a-distance, eccentric, or otherwise unaccommodated viewing, tested down to roughly 6 cycles across the face), reduced in contrast (linearly, down to 33% of the original image’s), and even half-occluded, the viewer’s impression of the faces’ attractiveness, relative to non- or less-degraded faces, is greatly enhanced. In this regard, the blur manipulation exhibits a classic exponential profile, the contrast manipulation follows a simple linear trend. Given the far superior attractiveness of half-occluded faces, which have no symmetry whatsoever, we also see that it may be incorrect to claim that facial symmetry is attractive and perhaps more accurate that asymmetry may be unattractive. As tested with a total of 200 novel female faces over three experiments, we find absolutely no male/female differences in this “partial information effect” of enhanced subjective attraction, nor do we find differences across the repetition of the task through to a second block of trials in which the faces are re-encountered and no longer novel. Finally, whereas objective stimulus quality is reduced, we suggest a positive hedonic experience arises as a subjective phenomenological index of enhanced perceptual goodness-of-fit, counter-intuitively facilitated by what may be stimulus-distilling image-level manipulations.

Humans have likely spent the vast majority of our history as a species in relatively egalitarian, small-scale societies; this does not mean humans are by nature egalitarian, but that ecological & demographic conditions suppressed dominance

Making and unmaking egalitarianism in small-scale human societies. Chris von Rueden. Current Opinion in Psychology, Volume 33, June 2020, Pages 167-171. https://doi.org/10.1016/j.copsyc.2019.07.037

Highlights
•    Modal political organization of ancestral human societies is egalitarianism.
•    Role of prestige in human hierarchy is a contributing factor to egalitarianism.
•    Historical shifts to greater inequality include coercive and non-coercive forces.

Abstract: Humans have likely spent the vast majority of our history as a species in relatively egalitarian, small-scale societies. This does not mean humans are by nature egalitarian. Rather, the ecological and demographic conditions common to small-scale societies favored the suppression of steep, dominance-based hierarchy and incentivized relatively shallow, prestige-based hierarchy. Shifts in ecological and demographic conditions, particularly with the spread of agriculture, weakened constraints on coercion.

Check also: Romanticizing the Hunter-Gatherer, Despite the Girl Infanticide, the Homicide Rate, etc.
Romanticizing the Hunter-Gatherer. William Buckner. Quillette, December 16, 2017.  https://www.bipartisanalliance.com/2017/12/romanticizing-hunter-gatherer-despite.html

Zimmermann's World Resources And Industries, 1st edition, 1933

Zimmermann's World Resources And Industries, 1st edition, 1933
https://drive.google.com/open?id=10USDzBnuR0GxZS30wvuWBlZ7DyZ1oAiB

Erich Walter Zimmermann, resource economist, was born in Mainz, Germany, on July 31, 1888 and died in Austin, United States of America, on February 16, 1961. He was an economist at the University of North Carolina and later the University of Texas.
Zimmermann of the Institutional school of economics[1] called his real world theory the functional theory of mineral resources. His followers have coined the term resourceship to describe the theory.[2] Unlike traditional descriptive inventories, Zimmermann's method offered a synthetic assessment of the human, cultural, and natural factors that determine resource availability.
Zimmermann rejected the assumption of fixity. Resources are not known, fixed things; they are what humans employ to service wants at a given time. To Zimmermann (1933, 3; 1951, 14), only human "appraisal" turns the "neutral stuff" of the earth into resources.[3] What are resources today may not be tomorrow, and vice versa. According to Zimmermann, "resources are not, they become."[4] "According to the definition of ew Zimmerman, the word ,"resource " does not refer to a thing but to a function which a thing may perform to an operation in which it may take part,namely,the function or operation of attaining a given end such a satisfying a want.

Bibliography

  • “Resources of the South”, The South-Atlantic Quarterly (July 1933)
  • World Resources and Industries: A Functional Appraisal of the Availability of Agricultural and Industrial Resources (1933) New York: Harper & Brothers
  • World Resources and Industries, 2nd revised ed. (1951) New York: Harper & Brothers

Sunday, September 22, 2019

More than 40% of our participants experienced difficulties in starting or keeping an intimate relationship; poor flirting skills, poor mate signal-detection ability, and high shyness were associated with poor performance in mating

Mating Performance: Assessing Flirting Skills, Mate Signal-Detection Ability, and Shyness Effects. Menelaos Apostolou et al. Evolutionary Psychology, September 22, 2019. https://doi.org/10.1177/1474704919872416

Abstract: Several people today experience poor mating performance, that is, they face difficulties in starting and/or keeping an intimate relationship. On the basis of an evolutionary theoretical framework, it was hypothesized that poor mating performance would be predicted by poor flirting skills, poor mate signal-detection ability, and high shyness. By employing a sample of 587 Greek-speaking men and women, we found that more than 40% of our participants experienced difficulties in starting and/or keeping an intimate relationship. We also found that poor flirting skills, poor mate signal-detection ability, and high shyness were associated with poor performance in mating, especially with respect to starting an intimate relationship. The effect sizes and the odds ratios indicated that flirting skills had the largest effect on mating performance, followed by the mate signal-detection ability and shyness.

Keywords: mating performance, mating, mismatch, flirting, shyness

Is There a Relationship Between Cyber-Dependent Crime, Autistic-Like Traits and Autism?

Is There a Relationship Between Cyber-Dependent Crime, Autistic-Like Traits and Autism? Katy-Louise Payne et al. Journal of Autism and Developmental Disorders, October 2019, Volume 49, Issue 10, pp 4159–4169. https://link.springer.com/article/10.1007/s10803-019-04119-5

Abstract; International law enforcement agencies have reported an apparent preponderance of autistic individuals amongst perpetrators of cyber-dependent crimes, such as hacking or spreading malware (Ledingham and Mills in Adv Autism 1:1–10, 2015). However, no empirical evidence exists to support such a relationship. This is the first study to empirically explore potential relationships between cyber-dependent crime and autism, autistic-like traits, explicit social cognition and perceived interpersonal support. Participants were 290 internet users, 23 of whom self-reported being autistic, who completed an anonymous online survey. Increased risk of committing cyber-dependent crime was associated with higher autistic-like traits. A diagnosis of autism was associated with a decreased risk of committing cyber-dependent crime. Around 40% of the association between autistic-like traits and cyber-dependent crime was mediated by advanced digital skills.

Keywords: Cyber-dependent crime Digital skills Autism Autistic-like traits Explicit social cognition Interpersonal support

Ledingham and Mills (2015) define cybercrime as “The illegal use of computers and the internet, or crime committed by means of computers and the internet.” Within the legal context (e.g. in the USA, UK, Australia, New Zealand, Germany, the Netherlands and Denmark; Ledingham and Mills 2015), there are two distinct types of cybercrime: (1) cyber-dependent crime, which can only be committed using computers, computer networks or other forms of information communication technology (ICT). These include the creation and spread of malware for financial gain, hacking to steal important personal or industry data and distributed denial of service (DDoS) attacks to cause reputational damage; and (2) cyber-enabled crime such as fraud, which can be conducted online or offline, but online may take place at unprecedented scale and speed (McGuire and Dowling 2013; The National Crime Agency: NCA 2016). In England and Wales, all forms of cybercrime were included in the Office for National Statistics crime estimates for the first time in 2016, which resulted in a near doubling of the crime rate. Cyber-dependent crime specifically represented 20% of UK crime (Office for National Statistics 2017) and in England and Wales in 2018, 976,000 cyber-dependent computer misuse incidents were reported (computer viruses and unauthorised access, including hacking: Office for National Statistics 2019). Furnell et al. (2015) propose that it is more important to understand the factors leading to cyber-dependent incidents and how to prevent them, than to focus on metrics such as specific costs to the global economy. Having interviewed cyber-dependent criminals, the NCA’s intelligence assessment (2017) identified that perpetrators are likely to be teenage males who are unlikely to be involved in traditional crime and also that autism spectrum disorder (ASD, hereafter autism) appears to be more prevalent amongst cyber-dependent criminals than the general populace—though this remains unproven. No socio-demographic bias has yet been identified amongst cyber-dependent offenders or those on the periphery of criminality.

This apparent relationship between cyber-dependent crime and autism is echoed in a survey of six international law enforcement agencies’ (UK; USA; Australia; New Zealand; Germany; the Netherlands; Denmark) experiences and contact with autistic1 cybercriminals (Ledingham and Mills 2015), which indicated that some autistic individuals commit cyber-dependent offences. Offences committed included: hacking; creating coding to enable a crime to be committed; creating, deploying or managing a bot or bot-net; and malware (Ledingham and Mills 2015). This was a small-scale study, limiting the generalisability of findings, but it does indicate a presence of autistic offenders within cyber-dependent crime populations, although the link between autism and cyber-dependent crime remains largely speculative as cyber-dependent criminality may be evidenced within a wide range of populations. Further clarification of any relationship between autism and cyber-dependent crime is required before any conclusions can be inferred.

Studies in Asia, Europe, and North America have identified an average prevalence of autism of between 1% and 2% (CDC 2018). Autism is a long-term condition predominately diagnosed in males, characterised by persistent deficits in social communication and interaction coupled with restricted and repetitive patterns of behaviour, interests or activities (American Psychiatric Association 2013; CDC 2018). One possibility is that the anecdotal evidence of apparent autism-like behaviour in cyber-dependent criminals may actually be reflecting people with high levels of autistic-like traits who do not have a diagnosis of autism (Brosnan in press). Autistic-like traits refer to behavioural traits such as social imperviousness, directness in conversation, lack of imagination, affinity for solitude, and difficulty displaying emotions (Gernsbacher et al. 2017). Autistic-like traits are argued to vary continuously across the general population, with studies reporting that autistic groups typically have higher levels of autistic-like traits than non-autistic comparison groups (Baron-Cohen et al. 2001, 2006; Constantino and Todd 2003; Kanne et al. 2012; Plomin et al. 2009; Posserud et al. 2006; Skuse et al. 2009; see also Bölte et al. 2011; Gernsbacher et al. 2017; Ronald and Hoekstra 2011; Ruzich et al. 2015a for meta-analysis). Autistic-like traits are typically assessed through self-report measures such as the 50-item Autism Spectrum Quotient (AQ: Baron-Cohen et al. 2001; see also Baghdadli et al. 2017). Ruzich et al.’s (2015a) meta-analysis of responses to the AQ from almost 7000 non-autistic and 2000 autistic respondents identified that non-autistic males had significantly higher levels of autistic-like traits than non-autistic females, and that autistic people had significantly higher levels of autistic-like traits compared to the non-autistic males (with no sex difference within the autistic sample). A clinical cut-off of a score of 26 on the AQ has been proposed to be suggestive of autism (Woodbury-Smith et al. 2005b), and whilst there are similarities between those with and without a diagnosis of autism who score above the cut-off the AQ, the AQ is not diagnostic. Importantly, there are also differences between those with and without a diagnosis of autism who scored above the cut-off (Ashwood et al. 2016; Bralton et al. 2018; Focquaert and Vanneste 2015; Lundqvist and Lindner 2017; see also Frith 2014).

With respect to cyber-dependent crime, some members of both autistic and high autistic-like trait groups will have developed advanced digital skills that are likely to be required to commit cyber-dependent crime. Indeed a specific relationship between ‘autism and the technical mind’ has been previously speculated by Baron-Cohen (2012; see also Wei et al. 2013). Moreover, computer science students and those employed in technology are two of the groups who typically possess higher levels of autistic-like traits (Baron-Cohen et al. 2001; Billington et al. 2007; Ruzich et al. 2015b). These relationships are potentially significant, as cyber-dependent criminal activity requires an advanced level of cyber-related skills (such as proficiency in programming in Java, C/C++, disassemblers, and assembly language and programming knowledge of scripting languages [PHP, Python, Perl, or Shell]; Insights 2018). Thus, there may be an association between autistic-like traits and the potential to develop the advanced digital skills required for cyber-dependent crime.

Assessing the relationship between autistic-like traits and cyber deviancy in a sample of college students, Seigfried-Spellar et al. (2015) found that of 296 university students, 179 (60%) engaged in some form of cyber-deviant behaviour (such as hacking, cyberbullying, identity theft, and virus writing) and the AQ distinguished between those who did and those who did not self-report cyber-deviant behaviour, with higher AQ scores among those reporting cyber-deviant behaviours. The authors also reported that if they used a cut-off score on the AQ of 26 to indicate high levels of autistic-like traits associated with autism, then 7% of the computer non-deviants and 6% of the computer deviants scored in this range. The authors concluded that ‘based on these findings alone, there is no evidence of a significant link between clinical levels of [autism] and computer deviance in the current sample. Nevertheless, the current study did find evidence for computer deviants reporting more autistic-like traits, according to the AQ, compared to computer non-deviants’. However, ‘cyber-deviant’ behaviour in Seigfried-Spellar et al.’s study included both cyber-enabled crimes such as cyberbullying and identity theft, as well as cyber-dependent crimes such as hacking and virus writing. This requires a more nuanced examination as there may be important differences in the relationship between autistic-like traits and cyber-dependent crime compared with cyber-enabled crime.

Cyber-enabled crime is an online variant of traditional crimes (such as fraud) and shares common motivations such as financial gain, whereas the motivations for cyber-dependent crime can be based around a sense of challenge in hacking into a system or enhanced reputation and credibility within hacker communities (NCA 2017). This may be pertinent for the relationship between cyber-dependent crime specifically and autism or autistic-like traits, since cyber-dependent criminals typically have not engaged in traditional crime (NCA 2017) and autism has been associated with generally being law abiding and low rates of criminality (Blackmore et al. 2017; Ghaziuddin et al. 1991; Heeramun et al. 2017; Howlin 2007; Murrie et al. 2002; Wing 1981; Woodbury-Smith et al. 2005a, 2006). In addition, several studies have suggested that autistic internet-users can demonstrate a preference for mediating social processes online, such as preferring to use social media over face-to-face interaction to share interests (Brosnan and Gavin 2015; Gillespie-Lynch et al. 2014; van der Aa et al. 2016). This may be significant, as it has been suggested that social relationships developed online are key to progressing into cyber-dependent crime, with forum interaction and reputation development being key drivers of cyber-dependent criminality (NCA 2017).

Finally, failing to appreciate the impact of crime upon others may be a relevant factor, as autism has been argued to reflect a diminished social cognition (e.g., theory of mind, Baron-Cohen et al. 1985). It has been suggested that there are two levels of social cognition; namely, a quicker and less conscious implicit social cognition, and a more conscious, slower and controlled explicit social cognition (Frith and Frith 2008; see also Heyes 2014). Autistic individuals are often not impaired in explicit social cognition, but are reportedly impaired on implicit social cognition (Callenmark et al. 2014; see also Dewey 1991; Frith and Happé 1999). This profile is also reflected in non-social cognition such as reasoning (Brosnan et al. 2016, 2017; Lewton et al. 2018) which may be better characterised as impaired processing of automatic, cognitively efficient heuristics (Brosnan and Ashwin 2018; Happé et al. 2017). Explicit social cognition is therefore a more pertinent measure of the potential to consider the impact of crime upon others.

The aim of the present study was to explore the apparent relationship identified by international law enforcement agencies between autistic-like traits and cyber-dependent crime. To do this, we conducted an online survey exploring autistic-like traits, cyber-related activities (legal and illegal) as well as perceived interpersonal support and explicit theory of mind. Our research question addressed whether higher autistic-like traits, lower explicit theory of mind and lower perceived interpersonal support would increase the risk of committing cyber-dependent crime. We also addressed whether autistic-like traits would be associated with cyber-dependent crime and whether this relationship would be mediated by advanced digital skills. Given the findings associating higher levels of law-abiding behaviour with autism, we also speculated that autism may represent a group of individuals with higher levels of autistic-like traits, but without a higher risk of committing cyber-dependent crime.


---

Discussion

International law enforcement agencies report an apparent relationship between autism and cyber-dependent crime, although any such link remains unproven (Ledingham and Mills ; NCA ). This was the first study to empirically explore whether autism, autistic-like traits, explicit social cognition, interpersonal support and digital skills were predictors of cyber-dependent criminality. Whilst higher levels of autistic-like traits were associated with a greater risk of committing cyber-dependent crime, a self-reported diagnosis of autism was associated with a decreased risk of committing cyber-dependent crime. Around 40% of the association between autistic-like traits and cyber-dependent crime was attributable to greater levels of advanced digital skills. Basic digital skills were also found to be a mediator between autistic-like traits and cyber-dependent crime, although they accounted for a smaller proportion of the association than advanced digital skills.
These findings are consistent with the proposal that the apparent association between autism and cyber-dependent crime identified by law enforcement agencies may be reflecting higher levels of autistic-like traits amongst cybercriminals but that this does not necessarily equate to autism being a risk factor for cybercrime. This confusion may well arise because typically, autistic people do report higher levels of autistic-like traits than the general population (Ruzich et al. ). Cyber-dependent crime may therefore represent an area that distinguishes high autistic-trait non-autistic groups from autistic groups, consistent with proposal that people with autism differ qualitatively from non-autistic people who are nevertheless high in autistic-like traits (see Ashwood et al. ; Frith ). The finding that autistic respondents were less likely to commit cyber-dependent crime is also consistent with literature suggesting that autistic people are generally as law abiding, if not more so, than the general population. Lower levels of criminality are shown, at least for certain types of crime (Blackmore et al. ; Cheely et al. ; Ghaziuddin et al. ; Heeramun et al. ; Howlin ; King and Murphy ; Murrie et al. ; Wing ; Woodbury-Smith et al. , ; but see, Rava et al. ; Tint et al. ).
Thus, there is evidence that higher AQ scores are associated with higher levels of cyber-dependent crime regardless of an autism diagnosis. As this association was independent from the autism diagnosis, there may be something about autistic-like traits beyond the diagnostic criteria for autism that relates to cyber-dependent criminal activity. The mediation analysis suggests that an association between autistic-like traits and advanced digital skills may represent a key factor. We cautiously state above that those reporting an autism diagnosis were less likely to report cyber-dependent crime. Cautiously, as this could be for various reasons beyond high AQ and autism being different things, including a diagnosis of autism leading to some protection (e.g., more support leading to less potential criminal behaviour; see Heeramun et al. ). Importantly, however, there are potential selection issues in relation to individuals who respond to an invitation to complete an online survey on this topic, thus the possibility of selection bias cannot be ruled out. We do not know how many did not respond to the invitations (and therefore could not identify a response rate, for example) and the apparent protective effect could be a chance finding due to small numbers. Future research using larger samples can address such concerns and until that time the suggestion that autism may be protective should be considered speculative, especially as the data is self-reported and diagnostic status could not be independently verified in the present study.
Previous research has identified higher levels of autistic-like traits being present within scientific disciplines in which computer science students and employees are included (Baron-Cohen et al. ; Billington et al. ; Ruzich et al. ). This study is the first to specify a direct relationship between higher levels of autistic-like traits and advanced digital skills. In addition to being a pre-requisite for committing cyber-dependent crimes, these skills are essential for the cyber security industry which will have an estimated 3.5 million unfulfilled jobs by 2021 (Morgan ). This study suggests that targeting groups high in autistic-like traits would be a beneficial strategy to meet this employment need. Given the employment difficulties that can be faced by members of the autistic community (Buescher et al. ; Knapp et al. ; see also Gotham et al. ; Hendricks ; Howlin ; Levy and Perry ; National Autistic Society ; Taylor et al. ; Shattuck et al. ) and that around 46% of autistic adults who are employed are either over-educated or exceed the skill level needed for the roles they are in Baldwin et al. (), targeting the autistic community for cyber security employment may be particularly beneficial.
Notwithstanding the limitations described above, this may be particularly pertinent as this study found that a diagnosis of autism was associated with reduced cyber-dependent criminality. This would be consistent with perceptions of autistic strengths of honesty and loyalty (de Schipper et al. )—ideal attributes within employment settings. Importantly, this is not to suggest that all autistic people are good with technology, or that all autistic people should seek employment within cyber security industries (see Milton ). Rather, this study highlights that in a particularly challenging employment context, some members of the autistic community may be ideally suited to such employment opportunities and emphasises the need for employers to ensure that their recruitment methods and working environments are autism-friendly and inclusive (see Hedley et al. for review).
The direct link between autistic-like traits and cyber-dependent crime is also consistent with previous research (Seigfried-Spellar et al. ) and may extend to a relationship with cyber-enabled crime (such as online fraud). Seigfried-Spellar et al. () explored relationships between autistic-like traits and cyber-deviancy more broadly defined than cyber-dependent crime. Future research could explore whether the level of autistic-like traits, mediated by advanced digital skills, also relates to cyber-enabled crime, and whether there are any direct effects that are specific to cyber-dependent crime. Seigfried-Spellar et al. () and the present study were both cross-sectional studies. The mediation of advanced digital skills between autistic-like traits and cyber-dependent crime has been assumed in the present study, but this could be best established in longitudinal research. Exploring prison populations to identify if ‘traditional’ crime was related to autistic-like traits found no differences between prisoners and the general population (Underwood et al. ), which may suggest that autistic-like traits are associated with cybercrime specifically (that is, cyber-dependent crime and potentially cyber-enabled crime).
Sex, age, non-verbal IQ, explicit social cognition and perceived interpersonal support did not significantly relate to cyber-dependent criminal activity, which serves to highlight the salience of autistic-like traits. A potential limitation is that explicit social cognition was assessed, but not implicit social cognition. Based on the autism literature (Callenmark et al. ; Dewey ; Frith and Happé ), we would not necessarily expect difficulties with explicit social cognition in groups with high autistic-like traits. Implicit social cognition was also assessed by Callenmark et al. using interviews after the IToSK. Such interviews, however, do not readily extend to the online context and future research could explore any role of implicit social cognition in cyber-dependent crime. However, recent accounts of implicit social cognition have questioned whether such a system exists and findings from such measures can better be attributed to general attentional processes (Conway et al. ; Heyes ; Santiesteban et al. , , ).
Future research should also focus on autistic communities as well as those convicted of cyber-dependent and cyber-enabled crimes to further develop our understanding of this area, an important aspect of which is the potential strengths some members of the autistic community can bring to cyber security employment.

Generalizable and Robust TV Advertising Effects: Substantially smaller advertising elasticities compared to the results documented in the literature

Shapiro, Bradley and Hitsch, Guenter J. and Tuchman, Anna, Generalizable and Robust TV Advertising Effects (September 17, 2019). SSRN: http://dx.doi.org/10.2139/ssrn.3273476

Abstract: We provide generalizable and robust results on the causal sales effect of TV advertising for a large number of products in many categories. Such generalizable results provide a prior distribution that can improve the advertising decisions made by firms and the analysis and recommendations of policy makers. A single case study cannot provide generalizable results, and hence the literature provides several meta-analyses based on published case studies of advertising effects. However, publication bias results if the research or review process systematically rejects estimates of small, statistically insignificant, or “unexpected” advertising elasticities. Consequently, if there is publication bias, the results of a meta-analysis will not reflect the true population distribution of advertising effects. To provide generalizable results, we base our analysis on a large number of products and clearly lay out the research protocol used to select the products. We characterize the distribution of all estimates, irrespective of sign, size, or statistical significance. To ensure generalizability, we document the robustness of the estimates. First, we examine the sensitivity of the results to the assumptions made when constructing the data used in estimation. Second, we document whether the estimated effects are sensitive to the identification strategies that we use to claim causality based on observational data. Our results reveal substantially smaller advertising elasticities compared to the results documented in the extant literature, as well as a sizable percentage of statistically insignificant or negative estimates. If we only select products with statistically significant and positive estimates, the mean and median of the advertising effect distribution increase by a factor of about five. The results are robust to various identifying assumptions, and are consistent with both publication bias and bias due to non-robust identification strategies to obtain causal estimates in the literature.

Keywords: Advertising, Publication Bias, Generalizability
JEL Classification: L00, L15, L81, M31, M37, B41, C55, C52, C81, C18