Tuesday, October 15, 2019

Guppies: Brain size affects responsiveness in mating behavior to variation in predation pressure and sex‐ratio

Brain size affects responsiveness in mating behavior to variation in predation pressure and sex‐ratio. Alberto Corral‐López  Maksym Romensky  Alexander Kotrschal  Severine D. Buechel  Niclas Kolm. Journal of Evolutionary Biology, October 14 2019. https://doi.org/10.1111/jeb.13556

Abstract: Despite ongoing advances in sexual selection theory, the evolution of mating decisions remains enigmatic. Cognitive processes often require simultaneous processing of multiple sources of information from environmental and social cues. However, little experimental data exist on how cognitive ability affects such fitness‐associated aspects of behavior. Using advanced tracking techniques, we studied mating behaviors of guppies artificially selected for divergence in relative brain size, with known differences in cognitive ability, when predation threat and sex‐ratio was varied. In females, we found a general increase in copulation behavior in when the sex‐ratio was female biased, but only large‐brained females responded with greater willingness to copulate under a low predation threat. In males, we found that small‐brained individuals courted more intensively and displayed more aggressive behaviors than large‐brained individuals. However, there were no differences in female response to males with different brain size. These results provide further evidence of a role for female brain size in optimal decision‐making in a mating context. In addition, our results indicate that brain size may affect mating display skill in male guppies. We suggest that it is important to consider the association between brain size, cognitive ability and sexual behavior when studying how morphological and behavioral traits evolve in wild populations.

7 types of sugaring: Sugar prostitution, compensated dating, compensated companionship, sugar dating, sugar friendships, sugar friendships with benefits, & pragmatic love

“It’s Its Own Thing”: A Typology of Interpersonal Sugar Relationship Scripts. Maren T. Scull. Sociological Perspectives, September 16, 2019. https://doi.org/10.1177/0731121419875115

Abstract: Although academics have focused on sugaring in various parts of the globe, sugar relationships in the United States have largely been ignored. The few studies that address these arrangements in the United States often frame them as a form of prostitution. Drawing from 48 in-depth interviews with women in the United States who have been in sugar relationships, I adopt a connected lives approach to explore the structure of these arrangements and to assess the extent to which they are a form of prostitution. Overall, I found that, although there is a dominant, subcultural relationship script that serves as a blueprint for sugar arrangements, they comprise their own unique relational package and take a variety of forms when enacted on an interpersonal level. Specifically, I identified seven types of sugar relationships, only one of which can be considered prostitution. These included sugar prostitution, compensated dating, compensated companionship, sugar dating, sugar friendships, sugar friendships with benefits, and pragmatic love.

Keywords script theory, prostitution, transactional sex, qualitative methods

Popular version: The 7 types of sugar daddy relationships. Sarah Erickson. Univ of Colorado at Denver, Oct 15 2019. https://www.eurekalert.org/pub_releases/2019-10/uocd-t7t101219.php

Red-colored male sticklebacks carry more oxidative DNA damage in muscle, testis & sperm in the peak breeding season, but the females find them more attractive

Attractive male sticklebacks carry more oxidative DNA damage in the soma and germline. Sin‐Yeon Kim  Alberto Velando. Journal of Evolutionary Biology, October 14 2019. https://doi.org/10.1111/jeb.13552

Abstract: Trade‐offs between the expression of sexual signals and the maintenance of somatic and germline tissues are expected when these depend upon the same resources. Despite the importance of sperm DNA integrity, its trade‐off with sexual signalling has rarely been explored. We experimentally tested the trade‐off between carotenoid‐based sexual colouration and oxidative DNA damage in skeletal muscle, testis and sperm by manipulating reproductive schedule (early vs. late onset of breeding) in male three‐spined sticklebacks. Oxidative DNA damage was measured as the amount of 8‐hydroxy‐2‐deoxy‐Guanosine in genomic DNA. Irrespective of the experimentally manipulated reproductive schedule, individuals investing more in red colouration showed higher levels of oxidative DNA damage in muscle, testis and sperm during the peak breeding season. Our results show that the expression of red colouration traded off against the level of oxidative DNA damage possibly due to the competing functions of carotenoids as colorants and antioxidants. Thus, female sticklebacks may risk fertility and viability of offspring by choosing redder, more deteriorated partners with decreased sperm DNA integrity. The evolution of sexual signal may be constrained by oxidative DNA damage in the soma and germline.

The Negative Intelligence–Religiosity Relation: New and Confirming Evidence

The Negative Intelligence–Religiosity Relation: New and Confirming Evidence. Miron Zuckerman at al. Personality and Social Psychology Bulletin, October 15, 2019. https://doi.org/10.1177/0146167219879122

Abstract: Zuckerman et al. (2013) conducted a meta-analysis of 63 studies that showed a negative intelligence–religiosity relation (IRR). As more studies have become available and because some of Zuckerman et al.’s (2013) conclusions have been challenged, we conducted a new meta-analysis with an updated data set of 83 studies. Confirming previous conclusions, the new analysis showed that the correlation between intelligence and religious beliefs in college and noncollege samples ranged from −.20 to −.23. There was no support for mediation of the IRR by education but there was support for partial mediation by analytic cognitive style. Thus, one possible interpretation for the IRR is that intelligent people are more likely to use analytic style (i.e., approach problems more rationally). An alternative (and less interesting) reason for the mediation is that tests of both intelligence and analytic style assess cognitive ability. Additional empirical and theoretical work is needed to resolve this issue.

Keywords intelligence, religiosity, meta-analysis, analytic thinking

Check also The Myth of the Stupid Believer: The Negative Religiousness–IQ Nexus is Not on General Intelligence (g) and is Likely a Product of the Relations Between IQ and Autism Spectrum. Edward Dutton et al. Journal of Religion and Health, Oct 5 2019. https://www.bipartisanalliance.com/2019/10/the-myth-of-stupid-believer-negative.html

The size of the stereotype threat effect that can be experienced on tests of cognitive ability in operational (real-world) scenarios such as college admissions tests and employment testing may range from negligible to small

Shewach, O. R., Sackett, P. R., & Quint, S. (2019). Stereotype threat effects in settings with features likely versus unlikely in operational test settings: A meta-analysis. Journal of Applied Psychology, OCt 2019. http://dx.doi.org/10.1037/apl0000420

Abstract: The stereotype threat literature primarily comprises lab studies, many of which involve features that would not be present in high-stakes testing settings. We meta-analyze the effect of stereotype threat on cognitive ability tests, focusing on both laboratory and operational studies with features likely to be present in high stakes settings. First, we examine the features of cognitive ability test metric, stereotype threat cue activation strength, and type of nonthreat control group, and conduct a focal analysis removing conditions that would not be present in high stakes settings. We also take into account a previously unrecognized methodological error in how data are analyzed in studies that control for scores on a prior cognitive ability test, which resulted in a biased estimate of stereotype threat. The focal sample, restricting the database to samples utilizing operational testing-relevant conditions, displayed a threat effect of d = −.14 (k = 45, N = 3,532, SDδ = .31). Second, we present a comprehensive meta-analysis of stereotype threat. Third, we examine a small subset of studies in operational test settings and studies utilizing motivational incentives, which yielded d-values ranging from .00 to −.14. Fourth, the meta-analytic database is subjected to tests of publication bias, finding nontrivial evidence for publication bias. Overall, results indicate that the size of the stereotype threat effect that can be experienced on tests of cognitive ability in operational scenarios such as college admissions tests and employment testing may range from negligible to small.

Lifespan Cognitive Reserve—A Secret to Coping With Neurodegenerative Pathology

Lifespan Cognitive Reserve—A Secret to Coping With Neurodegenerative Pathology. Sylvia Villeneuve. JAMA Neurol. 2019;76(10):1145-1146. October 2019, doi:10.1001/jamaneurol.2019.2899

Given the limited success of therapeutic interventions for Alzheimer disease, there is increased interest in understanding whether modifiable factors can help cope with or postpone the appearance of brain pathology. It is estimated that about 35% of Alzheimer risk is modifiable.1,2 Epidemiologic studies have shown that lifetime exposures to higher education, higher occupational attainment, and cognitively stimulating activities are associated with reduced risk of Alzheimer dementia.3 Autopsy studies have shown interindividual differences in the amount of brain pathology people can tolerate before manifesting cognitive impairments, and autopsied brains of about one-third of individuals who are cognitively normal meet neuropathological criteria for Alzheimer disease.4 About a decade ago, the concept of cognitive reserve was proposed to account for the discrepancy between brain pathology and cognitive status.5 The broad hypothesis was that individuals with enriched lifelong exposures would be able to better tolerate with brain pathology in late life. Many studies have investigated how one can cope with brain damage using proxies of neurodegeneration or synaptic integrity. However, the gold standard for testing the reserve hypothesis is the direct measurement of brain pathology at autopsy.

In this issue of JAMA Neurology, Xu and colleagues6 used the resources of the Rush Memory and Aging Project for an empirical test of the reserve hypothesis. They first assessed whether individuals with lifelong protective factors have a reduced risk of dementia. More importantly, they examined the influence of Alzheimer pathology on this association, thereby providing a direct test of the reserve hypothesis. The Rush Memory and Aging Project is among the largest longitudinal studies that include both comprehensive in vivo and autopsy data. The study had 1602 participants without dementia, of whom 611 had died during the study follow-up period and had autopsy data available. Of interest, the authors6 derived the reserve score by combining weighted measures of education, lifelong cognitive activity, and late-life social activity. In theory, the use of a composite score as a proxy for reserve instead of a single factor score (eg, education) should be more accurate if reserve is built on diverse lifelong experiences, as is commonly thought. Over a mean 6-year study follow-up (range, 1-20 years), one-quarter of these participants developed dementia. As expected, a higher reserve composite was associated with reduced incidence of dementia. Thus, participants in the highest tertile of reserve had about a 40% reduction in occurrence of dementia compared with those in the lowest tertile. A dose-effect association was also evident, in that individuals in the middle tertile showed about a 20% reduction in risk of dementia. Stated another way, individuals with high reserve scores experienced a delay in dementia onset of more than 7 years. Targeting lifestyle factors that enhance reserve could therefore reduce the incidence of new cases, while providing more than half a decade of cognitive health to individuals who would ultimately develop dementia.

The contribution of individual factors to dementia risk was assessed in a secondary set of analyses. Here it was interesting to note that lifestyle behaviors in midlife and late life were the main protective factors of dementia, suggesting that preventive interventions should probably be started in midlife but may also be successful if started in late life. In 2015 the results of Finnish Geriatric Intervention Study to Prevent Cognitive Impairment and Disability (FINGER),7 which involved physical exercise, nutrition, cognitive stimulation, and self-monitoring of heart health, suggested the possibility of preventing cognitive decline using a multidomain intervention among older individuals who were at risk. The FINGER model is now being adapted and implemented in North America (the Alzheimer's Association US Study to Protect Brain Health Through Lifestyle Intervention to Reduce Risk [US POINTER]), Asia (Singapore Intervention Study to Prevent Cognitive Impairment and Disability [SINGER] and the Multimodal Interventions to Delay Dementia and Disability in Rural China [MIND-CHINA]), Australia (the Maintain Your Brain [MYB] trial), and Europe (Multimodal Preventive Trial for Alzheimer’s Disease [MIND-AD]). It is estimated that a multidomain lifestyle intervention that achieves a 25% reduction in Alzheimer risk could prevent more than 3 million cases worldwide.1

Focusing on the autopsy findings, the positive association of reserve with reducing the risk of dementia onset was present even in individuals with high degrees of brain pathology. This was true for Alzheimer disease pathology, but also for gross infarcts and microscopic infarcts. No sex difference was reported for this association. A recent in vivo positron emission tomography (PET) study found that women exhibit higher tau burden than men in the preclinical phase of the disease,8 which could suggest that women can tolerate more pathology before developing cognitive impairments. Since women are at increased risk of developing Alzheimer dementia, particularly in late life, the role of sex differences in lifestyle-protective factors and associated resilience to the pathology will need further investigation.

The protective effect of lifespan cognitive reserve on the incidence of dementia in individuals with high pathology was also found when removing individuals with baseline mild cognitive impairments from the analysis. This last point is important, because individuals who are cognitively normal are the targets of preventive trials and therefore increasing effort is devoted toward a better understating of the factors that could protect this group from developing cognitive impairment. Additionally, it is possible that some reserve-associated phenomena are only quantifiable at a certain stage of the disease, stressing the need to explore reserve-associated phenomena at different disease stages. For instance, a new study9 has shown that, while high reserve was associated with a slower cognitive decline in individuals without dementia, it was associated with a faster decline in individuals with dementia. One interpretation for this unexpected finding is that when individuals with high reserve can no longer cope with the pathology, their clinical decline is faster than individuals with lower reserve because the disease stage is more advanced.5

Evidence from PET research further suggests that some lifestyle factors that can enhance cognitive reserve might also postpone manifestation of brain pathology. This new concept has been called resistance.10 Thus, while the term reserve refers to the ability to cope with AD pathology, resistance refers to the ability to avoid the pathology in the first place. For instance, enriched cognitive and physical activities have been associated with a reduced amyloid burden in individuals who are cognitively normal.11-13 It has even been proposed that protective lifestyle factors could partially offset the negative influence of the APOE ε4 allele on the risk of developing Alzheimer pathology.14,15 In their study, Xu and colleagues6 found no evidence supporting the resistance hypothesis, with individuals with high and low reserve scores exhibiting similar amounts of pathology. The results were similar when removing individuals with baseline mild cognitive impairment from the analysis or using cognitive status as an interactive term. The association between protective lifestyle factors and brain pathology, if it exists, might not be linear but rather closer to a U-shaped distribution and therefore extremely difficult to detect in cross-sectional studies. Plausibly, as soon as individuals with high reserve develop Alzheimer pathology, they are no longer resisting the pathology but are limited to coping with it.

This last point may serve as a reminder that cognitive reserve is a dynamic and therefore complex concept for study. Amyloid and tau PET imaging, even if they only give estimates of pathology, should be extremely powerful tools to quantify reserve-associated processes in real time, since they allow tracking of evolution of pathology in vivo. They can also contribute to simultaneous measurement of protective lifestyle factors, Alzheimer pathology, and cognitive performance. Even more interestingly, PET imaging can permit concurrent quantitation of the effects of preventive lifestyle interventions on Alzheimer pathology and associated cognitive decline.

The article by Dr Xu and colleagues6 provides strong evidence that individuals with higher reserve can tolerate more brain pathology and therefore experienced a reduced risk of dementia compared with others. While autopsy studies will probably remain the gold standard for testing the reserve hypothesis, their limitations in testing the dynamic of this phenomenon are obvious. New fluid, PET, and magnetic resonance imaging techniques for tracking Alzheimer and vascular pathology in vivo will certainly add to comprehension of reserve phenomena in the years to come.

Check also Association of Lifespan Cognitive Reserve Indicator With Dementia Risk in the Presence of Brain Pathologies. Hui Xu et al. JAMA Neurol. 2019;76(10):1184-1191, July 14, 2019, doi:10.1001/jamaneurol.2019.2455
Key Points

Question  Is high lifespan cognitive reserve (CR) indicator associated with a reduction in dementia risk, and how strong is this association in the presence of high brain pathologies?

Findings  In this cohort study including 1602 dementia-free older adults, high lifespan CR was associated with a decreased risk of dementia. This association was present in people with high Alzheimer disease and vascular pathologies.

Meaning  Accumulative educational and mentally stimulating activities enhancing CR throughout life might be a feasible strategy to prevent dementia, even for people with high brain pathologies.


Importance  Evidence on the association of lifespan cognitive reserve (CR) with dementia is limited, and the strength of this association in the presence of brain pathologies is unknown.

Objective  To examine the association of lifespan CR with dementia risk, taking brain pathologies into account.

Design, Setting, and Participants  This study used data from 2022 participants in the Rush Memory and Aging Project, an ongoing community-based cohort study with annual follow-up from 1997 to 2018 (mean follow-up, 6 years; maximum follow-up, 20 years). After excluding 420 individuals who had prevalent dementia, missing data on CR, or dropped out, 1602 dementia-free adults were identified at baseline and evaluated to detect incident dementia. During follow-up, 611 died and underwent autopsies. Data were analyzed from May to September 2018.

Exposures  Information on CR factors (education; early-life, midlife, and late-life cognitive activities; and social activities in late life) was obtained at baseline. Based on these factors, lifespan CR scores were captured using a latent variable from a structural equation model and was divided into tertiles (lowest, middle, and highest).

Main Outcomes and Measures  Dementia was diagnosed following international criteria. Neuropathologic evaluations for Alzheimer disease and other brain pathologies were performed in autopsied participants. The association of lifespan CR with dementia or brain pathologies was estimated using Cox regression models or logistic regression.

Results  Of the 1602 included participants, 1216 (75.9%) were women, and the mean (SD) age was 79.6 (7.5) years. During follow-up, 386 participants developed dementia (24.1%), including 357 participants with Alzheimer disease–related dementia (22.3%). The multiadjusted hazards ratios (HRs) of dementia were 0.77 (95% CI, 0.59-0.99) for participants in the middle CR score tertile and 0.61 (95% CI, 0.47-0.81) for those in the highest CR score tertile compared with those in the lowest CR score tertile. In autopsied participants, CR was not associated with most brain pathologies, and the association of CR with dementia remained significant after additional adjustment for brain pathologies (HR, 0.60; 95% CI, 0.42-0.86). The highest CR score tertile was associated with a reduction in dementia risk, even among participants with high Alzheimer disease pathology (HR, 0.57; 95% CI, 0.37-0.87) and any gross infarcts (HR, 0.34; 95% CI, 0.18-0.62).

Conclusions and Relevance  High lifespan CR is associated with a reduction in dementia risk, even in the presence of high brain pathologies. Our findings highlight the importance of lifespan CR accumulation in dementia prevention.

The dolichofacial individual was associated with security agents as the most prone to commit crimes, was seen as less trustworthy

Does the facial pattern give individuals a profile of a crime suspect? Nathalia de Lima SANTOS et al. Biosci. J., Uberlândia, v. 35, n. 5, p. 1614-1621, Sep./Oct. 2019. http://dx.doi.org/10.14393/BJ-v35n5a2019-46326

ABSTRACT: To evaluate the influence of mesofacial, brachyfacial and dolichofacial facial patterns on giving an individual the profile of a crime suspect in the eyes of public security agents. This study had a crosssectional design, conducted with public security agents of both sexes (n=100), where images of facial composites (police sketches) of individuals with different facial patterns (mesofacial, brachyfacial and dolichofacial) were used. With these images in hand, a questionnaire was created, divided into three parts: the first in which all the images were presented together, allowing comparison among them; the second, in which each image was evaluated separately followed by questions and the third that consisted on a visual analog scale that presented a bar with marks going from 0 to 100, where 0 represented the untrustworthy individual, 50 the individual who could be trusted, and 100 a very trustworthy individual. When all the data had been obtained statistical analyses were performed using the Chi-square and Friedman tests. The level of significance adopted was 5% (α=0.05). The dolichofacial individual was associated with security agents as the most prone to commit crimes and became more insecure and distrustful when compared to the mesofacial and brachyfacial individuals (p <0.001). The dolichofacial profile had a negative influence on the judgment of security agents who attributed to it, a character suspected of a crime and a low level of trustworthiness.

KEYWORDS: Face. Social perception. Judgment. Crime.