(C) PLOS One
This story was originally published by PLOS One and is unaltered.
. . . . . . . . . .
Reservoir host immunology and life history shape virulence evolution in zoonotic viruses [1]
['Cara E. Brook', 'Department Of Ecology', 'Evolution', 'University Of Chicago', 'Chicago', 'Illinois', 'United States Of America', 'Carly Rozins', 'Department Of Science', 'Technology']
Date: 2023-09
The management of future pandemic risk requires a better understanding of the mechanisms that determine the virulence of emerging zoonotic viruses. Meta-analyses suggest that the virulence of emerging zoonoses is correlated with but not completely predictable from reservoir host phylogeny, indicating that specific characteristics of reservoir host immunology and life history may drive the evolution of viral traits responsible for cross-species virulence. In particular, bats host viruses that cause higher case fatality rates upon spillover to humans than those derived from any other mammal, a phenomenon that cannot be explained by phylogenetic distance alone. In order to disentangle the fundamental drivers of these patterns, we develop a nested modeling framework that highlights mechanisms that underpin the evolution of viral traits in reservoir hosts that cause virulence following cross-species emergence. We apply this framework to generate virulence predictions for viral zoonoses derived from diverse mammalian reservoirs, recapturing trends in virus-induced human mortality rates reported in the literature. Notably, our work offers a mechanistic hypothesis to explain the extreme virulence of bat-borne zoonoses and, more generally, demonstrates how key differences in reservoir host longevity, viral tolerance, and constitutive immunity impact the evolution of viral traits that cause virulence following spillover to humans. Our theoretical framework offers a series of testable questions and predictions designed to stimulate future work comparing cross-species virulence evolution in zoonotic viruses derived from diverse mammalian hosts.
Funding: This work was funded by two National Institutes of Health grants (1R01AI129822-01 and 5DP2AI171120-02) to CEB, an Adolph C. and Mary Sprague Miller Institute for Basic Research fellowship to CEB, a Branco Weiss Society in Science fellowship to CEB, an American Association for the Advancement of Science/L'Oréal-USA for Women in Science Fellowship to CEB, and a Defense Advanced Research Projects Agency PREEMPT Program subgrant (D18AC00031) to CEB. This work was further supported by two National Science Foundation-Division of Environmental Biology grants to MB (#2011109 and #2109860). Additionally, CR was supported by the One Health Modelling Network for Emerging Infections (OMNI-RÉUNIS) with the support of Natural Sciences and Engineering Research Council of Canada and the Public Health Agency of Canada. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.
Modern theory on the evolution of virulence typically assumes, either explicitly or implicitly, that high pathogen growth rates should both enhance between-host transmission and elevate infection-induced morbidity or mortality, resulting in a trade-off between virulence and transmission [ 37 – 39 ]. Theory further suggests that because viral “tolerance” mitigates virulence without reducing viral load, most host strategies of tolerance should select for higher growth rate pathogens that achieve gains in between-host transmission without causing damage to the original host [ 39 – 41 ]. The widely touted viral tolerance of bats [ 11 , 16 , 42 – 44 ] should therefore be expected to support the evolution of enhanced virus growth rates, which—though avirulent to bats—may cause significant pathology upon spillover to hosts lacking unique features of bat immunology and physiology. Beyond tolerance, other life history characteristics unique to diverse reservoir hosts should also impact the evolution of traits in the viruses they host—with important consequences for cross-species virulence following spillover. However, to date, we lack a specific theory that examines the relative impact of reservoir host life history on spillover virulence. Here, we explore the extent to which the immunological and life history traits of mammalian reservoirs can explain variation in the virulence of zoonotic viruses emerging into human hosts.
Although the disproportionate frequency with which the Chiropteran order may source viral zoonoses remains debated [ 9 , 10 ], the extraordinary human pathology induced by many bat-borne zoonoses—including Ebola and Marburg filoviruses, Hendra and Nipah henipaviruses, and SARS, MERS, and SARS-CoV-2 coronaviruses [ 11 ]—is not contested. Remarkably, bats demonstrate limited clinical pathology from infection with viruses that cause extreme morbidity and mortality in other hosts [ 12 ]. Bats avoid pathological outcomes from viral infection via a combination of unique resistance and tolerance mechanisms, which, respectively, limit the viral load accrued during infection (“resistance”) and reduce the disease consequences of a given viral load (“tolerance”) [ 13 – 16 ]. Viral resistance mechanisms vary across bat species; those described to date include: receptor incompatibilities that limit the extent of infection for certain viruses in certain bats [ 17 – 20 ], constitutive expression of antiviral cytokines in some bat species [ 21 ], and enhanced autophagy [ 22 ] and heat-shock protein expression [ 23 ] in others. Expansion of anti-viral APOBEC3 genes has also been documented in a few well-studied bat genomes [ 24 , 25 ]. While such robust antiviral immunity would result in widespread immunopathology in most mammals, bats—as the only mammals capable of powered flight—have evolved numerous unique mechanisms of mitigating inflammation incurred during the intensive physiological process of flying [ 26 – 28 ]. These anti-inflammatory adaptations include loss of PYHIN [ 29 – 31 ] and down-regulation of NLRP3 [ 32 ] inflammasome-forming gene families, loss of pro-inflammatory genes in the NF-κΒ pathway [ 24 ], dampened interferon activation in the STING pathway [ 33 ], and diminished caspase-1 inflammatory signaling [ 34 ]. In addition to facilitating flight, this resilience to inflammation has yielded the apparent by-products of extraordinarily long bat lifespans [ 35 ] and tolerance of the immunopathology that typically results from viral infection [ 11 ]. Moreover, recent work demonstrates how high virus growth rates easily tolerated in constitutively antiviral bat cells cause significant pathology in cells lacking these unique antiviral defenses [ 36 ]. The extent to which inflammatory tolerance may modulate the evolution of the viruses that bats host, however, remains largely unexplored.
The devastating impact of the Severe Acute Respiratory Syndrome Coronavirus 2 (SARS-CoV-2) pandemic highlights the extreme public health outcomes that can result upon cross-species emergence of zoonotic viruses. Estimating the relative threats posed by potential future zoonoses is an important but challenging public health undertaking. In particular, efforts to predict the virulence of emerging viruses can be complicated since chance will always play a role in dictating the initial spillover that precedes selection [ 1 ], virulence upon emergence may be maladaptive in novel hosts [ 2 , 3 ], and patterns in available data may be muddled by attainment bias if avirulent infections go underreported [ 1 ]. Nonetheless, a growing body of recent work highlights clear associations between reservoir and spillover host phylogeny and the virulence of a corresponding cross-species infection [ 4 – 7 ]. In many cases, increasing phylogenetic distance between reservoir and spillover hosts is correlated with higher virulence infections [ 4 – 6 ], suggesting that spillover host immune systems may be poorly equipped to tolerate viral traits optimized in more distantly related reservoirs. Still, the effect of phylogeny on spillover virulence appears to supersede that of simple phylogenetic distance [ 7 , 8 ], indicating that taxon-specific reservoir host immunological and life history traits may be important drivers of cross-species virus virulence. Notably, bats host viruses that cause higher human case fatality rates than zoonoses derived from other mammals and birds, a phenomenon that cannot be explained by phylogenetic distance alone [ 8 ]. Understanding the mechanisms that select for the evolution of unique viral traits in bats compared to those selected in other mammalian reservoirs should enable us to better predict the virulence of future zoonotic threats.
Results
Estimating zoonotic virus virulence in spillover human hosts After establishing optimal growth rates for viruses evolved in diverse reservoir host orders ( ), we subsequently model the corresponding “spillover virulence” (α S ) of these viruses following emergence into a human host. Zoonotic spillovers are modeled as acute infections in the human, and virulence is calculated while varying only the growth rate of the spillover virus ( ) and the human tolerance of direct virus pathology (T vS ) between viruses evolved in differing reservoir orders. We vary this last parameter, T vS , to account for any differences in virus adaptation to reservoir host immune systems that are not already captured in estimation of reservoir host T wR and g 0R . T vS is thus computed as the inverse of the scaled time to MRCA for each mammalian reservoir host order from Primates (Figs 3E, S4 and S1 and S2 Tables), such that we estimate low human tolerance to viruses evolved in phylogenetically distant orders (e.g., monotreme and marsupial orders), and high human tolerance to viruses evolved in Primate and Primate-adjacent orders. In general, the modulating effects of T vS do little to alter virulence rankings of zoonotic viruses from those predicted by raw growth rate ( ) alone—with the highest spillover virulence predicted from viruses evolved in orders Monotremata and Chiroptera and the lowest spillover virulence predicted from viruses evolved in orders Cetartiodactyla and Scandentia (Fig 3D and 3G). Notably, modulating T vS enhances predictions of spillover virulence for some marsupial clades (Peramelemorphia, Dasyuromorphia, Diprotodontia) relative to eutherian orders with similar predicted values. This results in dampened predicted spillover virulence for the eutherian order Perissodactyla as compared with marsupial orders Dasyuromorphia and Diprotodontia, despite lower predicted values for the latter 2 clades (Figs 3D, 3G, and S5). Similarly, this elevates spillover virulence predictions for Peramelemorphia above Eulipotyphla, Afrosoricida, and Hyrocoidea, despite higher predicted in the 3 eutherian orders (Figs 3D, 3G, and S5).
[END]
---
[1] Url:
https://journals.plos.org/plosbiology/article?id=10.1371/journal.pbio.3002268
Published and (C) by PLOS One
Content appears here under this condition or license: Creative Commons - Attribution BY 4.0.
via Magical.Fish Gopher News Feeds:
gopher://magical.fish/1/feeds/news/plosone/