We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure coreplatform@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Background: Multidrug-resistant organisms (MDROs), such as carbapenem-resistant Enterobacterales (CRE), can spread rapidly in a region. Facilities that care for high-acuity patients with long average lengths of stay (eg, long-term acute-care hospitals or LTACHs and ventilator-capable skilled nursing facilities or vSNFs) may amplify this spread. We assessed the impact of interventions on CRE spread within a region individually, bundled, and implemented at different facility types. Methods: We developed a deterministic compartmental model, parametrized using CRE data reported to the NHSN and patient transfer data from the CMS specific to a US state. The model includes the community and the healthcare facilities within the state. Individuals may be either susceptible or infected and infectious. Infected patients determined to have CRE through admission screening or point-prevalence surveys at a facility are placed in a state of lower transmissibility if enhanced infection prevention and control (IPC) practices are in place. Results: Intervention bundles that included periodic point-prevalence surveys and enhanced IPC at high-acuity postacute-care facilities had the greatest impact on regional prevalence 10 years into an outbreak; the benefits of including admission screening and improved interfacility communication were more modest (Fig. 1A). Delaying interventions by 3 years is predicted to result in smaller reductions in prevalence (Fig. 1B). Increasing the frequency of point-prevalence surveys from biannually to quarterly resulted in a substantial relative reduction in prevalence (from 25% to 44%) if conducted from the start of an outbreak. IPC improvements in vSNFs resulted in greater relative reductions than in LTACHs. Admission screening at LTACHs and vSNFs was predicted to have a greater impact on prevalence if in place prior to CRE introduction (~20% reduction), and the impact decreased by approximately half if implementation was delayed until 3 years after CRE introduction. In contrast, the effect of admission screening in ACH was less (~10% reduction in prevalence) and did not change with implementation delays. Conclusions: Our model suggests that interventions that limit unrecognized MDRO introduction to, or dispersal from, LTACHs and vSNFs through screening are predicted to slow distribution regionally. Interventions to detect colonization and improve IPC practices within LTACHs and vSNFs may substantially reduce the regional burden. Prevention strategies are predicted to have the greatest impact when interventions are bundled and implemented before an MDRO is identified in a region, but reduction in overall prevalence is still possible if implemented after initial MDRO spread.
The opioid epidemic in the United States is getting worse: in 2020 opioid overdose deaths hit an all-time high of 92,183. This underscored the need for more effective and readily available treatments for patients with opioid use disorder (OUD). Prescription digital therapeutics (PDTs) are FDA-authorized treatments delivered via mobile devices (eg, smartphones). A real-world pilot study was conducted in an outpatient addiction treatment program to evaluate patient engagement and use of a PDT for patients with OUD. The objective was to assess the ability of the PDT to improve engagement and care for patients receiving buprenorphine medication for opioid use disorder (MOUD).
Methods
Patients with OUD treated at an ambulatory addiction treatment clinic were invited to participate in the pilot. The reSET-O PDT is comprised of 31 core therapy lessons plus 36 supplementary lessons, plus contingency management rewards. Patients were asked to complete at least 4 lessons per week, for 12-weeks. Engagement and use data were collected via the PDT and rates of emergency room data were obtained from patient medical records. Data were compared to a similar group of 158 OUD patients treated at the same clinic who did not use the PDT. Abstinence data were obtained from deidentified medical records.
Results
Pilot participants (N = 40) completed a median of 24 lessons: 73.2% completed at least 8 lessons and 42.5% completed all 31 core lessons. Pilot participants had significantly higher rates of abstinence from opioids in the 30 days prior to discharge from the program than the comparison group: 77.5% vs 51.9% (P < .01). Clinician-reported treatment retention for pilot participants vs the comparison group was 100% vs 70.9% 30 days after treatment initiation (P < .01), 87.5% vs 55.1% at 90 days post-initiation (P < .01), and 45.0% vs 38.6% at 180 days post-initiation (P = .46). Emergency room visits within 90 days of discharge from the addiction program were significantly reduced in pilot participants compared to the comparison group (17.3% vs 31.7%, P < .01).
Conclusions
These results demonstrate substantial engagement with a PDT in a real-world population of patients with OUD being treated with buprenorphine. Abstinence and retention outcomes were high compared to patients not using the PDT. These results demonstrate the potential value of PDTs to improve outcomes among patients with OUD, a population for which a significant need for improved treatments exists.
Funding
Trinity Health Innovation and Pear Therapeutics Inc.
To evaluate coronavirus disease 2019 (COVID-19) vaccine hesitancy among healthcare personnel (HCP) with significant clinical exposure to COVID-19 at 2 large, academic hospitals in Philadelphia, Pennsylvania.
Design, setting, and participants:
HCP were surveyed in November–December 2020 about their intention to receive the COVID-19 vaccine.
Methods:
The survey measured the intent among HCP to receive a COVID-19 vaccine, timing of vaccination, and reasons for or against vaccination. Among patient-facing HCP, multivariate regression evaluated the associations between healthcare positions (medical doctor, nurse practitioner or physician assistant, and registered nurse) and vaccine hesitancy (intending to decline, delay, or were unsure about vaccination), adjusting for demographic characteristics, reasons why or why not to receive the vaccine, and prior receipt of routine vaccines.
Results:
Among 5,929 HCP (2,253 medical doctors [MDs] and doctors of osteopathy [DOs], 582 nurse practitioners [NPs], 158 physician assistants [PAs], and 2,936 nurses), a higher proportion of nurses (47.3%) were COVID-vaccine hesitant compared with 30.0% of PAs and NPs and 13.1% of MDs and DOs. The most common reasons for vaccine hesitancy included concerns about side effects, the newness of the vaccines, and lack of vaccine knowledge. Regardless of position, Black HCP were more hesitant than White HCP (odds ratio [OR], ∼5) and females were more hesitant than males (OR, ∼2).
Conclusions:
Although most clinical HCP intended to receive a COVID-19 vaccine, intention varied by healthcare position. Consistent with other studies, hesitancy was also significantly associated with race or ethnicity across all positions. These results highlight the importance of understanding and effectively addressing reasons for hesitancy, especially among frontline HCP who are at increased risk of COVID exposure and play a critical role in recommending vaccines to patients.
Vision and hearing impairments are highly prevalent in adults 65 years of age and older. There is a need to understand their association with multiple health-related outcomes. We analyzed data from the Resident Assessment Instrument for Home Care (RAI-HC). Home care clients were followed for up to 5 years and categorized into seven unique cohorts based on whether or not they developed new vision and/or hearing impairments. An absolute standardized difference (stdiff) of at least 0.2 was considered statistically meaningful. Most clients (at least 60%) were female and 34.9 per cent developed a new sensory impairment. Those with a new concurrent vison and hearing impairment were more likely than those with no sensory impairments to experience a deterioration in receptive communication (stdiff = 0.68) and in cognitive performance (stdiff = 0.49). After multivariate adjustment, they had a twofold increased odds (adjusted odds ratio [OR] = 2.1; 95% confidence interval [CI]:1,87, 2.35) of deterioration in cognitive performance. Changes in sensory functioning are common and have important effects on multiple health-related outcomes.
Substantial progress has been made in the standardization of nomenclature for paediatric and congenital cardiac care. In 1936, Maude Abbott published her Atlas of Congenital Cardiac Disease, which was the first formal attempt to classify congenital heart disease. The International Paediatric and Congenital Cardiac Code (IPCCC) is now utilized worldwide and has most recently become the paediatric and congenital cardiac component of the Eleventh Revision of the International Classification of Diseases (ICD-11). The most recent publication of the IPCCC was in 2017. This manuscript provides an updated 2021 version of the IPCCC.
The International Society for Nomenclature of Paediatric and Congenital Heart Disease (ISNPCHD), in collaboration with the World Health Organization (WHO), developed the paediatric and congenital cardiac nomenclature that is now within the eleventh version of the International Classification of Diseases (ICD-11). This unification of IPCCC and ICD-11 is the IPCCC ICD-11 Nomenclature and is the first time that the clinical nomenclature for paediatric and congenital cardiac care and the administrative nomenclature for paediatric and congenital cardiac care are harmonized. The resultant congenital cardiac component of ICD-11 was increased from 29 congenital cardiac codes in ICD-9 and 73 congenital cardiac codes in ICD-10 to 318 codes submitted by ISNPCHD through 2018 for incorporation into ICD-11. After these 318 terms were incorporated into ICD-11 in 2018, the WHO ICD-11 team added an additional 49 terms, some of which are acceptable legacy terms from ICD-10, while others provide greater granularity than the ISNPCHD thought was originally acceptable. Thus, the total number of paediatric and congenital cardiac terms in ICD-11 is 367. In this manuscript, we describe and review the terminology, hierarchy, and definitions of the IPCCC ICD-11 Nomenclature. This article, therefore, presents a global system of nomenclature for paediatric and congenital cardiac care that unifies clinical and administrative nomenclature.
The members of ISNPCHD realize that the nomenclature published in this manuscript will continue to evolve. The version of the IPCCC that was published in 2017 has evolved and changed, and it is now replaced by this 2021 version. In the future, ISNPCHD will again publish updated versions of IPCCC, as IPCCC continues to evolve.
Regionalizing pre-colonial Africa aids in the collection and interpretation of primary sources as data for further analysis. This article includes a map with six broad regions and 34 sub-regions, which form a controlled vocabulary within which researchers may geographically organize and classify disparate pieces of information related to Africa’s past. In computational terms, the proposed African regions serve as data containers in order to consolidate, link, and disseminate research among a growing trend in digital humanities projects related to the history of the African diasporas before c. 1900. Our naming of regions aims to avoid terminologies derived from European slave traders, colonialism, and modern-day countries.
Rigorous scientific review of research protocols is critical to making funding decisions, and to the protection of both human and non-human research participants. Given the increasing complexity of research designs and data analysis methods, quantitative experts, such as biostatisticians, play an essential role in evaluating the rigor and reproducibility of proposed methods. However, there is a common misconception that a statistician’s input is relevant only to sample size/power and statistical analysis sections of a protocol. The comprehensive nature of a biostatistical review coupled with limited guidance on key components of protocol review motived this work. Members of the Biostatistics, Epidemiology, and Research Design Special Interest Group of the Association for Clinical and Translational Science used a consensus approach to identify the elements of research protocols that a biostatistician should consider in a review, and provide specific guidance on how each element should be reviewed. We present the resulting review framework as an educational tool and guideline for biostatisticians navigating review boards and panels. We briefly describe the approach to developing the framework, and we provide a comprehensive checklist and guidance on review of each protocol element. We posit that the biostatistical reviewer, through their breadth of engagement across multiple disciplines and experience with a range of research designs, can and should contribute significantly beyond review of the statistical analysis plan and sample size justification. Through careful scientific review, we hope to prevent excess resource expenditure and risk to humans and animals on poorly planned studies.
Background: Successful containment of regional outbreaks of emerging multidrug-resistant organisms (MDROs) relies on early outbreak detection. However, deploying regional containment is resource intensive; understanding the distribution of different types of outbreaks might aid in further classifying types of responses. Objective: We used a stochastic model of disease transmission in a region where healthcare facilities are linked by patient sharing to explore optimal strategies for early outbreak detection. Methods: We simulated the introduction and spread of Candida auris in a region using a lumped-parameter stochastic adaptation of a previously described deterministic model (Clin Infect Dis 2019 Mar 28. doi:10.1093/cid/ciz248). Stochasticity was incorporated to capture early-stage behavior of outbreaks with greater accuracy than was possible with a deterministic model. The model includes the real patient sharing network among healthcare facilities in an exemplary US state, using hospital claims data and the minimum data set from the CMS for 2015. Disease progression rates for C. auris were estimated from surveillance data and the literature. Each simulated outbreak was initiated with an importation to a Dartmouth Atlas of Health Care hospital referral region. To estimate the potential burden, we quantified the “facility-time” period during which infectious patients presented a risk of subsequent transmission within each healthcare facility. Results: Of the 28,000 simulated outbreaks initiated with an importation to the community, 2,534 resulted in patients entering the healthcare facility network. Among those, 2,480 (98%) initiated a short outbreak that died out or quickly attenuated within 2 years without additional intervention. In the simulations, if containment responses were initiated for each of those short outbreaks, facility time at risk decreased by only 3%. If containment responses were initiated for the 54 (2%) outbreaks lasting 2 years or longer, facility time at risk decreased by 79%. Sentinel surveillance through point-prevalence surveys (PPSs) at the 23 skilled-nursing facilities caring for ventilated patients (vSNF) in the network detected 50 (93%) of the 54 longer outbreaks (median, 235 days to detection). Quarterly PPSs at the 23 largest acute-care hospitals (ie, most discharges) detected 48 longer outbreaks (89%), but the time to detection was longer (median, 716 days to detection). Quarterly PPSs also identified 76 short-term outbreaks (in comparison to only 14 via vSNF PPS) that self-terminated without intervention. Conclusions: A vSNF-based sentinel surveillance system likely provides better information for guiding regional intervention for the containment of emerging MDROs than a similarly sized acute-care hospital–based system.
The purpose of this study was to describe the prevalence of hearing loss (HL), vision loss (VL), and dual sensory loss (DSL) in Canadians 45–85 years of age. Audiometry and visual acuity were measured. Various levels of impairment severity were described. Results were extrapolated to the 2016 Canadian population. In 2016, 1,500,000 Canadian males 45–85 years of age had at least mild HL, 1,800,000 had at least mild VL, and 570,000 had DSL. Among females, 1,200,000 had at least mild HL, 2,200,000 had at least mild VL, and 450,000 had DSL. Among Canadians 45–85 years of age, mild, moderate, and severe HL was prevalent among 13.4 per cent, 3.7 per cent, and 0.4 per cent of males, and among 11.3 per cent, 2.3 per cent, and 0.2 per cent of females, respectively. Mild and moderate, or severe VL was prevalent among 19.8 per cent and 2.4 per cent of males, and among 23.9 per cent and 2.6 per cent of females, respectively. At least mild DSL was prevalent among 6.4 per cent of males and 6.1 per cent of females.
Copy number variants (CNVs) play a significant role in disease pathogenesis in a small subset of individuals with schizophrenia (~2.5%). Chromosomal microarray testing is a first-tier genetic test for many neurodevelopmental disorders. Similar testing could be useful in schizophrenia.
Aims
To determine whether clinically identifiable phenotypic features could be used to successfully model schizophrenia-associated (SCZ-associated) CNV carrier status in a large schizophrenia cohort.
Method
Logistic regression and receiver operating characteristic (ROC) curves tested the accuracy of readily identifiable phenotypic features in modelling SCZ-associated CNV status in a discovery data-set of 1215 individuals with psychosis. A replication analysis was undertaken in a second psychosis data-set (n = 479).
Results
In the discovery cohort, specific learning disorder (OR = 8.12; 95% CI 1.16–34.88, P = 0.012), developmental delay (OR = 5.19; 95% CI 1.58–14.76, P = 0.003) and comorbid neurodevelopmental disorder (OR = 5.87; 95% CI 1.28–19.69, P = 0.009) were significant independent variables in modelling positive carrier status for a SCZ-associated CNV, with an area under the ROC (AUROC) of 74.2% (95% CI 61.9–86.4%). A model constructed from the discovery cohort including developmental delay and comorbid neurodevelopmental disorder variables resulted in an AUROC of 83% (95% CI 52.0–100.0%) for the replication cohort.
Conclusions
These findings suggest that careful clinical history taking to document specific neurodevelopmental features may be informative in screening for individuals with schizophrenia who are at higher risk of carrying known SCZ-associated CNVs. Identification of genomic disorders in these individuals is likely to have clinical benefits similar to those demonstrated for other neurodevelopmental disorders.
Childhood maltreatment (CM) plays an important role in the development of major depressive disorder (MDD). The aim of this study was to examine whether CM severity and type are associated with MDD-related brain alterations, and how they interact with sex and age.
Methods
Within the ENIGMA-MDD network, severity and subtypes of CM using the Childhood Trauma Questionnaire were assessed and structural magnetic resonance imaging data from patients with MDD and healthy controls were analyzed in a mega-analysis comprising a total of 3872 participants aged between 13 and 89 years. Cortical thickness and surface area were extracted at each site using FreeSurfer.
Results
CM severity was associated with reduced cortical thickness in the banks of the superior temporal sulcus and supramarginal gyrus as well as with reduced surface area of the middle temporal lobe. Participants reporting both childhood neglect and abuse had a lower cortical thickness in the inferior parietal lobe, middle temporal lobe, and precuneus compared to participants not exposed to CM. In males only, regardless of diagnosis, CM severity was associated with higher cortical thickness of the rostral anterior cingulate cortex. Finally, a significant interaction between CM and age in predicting thickness was seen across several prefrontal, temporal, and temporo-parietal regions.
Conclusions
Severity and type of CM may impact cortical thickness and surface area. Importantly, CM may influence age-dependent brain maturation, particularly in regions related to the default mode network, perception, and theory of mind.
In recent years, an increasing number of online archival databases of primary sources related to the history of the African diaspora and slavery have become freely and readily accessible for scholarly and public consumption. This proliferation of digital projects and databases presents a number of challenges related to aggregating data geographically according to the movement of people in and out of Africa across time and space. As a requirement to linking data of open-source digital projects, it has become necessary to delimit the entire continent of precolonial Africa during the era of the slave trade into broad regions and sub-regions that can allow the grouping of data effectively and meaningfully.
Since the 2000s, Greenland ice sheet mass loss has been accelerating, followed by increasing numbers of glacial earthquakes (GEs) at near-grounded glaciers. GEs are caused by calving of km-scale icebergs which capsize against the terminus. Seismic record inversion allows a reconstruction of the history of GE sources which captures capsize dynamics through iceberg-to-terminus contact. When compared with a catalog of contact forces from an iceberg capsize model, seismic force history accurately computes calving volumes while the earthquake magnitude fails to uniquely characterize iceberg size, giving errors up to 1 km3. Calving determined from GEs recorded ateight glaciers in 1993–2013 accounts for up to 21% of the associated discharge and 6% of the Greenland mass loss. The proportion of discharge attributed to capsizing calving may be underestimated by at least 10% as numerous events could not be identified by standard seismic detections (Olsen and Nettles, 2018). While calving production tends to stabilize in East Greenland, Western glaciers have released more and larger icebergs since 2010 and have become major contributors to Greenland dynamic discharge. Production of GEs and calving behavior are controlled by glacier geometry with bigger icebergs being produced when the terminus advances in deepening water. We illustrate how GEs can help in partitioning and monitoring Greenland mass loss and characterizing capsize dynamics.
Though not often discussed explicitly in literature, sample handling and preparation for advanced characterization techniques is a significant challenge for radiological materials. In this contribution, a detailed description is given of method development associated with characterization of highly radioactive and, in some cases, hygroscopic oxides of technetium. Details are given on developed protocols, fixtures, and tooling designed for x-ray and neutron diffraction, x-ray absorption, Raman spectroscopy, magic angle spinning nuclear magnetic resonance, and electron paramagnetic resonance. In some cases, multiple iterations of improved sample holder design are described. Lessons learned in handling Tc compounds for these and similar characterization methods are discussed.
An internationally approved and globally used classification scheme for the diagnosis of CHD has long been sought. The International Paediatric and Congenital Cardiac Code (IPCCC), which was produced and has been maintained by the International Society for Nomenclature of Paediatric and Congenital Heart Disease (the International Nomenclature Society), is used widely, but has spawned many “short list” versions that differ in content depending on the user. Thus, efforts to have a uniform identification of patients with CHD using a single up-to-date and coordinated nomenclature system continue to be thwarted, even if a common nomenclature has been used as a basis for composing various “short lists”. In an attempt to solve this problem, the International Nomenclature Society has linked its efforts with those of the World Health Organization to obtain a globally accepted nomenclature tree for CHD within the 11th iteration of the International Classification of Diseases (ICD-11). The International Nomenclature Society has submitted a hierarchical nomenclature tree for CHD to the World Health Organization that is expected to serve increasingly as the “short list” for all communities interested in coding for congenital cardiology. This article reviews the history of the International Classification of Diseases and of the IPCCC, and outlines the process used in developing the ICD-11 congenital cardiac disease diagnostic list and the definitions for each term on the list. An overview of the content of the congenital heart anomaly section of the Foundation Component of ICD-11, published herein in its entirety, is also included. Future plans for the International Nomenclature Society include linking again with the World Health Organization to tackle procedural nomenclature as it relates to cardiac malformations. By doing so, the Society will continue its role in standardising nomenclature for CHD across the globe, thereby promoting research and better outcomes for fetuses, children, and adults with congenital heart anomalies.
Electrochemical sensing systems are advancing into a wide range of new applications, moving from the traditional lab environment into disposable devices and systems, enabling real-time continuous monitoring of complex media. This transition presents numerous challenges ranging from issues such as sensitivity and dynamic range, to autocalibration and antifouling, to enabling multiparameter analyte and biomarker detection from an array of nanosensors within a miniaturized form factor. New materials are required not only to address these challenges, but also to facilitate new manufacturing processes for integrated electrochemical systems. This paper examines the recent advances in the instrumentation, sensor architectures, and sensor materials in the context of developing the next generation of nanoenabled electrochemical sensors for life sciences applications, and identifies the most promising solutions based on selected well established application exemplars.
The local chemistry of technetium-99 (99Tc) in oxide glasses is important for understanding the incorporation and long-term release of Tc from nuclear waste glasses, both those for legacy defense wastes and fuel reprocessing wastes. Tc preferably forms Tc(VII), Tc(IV), or Tc(0) in glass, depending on the level of reduction of the melt. Tc(VII) in oxide glasses is normally assumed to be isolated pertechnetate TcO4- anions surrounded by alkali, but can occasionally precipitate as alkali pertechnetate salts such as KTcO4 and NaTcO4 when Tc concentration is high. In these cases, Tc(VII) is 4-coordinated by oxygen. A reinvestigation of the chemistry of alkali-technetium-oxides formed under oxidizing conditions and at temperatures used to prepare nuclear waste glasses showed that higher coordinated alkali Tc(VII) oxide species had been reported, including those with the TcO5- and TcO6- anions. The chemistry of alkali Tc(VII) and other alkali-Tc-oxides is reviewed, along with relevant synthesis conditions.
Additionally, we report attempts to make 5- and 6-coordinate pertechnetate compounds of K, Na, and Li, i.e. TcO5- and TcO6-. It was found that higher coordinated species are very sensitive to water, and easily decompose into their respective pertechnetates. It was difficult to obtain pure compounds, but mixtures of the pertechnetate and other phase(s) were frequently found, as evidenced by x-ray absorption spectroscopy (XAS), neutron diffraction (ND), and Raman spectroscopy. Low temperature electron paramagnetic resonance (EPR) measurements showed the possibility of Tc(IV) and Tc(VI) in Na3TcO5 and Na5TcO6 compounds.
It was hypothesized that the smaller counter cation would result in more stable pertechnetates. To confirm the synthesis method, LiReO4 and Li5ReO6 were prepared, and their Raman spectra match those in the literature. Subsequently, the Tc versions LiTcO4 and Li5TcO6 were synthesized and characterized by ND, Raman spectroscopy, XANES, and EXAFS. The Li5TcO6 was a marginally stable compound that appears to have the same structure as that known for Li5ReO6. Implications of the experimental work on stability of alkali technetate compounds and possible role in the volatilization of Tc are discussed.
Large floods bringing significant sediments into the coastal oceans have not been observed in Antarctica. We report evidence of a large flood event depositing over 50 cm of sediment onto the nearshore benthic habitat at Salmon Bay, Antarctica, between 1990 and 2010. Besides direct observations of the sedimentation, the evidence involves a debris flow covering old tyre tracks from the early 1960s, as well as evidence of a considerable amount of sediment transported onto the Salmon Creek delta. We believe that the flood was sourced from the Salmon Glacier and possibly the smaller Blackwelder Glacier. Such floods will be more common in the future and it is important to better understand their ecological impacts with good monitoring programmes.
We estimate a number of macroeconomic variables as logistic smooth transition autoregressive (LSTAR) processes with uncertainty as the transition variable. The notion is that the effects of increases in uncertainty should not be symmetrical with the effects of decreases in uncertainty. Nonlinear estimation allows us to answer several interesting questions left unanswered by a linear model. For a number of important macroeconomic variables, we show that (i) a positive shock to uncertainty has a greater effect than a negative shock and (ii) the effect of the uncertainty shock is highly dependent on the state of the economy. Hence, the usual linear estimates for the consequences of uncertainty are underestimated in circumstances such as the recent financial crisis.
By
Paul K. Kleinman, Department of Radiology, Boston Children’s Hospital, and Harvard Medical School, Boston, Massachusetts, USA,
Michele M. Walters, Staff Pediatric Radiologist at Boston Children’s Hospital and Instructor in Radiology at Harvard Medical School, Boston, Massachusetts, USA
Accurate dating of fractures is critical in cases of suspected child abuse (1–8). The ability of medical professionals to assess the veracity of the history provided depends on the clinical and radiologic assessment of the presenting injury or injuries. If, for example, a single injury to a localized portion of the extremity is alleged to have occurred but multiple sites of subperiosteal new bone formation (SPNBF) and/or callus are seen on radiographs, medical providers should become suspicious and initiate further investigation. However, if the severity of the alleged injury correlates with clinical and radiographic findings and all evidence suggests that the fracture is acute, the suspicion of abuse may never arise. It is clear that the ability of the radiologist and the clinician to assess the age of bony injury is critical to a determination of suspected child abuse. The forensic requirements to establishing responsibility and determining the need for intervention by child protection agencies rest strongly on the assessments of the clinician and the radiologist regarding the nature and timing of injury. Accurate fracture dating can aid in the identification and exclusion of potential abusive perpetrators. In criminal proceedings, the requirement to assign age estimates to fractures and to determine if there have been multiple episodes of abuse may have important implications. The presence of prior injuries may influence critical decisions regarding how defendants may be charged, how a prosecution may procede, the jury verdict, and the penalties to a convicted abuser (Fig. 6.1) (9).