To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Law plays a key role in determining the level of entrepreneurial action in society. Legal rules seek to define property rights, facilitate private ordering, and impose liability for legal wrongs, thereby attempting to establish conditions under which individuals may act. These rules also channel the development of technology, regulate information flows, and determine parameters of competition. Depending on their structure and implementation, legal rules can also discourage individuals from acting. It is thus crucial to determine which legal rules and institutions best enable entrepreneurs, whose core function is to challenge incumbency. This volume assembles legal experts from diverse fields to examine the role of law in facilitating or impeding entrepreneurial action. Contributors explore issues arising in current policy debates, including the incentive effect of legal rules on startup activity; the role of law in promoting or foreclosing market entry; and the effect of entrepreneurial action on legal doctrine.
The Holzman archaeological site, located along Shaw Creek in interior Alaska, contained two mammoth ivory rods, of which one is bi-beveled, within a stratigraphically sealed cultural context. Dated 13,600–13,300 cal BP, these are the earliest known examples of osseous rod technology in the Americas. Beveled ivory, antler, and bone rods and points share technological similarities between Upper Paleolithic Europe, Asia, eastern Beringia, and the Clovis tradition of North America and are important tool types in understanding the late Pleistocene dispersal of modern humans. The Holzman finds are comparable to well-known Clovis tradition artifacts from Anzick (Montana), Blackwater Draw (New Mexico), East Wenatchee (Washington), and Sherman Cave (Ohio). We describe these tools in the broader context of late Pleistocene osseous technology with implications for acquisition and use of mammoth ivory in eastern Beringia and beyond.
Throughout the 1690s there were several high-profile parliamentary debates about lowering interest rates from 6 to 4 percent. Locke's involvement in these policy debates is significant. In this period, he circulated at least one important pamphlet on this issue to various Members of Parliament. The purpose of this article is to illuminate the links between Locke's arguments against interest rate reduction and immigration policy. Locke's essay “For a General Naturalization” (1693) employs some of the same pro-naturalization formulations that Josiah Child uses in A New Discourse of Trade (1693), a pamphlet that was ostensibly published in support of the parliamentary proposal for lower interest rates. Even though Locke had a long history with pro-naturalization arguments, the framework of his essay on naturalization is very likely an extension of those debates with Child about interest rates from 1691/2.
The modality of treatment of third nerve palsy (TNP) associated with intracranial aneurysms remains controversial. While treatment varies with the location of the aneurysm, microsurgical clipping of PComm aneurysms has generally been the traditional choice, with endovascular coiling emerging as a reasonable alternative.
Patients with TNP due to an intracranial aneurysm who subsequently underwent treatment at a mid-sized Canadian neurosurgical center over a 15-year period (2003–2018) were examined.
A total of 616 intracranial aneurysms in 538 patients were treated; the majority underwent endovascular coiling with only 24 patients treated with surgical clipping. Only 37 patients (6.9%) presented with either a partial or complete TNP and underwent endovascular embolization; of these, 17 presented with a SAH secondary to intracranial aneurysm rupture. Aneurysms associated with TNP included PComm (64.9%), terminal ICA (29.7%), proximal MCA (2.7%), and basilar tip (2.7%) aneurysms. In general, smaller aneurysms and earlier treatment were provided for patients for ruptured aneurysms with a shorter mean interval to TNP recovery. In the endovascularly treated cohort initially presenting with TNP, seven presented with a complete TNP and the remaining were partial TNPs. TNP resolved completely in 20 patients (55.1%) and partially in 10 patients (27.0%). Neither time to coiling nor SAH at presentation were significantly associated with the recovery status of TNP.
Endovascular coil embolization is a viable treatment modality for patients presenting with an associated cranial nerve palsy.
To determine the impact of electronic health record (EHR)–based interventions and test restriction on Clostridioides difficile tests (CDTs) and hospital-onset C. difficile infection (HO-CDI).
Quasi-experimental study in 3 hospitals.
957-bed academic (hospital A), 354-bed (hospital B), and 175-bed (hospital C) academic-affiliated community hospitals.
Three EHR-based interventions were sequentially implemented: (1) alert when ordering a CDT if laxatives administered within 24 hours (January 2018); (2) cancellation of CDT orders after 24 hours (October 2018); (3) contextual rule-driven order questions requiring justification when laxative administered or lack of EHR documentation of diarrhea (July 2019). In February 2019, hospital C implemented a gatekeeper intervention requiring approval for all CDTs after hospital day 3. The impact of the interventions on C. difficile testing and HO-CDI rates was estimated using an interrupted time-series analysis.
C. difficile testing was already declining in the preintervention period (annual change in incidence rate [IR], 0.79; 95% CI, 0.72–0.87) and did not decrease further with the EHR interventions. The laxative alert was temporally associated with a trend reduction in HO-CDI (annual change in IR from baseline, 0.85; 95% CI, 0.75–0.96) at hospitals A and B. The gatekeeper intervention at hospital C was associated with level (IRR, 0.50; 95% CI, 0.42-0.60) and trend reductions in C. difficile testing (annual change in IR, 0.91; 95% CI, 0.85–0.98) and level (IRR 0.42; 95% CI, 0.22–0.81) and trend reductions in HO-CDI (annual change in IR, 0.68; 95% CI, 0.50–0.92) relative to the baseline period.
Test restriction was more effective than EHR-based clinical decision support to reduce C. difficile testing in our 3-hospital system.
The National Aeronautics and Space Administration (NASA) and the European Space Agency (ESA) are studying how samples might be brought back to Earth from Mars safely. Backward planetary protection is key in this complex endeavour, as it is required to prevent potential adverse effects from returning materials to Earth's biosphere. As the question of whether or not life exists on Mars today or whether it ever did in the past is still unanswered, the effort to return samples from Mars is expected to be categorized as a ‘Restricted Earth Return’ mission, for which NASA policy requires the containment of any unsterilized material returned to Earth. NASA is investigating several solutions to contain Mars samples and sterilize any uncontained Martian particles. This effort has significant implications for both NASA's scientific mission, and the Earth's environment; and so special care and vigilance are needed in planning and execution in order to assure acceptance of safety to Earth's biosphere. To generate a technically acceptable sterilization process across a wide array of scientific and other stakeholders, on 30–31 January 2019, 10–11 June 2019 and 19–20 February 2020, NASA informally convened a Sterilization Working Group (SWG) composed of experts from industry, academia and government to assess methods for sterilization and inactivation, to identify future work needed to verify these methods against biological challenges, and to determine their feasibility for implementation on robotic spacecraft in deep space. The goals of the SWG were:
(1) Understand what it means to sterilize and/or inactivate Martian materials and how that understanding can be applied to the Mars Sample Return (MSR) mission.
(2) Assess methods for sterilization and inactivation, and identify future work needed to verify these methods.
(3) Provide an effective plan for communicating with other agencies and the public.
This paper provides a summary of the discussions and conclusions of the SWG over these three workshops. It reflects a consensus position based on qualitative discussion of how agencies might approach the problem of sterilization of Mars material. The SWG reached a consensus that sterilization options can be considered on the basis of biology as we know it, and that sterilization modalities that are effective on terrestrial materials and organisms should be part of the MSR planetary protection strategy. Conclusions pointed to several industry standards for sterilization to include heat, chemical, UV radiation and low-heat plasma. Technical trade-offs for each sterilization modality were discussed while simultaneously considering the engineering challenges and limitations for spaceflight. Future work includes more in-depth discussions on technical trade-offs of sterilization modalities, identifying and testing Earth analogue challenge organisms and proteinaceous molecules against chosen modalities, and executing collaborative agreements between NASA and external working group partners to help close data gaps, and to establish strong, scientifically grounded sterilization and inactivation standards for MSR.
The role of private health insurance in the Irish health system can be assessed from different angles and from all angles it appears complex. Despite unversal entitlement to public hospital services, private cover – predominantly for hospital services – is purchased by nearly half of the population. This high level of demand has remained buoyant over time in the face of premium increases, adverse economic conditions, reductions in public subsidies and controversy within the market. Also, while private health insurance accounts for less than 15% of total spending on health, it commands a high profile in media and policy discussions and has substantial leverage over how public and private resources are allocated within the health system, particularly in the acute care sector.
Background: Chlorhexidine bathing reduces bacterial skin colonization and prevents infections in specific patient populations. As chlorhexidine use becomes more widespread, concerns about bacterial tolerance to chlorhexidine have increased; however, testing for chlorhexidine minimum inhibitory concentrations (MICs) is challenging. We adapted a broth microdilution (BMD) method to determine whether chlorhexidine MICs changed over time among 4 important healthcare-associated pathogens. Methods: Antibiotic-resistant bacterial isolates (Staphylococcus aureus from 2005 to 2019 and Escherichia coli, Klebsiella pneumoniae, and Enterobacter cloacae complex from 2011 to 2019) were collected through Emerging Infections Program surveillance in 2 sites (Georgia and Tennessee) or through public health reporting in 1 site (Orange County, California). A convenience sample of isolates were collected from facilities with varying amounts of chlorhexidine use. We performed BMD testing using laboratory-developed panels with chlorhexidine digluconate concentrations ranging from 0.125 to 64 μg/mL. After successfully establishing reproducibility with quality control organisms, 3 laboratories performed MIC testing. For each organism, epidemiological cutoff values (ECVs) were established using ECOFFinder. Results: Among 538 isolates tested (129 S. aureus, 158 E. coli, 142 K. pneumoniae, and 109 E. cloacae complex), S. aureus, E. coli, K. pneumoniae, and E. cloacae complex ECVs were 8, 4, 64, and 64 µg/mL, respectively (Table 1). Moreover, 14 isolates had an MIC above the ECV (12 E. coli and 2 E. cloacae complex). The MIC50 of each species is reported over time (Table 2). Conclusions: Using an adapted BMD method, we found that chlorhexidine MICs did not increase over time among a limited sample of S. aureus, E. coli, K. pneumoniae, and E. cloacae complex isolates. Although these results are reassuring, continued surveillance for elevated chlorhexidine MICs in isolates from patients with well-characterized chlorhexidine exposure is needed as chlorhexidine use increases.
Understanding differences in social-emotional behavior can help identify atypical development. This study examined the differences in social-emotional development in children at increased risk of an autism spectrum disorder (ASD) diagnosis (infant siblings of children diagnosed with the disorder). Parents completed the Brief Infant-Toddler Social-Emotional Assessment (BITSEA) to determine its ability to flag children with later-diagnosed ASD in a high-risk (HR) sibling population. Parents of HR (n = 311) and low-risk (LR; no family history of ASD; n = 127) children completed the BITSEA when their children were 18 months old and all children underwent a diagnostic assessment for ASD at age 3 years. All six subscales of the BITSEA (Problems, Competence, ASD Problems, ASD Competence, Total ASD Score, and Red Flags) distinguished between those in the HR group who were diagnosed with ASD (n = 84) compared to non-ASD-diagnosed children (both HR-N and LR). One subscale (BITSEA Competence) differentiated between the HR children not diagnosed with ASD and the LR group. The results suggest that tracking early social-emotional development may have implications for all HR children, as they are at increased risk of ASD but also other developmental or mental health conditions.
A set of durum wheat genotypes from New South Wales (NSW, Durum Breeding Australia (DBA) Northern Program), South Australia (SA, DBA Southern Program and Australian Grain Technology), ICARDA and CIMMYT (International Centre for Research in Dryland Agriculture and International Centre for Maize and Wheat Improvement) was evaluated over 3 years (2012–2014) in field trials containing rainfed and watered blocks in Narrabri, NSW, Australia. Data on yield and other agronomic traits were analysed using a multi-environment trial approach that accommodated the factorial treatment structure (genotype by irrigation regime) within individual trials. Considerable variation was observed in the durum germplasm for productivity and grain quality traits. DBA Bindaroi (NSW) and 101042 (ICARDA) were the top yielders in watered and rainfed blocks, respectively. The yield was positively and strongly related to both harvest index and grains/m2, but grains/m2 was negatively related to thousand grain weight (TGW) and positively related to screenings. TGW and screenings were strongly negatively related and TGW and grains/m2 showed a weak positive relationship. Promising genotypes were identified, with superior traits to both the bread wheat check, EGA Gregory and the durum check, Caparoi. Overall, lines from SA and ICARDA were superior for yield but those from NSW were superior for quality parameters including TGW and screenings. These results suggested the possibility of developing high yielding high-quality durum varieties by crossing NSW lines with SA, CIMMYT and ICARDA lines through simultaneous selection for yield, TGW and low screenings. The results also suggested that productivity in rainfed conditions was positively related to productivity under watering, but further research is required to establish this.
This chapter offers a new explanation for mandatory fiduciary protections in certain business relationships—the preservation of trust that might otherwise be eroded through the bargaining process. Any contract a hypothetical entrepreneur and an investor might enter would inevitably be incomplete and give rise to potential opportunistic behavior. While the parties could draft a more detailed agreement prohibiting various forms of opportunism, the very act of bargaining over these protections could undermine whatever trust existed between the parties at the outset of their relationship. By contrast, a prohibition limiting opportunism in state-imposed fiduciary obligations removes the invocation of distrust by either party to the agreement. Fiduciary protections, however, do not provide a perfect solution in all business relationships. Although fiduciary duties can usefully constrain opportunism and preserve trust in vertical business relationships, such as in a simple principal-agent arrangement, other situations involve complexity that pose challenges for fiduciary law. We illustrate this observation with examples of various horizontal conflicts, or diverging interests, in the venture capital-backed startup context. To the extent that contract and fiduciary law are each incomplete, a residual domain for trust and other mechanisms for risk reduction or self help remains.
Drive-through clinics (DTCs) are a novel type of point of dispensing where participants drive to a designated location and receive prophylaxis while remaining inside their vehicle. The objective of this review was to identify effective practices and recommendations for implementing DTCs for mass prophylaxis dispensing during emergency events.
A systematic review was conducted for articles covering DTCs published between 1990 and 2019. Inclusion criteria were peer-reviewed, written in English, and addressed DTCs sufficiently. Effective practices and recommendations identified in the literature were presented by theme.
A total of 13 articles met inclusion criteria. The themes identified were (1) optimal DTC design and planning via decision support systems and decision support tools; (2) clinic layouts, locations, and design aspects; (3) staffing, training, and DTC communication; (4) throughput time; (5) community outreach methods; (6) DTC equipment; (7) infection prevention and personal protective equipment; and (8) adverse events prevention and traffic management.
DTCs are an essential component of emergency preparedness and must be optimally designed and implemented to successfully dispense mass prophylaxis to a community within 48 hours. The effective practices and recommendations presented can be used for the development, implementation, and improvement of DTCs for their target populations.
Lifeboats are essential life-saving equipment for all types of water-going vessels and offshore platforms. Lifeboat simulators have been created specifically for offshore personnel to practice in conditions that are normally too risky for live training. As simulation training is a relatively new alternative, there is a need to assess how training performed with a simulator compares with conventional training. This study was performed to evaluate how skills acquired with different training approaches transferred to an emergency scenario. Over a period of one year, participants received quarterly training in one of three ways: using live boats, computer-based training or a simulator. Following training, participants were evaluated on their ability to launch and manoeuvre a lifeboat in a plausible emergency. The study results suggest a benefit to performing training with realistic lifeboat controls and practicing using representative emergency scenarios. Insights are provided on how training can be modified to increase competence.
Weed management is a major challenge in organic crop production, and organic farms generally harbor larger weed populations and more diverse communities compared with conventional farms. However, little research has been conducted on the effects of different organic management practices on weed communities and crop yields. In 2014 and 2015, we measured weed community structure and soybean [Glycine max (L.) Merr.] yield in a long-term experiment that compared four organic cropping systems that differed in nutrient inputs, tillage, and weed management intensity: (1) high fertility (HF), (2) low fertility (LF), (3) enhanced weed management (EWM), and (4) reduced tillage (RT). In addition, we created weed-free subplots within each system to assess the impact of weeds on soybean yield. Weed density was greater in the LF and RT systems compared with the EWM system, but weed biomass did not differ among systems. Weed species richness was greater in the RT system compared with the EWM system, and weed community composition differed between RT and other systems. Our results show that differences in weed community structure were primarily related to differences in tillage intensity, rather than nutrient inputs. Soybean yield was lower in the EWM system compared with the HF and RT systems. When averaged across all four cropping systems and both years, soybean yield in weed-free subplots was 10% greater than soybean yield in the ambient weed subplots that received standard management practices for the systems in which they were located. Although weed competition limited soybean yield across all systems, the EWM system, which had the lowest weed density, also had the lowest soybean yield. Future research should aim to overcome such trade-offs between weed control and yield potential, while conserving weed species richness and the ecosystem services associated with increased weed diversity.
Item 9 of the Patient Health Questionnaire-9 (PHQ-9) queries about thoughts of death and self-harm, but not suicidality. Although it is sometimes used to assess suicide risk, most positive responses are not associated with suicidality. The PHQ-8, which omits Item 9, is thus increasingly used in research. We assessed equivalency of total score correlations and the diagnostic accuracy to detect major depression of the PHQ-8 and PHQ-9.
We conducted an individual patient data meta-analysis. We fit bivariate random-effects models to assess diagnostic accuracy.
16 742 participants (2097 major depression cases) from 54 studies were included. The correlation between PHQ-8 and PHQ-9 scores was 0.996 (95% confidence interval 0.996 to 0.996). The standard cutoff score of 10 for the PHQ-9 maximized sensitivity + specificity for the PHQ-8 among studies that used a semi-structured diagnostic interview reference standard (N = 27). At cutoff 10, the PHQ-8 was less sensitive by 0.02 (−0.06 to 0.00) and more specific by 0.01 (0.00 to 0.01) among those studies (N = 27), with similar results for studies that used other types of interviews (N = 27). For all 54 primary studies combined, across all cutoffs, the PHQ-8 was less sensitive than the PHQ-9 by 0.00 to 0.05 (0.03 at cutoff 10), and specificity was within 0.01 for all cutoffs (0.00 to 0.01).
PHQ-8 and PHQ-9 total scores were similar. Sensitivity may be minimally reduced with the PHQ-8, but specificity is similar.
We sought to retrospectively report our outcomes using post-operative stereotactic radiosurgery (SRS)/stereotactic radiotherapy (SRT) in place of whole-brain radiation therapy (WBRT) following resection of brain metastases from our hospital-based community practice.
Materials and Methods:
A retrospective review of 23 patients who underwent post-operative SRS at our single institution from 2013 to 2017 was undertaken. Patient records, treatment plans and diagnostic images were reviewed. Local failure, distant intracranial failure and overall survival were studied. Categorical variables were analyzed using Fisher’s exact tests. Continuous variables were analyzed using Mann–Whitney tests. The Kaplan–Meier method was used to estimate survival times.
16 (70%) were single-fraction SRS, whereas the remaining 7 patients received a five-fraction treatment course. The median single-fraction dose was 16 Gy (range, 16–18). The median total dose for fractionated treatments was 25 Gy (range, 25–35). Overall survival at 6 and 12 months was 95 and 67%, respectively. Comparison of SRS versus SRT local control rates at 6 and 12 months revealed control rates of 92 and 78% versus 29 and 14%, respectively. Every patient with dural/pial involvement at the time of surgery had distant intracranial failure at the 12-month follow-up.
Single-fraction frameless SRS proved to be an effective modality with excellent local control rates. However, the five-fraction SRT course was associated with an increased rate of local recurrence. Dural/pial involvement may portend a high risk for distant intracranial disease; therefore, it may be prudent to consider alternative approaches in these cases.
Determining infectious cross-transmission events in healthcare settings involves manual surveillance of case clusters by infection control personnel, followed by strain typing of clinical/environmental isolates suspected in said clusters. Recent advances in genomic sequencing and cloud computing now allow for the rapid molecular typing of infecting isolates.
To facilitate rapid recognition of transmission clusters, we aimed to assess infection control surveillance using whole-genome sequencing (WGS) of microbial pathogens to identify cross-transmission events for epidemiologic review.
Clinical isolates of Staphylococcus aureus, Enterococcus faecium, Pseudomonas aeruginosa, and Klebsiella pneumoniae were obtained prospectively at an academic medical center, from September 1, 2016, to September 30, 2017. Isolate genomes were sequenced, followed by single-nucleotide variant analysis; a cloud-computing platform was used for whole-genome sequence analysis and cluster identification.
Most strains of the 4 studied pathogens were unrelated, and 34 potential transmission clusters were present. The characteristics of the potential clusters were complex and likely not identifiable by traditional surveillance alone. Notably, only 1 cluster had been suspected by routine manual surveillance.
Our work supports the assertion that integration of genomic and clinical epidemiologic data can augment infection control surveillance for both the identification of cross-transmission events and the inclusion of missed and exclusion of misidentified outbreaks (ie, false alarms). The integration of clinical data is essential to prioritize suspect clusters for investigation, and for existing infections, a timely review of both the clinical and WGS results can hold promise to reduce HAIs. A richer understanding of cross-transmission events within healthcare settings will require the expansion of current surveillance approaches.
Syndromic surveillance is a form of surveillance that generates information for public health action by collecting, analysing and interpreting routine health-related data on symptoms and clinical signs reported by patients and clinicians rather than being based on microbiologically or clinically confirmed cases. In England, a suite of national real-time syndromic surveillance systems (SSS) have been developed over the last 20 years, utilising data from a variety of health care settings (a telehealth triage system, general practice and emergency departments). The real-time systems in England have been used for early detection (e.g. seasonal influenza), for situational awareness (e.g. describing the size and demographics of the impact of a heatwave) and for reassurance of lack of impact on population health of mass gatherings (e.g. the London 2012 Olympic and Paralympic Games).We highlight the lessons learnt from running SSS, for nearly two decades, and propose questions and issues still to be addressed. We feel that syndromic surveillance is an example of the use of ‘big data’, but contend that the focus for sustainable and useful systems should be on the added value of such systems and the importance of people working together to maximise the value for the public health of syndromic surveillance services.