We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure coreplatform@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Developing and practicing ethics requires an active and mindful approach that continues from graduate school throughout our careers. Because life in the real world tends to be messy with gray areas, contradictions, surprises, and rough edges, we must stay alert, distrust quick answers, and keep questioning. Knowing the ethics codes, laws, and professional guidelines is important; however, it is not enough. It is important not to let ethics, laws, and standards replace critical thinking, professional judgment, and personal responsibility. Our ability to think creatively and respond ethically to even the most daunting challenges seem mirrored by our shared human abilities to rationalize even the most unethical approaches. This chapter discusses the importance of learning to recognize and avoid the classic ethical fallacies. Attention is paid to the importance of knowing our weaknesses, ethical blind spots, biases, and ways to address these fallibilities in our careers.
Robert Heizer excavated Leonard Rockshelter (26Pe14) in western Nevada more than 70 years ago. He described stratified cultural deposits spanning the Holocene. He also reported obsidian flakes purportedly associated with late Pleistocene sediments, suggesting that human use extended even farther back in time. Because Heizer never produced a final report, Leonard Rockshelter faded into obscurity despite the possibility that it might contain a Clovis Era or older occupation. That possibility prompted our team of researchers from the University of Nevada, Reno and Desert Research Institute to return to the site in 2018 and 2019. We relocated the excavation block from which Heizer both recovered the flakes and obtained a late Pleistocene date on nearby sediments. We minimally excavated undisturbed deposits to rerecord and redate the strata. As an independent means of evaluating Heizer's findings, we also directly dated 12 organic artifacts housed at the Phoebe A. Hearst Museum of Anthropology. Our work demonstrates that people did not visit Leonard Rockshelter during the late Pleistocene. Rather, they first visited the site immediately following the Younger Dryas (12,900–11,700 cal BP) and sporadically used the shelter, mostly to store gear, throughout the Holocene.
Colleges and universities around the world engaged diverse strategies during the COVID-19 pandemic. Baylor University, a community of ˜22,700 individuals, was 1 of the institutions which resumed and sustained operations. The key strategy was establishment of multidisciplinary teams to develop mitigation strategies and priority areas for action. This population-based team approach along with implementation of a “Swiss Cheese” risk mitigation model allowed small clusters to be rapidly addressed through testing, surveillance, tracing, isolation, and quarantine. These efforts were supported by health protocols including face coverings, social distancing, and compliance monitoring. As a result, activities were sustained from August 1 to December 8, 2020. There were 62,970 COVID-19 tests conducted with 1435 people testing positive for a positivity rate of 2.28%. A total of 1670 COVID-19 cases were identified with 235 self-reports. The mean number of tests per week was 3500 with approximately 80 of these positive (11/d). More than 60 student tracers were trained with over 120 personnel available to contact trace, at a ratio of 1 per 400 university members. The successes and lessons learned provide a framework and pathway for similar institutions to mitigate the ongoing impacts of COVID-19 and sustain operations during a global pandemic.
Although apps are increasingly being used to support the diagnosis, treatment and management of mental illness, there is no single means through which costs associated with mental apps are being reimbursed. Furthermore, different apps are amenable to different means of reimbursement as not all apps generate value in the same way.
Aims
To provide insights into how apps are currently generating value and being reimbursed across the world, with a particular focus on the situation in the USA.
Method
An international team performed secondary research on how apps are being used and on common pathways to remuneration.
Results
The uses of apps today and in the future are reviewed, the nature of the value delivered by apps is summarised and an overview of app reimbursement in the USA and other countries is provided. Recommendations regarding how payments might be made for apps in the future are discussed.
Conclusions
Currently, apps are being reimbursed through channels with other original purposes. There may be a need to develop an app-specific channel for reimbursement which is analogous to the channels used for devices, drugs and laboratory tests.
Studies suggest that alcohol consumption and alcohol use disorders have distinct genetic backgrounds.
Methods
We examined whether polygenic risk scores (PRS) for consumption and problem subscales of the Alcohol Use Disorders Identification Test (AUDIT-C, AUDIT-P) in the UK Biobank (UKB; N = 121 630) correlate with alcohol outcomes in four independent samples: an ascertained cohort, the Collaborative Study on the Genetics of Alcoholism (COGA; N = 6850), and population-based cohorts: Avon Longitudinal Study of Parents and Children (ALSPAC; N = 5911), Generation Scotland (GS; N = 17 461), and an independent subset of UKB (N = 245 947). Regression models and survival analyses tested whether the PRS were associated with the alcohol-related outcomes.
Results
In COGA, AUDIT-P PRS was associated with alcohol dependence, AUD symptom count, maximum drinks (R2 = 0.47–0.68%, p = 2.0 × 10−8–1.0 × 10−10), and increased likelihood of onset of alcohol dependence (hazard ratio = 1.15, p = 4.7 × 10−8); AUDIT-C PRS was not an independent predictor of any phenotype. In ALSPAC, the AUDIT-C PRS was associated with alcohol dependence (R2 = 0.96%, p = 4.8 × 10−6). In GS, AUDIT-C PRS was a better predictor of weekly alcohol use (R2 = 0.27%, p = 5.5 × 10−11), while AUDIT-P PRS was more associated with problem drinking (R2 = 0.40%, p = 9.0 × 10−7). Lastly, AUDIT-P PRS was associated with ICD-based alcohol-related disorders in the UKB subset (R2 = 0.18%, p < 2.0 × 10−16).
Conclusions
AUDIT-P PRS was associated with a range of alcohol-related phenotypes across population-based and ascertained cohorts, while AUDIT-C PRS showed less utility in the ascertained cohort. We show that AUDIT-P is genetically correlated with both use and misuse and demonstrate the influence of ascertainment schemes on PRS analyses.
Aspirin-use disorder is an underreported condition. Identification of the signs and symptoms of aspirin misuse are important in light of prevalent non-prescribed medicine/over-the-counter medication (NPM/OTC) misuse. We discuss here the case of a patient with a history of chronic aspirin misuse who presented to the emergency department with salicylate intoxication and described elation secondary to deliberate aspirin consumption. This case highlights the importance of screening for NPM/OTC medication misuse in at-risk populations.
The late Holocene histories of Walker Lake and the Carson Sink were reconstructed by synthesizing existing data in both basins along with new age constraints from key sites, supplemented with paleohydrologic modeling. The repeated diversions of the Walker River to the Carson Sink and then back to Walker Lake caused Walker Lake–level fluctuations spanning ± 50 m. Low lake levels at about 1000, 750, and 300 cal yr BP are time correlative to the ages of fluvial deposits along the Walker River paleochannel, when flow was directed toward the Carson Sink. The timing and duration of large lakes in the Carson Sink were further refined using moisture-sensitive tree-ring chronologies. The largest lakes required a fourfold to fivefold increase in discharge spanning decades. Addition of Walker River flow to the Carson Sink by itself is inadequate to account for the required discharge. Instead, increases in the runoff coefficient and larger areas of the drainage basin contributing surface runoff may explain the enhanced discharge required to create these large lakes.
A new lake-level curve for Pyramid and Winnemucca lakes, Nevada, is presented that indicates that after the ~15,500 cal yr BP Lake Lahontan high stand (1338 m), lake level fell to an elevation below 1200 m, before rising to 1230 m at the 12,000 cal yr BP Younger Dryas high stand. Lake level then fell to 1155 m by ~10,500 cal yr BP followed by a rise to 1200 m around 8000 cal yr BP. During the mid-Holocene, levels were relatively low (~1155 m) before rising to moderate levels (1190–1195 m) during the Neopluvial period (~4800–3400 cal yr BP). Lake level again plunged to about 1155 m during the late Holocene dry period (~2800–1900 cal yr BP) before rising to about 1190 m by ~1200 cal yr BP. Levels have since fluctuated within the elevation range of about 1170–1182 m except for the last 100 yr of managed river discharge when they dropped to as low as 1153 m. Late Holocene lake-level changes correspond to volume changes between 25 and 55 km3 and surface area changes between 450 and 900 km2. These lake state changes probably encompass the hydrologic variability possible under current climate boundary conditions.
The Pueblo population of Chaco Canyon during the Bonito Phase (AD 800–1130) employed agricultural strategies and water-management systems to enhance food cultivation in this unpredictable environment. Scepticism concerning the timing and effectiveness of this system, however, remains common. Using optically stimulated luminescence dating of sediments and LiDAR imaging, the authors located Bonito Phase canal features at the far west end of the canyon. Additional ED-XRF and strontium isotope (87Sr/86Sr) analyses confirm the diversion of waters from multiple sources during Chaco’s occupation. The extent of this water-management system raises new questions about social organisation and the role of ritual in facilitating responses to environmental unpredictability.
Delusional parasitosis is infrequently seen in hospital-based consultation–liaison psychiatry.
Aims
Although there are many publications on delusional parasitosis, this report reviews a unique case that was diagnosed during a hospital admission and treated over the next 36 months.
Method
Case report and literature review.
Results
This case report describes a 65-year-old man who was diagnosed with delusional parasitosis during a hospital admission for congestive heart failure and acute kidney injury. A longitudinal description of the patient's condition during the hospital stay and in the 36 months following discharge, during which time he was treated by a consultation psychiatrist, is provided.
Conclusions
In discussing the treatment of a challenging presentation, this case demonstrates the opportunity for consultation psychiatrists to initiate care in patients who might not otherwise seek psychiatric services. Patients with somatic delusions represent one group of patients who are unlikely to independently seek psychiatric treatment.
The Wono and Trego Hot Springs (THS) tephras are widespread in the Lahontan basin and have been identified in a variety of sedimentary environments at different elevations. Davis (1983) reported lake level to be at about 1256 m when the THS tephra was deposited, an interpretation questioned by Benson et al. (1997) who interpreted lake level to be ≤1177 m at that time. This is a significant difference in lake size with important implications for interpreting the climate that prevailed at that time. Based on new interpretations of depositional settings of the THS bed at multiple sites, the larger lake size is correct. Additional sites containing the Wono tephra indicate that it was deposited when lake level was at about 1217 m in the western subbasins and at about 1205 m in the Carson Sink. Sedimentary features associated with progressively deeper paleowater depths follow a predictable pattern that is modulated by proximity to sediment sources and local slope. Fine to coarse sands with wave-formed features are commonly associated with relatively shallow water. Silty clay or clay dominates in paleowater depths >25 m, with thin laminae of sand and ostracods at sites located adjacent to or downslope from steep mountain fronts.
Shoreline geomorphology, shoreline stratigraphy, and radiocarbon dates of organic material incorporated in constructional beach ridges record large lakes during the late Pleistocene and late Holocene in the Pyramid Lake subbasin of Lake Lahontan, Nevada, USA. During the late Holocene, a transgression began at or after 3595 ± 35 14C yr B.P. and continued, perhaps in pulses, through 2635 ± 40 14C yr B.P., resulting in a lake as high as 1199 m. During the latest Pleistocene and overlapping with the earliest part of the Younger Dryas interval, a lake stood at approximately 1212 m at 10,820 ± 35 14C yr B.P. and a geomorphically and stratigraphically distinct suite of constructional shorelines associated with this lake can be traced to 1230 m. These two lake highstands correspond to periods of elevated regional wetness in the western Basin and Range that are not clearly represented in existing northern Sierra Nevada climate proxy records.
New dating in the Carson Sink at the termini of the Humboldt and Carson rivers in the Great Basin of the western United States indicates that lakes reached elevations of 1204 and 1198 m between 915 and 652 and between 1519 and 1308 cal yr B.P., respectively. These dates confirm Morrison's original interpretation (Lake Lahontan: Geology of the Southern Carson Desert, Professional Paper 40, U.S. Geol. Survey, 1964) that these shorelines are late Holocene features, rather than late Pleistocene as interpreted by later researchers. Paleohydrologic modeling suggests that discharge into the Carson Sink must have been increased by a factor of about four, and maintained for decades, to account for the 1204-m lake stand. The hydrologic effects of diversions of the Walker River to the Carson Sink were probably not sufficient, by themselves, to account for the late Holocene lake-level rises. The decadal-long period of increased runoff represented by the 1204-m lake is also reflected in other lake records and in tree ring records from the western United States.
This study evaluates obsidian-hydration dating in postglacial fluvial terraces cut into an outwash plain near West Yellowstone, Montana. Fluvial transport fractures obsidian grains. However, some old hydration rinds may be preserved, thus, a grain may record several fracturing events. The most recent fracturing event at West Yellowstone is recorded in surface sediments from all of the terraces, which were cut in a shorter period of time than the technique can discern. They formed about 19,000 ± 1000 yr ago, using published hydration-rate estimates and a mean rind thickness of 6.34 ± 0.14 μm (1 SE). Alternatively, the application of published hydration-rate constants for the Obsidian Cliff flow with an estimated effective hydration temperature of 1.4°C yield an age of 24,400 ± 1100 yr (1 SE). Thicker rinds record fracturing during Bull Lake glaciation and cooling cracks from the emplacement of several source flows. Much of the observed spread in rind thicknesses (6.34 ± 1.69 μm: 1 SD) is probably the result of chemically induced variations in hydration rate. Terrace ages based on a single rind would range from 13,000 to 39,000 yr (±1 SD). Therefore, it is inappropriate to (1) use a set of hydration-rate constants determined from a single sample to calculate ages for multiple artifacts or geological samples, (2) date an archaeological or geological event on the basis of a single artifact, or (3) generate a chronostratigraphy on the basis of individual dates as a function of depth. Multiple evaluations of source chemistry and hydration rates and multiple rind measurements are required to date fracturing events.
Evidence from shoreline and deep-lake sediments show Laguna Cari-Laufquén, located at 41°S in central Argentina, rose and fell repeatedly during the late Quaternary. Our results show that a deep (> 38 m above modern lake level) lake persisted from no later than 28 ka to 19 ka, with the deepest lake phase from 27 to 22 ka. No evidence of highstands is found after 19 ka until the lake rose briefly in the last millennia to 12 m above the modern lake, before regressing to present levels. Laguna Cari-Laufquén broadly matches other regional records in showing last glacial maximum (LGM) highstands, but contrasts with sub-tropical lake records in South America where the hydrologic maximum occurred during deglaciation (17–10 ka). Our lake record from Cari-Laufquén mimics that of high-latitude records from the Northern Hemisphere. This points to a common cause for lake expansions, likely involving some combination of temperature depression and intensification of storminess in the westerlies belt of both hemispheres during the LGM.
Natalgrass is an invasive species that has become increasingly problematic
in natural areas in Florida and other subtropical and tropical regions
around the world. Natalgrass is a prolific seed producer, but little
information is available regarding its seed biology and ecology. Research
was conducted to determine levels of seed dormancy and to examine the
effects of light, temperature, pH, water stress, and depth of burial on
natalgrass seed germination. In addition, seed persistence under field
conditions was examined both on the soil surface and while buried. Seeds
appeared to undergo afterripening. Seed germination was not light dependent
and occurred from 15 to 35 C, with optimum germination occurring at 20 to 35
C. Germination occurred at pH levels of 6 and 8 and was affected by water
stress; no germination was observed at osmotic potentials less than −0.2
MPa. Seeds emerged from depths of at least 5 cm. Under field conditions,
germination was reduced after burial; however, burial lengths of 3 to 15 mo
did not result in differences in germination levels. Seedling numbers from
seed deposits on the soil surface were greatly reduced after 1 mo, and no
seedling emergence was observed after 4 mo.
The user-managed inventory (UMI) is an emerging idea for enhancing the current distribution and maintenance system for emergency medical countermeasures (MCMs). It increases current capabilities for the dispensing and distribution of MCMs and enhances local/regional preparedness and resilience. In the UMI, critical MCMs, especially those in routine medical use (“dual utility”) and those that must be administered soon after an incident before outside supplies can arrive, are stored at multiple medical facilities (including medical supply or distribution networks) across the United States. The medical facilities store a sufficient cache to meet part of the surge needs but not so much that the resources expire before they would be used in the normal course of business. In an emergency, these extra supplies can be used locally to treat casualties, including evacuees from incidents in other localities. This system, which is at the interface of local/regional and federal response, provides response capacity before the arrival of supplies from the Strategic National Stockpile (SNS) and thus enhances the local/regional medical responders' ability to provide life-saving MCMs that otherwise would be delayed. The UMI can be more cost-effective than stockpiling by avoiding costs due to drug expiration, disposal of expired stockpiled supplies, and repurchase for replacement.
(Disaster Med Public Health Preparedness. 2012;6:408-414)