We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure coreplatform@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Wesley Hohfeld is known the world over as the legal theorist who famously developed a taxonomy of legal concepts. His contributions to legal thinking have stood the test of time, remaining relevant nearly a century after they were first published. Yet, little systematic attention has been devoted to exploring the full significance of his work. Beginning with a lucid, annotated version of Hohfeld's most important article, this volume is the first to offer a comprehensive look at the scope, significance, reach, intricacies, and shortcomings of Hohfeld's work. Featuring insights from leading legal thinkers, the book also contains many of Hohfeld's previously unseen personal papers, shedding new light on the complex motivations behind Hohfeld's projects. Together, these selected papers and original essays reveal a portrait of a multifaceted and ambitious intellectual who did not live long enough to see the impact of his ideas on the study of law.
Water stress and weed competition are critical stressors during corn (Zea mays L.) development. Genetic improvements in corn have resulted in hybrids with greater tolerance to abiotic and biotic stressors; however, drought stress remains problematic. Therefore, with the expected change in precipitation throughout the Great Lakes Region, greenhouse experiments were conducted to evaluate water stress and weed competition on drought tolerant corn performance. The study followed a completely randomized block design with four replications. Factorial treatment combinations consisted of drought tolerant corn competition (presence or absence), water stress (100 or 50% volumetric water content (VWC)), and nine corn:common lambsquarters (Chenopodium album L.) (CHEAL) densities. Corn and C. album growth parameters were measured 14 and 21 days after water stress initiation. To address the impact of reduced soil moisture and weed competition on corn and C. album growth parameters, photosynthetic response, and biomass; linear mixed effects and non-linear regression models were constructed in R. Chenopodium album biomass was reduced by 46 and 50% under corn competition at two and four weeds pot-1 (p = 0.0003, 0.0004). However, introducing crop competition under six and nine weeds pot-1 did not reduce C. album biomass (p = 0.90, 1.00). Averaged across weed pressures, corn biomass was 22% less when grown under 50% compared to 100% VWC (p = 0.0003). However, averaged across VWC, increasing weed competition from zero to two (p = 0.04), four (p = <0.0001), six (p = 0.0002), or nine (p = 0.0002) weeds pot-1 reduced biomass by 22, 38, 35, and 36%. Overall, water stress and C. album competition negatively affected the parameters measured in this study; however, the magnitude of reduction is stronger under drought stress than increasing weed competition when water is not limiting. Therefore, field crop growers must modify current integrated weed management programs to maintain yield under future climate stress.
Background: Measuring outcomes that matter to patients is a key component of ensuring patient-centred care. In Chronic Inflammatory Neuropathies (CINs), where immunomodulatory treatments have risks and high costs, systematic evaluation of disease progression is needed to ensure patients are achieving outcomes that reflect their values and goals. The aim of this project is to evaluate the feasibility of objective outcome measure (OOM) use in the clinical setting. Methods: Prospective data was collected from 27 participants with CIDP or MMN. Participants completed and provided feedback on patient-reported outcome measures including quality of life, activity and participation, pain and fatigue, as well as grip strength, 9-hole peg test, 10 meter walk, muscle strength and sensation. Focus groups were conducted to collect qualitative data. Results: The majority of OOMs were considered relevant to 90% of participants. The top three ranked measures were muscle strength testing, daily activities questionnaire and quality of life questionnaire. 52% of participants identified balance and/or detailed gait assessment as an important factor that was not part of collected OOMs. Conclusions: OOMs allow for appropriate monitoring of patients and optimization of immunotherapy treatment. By tracking longitudinal results that matter to patients, patients can better participate in shared-decision making. Clinicians should adopt OOMs going forward.
Background: Delayed Cerebral Ischemia (DCI) is a complication of aneurysmal subarachnoid hemorrhage (aSAH) and is associated with significant morbidity and mortality. A paucity of high-quality evidence is available to guide the management of DCI. As such, our objective was to evaluate practice patterns of Canadian physicians regarding the management of aSAH and DCI. Methods: The Canadian Neurosurgery Research Collaborative (CNRC) performed a cross-sectional survey of Canadian neurosurgeons, intensivists, and neurologists who manage aSAH. The survey was distributed to members of the Canadian Neurosurgical and Neurocritical Care Societies, respectively. Responses were analyzed using quantitative and qualitative methods. Results: The response rate was 129/340 (38%). Agreement among respondents included the need for intensive care unit admission, use of clinical and radiographic monitoring, and prophylaxis for prevention of DCI. Indications for starting hyperdynamic therapy varied. There was discrepancy in the proportion of patients felt to require intravenous milrinone, intra-arterial vasodilators, or physical angioplasty for treatment of DCI. Most respondents reported their facility does not utilize a standardized definition for DCI. Conclusions: DCI is an important clinical entity for which no consensus exists in management among Canadian practitioners. The CNRC calls for the development of national standards in the diagnosis and management of DCI.
Background: Extracranial traumatic vertebral artery injury (eTVAI) is common following non-penetrating head and neck trauma. Most cases are initially asymptomatic with an increased risk for stroke. Consensus is lacking regarding screening, treatment, and follow-up of asymptomatic patients with eTVAI. Our objective was to investigate national practice patterns reflecting these domains. Methods: An electronic survey was distributed via the Canadian Neurological Sciences Federation and Canadian Spine Society. Two case-based scenarios featured asymptomatic patients with eTVAI. Case 1: non-displaced cervical lateral mass fracture; angiography stratified by luminal diameter reduction. Case 2: complex C2 fracture; angiography featuring pseudoaneurysm dissection. Analysis: descriptive statistics. Results: Response Rate: 108 of 182 participants (59%), representing 20 academic institutions.
Case 1: 78% of respondents would screen using CTA (97%), immediately (88%). Most respondents (97%) would initiate treatment, using aspirin (89%) for 3-6 months (46%).
Case 2: 73% of respondents would screen using CTA (96%), immediately (88%). The majority of respondents (94%) would initiate treatment, using aspirin (50%) for 3-6 months (35%). Thirty-six percent of respondents would utilize endovascular therapy.
In both cases, the majority of respondents would follow-up clinically or radiographically every 1-3 months, respectively. Conclusions: This study highlights consensus in Canadian practice patterns for the workup and management of asymptomatic eTVAI.
Background: Sensory ganglionopathy (SG) is a rare form of neuropathy affecting the dorsal root ganglia and leading to non-length-dependent sensory abnormalities. Although balance problems are frequently reported by patients, a comprehensive balance assessment in SG has not been conducted. This study quantifies balance deficits in SG and examines their relation to patient-reported outcome measures (PROMs). Methods: Prospective data was collected from five participants with SG. Balance assessments included Fullerton Advanced Balance scale, Berg Balance scale, and 360 degree turn. Participants completed PROMs assessing balance confidence (ABC scale), pain, fatigue, quality of life (QoL), and daily activity and participation. Assessment also included neurological exam, nerve conduction studies (NCS) and posturography. Results: All participants had severe SG on NCS with normal strength and significant sensory abnormalities. Balance scores indicated severe balance deficits in all participants and aligned with posturography and truncal sway measures. PROMs revealed low confidence in balance, high levels of pain and fatigue, difficulties with daily activities, and reduced QoL. Conclusions: Although balance testing is not part of routine clinical practice, PROMs and targeted assessment may help monitor patients with SG and their response to treatment. Larger sample sizes are needed to understand the impact of balance on PROMs and optimize bedside balance testing.
Over the last 25 years, radiowave detection of neutrino-generated signals, using cold polar ice as the neutrino target, has emerged as perhaps the most promising technique for detection of extragalactic ultra-high energy neutrinos (corresponding to neutrino energies in excess of 0.01 Joules, or 1017 electron volts). During the summer of 2021 and in tandem with the initial deployment of the Radio Neutrino Observatory in Greenland (RNO-G), we conducted radioglaciological measurements at Summit Station, Greenland to refine our understanding of the ice target. We report the result of one such measurement, the radio-frequency electric field attenuation length $L_\alpha$. We find an approximately linear dependence of $L_\alpha$ on frequency with the best fit of the average field attenuation for the upper 1500 m of ice: $\langle L_\alpha \rangle = ( ( 1154 \pm 121) - ( 0.81 \pm 0.14) \, ( \nu /{\rm MHz}) ) \,{\rm m}$ for frequencies ν ∈ [145 − 350] MHz.
Lower-income older adults with multiple chronic conditions (MCC) are highly vulnerable to food insecurity. However, few studies have considered how health care access is related to food insecurity among older adults with MCC. The aims of this study were to examine associations between MCC and food insecurity, and, among older adults with MCC, between health care access and food insecurity.
Design:
Cross-sectional study data from the 2019 Behavioral Risk Factor Surveillance System survey.
Setting:
Washington State, USA.
Participants:
Lower-income adults, aged 50 years or older (n 2118). MCC was defined as having ≥ 2 of 11 possible conditions. Health care access comprised three variables (unable to afford seeing the doctor, no health care coverage and not having a primary care provider (PCP)). Food insecurity was defined as buying food that did not last and not having money to get more.
Results:
The overall prevalence of food insecurity was 26·0 % and was 1·50 times greater (95 % CI 1·16, 1·95) among participants with MCC compared to those without MCC. Among those with MCC (n 1580), inability to afford seeing a doctor was associated with food insecurity (prevalence ratio (PR) 1·83; 95 % CI 1·46, 2·28), but not having health insurance (PR 1·49; 95 % CI 0·98, 2·24) and not having a PCP (PR 1·10; 95 % CI 0·77, 1·57) were not.
Conclusions:
Inability to afford healthcare is related to food insecurity among older adults with MCC. Future work should focus on collecting longitudinal data that can clarify the temporal relationship between MCC and food insecurity.
Robert Heizer excavated Leonard Rockshelter (26Pe14) in western Nevada more than 70 years ago. He described stratified cultural deposits spanning the Holocene. He also reported obsidian flakes purportedly associated with late Pleistocene sediments, suggesting that human use extended even farther back in time. Because Heizer never produced a final report, Leonard Rockshelter faded into obscurity despite the possibility that it might contain a Clovis Era or older occupation. That possibility prompted our team of researchers from the University of Nevada, Reno and Desert Research Institute to return to the site in 2018 and 2019. We relocated the excavation block from which Heizer both recovered the flakes and obtained a late Pleistocene date on nearby sediments. We minimally excavated undisturbed deposits to rerecord and redate the strata. As an independent means of evaluating Heizer's findings, we also directly dated 12 organic artifacts housed at the Phoebe A. Hearst Museum of Anthropology. Our work demonstrates that people did not visit Leonard Rockshelter during the late Pleistocene. Rather, they first visited the site immediately following the Younger Dryas (12,900–11,700 cal BP) and sporadically used the shelter, mostly to store gear, throughout the Holocene.
Animal and human data demonstrate independent relationships between fetal growth, hypothalamic-pituitary-adrenal axis function (HPA-A) and adult cardiometabolic outcomes. While the association between fetal growth and adult cardiometabolic outcomes is well-established, the role of the HPA-A in these relationships is unclear. This study aims to determine whether HPA-A function mediates or moderates this relationship. Approximately 2900 pregnant women were recruited between 1989-1991 in the Raine Study. Detailed anthropometric data was collected at birth (per cent optimal birthweight [POBW]). The Trier Social Stress Test was administered to the offspring (Generation 2; Gen2) at 18 years; HPA-A responses were determined (reactive responders [RR], anticipatory responders [AR] and non-responders [NR]). Cardiometabolic parameters (BMI, systolic BP [sBP] and LDL cholesterol) were measured at 20 years. Regression modelling demonstrated linear associations between POBW and BMI and sBP; quadratic associations were observed for LDL cholesterol. For every 10% increase in POBW, there was a 0.54 unit increase in BMI (standard error [SE] 0.15) and a 0.65 unit decrease in sBP (SE 0.34). The interaction between participant’s fetal growth and HPA-A phenotype was strongest for sBP in young adulthood. Interactions for BMI and LDL-C were non-significant. Decomposition of the total effect revealed no causal evidence of mediation or moderation.
Fluting is a technological and morphological hallmark of some of the most iconic North American Paleoindian stone points. Through decades of detailed artifact analyses and replication experiments, archaeologists have spent considerable effort reconstructing how flute removals were achieved, and they have explored possible explanations of why fluting was such an important aspect of early point technologies. However, the end of fluting has been less thoroughly researched. In southern North America, fluting is recognized as a diagnostic characteristic of Clovis points dating to approximately 13,000 cal yr BP, the earliest widespread use of fluting. One thousand years later, fluting occurs more variably in Dalton and is no longer useful as a diagnostic indicator. How did fluting change, and why did point makers eventually abandon fluting? In this article, we use traditional 2D measurements, geometric morphometric (GM) analysis of 3D models, and 2D GM of flute cross sections to compare Clovis and Dalton point flute and basal morphologies. The significant differences observed show that fluting in Clovis was highly standardized, suggesting that fluting may have functioned to improve projectile durability. Because Dalton points were used increasingly as knives and other types of tools, maximizing projectile functionality became less important. We propose that fluting in Dalton is a vestigial technological trait retained beyond its original functional usefulness.
Bubiyan Island, presently a vast sabkha and salt flat in the westernmost part of the Shatt Al-Arab delta, originated ca. 4000 cal yr BP as prodelta deposits from a paleochannel of the Euphrates River that flowed into a shallow sea. Southeastern Bubiyan Island first surfaced when spits and barrier islands formed on a 1–2 m forebulge caused by heavy sediment load to the northwest; the spits and barriers delineated an incipient shoreline and sheltered a shallow lagoon. Progradation of southeastern Bubiyan Island began when the spits and barriers were gradually stranded as beach ridges during minor sea-level fluctuations and continued marginal uplift. AMS dating of the beach ridges, which are ~1–5 km from the present shoreline, implies that Late Holocene relative sea level fell in three phases: ca. 3700–3400 cal yr BP, ca. 2600–1000 cal yr BP, and ca. 600–500 cal yr BP. Prior to each phase, relative sea level apparently stabilized to near stillstands, allowing spits and barriers to accrete. Torpedo-jar pottery sherds scattered on some of the most prominent beach ridges indicate Sasanian (AD ca. 300–650; 1650–1300 cal yr BP) to early Islamic (AD ca. 650–800; 1300–1150 cal yr BP) periods of human presence, concurrent with the Second phase of beach-ridge formation.
Field studies were conducted to determine the effects of synthetic auxin herbicides at simulated exposure rates applied to ‘Covington’ sweetpotato propagation beds on the quality of nonrooted stem cuttings (slips). Treatments included DGA salt of dicamba, 2,4-D choline plus nonionic surfactant (NIS), or 2,4-D choline plus glyphosate at 1/10, 1/33, or 1/66 of a 1× application rate (560 g ae ha-1 dicamba, 1065 g ae ha-1 2,4-D choline, 1130 g ae ha-1 glyphosate) applied at 2 or 4 wk after first slip harvest (WASH). Injury to sweetpotato 2 wk after treatment was greatest when herbicides were applied 2 WASH (21%) compared to 4 WASH (16%). More slip injury was caused by 2,4-D choline than dicamba, and the addition of glyphosate did not increase injury over 2,4-D choline alone. Two wk after the second application, sweetpotato slips were cut 2 cm above the soil surface and transplanted into production fields. In 2019, sweetpotato ground coverage 8 wk after transplanting was reduced 37 and 26% by the 1/10× rates of dicamba and 2,4-D choline plus NIS, respectively. Though dicamba caused less injury to propagation beds than 2,4-D choline with or without glyphosate; after transplanting, slips treated with 1/10× dicamba did not recover as quickly as those treated with 2,4-D choline. In 2020, sweetpotato ground coverage was 90% or greater for all treatments. Dicamba applied 2 WASH decreased marketable sweetpotato storage root yield by 59% compared to the nontreated check, whereas treatments including 2,4-D choline reduced marketable yield 22 to 29%. All herbicides applied at 4 WASH reduced marketable yield 31 to 36%. The addition of glyphosate to 2,4-D choline did not increase sweetpotato yield. Results indicate caution should be taken when deciding whether to transplant sweetpotato slips that are suspected to have been exposed to dicamba or 2,4-D choline.
Children born very preterm (VP) are susceptible to a range of cognitive impairments, yet the effects of VP birth on long-term, episodic, and prospective memory remains unclear. This study examined episodic and prospective memory functioning in children born VP compared with their term-born counterparts at 13 years.
Method:
VP (n = 81: born <30 weeks’ gestation) and term (n = 26) groups were aged between 12 and 14 years. Children completed: (i) standardized verbal and visuospatial episodic memory tests; and (ii) an experimental time- and event-based prospective memory test that included short-term (within assessment session) and long-term (up to 1-week post-session) tasks. Parents completed a questionnaire assessing memory functions in everyday life.
Results:
The VP group performed worse on all measures of verbal and visuospatial episodic memory than the term group. While there were no group differences in event-based or long-term prospective memory, the VP group performed worse on time-based and short-term prospective memory tasks than term-born counterparts. Parents of children born VP reported more everyday memory difficulties than parents of children born at term, with parent-ratings indicating significantly elevated rates of everyday memory challenges in children born VP.
Conclusions:
Children born VP warrant long-term surveillance, as challenges associated with VP birth include memory difficulties at 13 years. This study highlights the need for greater research and clinical attention into childhood functional memory outcomes.
Healthcare facilities are a well-known high-risk environment for transmission of M. tuberculosis, the etiologic agent of tuberculosis (TB) disease. However, the link between M. tuberculosis transmission in healthcare facilities and its role in the general TB epidemic is unknown. We estimated the proportion of overall TB transmission in the general population attributable to healthcare facilities.
Methods:
We combined data from a prospective, population-based molecular epidemiologic study with a universal electronic medical record (EMR) covering all healthcare facilities in Botswana to identify biologically plausible transmission events occurring at the healthcare facility. Patients with M. tuberculosis isolates of the same genotype visiting the same facility concurrently were considered an overlapping event. We then used TB diagnosis and treatment data to categorize overlapping events into biologically plausible definitions. We calculated the proportion of overall TB cases in the cohort that could be attributable to healthcare facilities.
Results:
In total, 1,881 participants had TB genotypic and EMR data suitable for analysis, resulting in 46,853 clinical encounters at 338 healthcare facilities. We identified 326 unique overlapping events involving 370 individual patients; 91 (5%) had biologic plausibility for transmission occurring at a healthcare facility. A sensitivity analysis estimated that 3%–8% of transmission may be attributable to healthcare facilities.
Conclusions:
Although effective interventions are critical in reducing individual risk for healthcare workers and patients at healthcare facilities, our findings suggest that development of targeted interventions aimed at community transmission may have a larger impact in reducing TB.