To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Origami, the art of paper folding, has a rich mathematical theory. Early investigations go back to at least the 1930s, but the twenty-first century has seen a remarkable blossoming of the mathematics of folding. Besides its use in describing origami and designing new models, it is also finding real-world applications from building nano-scale robots to deploying large solar arrays in space. Written by a world expert on the subject, Origametry is the first complete reference on the mathematics of origami. It brings together historical results, modern developments, and future directions into a cohesive whole. Over 180 figures illustrate the constructions described while numerous 'diversions' provide jumping-off points for readers to deepen their understanding. This book is an essential reference for researchers of origami mathematics and its applications in physics, engineering, and design. Educators, students, and enthusiasts will also find much to enjoy in this fascinating account of the mathematics of folding.
Reward Deficiency Syndrome (RDS) is an umbrella term for all drug and nondrug addictive behaviors, due to a dopamine deficiency, “hypodopaminergia.” There is an opioid-overdose epidemic in the USA, which may result in or worsen RDS. A paradigm shift is needed to combat a system that is not working. This shift involves the recognition of dopamine homeostasis as the ultimate treatment of RDS via precision, genetically guided KB220 variants, called Precision Behavioral Management (PBM). Recognition of RDS as an endophenotype and an umbrella term in the future DSM 6, following the Research Domain Criteria (RDoC), would assist in shifting this paradigm.
We implemented universal SARS-CoV-2 testing of patients undergoing surgical procedures as a means to conserve personal protective equipment (PPE). The rate of asymptomatic SARS-CoV-2 infection was <0.5%, and suggests that early local public health interventions were successful. While our protocol was resource-intensive, it prevented exposures to healthcare team members.
The Khao Wong Prachan Valley of central Thailand is one of four known prehistoric loci of copper mining, smelting and casting in Southeast Asia. Many radiocarbon determinations from bronze-consumption sites in north-east Thailand date the earliest copper-base metallurgy there in the late second millennium BC. By applying kernel density estimation analysis to approximately 100 new AMS radiocarbon dates, the authors conclude that the valley's first Neolithic millet farmers had settled there by c. 2000 BC, and initial copper mining and rudimentary smelting began in the late second millennium BC. This overlaps with the established dates for Southeast Asian metal-consumption sites, and provides an important new insight into the development of metallurgy in central Thailand and beyond.
In prehistoric coastal and western-central Thailand, rice was the dominant cultivar. In eastern-central Thailand, however, the first known farmers cultivated millet. Using one of the largest collections of archaeobotanical material in Southeast Asia, this article examines how cropping systems were adapted as domesticates were introduced into eastern-central Thailand. The authors argue that millet reached the region first, to be progressively replaced by rice, possibly due to climatic pressures. But despite the increasing importance of rice, dryland, rain-fed cultivation persisted throughout ancient central Thailand, a result that contributes to refining understanding of the development of farming in Southeast Asia.
For many us who have studied, researched, written, and taught about the influenza pandemic of 1918–19, the current period of the global viral pandemic is eerily and unpleasantly familiar. Today, the rapid global spread of a virus has prompted policies calling for widespread closures, social distancing, constant handwashing, and public mask wearing in additional to other non-pharmaceutical interventions (NPIs). We have also seen pushback and resistance to these directives as well as substantial mismanagement of resources and a flood of misinformation. Much health policy has been inconsistently set at the local rather than federal level. These responses to our current pandemic closely mirror those to the pandemic 102 years ago.
During the Early Neolithic in the Near East, particularly from the mid ninth millennium cal BC onwards, human iconography became more widespread. Explanations for this development, however, remain elusive. This article presents a unique assemblage of flint artefacts from the Middle Pre-Pottery Neolithic B (eighth millennium BC) site of Kharaysin in Jordan. Contextual, morphological, statistical and use-wear analyses of these artefacts suggest that they are not tools but rather human figurines. Their close association with burial contexts suggests that they were manufactured and discarded during mortuary rituals and remembrance ceremonies that included the extraction, manipulation and redeposition of human remains.
Catatonia is a psychomotor dysregulation syndrome of diverse aetiology, increasingly recognised as a prominent feature of N-methyl-d-aspartate receptor antibody encephalitis (NMDARE) in adults. No study to date has systematically assessed the prevalence and symptomatology of catatonia in children with NMDARE. We analysed 57 paediatric patients with NMDARE from the literature using the Bush-Francis Catatonia Rating Scale. Catatonia was common (occurring in 86% of patients), manifesting as complex clusters of positive and negative features within individual patients. It was both underrecognised and undertreated. Immunotherapy was the only effective intervention, highlighting the importance of prompt recognition and treatment of the underlying cause of catatonia.
Prescribing metrics, cost, and surrogate markers are often used to describe the value of antimicrobial stewardship (AMS) programs. However, process measures are only indirectly related to clinical outcomes and may not represent the total effect of an intervention. We determined the global impact of a multifaceted AMS initiative for hospitalized adults with common infections.
Single center, quasi-experimental study.
Hospitalized adults with urinary, skin, and respiratory tract infections discharged from family medicine and internal medicine wards before (January 2017–June 2017) and after (January 2018–June 2018) an AMS initiative on a family medicine ward were included. A series of AMS-focused initiatives comprised the development and dissemination of: handheld prescribing tools, AMS positive feedback cases, and academic modules. We compared the effect on an ordinal end point consisting of clinical resolution, adverse drug events, and antimicrobial optimization between the preintervention and postintervention periods.
In total, 256 subjects were included before and after an AMS intervention. Excessive durations of therapy were reduced from 40.3% to 22% (P < .001). Patients without an optimized antimicrobial course were more likely to experience clinical failure (OR, 2.35; 95% CI, 1.17–4.72). The likelihood of a better global outcome was greater in the family medicine intervention arm (62.0%, 95% CI, 59.6–67.1) than in the preintervention family medicine arm.
Collaborative, targeted feedback with prescribing metrics, AMS cases, and education improved global outcomes for hospitalized adults on a family medicine ward.
In response to advancing clinical practice guidelines regarding concussion management, service members, like athletes, complete a baseline assessment prior to participating in high-risk activities. While several studies have established test stability in athletes, no investigation to date has examined the stability of baseline assessment scores in military cadets. The objective of this study was to assess the test–retest reliability of a baseline concussion test battery in cadets at U.S. Service Academies.
All cadets participating in the Concussion Assessment, Research, and Education (CARE) Consortium investigation completed a standard baseline battery that included memory, balance, symptom, and neurocognitive assessments. Annual baseline testing was completed during the first 3 years of the study. A two-way mixed-model analysis of variance (intraclass correlation coefficent (ICC)3,1) and Kappa statistics were used to assess the stability of the metrics at 1-year and 2-year time intervals.
ICC values for the 1-year test interval ranged from 0.28 to 0.67 and from 0.15 to 0.57 for the 2-year interval. Kappa values ranged from 0.16 to 0.21 for the 1-year interval and from 0.29 to 0.31 for the 2-year test interval. Across all measures, the observed effects were small, ranging from 0.01 to 0.44.
This investigation noted less than optimal reliability for the most common concussion baseline assessments. While none of the assessments met or exceeded the accepted clinical threshold, the effect sizes were relatively small suggesting an overlap in performance from year-to-year. As such, baseline assessments beyond the initial evaluation in cadets are not essential but could aid concussion diagnosis.
Obtaining objective, dietary exposure information from individuals is challenging because of the complexity of food consumption patterns and the limitations of self-reporting tools (e.g., FFQ and diet diaries). This hinders research efforts to associate intakes of specific foods or eating patterns with population health outcomes.
Dietary exposure can be assessed by the measurement of food-derived chemicals in urine samples. We aimed to develop methodologies for urine collection that minimised impact on the day-to-day activities of participants but also yielded samples that were data-rich in terms of targeted biomarker measurements.
Urine collection methodologies were developed within home settings.
Different cohorts of free-living volunteers.
Home collection of urine samples using vacuum transfer technology was deemed highly acceptable by volunteers. Statistical analysis of both metabolome and selected dietary exposure biomarkers in spot urine collected and stored using this method showed that they were compositionally similar to urine collected using a standard method with immediate sample freezing. Even without chemical preservatives, samples can be stored under different temperature regimes without any significant impact on the overall urine composition or concentration of forty-six exemplar dietary exposure biomarkers. Importantly, the samples could be posted directly to analytical facilities, without the need for refrigerated transport and involvement of clinical professionals.
This urine sampling methodology appears to be suitable for routine use and may provide a scalable, cost-effective means to collect urine samples and to assess diet in epidemiological studies.
Commercialization of 2,4-D–tolerant crops is a major concern for sweetpotato producers because of potential 2,4-D drift that can cause severe crop injury and yield reduction. A field study was initiated in 2014 and repeated in 2015 to assess impacts of reduced rates of 2,4-D, glyphosate, or a combination of 2,4-D with glyphosate on sweetpotato. In one study, 2,4-D and glyphosate were applied alone and in combination at 1/10, 1/100, 1/250, 1/500, 1/750, and 1/1,000 of anticipated field use rates (1.05 kg ha−1 for 2,4-D and 1.12 kg ha−1 for glyphosate) to ‘Beauregard’ sweetpotato at storage root formation (10 days after transplanting [DAP]). In a separate study, all these treatments were applied to ‘Beauregard’ sweetpotato at storage root development (30 DAP). Injury with 2,4-D alone or in combination with glyphosate was generally equal or greater than with glyphosate applied alone at equivalent herbicide rates, indicating that injury is attributable mostly to 2,4-D in the combination. There was a quadratic increase in crop injury and quadratic decrease in crop yield (with respect to most yield grades) with increased rate of 2,4-D applied alone or in combination with glyphosate applied at storage root development. However, neither the results of this relationship nor of the significance of herbicide rate were observed on crop injury or sweetpotato yield when herbicide application occurred at storage root formation, with a few exceptions. In general, crop injury and yield reduction were greatest at the highest rate (1/10×) of 2,4-D applied alone or in combination with glyphosate, although injury observed at lower rates was also a concern after initial observation by sweetpotato producers. However, in some cases, yield reduction of U.S. no.1 and marketable grades was also observed after application of 1/250×, 1/100×, or 1/10× rates of 2,4-D alone or with glyphosate when applied at storage root development.
OBJECTIVES/GOALS: This study’s goal is to examine the feasibility and acceptability of using VRM to impact the APP of adults in the inpatient setting. Aims include examining the: 1) feasibility of VRM for APP management; 2) acceptability of using VRM for APP management; and 3) experience of VRM for APP management. METHODS/STUDY POPULATION: To comprehensively examine participants’ experience of using VRM for APP, this study will employ a convergent mixed-methods design in which living kidney donors (N = 45) will be recruited to serially use VRM during their hospital stay. Feasibility and acceptability will be evaluated using descriptive and inferential statistics evaluating patient-reported outcome (PRO) measures taken pre-, post- and 1-hour post-VRM, PRO measures extracted from the participant’s electronic health record and data on VRM use. Semi-structured interviews will allow formulation of inferences based on participants’ experience of VRM for APP management and their insights on content, deployment, and clinical use of VRM. RESULTS/ANTICIPATED RESULTS: This in-process study expects: 1) an adequate sample of participants undergoing living kidney donor surgery who agree to enroll with retention of >90% of participants (Aim 1); 2) participants to report VRM as an acceptable and suitable treatment, feel “present” and interested in the VR environment, and feel comfortable using VRM in the hospital (Aim 2); and 3) to provide insight into participants’ experience of VRM for APP, understanding of extended VRM use for APP analgesia, examination of key variables affecting participants’ experience of VRM for APP and feedback about VRM procedures and protocol to inform future VRM use for APP management (Aim 3). DISCUSSION/SIGNIFICANCE OF IMPACT: Results of the proposed study will inform future clinical testing and deployment of VRM, guide future use of VRM as an adjunct for inpatient APP management, and provide insight into inpatients’ experience of VRM for APP analgesia.
Seismic-reflection surveys of the Isle Royale sub-basin, central Lake Superior, reveal two large end moraines and associated glacial sediments deposited during the last cycle of the Laurentide Ice Sheet in the basin. The Isle Royale moraines directly overlie bedrock and are cored with dense, acoustically massive till intercalated down-ice with acoustically stratified outwash. Till and outwash are overlain by glacial varves, a lower red unit and an upper gray unit.
The maximum extent of late Younger Dryas-age readvance into the western Lake Superior basin is uncertain, but it was probably controlled by both ice dynamics and climate. Our data indicate that during retreat from the maximum, the ice paused just long enough to construct the outer of the two moraines, >100 m high, and then retreated to the inner moraine, during which time most of the lower glacial-lacustrine sequence (red varves) was deposited. Retreat from the inner moraine coincided with a marked flux of icebergs at the calving margin and a change to gray varves. Rapid retreat may be related to both an influx of meltwater from Glacial Lake Agassiz about 10,500 cal yr BP and retreat of the calving margin down an adverse slope into the Isle Royale sub-basin.
Metabolites are small molecules involved in cellular metabolism where they act as reaction substrates or products. The term ‘metabolomics’ refers to the comprehensive study of these molecules. The concentrations of metabolites in biological tissues are under genetic control, but this is limited by environmental factors such as diet. In adult mono- and dizygotic twin pairs, we estimated the contribution of genetic and shared environmental influences on metabolite levels by structural equation modeling and tested whether the familial resemblance for metabolite levels is mainly explained by genetic or by environmental factors that are shared by family members. Metabolites were measured across three platforms: two based on proton nuclear magnetic resonance techniques and one employing mass spectrometry. These three platforms comprised 237 single metabolic traits of several chemical classes. For the three platforms, metabolites were assessed in 1407, 1037 and 1116 twin pairs, respectively. We carried out power calculations to establish what percentage of shared environmental variance could be detected given these sample sizes. Our study did not find evidence for a systematic contribution of shared environment, defined as the influence of growing up together in the same household, on metabolites assessed in adulthood. Significant heritability was observed for nearly all 237 metabolites; significant contribution of the shared environment was limited to 6 metabolites. The top quartile of the heritability distribution was populated by 5 of the 11 investigated chemical classes. In this quartile, metabolites of the class lipoprotein were significantly overrepresented, whereas metabolites of classes glycerophospholipids and glycerolipids were significantly underrepresented.
Motivated by the occurrence of a moderately nearby supernova near the beginning of the Pleistocene, possibly as part of a long-term series beginning in the Miocene, we investigated whether nitrate rainout resulting from the atmospheric ionization of enhanced cosmic ray flux could have, through its fertilizer effect, initiated carbon dioxide drawdown. Such a drawdown could possibly reduce the greenhouse effect and induce the climate change that led to the Pleistocene glaciations. We estimate that the nitrogen flux enhancement onto the surface from an event at 50 pc would be of order 10%, probably too small for dramatic changes. We estimate deposition of iron (another potential fertilizer) and find it is also too small to be significant. There are also competing effects of opposite sign, including muon irradiation and reduction in photosynthetic yield caused by UV increase from stratospheric ozone layer depletion, leading to an ambiguous result. However, if the atmospheric ionization induces a large increase in the frequency of lightning, as argued elsewhere, the amount of nitrate synthesis should be much larger, dominate over the other effects and induce the climate change. More work needs to be done to clarify the effects on lightning frequency.