To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
We have previously shown that higher intake of cruciferous vegetables is inversely associated with carotid artery intima-media thickness. To further test the hypothesis that an increased consumption of cruciferous vegetables is associated with reduced indicators of structural vascular disease in other areas of the vascular tree, we aimed to investigate the cross-sectional association between cruciferous vegetable intake and extensive calcification in the abdominal aorta. Dietary intake was assessed, using a FFQ, in 684 older women from the Calcium Intake Fracture Outcome Study. Cruciferous vegetables included cabbage, Brussels sprouts, cauliflower and broccoli. Abdominal aortic calcification (AAC) was scored using the Kauppila AAC24 scale on dual-energy X-ray absorptiometry lateral spine images and was categorised as ‘not extensive’ (0–5) or ‘extensive’ (≥6). Mean age was 74·9 (sd 2·6) years, median cruciferous vegetable intake was 28·2 (interquartile range 15·0–44·7) g/d and 128/684 (18·7 %) women had extensive AAC scores. Those with higher intakes of cruciferous vegetables (>44·6 g/d) were associated with a 46 % lower odds of having extensive AAC in comparison with those with lower intakes (<15·0 g/d) after adjustment for lifestyle, dietary and CVD risk factors (ORQ4 v. Q1 0·54, 95 % CI 0·30, 0·97, P = 0·036). Total vegetable intake and each of the other vegetable types were not related to extensive AAC (P > 0·05 for all). This study strengthens the hypothesis that higher intake of cruciferous vegetables may protect against vascular calcification.
Emergency Medical Services (EMS) systems have developed protocols for prehospital activation of the cardiac catheterization laboratory for patients with suspected ST-elevation myocardial infarction (STEMI) to decrease first-medical-contact-to-balloon time (FMC2B). The rate of “false positive” prehospital activations is high. In order to decrease this rate and expedite care for patients with true STEMI, the American Heart Association (AHA; Dallas, Texas USA) developed the Mission Lifeline PreAct STEMI algorithm, which was implemented in Los Angeles County (LAC; California USA) in 2015. The hypothesis of this study was that implementation of the PreAct algorithm would increase the positive predictive value (PPV) of prehospital activation.
This is an observational pre-/post-study of the effect of the implementation of the PreAct algorithm for patients with suspected STEMI transported to one of five STEMI Receiving Centers (SRCs) within the LAC Regional System. The primary outcome was the PPV of cardiac catheterization laboratory activation for percutaneous coronary intervention (PCI) or coronary artery bypass graft (CABG). The secondary outcome was FMC2B.
A total of 1,877 patients were analyzed for the primary outcome in the pre-intervention period and 405 patients in the post-intervention period. There was an overall decrease in cardiac catheterization laboratory activations, from 67% in the pre-intervention period to 49% in the post-intervention period (95% CI for the difference, -14% to -22%). The overall rate of cardiac catheterization declined in post-intervention period as compared the pre-intervention period, from 34% to 30% (95% CI, for the difference -7.6% to 0.4%), but actually increased for subjects who had activation (48% versus 58%; 95% CI, 4.6%-15.0%). Implementation of the PreAct algorithm was associated with an increase in the PPV of activation for PCI or CABG from 37.9% to 48.6%. The overall odds ratio (OR) associated with the intervention was 1.4 (95% CI, 1.1-1.8). The effect of the intervention was to decrease variability between medical centers. There was no associated change in average FMC2B.
The implementation of the PreAct algorithm in the LAC EMS system was associated with an overall increase in the PPV of cardiac catheterization laboratory activation.
N95 respirators are personal protective equipment most often used to control exposures to infections transmitted via the airborne route. Supplies of N95 respirators can become depleted during pandemics or when otherwise in high demand. In this paper, we offer strategies for optimizing supplies of N95 respirators in health care settings while maximizing the level of protection offered to health care personnel when there is limited supply in the United States during the 2019 coronavirus disease pandemic. The strategies are intended for use by professionals who manage respiratory protection programs, occupational health services, and infection prevention programs in health care facilities to protect health care personnel from job-related risks of exposure to infectious respiratory illnesses. Consultation with federal, state, and local public health officials is also important. We use the framework of surge capacity and the occupational health and safety hierarchy of controls approach to discuss specific engineering control, administrative control, and personal protective equipment measures that may help in optimizing N95 respirator supplies.
Introduction: Determining fluid status prior to resuscitation provides a more accurate guide for appropriate fluid administration in the setting of undifferentiated hypotension. Emergency Department (ED) point of care ultrasound (PoCUS) has been proposed as a potential non-invasive, rapid, repeatable investigation to ascertain inferior vena cava (IVC) characteristics. Our goal was to determine the feasibility of using PoCUS to measure IVC size and collapsibility. Methods: This was a planned secondary analysis of data from a prospective multicentre international study investigating PoCUS in ED patients with undifferentiated hypotension. We prospectively collected data on IVC size and collapsibility using a standard data collection form in 6 centres. The primary outcome was the proportion of patients with a clinically useful (determinate) scan defined as a clearly visible intrahepatic IVC, measurable for size and collapse. Descriptive statistics are provided. Results: A total of 138 scans were attempted on 138 patients; 45.7% were women and the median age was 58 years old. Overall, one hundred twenty-nine scans (93.5%; 95% CI 87.9 to 96.7%) were determinate. 131 (94.9%; 89.7 to 97.7%) were determinate for IVC size, and 131 (94.9%; 89.7 to 97.7%) were determinate for collapsibility. Conclusion: In this analysis of 138 ED patients with undifferentiated hypotension, the vast majority of PoCUS scans to investigate IVC characteristics were determinate. Future work should include analysis of the value of IVC size and collapsibility in determining fluid status in this group.
Introduction: Patients presenting to the emergency department (ED) with hypotension have a high mortality rate and require careful yet rapid resuscitation. The use of cardiac point of care ultrasound (PoCUS) in the ED has progressed beyond the basic indications of detecting pericardial fluid and activity in cardiac arrest. We examine if finding left ventricular dysfunction (LVD) on emergency physician performed PoCUS reliably predicts the presence of cardiogenic shock in hypotensive ED patients. Methods: We prospectively collected PoCUS findings performed in 135 ED patients with undifferentiated hypotension as part of an international study. Patients with clearly identified etiologies for hypotension were excluded, along with other specific presumptive diagnoses. LVD was defined as identification of a generally hypodynamic LV in the setting of shock. PoCUS findings were collected using a standardized protocol and data collection form. All scans were performed by PoCUS-trained emergency physicians. Final shock type was defined as cardiogenic or non-cardiogenic by independent specialist blinded chart review. Results: All 135 patients had complete follow up. Median age was 56 years, 53% of patients were male. Disease prevalence for cardiogenic shock was 12% and the mortality rate was 24%. The presence of LVD on PoCUS had a sensitivity of 62.50% (95%CI 35.43% to 84.80%), specificity of 94.12% (88.26% to 97.60%), positive-LR 10.62 (4.71 to 23.95), negative-LR 0.40 (0.21 to 0.75) and accuracy of 90.37% (84.10% to 94.77%) for detecting cardiogenic shock. Conclusion: Detecting left ventricular dysfunction on PoCUS in the ED may be useful in confirming the underlying shock type as cardiogenic in otherwise undifferentiated hypotensive patients.
Early detection and indicated early intervention in the initial prodromal phase should considerably improve the course of psychoses. Yet, the benefits of such programmes still require an evidence-based evaluation on the basis of a sufficient sample-size.
This report presents an overview on the concept and design of the European Prediction of Psychosis Study (EPOS) an European 4-country naturalistic field-study of the initial Prodrome.
Materials and Methods
Across six participating centres (Germany: Cologne, Berlin; Finland: Turku; The Netherlands: Amsterdam; United Kingdom: Birmingham, Manchester), 16 to 40 year old putatively prodromal persons attending specialized services or general psychiatric services underwent multi-level baseline, 9-months follow-up, and 18-months follow-up examinations. Inclusion criteria were the presence of APS, BLIPS, at least 2 of 9 Basic Symptoms (BS), and Familial Risk or Schizotypal Personality Disorder plus Reduced Functioning (FR+RF). In addition, psychopathological, neurocognitive, neurobiological, psychosocial, and service and treatment-related assessments were carried out.
A substantial part of more than 250 subjects included into the study participated in their respective baseline, 1st follow-up, and 2nd follow-up examinations. A high percentage presented themselves with BS and/or APS, a smaller percentage with BLIPS or FR+RF. The rates of transition into psychosis and the levels of psychopathology, distress and functional decline found among this patient group underline the need for indicated early recognition and intervention.
EPOS provides for the first time a sound data base allowing an evaluation of the applicability and cost-benefit ratio of early detection and intervention programmes in Europe.
A main objective of EPOS is to provide a valid multifactorial model for the prediction of psychosis. One major element of such a model should be the clinical state.
In a European multicentre study, persons fulfilling clinical criteria thought to indicate an increased risk for psychosis (PAR) were assessed amongst others with different psychopathological instruments covering the whole spectrum from basic symptoms to frank psychotic symptoms. Inclusion criteria comprised attenuated positive symptoms (APS), brief limited intermittent psychotic symptoms (BLIPS), cognitive basic symptoms (CogDis) and a combination of family risk and reduced functioning (S&T).
246 PAR were included into the study, mostly by APS or CogDis. Analysis of demographical data showed a high amount of functional impairment, resulting e.g. in low mean GAF scores (51.0 ± 11.8 SD), and of non-psychotic axis-I disorders. In September 2006, the hazard rate for a conversion to psychosis was 15.3 at 12 and 20.0 at 18 months after baseline assessment. According to the inclusion criteria, the highest rate of conversion was observed among PAR with BLIPS. On a dimensional level, a low GAF score was among the best predictors of conversion.
The transition rates of EPOS were in line with recent studies. A first analysis of clinical data supports the notion that the functional state should be an inherent part of any set of clinical risk criteria. Further analysis will consider the contribution of single symptoms or symptom combinations and the impact of symptom duration.
The aim of this presentation is to summarize the results of the published systematic reviews / meta-analyses of the randomized controlled trials that have investigated the effectiveness of medications for the treatment of obsessive compulsive disorder (OCD) in children/adolescents and adults and to present preliminary results from a new review study using network meta-analytical techniques
Medline, Cochrane database, and the register of controlled trials maintained by the Cochrane Collaboration Depression, Anxiety & Neurosis Group (CCDAN) were searched for relevant trials, systematic reviews and/or meta-analyses.
Regarding the new review study we were able to extract 103 pharmacological arms (preliminary results: 9 for Fluoxetine, 16 for Fluvoxamine, 11 for Paroxetine, 10 for Sertraline, 16 for Clomipramine, 7 for Citalopram / escitalopram, 2 for Venlafaxine, and 32 for Placebo) with a total number of 6572 patients randomized. The previous meta-analyses have confirmed the efficacy for all SSRIs and Clomipramine in all age groups while for other drugs further evidence is required. In the context of this presentation preliminary results of the relative efficacy of each drug, using network meta-analysis techniques will also be reported.
Several antidepressants have established their efficacy and acceptability for the management of non-resistant OCD. The major weakness of the literature so far is that head to head comparisons between antidepressants are few and therefore it is difficult to establish a clear hierarchy of the efficacy and acceptability of the various agents. This gap in the present literature will be filled by the present review.
We enumerate factorizations of a Coxeter element in a well-generated complex reflection group into arbitrary factors, keeping track of the fixed space dimension of each factor. In the infinite families of generalized permutations, our approach is fully combinatorial. It gives results analogous to those of Jackson in the symmetric group and can be refined to encode a notion of cycle type. As one application of our results, we give a previously overlooked characterization of the poset of W-noncrossing partitions.
Broadmoor is a high secure psychiatric hospital divided into personality disorder (PD) and mental illness (MI) pathways. Whenever an incident occurs, it should be recorded. To better understand which factors influence the rate of incidents, such as diagnosis or intervention by medical and psychological staff, we examined the difference in the number of incidents recorded on weekdays versus weekends, ward round (WR) versus non-WR days and the PD versus MI pathways.
All incidents recorded over a one-year period (3.11.2014–2.11.2015) were examined. Extraneous incidents were excluded, leaving subgroups of “aggressive” (physical and verbal) and “physical” (excluding verbal) incidents which were analysed. Data were adjusted for the difference in number of beds in each pathway.
Of the 2369 incident reports included, more were recorded per day on weekdays than weekends, with little difference on WR versus non-WR days. The rates of both types of incidents were similar on both PD and MI admission wards, although the rate of “physical” incidents was 2.6 times higher and “aggressive” incidents 3.3 times higher in PD compared to MI rehabilitation wards.
The findings suggest the presence of medical and psychological staff during the week, and possibly the requirements they place on patients, may increase the rate of incidents within the hospital. Despite comparable rates on admission, MI rehabilitation wards have far fewer incidents than PD rehab wards, which may reflect the more intractable nature of PD versus MI. More work is required to confirm these findings.
Disclosure of interest
The authors have not supplied their declaration of competing interest.
The objective of this study was to describe in the words of child-rearing parents with incurable cancer, what they had gained or thought about as a result of participating in a five-session, scripted, telephone-delivered psycho-educational parenting intervention, the Enhancing Connections Program in Palliative Care.
A total of 26 parents completed the program. Parents’ responses were audio-recorded and transcribed verbatim and verified for accuracy. The analysis proceeded through four steps: unitizing, coding into categories, defining categories, and formation of a core construct that explained parents’ attributed gains. Trustworthiness of study results was protected by coding to consensus, formal peer debriefing, and maintaining an audit trail.
Although 50% reached or exceeded clinical cutoff scores on anxiety and 42% reached or exceeded clinical cutoff scores on depressed mood, parents extensively elaborated what they gained. Results revealed six categories of competencies they attributed to their participation in the program: (1) being ready for a conversation about my cancer, (2) bringing things out in the open, (3) listening better to my child, (4) getting my child to open up, (5) not getting in my child's way, and (6) changing my parenting.
Despite an extensive symptom burden, parents with incurable cancer attributed major gains from a brief, fully scripted, cancer parenting communication intervention. A manualized telephone-delivered educational counseling program for symptomatic parents with incurable cancer has the potential to augment competencies for parents as they assist their children manage the cancer experience.
Background: Biallelic variants in POLR1C are associated with POLR3-related leukodystrophy (POLR3-HLD), or 4H leukodystrophy (Hypomyelination, Hypodontia, Hypogonadotropic Hypogonadism), and Treacher Collins syndrome (TCS). The clinical spectrum of POLR3-HLD caused by variants in this gene has not been described. Methods: A cross-sectional observational study involving 25 centers worldwide was conducted between 2016 and 2018. The clinical, radiologic and molecular features of 23 unreported and previously reported cases of POLR3-HLD caused by POLR1C variants were reviewed. Results: Most participants presented between birth and age 6 years with motor difficulties. Neurological deterioration was seen during childhood, suggesting a more severe phenotype than previously described. The dental, ocular and endocrine features often seen in POLR3-HLD were not invariably present. Five patients (22%) had a combination of hypomyelinating leukodystrophy and abnormal craniofacial development, including one individual with clear TCS features. Several cases did not exhibit all the typical radiologic characteristics of POLR3-HLD. A total of 29 different pathogenic variants in POLR1C were identified, including 13 new disease-causing variants. Conclusions: Based on the largest cohort of patients to date, these results suggest novel characteristics of POLR1C-related disorder, with a spectrum of clinical involvement characterized by hypomyelinating leukodystrophy with or without abnormal craniofacial development reminiscent of TCS.
Rapid increases in herbicide resistance have highlighted the ability of weeds to undergo genetic change within a short period of time. That change, in turn, has resulted in an increasing emphasis in weed science on the evolutionary ecology and potential adaptation of weeds to herbicide selection. Here we argue that a similar emphasis would also be invaluable for understanding another challenge that will profoundly alter weed biology: the rapid rise in atmospheric carbon dioxide (CO2) and the associated changes in climate. Our review of the literature suggests that elevated CO2 and climate change will impose strong selection pressures on weeds and that weeds will often have the capacity to respond with rapid adaptive evolution. Based on current data, climate change and rising CO2 levels are likely to alter the evolution of agronomic and invasive weeds, with consequences for distribution, community composition, and herbicide efficacy. In addition, we identify four key areas that represent clear knowledge gaps in weed evolution: (1) differential herbicide resistance in response to a rapidly changing CO2/climate confluence; (2) shifts in the efficacy of biological constraints (e.g., pathogens) and resultant selection shifts in affected weed species; (3) climate-induced phenological shifts in weed distribution, demography, and fitness relative to crop systems; and (4) understanding and characterization of epigenetics and the differential expression of phenotypic plasticity versus evolutionary adaptation. These consequences, in turn, should be of fundamental interest to the weed science community.
Introduction: Although use of point of care ultrasound (PoCUS) protocols for patients with undifferentiated hypotension in the Emergency Department (ED) is widespread, our previously reported SHoC-ED study showed no clear survival or length of stay benefit for patients assessed with PoCUS. In this analysis, we examine if the use of PoCUS changed fluid administration and rates of other emergency interventions between patients with different shock types. The primary comparison was between cardiogenic and non-cardiogenic shock types. Methods: A post-hoc analysis was completed on the database from an RCT of 273 patients who presented to the ED with undifferentiated hypotension (SBP <100 or shock index > 1) and who had been randomized to receive standard care with or without PoCUS in 6 centres in Canada and South Africa. PoCUS-trained physicians performed scans after initial assessment. Shock categories and diagnoses recorded at 60 minutes after ED presentation, were used to allocate patients into subcategories of shock for analysis of treatment. We analyzed actual care delivered including initial IV fluid bolus volumes (mL), rates of inotrope use and major procedures. Standard statistical tests were employed. Sample size was powered at 0.80 (α:0.05) for a moderate difference. Results: Although there were expected differences in the mean fluid bolus volume between patients with non-cardiogenic and cardiogenic shock, there was no difference in fluid bolus volume between the control and PoCUS groups (non-cardiogenic control 1878 mL (95% CI 1550 – 2206 mL) vs. non-cardiogenic PoCUS 1687 mL (1458 – 1916 mL); and cardiogenic control 768 mL (194 – 1341 mL) vs. cardiogenic PoCUS 981 mL (341 – 1620 mL). Likewise there were no differences in rates of inotrope administration, or major procedures for any of the subcategories of shock between the control group and PoCUS group patients. The most common subcategory of shock was distributive. Conclusion: Despite differences in care delivered by subcategory of shock, we did not find any significant difference in actual care delivered between patients who were examined using PoCUS and those who were not. This may help to explain the previously reported lack of outcome difference between groups.
Introduction: Point of care ultrasound has been reported to improve diagnosis in non-traumatic hypotensive ED patients. We compared diagnostic performance of physicians with and without PoCUS in undifferentiated hypotensive patients as part of an international prospective randomized controlled study. The primary outcome was diagnostic performance of PoCUS for cardiogenic vs. non-cardiogenic shock. Methods: SHoC-ED recruited hypotensive patients (SBP < 100 mmHg or shock index > 1) in 6 centres in Canada and South Africa. We describe previously unreported secondary outcomes relating to diagnostic accuracy. Patients were randomized to standard clinical assessment (No PoCUS) or PoCUS groups. PoCUS-trained physicians performed scans after initial assessment. Demographics, clinical details and findings were collected prospectively. Initial and secondary diagnoses including shock category were recorded at 0 and 60 minutes. Final diagnosis was determined by independent blinded chart review. Standard statistical tests were employed. Sample size was powered at 0.80 (α:0.05) for a moderate difference. Results: 273 patients were enrolled with follow-up for primary outcome completed for 270. Baseline demographics and perceived category of shock were similar between groups. 11% of patients were determined to have cardiogenic shock. PoCUS had a sensitivity of 80.0% (95% CI 54.8 to 93.0%), specificity 95.5% (90.0 to 98.1%), LR+ve 17.9 (7.34 to 43.8), LR-ve 0.21 (0.08 to 0.58), Diagnostic OR 85.6 (18.2 to 403.6) and accuracy 93.7% (88.0 to 97.2%) for cardiogenic shock. Standard assessment without PoCUS had a sensitivity of 91.7% (64.6 to 98.5%), specificity 93.8% (87.8 to 97.0%), LR+ve 14.8 (7.1 to 30.9), LR- of 0.09 (0.01 to 0.58), Diagnostic OR 166.6 (18.7 to 1481) and accuracy of 93.6% (87.8 to 97.2%). There was no significant difference in sensitivity (-11.7% (-37.8 to 18.3%)) or specificity (1.73% (-4.67 to 8.29%)). Diagnostic performance was also similar between other shock subcategories. Conclusion: As reported in other studies, PoCUS based assessment performed well diagnostically in undifferentiated hypotensive patients, especially as a rule-in test. However performance was similar to standard (non-PoCUS) assessment, which was excellent in this study.
Despite established clinical associations among major depression (MD), alcohol dependence (AD), and alcohol consumption (AC), the nature of the causal relationship between them is not completely understood. We leveraged genome-wide data from the Psychiatric Genomics Consortium (PGC) and UK Biobank to test for the presence of shared genetic mechanisms and causal relationships among MD, AD, and AC.
Linkage disequilibrium score regression and Mendelian randomization (MR) were performed using genome-wide data from the PGC (MD: 135 458 cases and 344 901 controls; AD: 10 206 cases and 28 480 controls) and UK Biobank (AC-frequency: 438 308 individuals; AC-quantity: 307 098 individuals).
Positive genetic correlation was observed between MD and AD (rgMD−AD = + 0.47, P = 6.6 × 10−10). AC-quantity showed positive genetic correlation with both AD (rgAD−AC quantity = + 0.75, P = 1.8 × 10−14) and MD (rgMD−AC quantity = + 0.14, P = 2.9 × 10−7), while there was negative correlation of AC-frequency with MD (rgMD−AC frequency = −0.17, P = 1.5 × 10−10) and a non-significant result with AD. MR analyses confirmed the presence of pleiotropy among these four traits. However, the MD-AD results reflect a mediated-pleiotropy mechanism (i.e. causal relationship) with an effect of MD on AD (beta = 0.28, P = 1.29 × 10−6). There was no evidence for reverse causation.
This study supports a causal role for genetic liability of MD on AD based on genetic datasets including thousands of individuals. Understanding mechanisms underlying MD-AD comorbidity addresses important public health concerns and has the potential to facilitate prevention and intervention efforts.
Measuring diet choice in grazing animals is challenging, complicating the assessment of feed efficiency in pasture-based systems. Furthermore, animals may modify their intake of a forage species depending on its nutritive value and on their own physiological status. Various fecal markers have been used to estimate feed intake in grazing animals. However, plant-wax markers such as n-alkanes (ALK) and long-chain alcohols may provide reliable estimates of both dietary choices and intakes. Still, their use in beef cattle has been relatively limited. The present study was designed to test the reliability of the ALK technique to estimate diet choices in beef heifers. Twenty-two Angus-cross heifers were evaluated at both post-weaning and yearling age. At each age, they were offered both red clover and fescue hay as cubes. Following 3-week acclimation periods, daily intake of each forage species was assessed daily for 10 days. During the final 5 days, fecal grab samples were collected twice daily. The ALK fecal concentrations were adjusted using recovery fractions compiled from literature. Diet composition was estimated using two statistical methods. Post-weaning, dietary choices were reliably estimated, with low residual error, regardless of the statistical approach adopted. The regression of observed on estimated red clover proportion ranged from 0.85±0.08 to 1.01±0.09 for fecal samples collected in the p.m. and for daily proportions once averaged, respectively. However, at yearling age, the estimates were less reliable. There was a tendency to overestimate the red clover proportion in diets of heifers preferring fescue, and vice versa. This was due to greater variability in ALK fecal concentrations in the yearling heifers. Overall, the ALK technique provided a reliable tool for estimating diet choice in animals fed a simple forage diet. Although further refinements in the application of this methodology are needed, plant-wax markers provide opportunities for evaluating diet composition in grazing systems in cattle.
Night-migratory songbirds appear to sense the direction of the Earth's magnetic field via radical pair intermediates formed photochemically in cryptochrome flavoproteins contained in photoreceptor cells in their retinas. It is an open question whether this light-dependent mechanism could be sufficiently sensitive given the low-light levels experienced by nocturnal migrants. The scarcity of available photons results in significant uncertainty in the signal generated by the magnetoreceptors distributed around the retina. Here we use results from Information Theory to obtain a lower bound estimate of the precision with which a bird could orient itself using only geomagnetic cues. Our approach bypasses the current lack of knowledge about magnetic signal transduction and processing in vivo by computing the best-case compass precision under conditions where photons are in short supply. We use this method to assess the performance of three plausible cryptochrome-derived flavin-containing radical pairs as potential magnetoreceptors.
An insect trap constructed using three-dimensional (3D) printing technology was tested in potato (Solanum tuberosum Linnaeus; Solanaceae) fields to determine whether it could substitute for the standard yellow sticky card used to monitor Bactericera cockerelli (Šulc) (Hemiptera: Psylloidea: Triozidae). Sticky cards have shortcomings that prompted search for a replacement: cards are messy, require weekly replacement, are expensive to purchase, and accumulate large numbers of nontarget insects. Bactericera cockerelli on sticky cards also deteriorate enough that specimens cannot be tested reliably for the presence of vectored plant pathogens. A prototype trap constructed using 3D printing technology for monitoring Diaphorina citri Kuwayama (Hemiptera: Psylloidea: Liviidae) was tested for monitoring B. cockerelli. The trap was designed to attract B. cockerelli visually to the trap and then funnel specimens into preservative-filled vials at the trap bottom. Prototype traps were paired against yellow sticky cards at multiple fields to compare the captures of B. cockerelli between cards and traps. The prototype trap was competitive with sticky cards early in the growing season when B. cockerelli numbers were low. We estimated that two or three prototype traps would collect as many B. cockerelli as one sticky card under these conditions. Efficacy of the prototype declined as B. cockerelli numbers increased seasonally. The prototype trap accumulated nontarget taxa that are common on sticky cards (especially Thysanoptera and Diptera), and was also found to capture taxa of possible interest in integrated pest management research, including predatory insects, parasitic Hymenoptera, and winged Aphididae (Hemiptera), suggesting that the traps could be useful outside of the purpose targeted here. We believe that 3D printing technology has substantial promise for developing monitoring tools that exploit behavioural traits of the targeted insect. Ongoing work includes the use of this technology to modify the prototype, with a focus on making it more effective at capturing psyllids and less susceptible to capture of nontarget species.