To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
At present, the analysis of diet and bladder cancer (BC) is mostly based on the intake of individual foods. The examination of food combinations provides a scope to deal with the complexity and unpredictability of the diet and aims to overcome the limitations of the study of nutrients and foods in isolation. This article aims to demonstrate the usability of supervised data mining methods to extract the food groups related to BC. In order to derive key food groups associated with BC risk, we applied the data mining technique C5.0 with 10-fold cross validation in the BLadder cancer Epidemiology and Nutritional Determinants (BLEND) study, including data from 18 case-control and 1 nested case-cohort study, compromising 8,320 BC cases out of 31,551 participants. Dietary data, on the 11 main food groups of the Eurocode 2 Core classification codebook and relevant non-diet data (i.e. sex, age and smoking status) were available. Primarily, five key food groups were extracted; in order of importance: beverages (non-milk); grains and grain products; vegetables and vegetable products; fats, oils and their products; meats and meat products were associated with BC risk. Since these food groups are corresponded with previously proposed BC related dietary factors, data mining seems to be a promising technique in the field of nutritional epidemiology and deserves further examination.
Given the common view that pre-exercise nutrition/breakfast is important for performance, the present study investigated whether breakfast influences resistance exercise performance via a physiological or psychological effect. Twenty-two resistance-trained, breakfast-consuming men completed three experimental trials, consuming water-only (WAT), or semi-solid breakfasts containing 0 g/kg (PLA) or 1·5 g/kg (CHO) maltodextrin. PLA and CHO meals contained xanthan gum and low-energy flavouring (approximately 122 kJ), and subjects were told both ‘contained energy’. At 2 h post-meal, subjects completed four sets of back squat and bench press to failure at 90 % ten repetition maximum. Blood samples were taken pre-meal, 45 min and 105 min post-meal to measure serum/plasma glucose, insulin, ghrelin, glucagon-like peptide-1 and peptide tyrosine-tyrosine concentrations. Subjective hunger/fullness was also measured. Total back squat repetitions were greater in CHO (44 (sd 10) repetitions) and PLA (43 (sd 10) repetitions) than WAT (38 (sd 10) repetitions; P < 0·001). Total bench press repetitions were similar between trials (WAT 37 (sd 7) repetitions; CHO 39 (sd 7) repetitions; PLA 38 (sd 7) repetitions; P = 0·130). Performance was similar between CHO and PLA trials. Hunger was suppressed and fullness increased similarly in PLA and CHO, relative to WAT (P < 0·001). During CHO, plasma glucose was elevated at 45 min (P < 0·05), whilst serum insulin was elevated (P < 0·05) and plasma ghrelin suppressed at 45 and 105 min (P < 0·05). These results suggest that breakfast/pre-exercise nutrition enhances resistance exercise performance via a psychological effect, although a potential mediating role of hunger cannot be discounted.
Most older adults perceive themselves as good drivers; however, their perception may not be accurate, and could negatively affect their driving safety. This study examined the accuracy of older drivers’ self-awareness of driving ability in their everyday driving environment by determining the concordance between the perceived (assessed by the Perceived Driving Ability [PDA] questionnaire) and actual (assessed by electronic Driving Observation Schedule [eDOS]) driving performance. One hundred and eight older drivers (male: 67.6%; age: mean = 80.6 years, standard deviation [SD] = 4.9 years) who participated in the study were classified into three groups: underestimation (19%), accurate estimation (29%), and overestimation (53%). Using the demographic and clinical functioning information collected in the Candrive annual assessments, an ordinal regression showed that two factors were related to the accuracy of self-awareness: older drivers with better visuo-motor processing speed measured by the Trail Making Test (TMT)-A and fewer self-reported comorbid conditions tended to overestimate their driving ability, and vice versa.
Campylobacteriosis is the most common notifiable disease in New Zealand. While the risk of campylobacteriosis has been found to be strongly associated with the consumption of undercooked poultry, other risk factors include rainwater-sourced drinking water, contact with animals and consumption of raw dairy products. Despite this, there has been little investigation of raw milk as a risk factor for campylobacteriosis. Recent increases in demand for untreated or ‘raw’ milk have also raised concerns that this exposure may become a more important source of disease in the future. This study describes the cases of notified campylobacteriosis from a sentinel surveillance site. Previously collected data from notified cases of raw milk-associated campylobacteriosis were examined and compared with campylobacteriosis cases who did not report raw milk consumption. Raw milk campylobacteriosis cases differed from non-raw milk cases on comparison of age and occupation demographics, with raw milk cases more likely to be younger and categorised as children or students for occupation. Raw milk cases were more likely to be associated with outbreaks than non-raw milk cases. Study-suggested motivations for raw milk consumption (health reasons, natural product, produced on farm, inexpensive or to support locals) were not strongly supported by cases. More information about the raw milk consumption habits of New Zealanders would be helpful to better understand the risks of this disease, especially with respect to increased disease risk observed in younger people. Further discussion with raw milk consumers around their motivations may also be useful to find common ground between public health concerns and consumer preferences as efforts continue to manage this ongoing public health issue.
Electrochemical capacitors featuring a modified acetonitrile (AN) electrolyte and a binder-free, activated carbon fabric electrode material were assembled and tested at <−40 °C. The melting point of the electrolyte was depressed relative to the standard pure AN solvent through the use of a methyl formate cosolvent, to enable operation at temperatures lower than the rated limit of typical commercial cells (−40 °C). Based on earlier electrolyte formulation studies, a 1:1 ratio of methyl formate to AN (by volume) was selected, to maximize freezing point depression while maintaining a sufficient salt solubility. The salt spiro-(1,1′)-bipyrrolidinium tetrafluoroborate was used, based on its improved conductivity at low temperatures, relative to linear alkyl ammonium salts. The carbon fabric electrode supported a relatively high rate capability at temperatures as low as −65 °C with a modest increase in cell resistance at this reduced temperature. The capacitance was only weakly dependent on temperature, with a specific capacitance of ∼110 F/g.
The purpose of this study was to examine whether vehicle type based on size (car vs. other = truck/van/SUV) had an impact on the speeding, acceleration, and braking patterns of older male and female drivers (70 years and older) from a Canadian longitudinal study. The primary hypothesis was that older adults driving larger vehicles (e.g., trucks, SUVs, or vans) would be more likely to speed than those driving cars. Participants (n = 493) had a device installed in their vehicles that recorded their everyday driving. The findings suggest that the type of vehicle driven had little or no impact on per cent of time speeding or on the braking and accelerating patterns of older drivers. Given that the propensity for exceeding the speed limit was high among these older drivers, regardless of vehicle type, future research should examine what effect this behaviour has on older-driver road safety.
Epistaxis is the most common ENT emergency. This study aimed to assess one-year mortality rates in patients admitted to a large teaching hospital.
This study was a retrospective case note analysis of all patients admitted to the Queen Elizabeth University Hospital in Glasgow with epistaxis over a 12-month period.
The one-year overall mortality for a patient admitted with epistaxis was 9.8 per cent. The patients who died were older (mean age 77.2 vs 68.8 years; p = 0.002), had a higher Cumulative Illness Rating Scale-Geriatric score (9.9 vs 6.7; p < 0.001) and had a higher performance status score (2 or higher vs less than 2; p < 0.001). Other risk factors were a low admission haemoglobin level (less than 128 g/dl vs 128 g/dl or higher; p = 0.025), abnormal coagulation (p = 0.004), low albumin (less than 36 g/l vs more than 36 g/l; p < 0.001) and longer length of stay (p = 0.046).
There are a number of risk factors associated with increased mortality after admission with epistaxis. This information could help with risk stratification of patients at admission and enable the appropriate patient support to be arranged.
We sought to address the prior limitations of symptom checker accuracy by analysing the diagnostic and triage feasibility of online symptom checkers using a consecutive series of real-life emergency department (ED) patient encounters, and addressing a complex patient population – those with hepatitis C or HIV. We aimed to study the diagnostic and triage accuracy of these symptom checkers in relation to an emergency room physician-determined diagnosis. An ED retrospective analysis was performed on 8363 consecutive adult patients. Eligible patients included: 90 HIV, 67 hepatitis C, 11 both HIV and hepatitis C. Five online symptom checkers were utilised for diagnosis (Mayo Clinic, WebMD, Symptomate, Symcat, Isabel), three with triage capabilities. Symptom checker output was compared with ED physician-determined diagnosis data in regards to diagnostic accuracy and differential diagnosis listing, along with triage advice. All symptom checkers, whether for combined HIV and hepatitis C, HIV alone or hepatitis C alone had poor diagnostic accuracy in regards to Top1 (<20%), Top3 (<35%), Top10 (<40%), Listed at All (<45%). Significant variations existed for each individual symptom checker, as some appeared more accurate for listing the diagnosis in the top of the differential, vs. others more apt to list the diagnosis at all. In regards to ED triage data, a significantly higher percentage of hepatitis C patients (59.7%; 40/67) were found to have an initial diagnosis with emergent criteria than HIV patients (35.6%; 32/90). Symptom checker diagnostic capabilities are quite inferior to physician diagnostic capabilities. Complex patients such as those with HIV or hepatitis C may carry a more specific differential diagnosis, warranting symptom checkers to have diagnostic algorithms accounting for such complexity. Symptom checkers carry the potential for real-time epidemiologic monitoring of patient symptoms, as symptom entries and subsequent symptom checker diagnosis could allow health officials a means to track illnesses in specific patient populations and geographic regions. In order to do this, accurate and reliable symptom checkers are warranted.
The north-west European population of Bewick’s Swan Cygnus columbianus bewickii declined by 38% between 1995 and 2010 and is listed as ‘Endangered’ on the European Red List of birds. Here, we combined information on food resources within the landscape with long-term data on swan numbers, habitat use, behaviour and two complementary measures of body condition, to examine whether changes in food type and availability have influenced the Bewick’s Swan’s use of their main wintering site in the UK, the Ouse Washes and surrounding fens. Maximum number of Bewick’s Swans rose from 620 in winter 1958/59 to a high of 7,491 in winter 2004/05, before falling to 1,073 birds in winter 2013/14. Between winters 1958/59 and 2014/15 the Ouse Washes supported between 0.5 and 37.9 % of the total population wintering in north-west Europe (mean ± 95 % CI = 18.1 ± 2.4 %). Swans fed on agricultural crops, shifting from post-harvest remains of root crops (e.g. sugar beet and potatoes) in November and December to winter-sown cereals (e.g. wheat) in January and February. Inter-annual variation in the area cultivated for these crops did not result in changes in the peak numbers of swans occurring on the Ouse Washes. Behavioural and body condition data indicated that food supplies on the Ouse Washes and surrounding fens remain adequate to allow the birds to gain and maintain good body condition throughout winter with no increase in foraging effort. Our findings suggest that the recent decline in numbers of Bewick’s Swans at this internationally important site was not linked to inadequate food resources.
Dengue is the fastest spreading mosquito-transmitted disease in the world. In China, Guangzhou City is believed to be the most important epicenter of dengue outbreaks although the transmission patterns are still poorly understood. We developed an autoregressive integrated moving average model incorporating external regressors to examine the association between the monthly number of locally acquired dengue infections and imported cases, mosquito densities, temperature and precipitation in Guangzhou. In multivariate analysis, imported cases and minimum temperature (both at lag 0) were both associated with the number of locally acquired infections (P < 0.05). This multivariate model performed best, featuring the lowest fitting root mean squared error (RMSE) (0.7520), AIC (393.7854) and test RMSE (0.6445), as well as the best effect in model validation for testing outbreak with a sensitivity of 1.0000, a specificity of 0.7368 and a consistency rate of 0.7917. Our findings suggest that imported cases and minimum temperature are two key determinants of dengue local transmission in Guangzhou. The modelling method can be used to predict dengue transmission in non-endemic countries and to inform dengue prevention and control strategies.
Introduction: To describe dosing, duration, and pre- and post-infusion analgesic administration of continuous intravenous sub-dissociative dose ketamine (SDK) infusion for managing a variety of painful conditions in the emergency department (ED). Methods: Retrospective chart review of patients aged 18 and older presenting to the ED with acute and chronic painful conditions who received continuous SDK infusion in the ED for a period over 6 years (2010-2016). Primary data analyses included dosing and duration of infusion, rates of pre- and post-infusion analgesic administration, and final diagnoses. Secondary data included pre- and post-infusion pain scores and rates of side effects. Results: 104 patients were enrolled in the study. Average dosing of ketamine infusion was 11.26 mg/hr, the mean duration of infusion was 135.87 minutes with 38% increase in patients not requiring post-infusion analgesia. The average decrease in pain score was 5.04. There were 12 reported adverse effects with nausea being the most prevalent. Conclusion: Continuous intravenous SDK infusion has a role in controlling pain of various etiologies in the ED with a potential to reduce need for co-analgesics or rescue analgesic administration. There is a need for more robust, prospective, randomized trials that will further evaluate the analgesic efficacy and safety of this modality across wide range of pain syndromes and different age groups in the ED.
An internationally approved and globally used classification scheme for the diagnosis of CHD has long been sought. The International Paediatric and Congenital Cardiac Code (IPCCC), which was produced and has been maintained by the International Society for Nomenclature of Paediatric and Congenital Heart Disease (the International Nomenclature Society), is used widely, but has spawned many “short list” versions that differ in content depending on the user. Thus, efforts to have a uniform identification of patients with CHD using a single up-to-date and coordinated nomenclature system continue to be thwarted, even if a common nomenclature has been used as a basis for composing various “short lists”. In an attempt to solve this problem, the International Nomenclature Society has linked its efforts with those of the World Health Organization to obtain a globally accepted nomenclature tree for CHD within the 11th iteration of the International Classification of Diseases (ICD-11). The International Nomenclature Society has submitted a hierarchical nomenclature tree for CHD to the World Health Organization that is expected to serve increasingly as the “short list” for all communities interested in coding for congenital cardiology. This article reviews the history of the International Classification of Diseases and of the IPCCC, and outlines the process used in developing the ICD-11 congenital cardiac disease diagnostic list and the definitions for each term on the list. An overview of the content of the congenital heart anomaly section of the Foundation Component of ICD-11, published herein in its entirety, is also included. Future plans for the International Nomenclature Society include linking again with the World Health Organization to tackle procedural nomenclature as it relates to cardiac malformations. By doing so, the Society will continue its role in standardising nomenclature for CHD across the globe, thereby promoting research and better outcomes for fetuses, children, and adults with congenital heart anomalies.
Morelli, Potosky, Arthur, and Tippins (2017) make a timely and appropriate call for authors to create conceptual models of technology in industrial-organizational (I-O) psychology. We agree with their call, but we believe that Morelli et al. overlooked the contributions of related fields that conduct research on technology in the workplace that are already consistent with their call. For this reason, we briefly detail other fields that commonly study the dynamics of technology and its influence on the workplace, followed by a discussion regarding the place of I-O psychology in the broader scheme of technology research. This discussion can aid future authors in conceptualizing appropriate contributions to the study of technology in I-O psychology as well as identifying whether these contributions benefit other fields. Perhaps more importantly, this discussion can help identify where I-O psychology fits in the broader scheme of technology research and which associated fields may be most readily available to aid in the creation of new models—two questions that currently seem unanswered.