To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Walk or Run to Quit was a national program targeting smoking cessation through group-based running clinics. Increasing physical activity may facilitate smoking cessation as well as lead to additional health benefits beyond cessation.
To evaluate the impact of Walk or Run to Quit over 3 years.
Adult male and female participants (N = 745) looking to quit smoking took part in 156 running-based cessation clinics in 79 locations across Canada. Using a pre-post design, participants completed questionnaires assessing physical activity, running frequency and smoking at the beginning and end of the 10-week program and at 6-months follow-up. Carbon monoxide testing pre- and post- provided an objective indicator of smoking status and coach logs assessed implementation.
55.0% of program completers achieved 7-day point prevalence (intent-to-treat = 22.1%) and carbon monoxide significantly decreased from weeks 1 to 10 (P < 0.001). There was an increase in physical activity and running from baseline to end-of-program (P's<0.001). At 6-month follow-up, 28.9% of participants contacted self-reported prolonged 6-month abstinence (intent-to-treat = 11.4%) and 35.6% were still running regularly.
Although attrition was a concern, Walk or Run to Quit demonstrated potential as a scalable behaviour change intervention that targets both cessation and physical activity.
Ecosystem engineers such as the Antarctic scallop (Adamussium colbecki) shape marine communities. Thus, changes to their lifespan and growth could have far-reaching effects on other organisms. Sea ice is critical to polar marine ecosystem function, attenuating light and thereby affecting nutrient availability. Sea ice could therefore impact longevity and growth in polar bivalves unless temperature is the overriding factor. Here, we compare the longevity and growth of A. colbecki from two Antarctic sites: Explorers Cove and Bay of Sails, which differ by sea-ice cover, but share similar seawater temperatures, the coldest on Earth (-1.97°C). We hypothesize that scallops from the multiannual sea-ice site will have slower growth and greater longevity. We found maximum ages to be similar at both sites (18–19 years). Growth was slower, with higher inter-individual variability, under multiannual sea ice than under annual sea ice, which we attribute to patchier nutrient availability under multiannual sea ice. Contrary to expectations, A. colbecki growth, but not longevity, is affected by sea-ice duration when temperatures are comparable. Recent dramatic reductions in Antarctic sea ice and predicted temperature increases may irrevocably alter the life histories of this ecosystem engineer and other polar organisms.
The COVID-19 pandemic and subsequent state of public emergency have significantly affected older adults in Canada and worldwide. It is imperative that the gerontological response be efficient and effective. In this statement, the board members of the Canadian Association on Gerontology/L’Association canadienne de gérontologie (CAG/ACG) and the Canadian Journal on Aging/La revue canadienne du vieillissement (CJA/RCV) acknowledge the contributions of CAG/ACG members and CJA/RCV readers. We also profile the complex ways that COVID-19 is affecting older adults, from individual to population levels, and advocate for the adoption of multidisciplinary collaborative teams to bring together different perspectives, areas of expertise, and methods of evaluation in the COVID-19 response.
This paper investigates cost-share program attributes that would affect producers' willingness to enroll in a cost-share program to fund the adoption of best management practices to improve water quality and decrease water use. Through a survey administered to Florida agricultural producers, we conducted choice experiments to assess farmers’ preferences for cost-share programs using five attributes: contracting agency, length of contract, annual verification process, costs included, and percent of costs covered. Results suggest that producers prefer cost-share programs with shorter contract lengths, self-monitoring, and administration by agricultural (as opposed to environmental) agencies. Our findings suggest the importance of an existing trust between the local communities and the contracting agencies for higher enrollment rates in cost –share programs. Our results can inform policymakers on ways to increase enrollment rates that move towards long-term environmental goals.
Introduction: An increasing number of Canadian paramedic services are creating Community Paramedic programs targeting treatment of long-term care (LTC) patients on-site. We explored the characteristics, clinical course and disposition of LTC patients cared for by paramedics during an emergency call, and the possible impact of Community Paramedic programs. Methods: We completed a health records review of paramedic call reports and emergency department (ED) records between April 1, 2016 and March 31, 2017. We utilized paramedic dispatch data to identify emergency calls originating from LTC centers resulting in transport to one of the two EDs of the Ottawa Hospital. We excluded patients with absent vital signs, a Canadian Triage and Acuity Scale (CTAS) score of 1, and whose transfer to hospital were deferrable or scheduled. We stratified remaining cases by month and selected cases using a random number generator to meet our apriori sample size. We collected data using a piloted standardized form. We used descriptive statistics and categorized patients into groups based on the ED care received and if the treatment received fit into current paramedic medical directives. Results: Characteristics of the 381 included patients were mean age 82.5 years, 58.5% female, 59.7% hypertension, 52.6% dementia and 52.1% cardiovascular disease. On arrival at hospital, 57.7% of patients waited in offload delay for a median time of 45 minutes (IQR 33.5-78.0). We could identify 4 groups: 1) Patients requiring no treatment or diagnostics in the ED (7.9%); 2) Patients receiving ED treatment within current paramedic medical directives and no diagnostics (3.2%); 3) Patients requiring diagnostics or ED care outside current paramedic directives (54.9%); and 4) patients requiring admission (34.1%). Most patients were discharged from the ED (65.6%), and 1.1% died. The main ED diagnoses were infection (18.6%) and musculoskeletal injury (17.9%). Of the patients that required ED care but were discharged, 64.1% required x-rays, 42.1% CT, and 3.4% ultrasound. ED care included intravenous fluids (35.7%), medication (67.5%), antibiotics (29.4%), non-opioid analgesics (29.4%) and opioids (20.7%). Overall, 11.1% of patients didn't need management beyond current paramedic capabilities. Conclusion: Many LTC patients could receive care by paramedics on-site within current medical directives and avoid a transfer to the ED. This group could potentially grow using Community Paramedics with an expanded scope of practice.
Introduction: Emergency department (ED) crowding, long waits for care, and paramedic offload delay are of increasing concern. Older adults living in long-term care (LTC) are more likely to utilize the ED and are vulnerable to adverse events. We sought to identify existing programs that seek to avoid ED visits from LTC facilities where allied health professionals are the primary providers of the intervention and, to evaluate their efficacy and safety. Methods: We completed this systematic review based on a protocol we published apriori and following the Preferred Reporting Items for Systematic Reviews and Meta-Analysis (PRISMA) statement. We systematically searched Medline, CINAHL and EMBASE with terms relating to long-term care, emergency services, hospitalization and allied health personnel. Two investigators independently selected studies and extracted data using a piloted standardized form and evaluated the risk of bias of included studies. We report a narrative synthesis grouped by intervention categories. Results: We reviewed 11,176 abstracts and included 22 studies. Most studies were observational and few assessed patient safety. We found five categories of interventions including: 1) use of advanced practice nursing; 2) a program called Interventions to Reduce Acute Care Transfers (INTERACT); 3) end-of-life care; 4) condition specific interventions; and 5) use of extended care paramedics. Of the 13 studies that reported ED visits, all (100%) reported a decrease, and of the 16/17 that reported hospitalization, 94.1% reported a decrease. Patient adverse events such as functional status and relapse were seldom reported (6/22) as were measures of emergency system function such as crowding/inability of paramedics to transfer care to the ED (1/22). Only 4/22 studies evaluated patient mortality and 3/4 found a non-statistically significant worsening. When measured, studies reported decreased hospital length of stay, more time spent with patients by allied health professionals and cost savings. Conclusion: We found five types of programs/interventions which all demonstrated a decrease in ED visits or hospitalization. Many identified programs focused on improved primary care for patients. Interventions addressing acute care issues such as those provided by community paramedics, patient preferences, and quality of life indicators all deserve more study.
Implementation of genome-scale sequencing in clinical care has significant challenges: the technology is highly dimensional with many kinds of potential results, results interpretation and delivery require expertise and coordination across multiple medical specialties, clinical utility may be uncertain, and there may be broader familial or societal implications beyond the individual participant. Transdisciplinary consortia and collaborative team science are well poised to address these challenges. However, understanding the complex web of organizational, institutional, physical, environmental, technologic, and other political and societal factors that influence the effectiveness of consortia is understudied. We describe our experience working in the Clinical Sequencing Evidence-Generating Research (CSER) consortium, a multi-institutional translational genomics consortium.
A key aspect of the CSER consortium was the juxtaposition of site-specific measures with the need to identify consensus measures related to clinical utility and to create a core set of harmonized measures. During this harmonization process, we sought to minimize participant burden, accommodate project-specific choices, and use validated measures that allow data sharing.
Identifying platforms to ensure swift communication between teams and management of materials and data were essential to our harmonization efforts. Funding agencies can help consortia by clarifying key study design elements across projects during the proposal preparation phase and by providing a framework for data sharing data across participating projects.
In summary, time and resources must be devoted to developing and implementing collaborative practices as preparatory work at the beginning of project timelines to improve the effectiveness of research consortia.
Having a diagnosis of schizophrenia is a risk factor for involuntary admission to psychiatric inpatient care, but we have a limited understanding of why some patients and not others require involuntary admission. We aimed to identify the predictors of involuntary admission in first episode schizophrenia. We used validated instruments to assess clinical and socio-demographic variables in all patients (n = 78) with first episode schizophrenia from a defined geographical area admitted to a Dublin psychiatric hospital over a 4-year period. Involuntary patients (n = 17) could not be distinguished from voluntary patients (n = 61) on the basis of age, gender, living status, marital status, drug abuse or duration of untreated psychosis. Neither positive nor negative symptoms were useful predictors of admission status. Lack of insight was a strong predictor of involuntary status.
The aim of this study was to identify the features of first episode schizophrenia that predict adherence antipsychotic medication at six-month follow-up. We used validated instruments to assess clinical and socio-demographic variables in all patients with first episode schizophrenia from a defined geographical area admitted to a Dublin psychiatric hospital over a four-year period (N = 100). At six-month follow-up (N = 60) we assessed adherence to medication using the Compliance Interview. One third of patients with schizophrenia were non-adherent with medication within six months of their first episode of illness. High levels of positive symptoms at baseline, lack of insight at baseline, alcohol misuse at baseline and previous drug misuse predict non-adherence. These results indicate that an identifiable subgroup of patients with first episode schizophrenia is at high risk of early non-adherence to medication. While high positive symptom scores pre-date and predict non-adherence in most patients, reduced insight is the best predictor of non-adherence in patients who do not misuse alcohol or other drugs.
Poor school connectedness (SC), defined as students’ feelings of belonging, safety, and fairness at school, is a risk factor for negative psychosocial outcomes. Few studies have examined the specific relationship between SC and anxiety. This study examined the relation between SC and anxiety within a group of 114 clinically anxious youth (mean age = 10.82; SD = 2.93; 48.2% female; 70.2% White, non-Hispanic); age differences were also examined. Results indicated that SC was significantly negatively associated with age but unrelated to gender, race/ethnicity, socio-economic status, parent education, or presence of a comorbid disorder. Findings generally revealed that low SC was associated with greater total and domain specific anxiety. SC may play a unique role in the maintenance of global and domain specific anxiety symptoms.
This is a cross-sectional study aiming to understand the early characteristics and background of bone health impairment in clinically well children with Fontan circulation.
We enrolled 10 clinically well children with Fontan palliation (operated >5 years before study entrance, Tanner stage ≤3, age 12.1 ± 1.77 years, 7 males) and 11 healthy controls (age 12.0 ± 1.45 years, 9 males) at two children’s hospitals. All patients underwent peripheral quantitative CT. For the Fontan group, we obtained clinical characteristics, NYHA class, cardiac index by MRI, dual x-ray absorptiometry, and biochemical studies. Linear regression was used to compare radius and tibia peripheral quantitative CT measures between Fontan patients and controls.
All Fontan patients were clinically well (NYHA class 1 or 2, cardiac index 4.85 ± 1.51 L/min/m2) and without significant comorbidities. Adjusted trabecular bone mineral density, cortical thickness, and bone strength index at the radius were significantly decreased in Fontan patients compared to controls with mean differences −30.13 mg/cm3 (p = 0.041), −0.31 mm (p = 0.043), and −6.65 mg2/mm4 (p = 0.036), respectively. No differences were found for tibial measures. In Fontan patients, the mean height-adjusted lumbar bone mineral density and total body less head z scores were −0.46 ± 1.1 and −0.63 ± 1.1, respectively, which are below the average, but within normal range for age and sex.
In a clinically well Fontan cohort, we found significant bone deficits by peripheral quantitative CT in the radius but not the tibia, suggesting non-weight-bearing bones may be more vulnerable to the unique haemodynamics of the Fontan circulation.
Following an outbreak of highly pathogenic avian influenza virus (HPAIV) in a poultry house, control measures are put in place to prevent further spread. An essential part of the control measures based on the European Commission Avian Influenza Directive 2005/94/EC is the cleansing and disinfection (C&D) of infected premises. Cleansing and disinfection includes both preliminary and secondary C&D, and the dismantling of complex equipment during secondary C&D is also required, which is costly to the owner and also delays the secondary cleansing process, hence increasing the risk for onward spread. In this study, a quantitative risk assessment is presented to assess the risk of re-infection (recrudescence) occurring in an enriched colony-caged layer poultry house on restocking with chickens after different C&D scenarios. The risk is expressed as the number of restocked poultry houses expected before recrudescence occurs. Three C&D scenarios were considered, namely (i) preliminary C&D alone, (ii) preliminary C&D plus secondary C&D without dismantling and (iii) preliminary C&D plus secondary C&D with dismantling. The source-pathway-receptor framework was used to construct the model, and parameterisation was based on the three C&D scenarios. Two key operational variables in the model are (i) the time between depopulation of infected birds and restocking with new birds (TbDR) and (ii) the proportion of infected material that bypasses C&D, enabling virus to survive the process. Probability distributions were used to describe these two parameters for which there was recognised variability between premises in TbDR or uncertainty due to lack of information in the fraction of bypass. The risk assessment estimates that the median (95% credible intervals) number of repopulated poultry houses before recrudescence are 1.2 × 104 (50 to 2.8 × 106), 1.9 × 105 (780 to 5.7 × 107) and 1.1 × 106 (4.2 × 103 to 2.9 × 108) under C&D scenarios (i), (ii) and (iii), respectively. Thus for HPAIV in caged layers, undertaking secondary C&D without dismantling reduces the risk by 16-fold compared to preliminary C&D alone. Dismantling has an additional, although smaller, impact, reducing the risk by a further 6-fold and thus around 90-fold compared to preliminary C&D alone. On the basis of the 95% credible intervals, the model demonstrates the importance of secondary C&D (with or without dismantling) over preliminary C&D alone. However, the extra protection afforded by dismantling may not be cost beneficial in the context of reduced risk of onward spread.
Capitalist development has always, and everywhere, been characterized by the restless mobility of both capital and labour. While these two forms of mobility are fundamentally related, it is unusual to combine the study of both or seek connections between them. In an effort to make these connections more than three decades ago, Saskia Sassen commented that the two processes of capital and labour mobility ‘have been constructed into unrelated categories’ (1988: 12). This assessment still largely holds true. The objective of this book is to explore the links between these forms of mobility with a particular focus on Asia.
While the imperative to be mobile is well established as a systemic feature of capital, it is usually studied through frameworks that try to understand the behaviour of firms, conglomerates, production networks, or investors. An extensive body of literature addresses corporate structures and strategies of capital accumulation. For example, in the field of international business, attention has traditionally focused on the mobility of capital, primarily through foreign direct investment (FDI) (for example, Dunning, 1988). The underlying assumption is the immobility of labour. The multinational corporation, with its proprietary capital and know-how (ownership advantage) and governance within a hierarchical organization (internalization advantage), facilitates the mobility of capital in order to take advantage of location-bound factors of production (including labour). Other approaches have addressed the networks and supply chains in which firms are situated. There have been, for example, significant efforts at understanding the spatial structures of production through the lenses of global commodity chains and global production networks. These bodies of literature point out that significant levels of spatial flexibility and mobility in production capital have been created through non-ownership modes of control such as subcontracting (Gereffi and Korzeniewicz, 1994; Coe and Yeung, 2015). Complementing this work are studies that focus on corporate international expansion trajectories and governance structures to manage globally dispersed investments (for example, Cuervo-Cazurra and Ramamurti, 2014; Ramamurti and Singh, 2009). Labour seldom features centrally in such accounts, except as an in situ characteristic of a particular place, valued for its skills, affordability, or docility. At the human scale, it is usually the investor or manager who is assumed to be mobile, but mostly it is the spatial configuration of capital itself (through FDI, corporate structures, commodity trade, debt, and so on) that receives attention.