To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Scientific endeavors are increasingly carried out by teams of scientists. While there is growing literature on factors associated with effective science teams, little is known about processes that facilitate the success of dissemination and implementation (D&I) teams studying the uptake of healthcare innovations. This study aimed to identify strategies used by D&I scientists to promote team science.
Using a nominal group technique, a sample of 27 D&I scholars responded to the question, “What strategies have you or others used to promote team science?” Participants were asked to individually respond and then discuss within a small group to determine the group’s top three strategies. Through a facilitated consensus discussion with the full sample, a rank-ordered list of three strategies was determined.
A total of 126 individual responses (M = 9; SD = 4.88) were submitted. Through small group discussion, six groups ranked their top three strategies to promote team science. The final ranked list of strategies determined by the full sample included: (1) developing and maintaining clear expectations, (2) promoting and modeling effective communication, and (3) establishing shared goals and a mission of the work to be accomplished.
Because of its goal of translating knowledge to practice, D&I research necessitates the use of team science. The top strategies are in line with those found to be effective for teams in other fields and hold promise for improving D&I team cohesion and innovation, which may ultimately accelerate the translation of health innovations and the improvement of care quality and outcomes.
Objectives: This study aimed to evaluate the influence of lower limb loss (LL) on mental workload by assessing neurocognitive measures in individuals with unilateral transtibial (TT) versus those with transfemoral (TF) LL while dual-task walking under varying cognitive demand. Methods: Electroencephalography (EEG) was recorded as participants performed a task of varying cognitive demand while being seated or walking (i.e., varying physical demand). Results: The findings revealed both groups of participants (TT LL vs. TF LL) exhibited a similar EEG theta synchrony response as either the cognitive or the physical demand increased. Also, while individuals with TT LL maintained similar performance on the cognitive task during seated and walking conditions, those with TF LL exhibited performance decrements (slower response times) on the cognitive task during the walking in comparison to the seated conditions. Furthermore, those with TF LL neither exhibited regional differences in EEG low-alpha power while walking, nor EEG high-alpha desynchrony as a function of cognitive task difficulty while walking. This lack of alpha modulation coincided with no elevation of theta/alpha ratio power as a function of cognitive task difficulty in the TF LL group. Conclusions: This work suggests that both groups share some common but also different neurocognitive features during dual-task walking. Although all participants were able to recruit neural mechanisms critical for the maintenance of cognitive-motor performance under elevated cognitive or physical demands, the observed differences indicate that walking with a prosthesis, while concurrently performing a cognitive task, imposes additional cognitive demand in individuals with more proximal levels of amputation.
Vascular surgery patients are nutritionally vulnerable. Various malnutrition screening and assessment tools are available; however, none has been developed or validated in vascular patients. The present study aimed to: (1) investigate the validity of four commonly administered malnutrition screening tools (Malnutrition Screening Tool (MST), Malnutrition Universal Screening Tool (MUST), Nutrition Risk Screen-2002 (NRS-2002) and the Mini-Nutritional Assessment – Short Form (MNA-SF) and an assessment tool (the Patient-Generated Subjective Global Assessment (PG-SGA)) compared against a comprehensive dietitian’s assessment and (2) evaluate the ability of the instruments to predict outcomes. Vascular inpatients were screened using the four malnutrition screening tools and assessed using the PG-SGA. Each was assessed by a dietitian incorporating nutritional biochemistry, anthropometry and changes in dietary intake. Diagnostic accuracy, consistency and predictive ability were determined. A total of 322 (69·3 % male) patients participated, with 75 % having at least one parameter indicating nutritional deficits. No instrument achieved the a priori levels for sensitivity (14·9–52·5 %). Neither tool predicted EuroQoL 5-dimension 5-level score. All tools except the MNA-SF were associated with length of stay (LOS); however, the direction varied with increased risk of malnutrition on the MUST and NRS-2002 being associated with shorter LOS (P=0·029 and 0·045) and the reverse with the MST and PG-SGA (P=0·005 and <0·001). The NRS-2002 was associated with increased risk of complications (P=0·039). The MST, NRS-2002 and PG-SGA were predictive of discharge to an institution (P=0·004, 0·005 and 0·003). The tools studied were unable to identify the high prevalence of undernutrition; hence, vascular disease-specific screening and/or assessment tools are warranted.
Species distribution models (SDMs) are statistical tools used to develop continuous predictions of species occurrence. ‘Integrated SDMs’ (ISDMs) are an elaboration of this approach with potential advantages that allow for the dual use of opportunistically collected presence-only data and site-occupancy data from planned surveys. These models also account for survey bias and imperfect detection through the use of a hierarchical modelling framework that separately estimates the species–environment response and detection process. This is particularly helpful for conservation applications and predictions for rare species, where data are often limited and prediction errors may have significant management consequences. Despite this potential importance, ISDMs remain largely untested under a variety of scenarios. We performed an exploration of key modelling decisions and assumptions on an ISDM using the endangered Baird’s tapir (Tapirus bairdii) as a test species. We found that site area had the strongest effect on the magnitude of population estimates and underlying intensity surface and was driven by estimates of model intercepts. Selecting a site area that accounted for the individual movements of the species within an average home range led to population estimates that coincided with expert estimates. ISDMs that do not account for the individual movements of species will likely lead to less accurate estimates of species intensity (number of individuals per unit area) and thus overall population estimates. This bias could be severe and highly detrimental to conservation actions if uninformed ISDMs are used to estimate global populations of threatened and data-deficient species, particularly those that lack natural history and movement information. However, the ISDM was consistently the most accurate model compared to other approaches, which demonstrates the importance of this new modelling framework and the ability to combine opportunistic data with systematic survey data. Thus, we recommend researchers use ISDMs with conservative movement information when estimating population sizes of rare and data-deficient species. ISDMs could be improved by using a similar parameterization to spatial capture–recapture models that explicitly incorporate animal movement as a model parameter, which would further remove the need for spatial subsampling prior to implementation.
Atrial fibrillation (AFIB) with rapid ventricular response (RVR) is a common tachydysrhythmia encountered by Emergency Medical Services (EMS). Current guidelines suggest rate control in stable, symptomatic patients.
Little is known about the safety or efficacy of rate-controlling medications given by prehospital providers. This study assessed a protocol for prehospital administration of diltiazem in the setting of AFIB with RVR for provider protocol compliance, patient clinical improvement, and associated adverse events.
This was a retrospective, cohort study of patients who were administered diltiazem by providers in the Orange County EMS System (Florida USA) over a two-year period. The protocol directed a 0.25mg/kg dose of diltiazem (maximum of 20mg) for stable, symptomatic patients in AFIB with RVR at a rate of >150 beats per minute (bpm) with a narrow complex. Data collected included patient characteristics, vital signs, electrocardiogram (ECG) rhythm before and after diltiazem, and need for rescue or additional medications. Adverse events were defined as systolic blood pressure <90mmHg or administration of intravenous fluid after diltiazem administration. Clinical improvement was defined as a heart rate decreased by 20% or less than 100bmp. Original prehospital ECG rhythm interpretations were compared to physician interpretations performed retrospectively.
Over the study period, 197 patients received diltiazem, with 131 adhering to the protocol. The initial rhythm was AFIB with RVR in 93% of the patients (five percent atrial flutter, two percent supraventricular tachycardia, and one percent sinus tachycardia). The agreement between prehospital and physician rhythm interpretation was 92%, with a Kappa value of 0.454 (P <.001). Overall, there were 22 (11%) adverse events, and 112 (57%) patients showed clinical improvement. When diltiazem was given outside of the existing protocol, the patients had higher rates of adverse events (18% versus eight percent; P = .033). Patients who received diltiazem in adherence with protocols were more likely to show clinical improvement (63% versus 46%; P = .031).
This study suggests that prehospital diltiazem administration for AFIB with RVR is safe and effective when strict protocols are followed.
Rodriguez A, Hunter CL, Premuroso C, Silvestri S, Stone A, Miller S, Zuver C, Papa L. Safety and efficacy of prehospital diltiazem for atrial fibrillation with rapid ventricular response. Prehosp Disaster Med. 2019;34(3):297–302.
We compared elastic moduli in polar firn derived from diving wave refraction seismic velocity analysis, firn-core density measurements and microstructure modelling based on firn-core data. The seismic data were obtained with a small electrodynamic vibrator source near Kohnen Station, East Antarctica. The analysis of diving waves resulted in velocity–depth profiles for different wave types (P-, SH- and SV-waves). Dynamic elastic moduli of firn were derived by combining P- and S-wave velocities and densities obtained from firn-core measurements. The structural finite-element method (FEM) was used to calculate the components of the elastic tensor from firn microstructure derived from X-ray tomography of firn-core samples at depths of 10, 42, 71 and 99 m, providing static elastic moduli. Shear and bulk moduli range from 0.39 to 2.42 GPa and 0.68 to 2.42 GPa, respectively. The elastic moduli from seismic observations and the structural FEM agree within 8.5% for the deepest achieved values at a depth of 71 m, and are within the uncertainty range. Our observations demonstrate that the elastic moduli of the firn can be consistently obtained from two independent methods which are based on dynamic (seismic) and static (tomography and FEM) observations, respectively, for deeper layers in the firn below ~10 m depth.
The impact of dementia-related stressors and strains have been examined for their potential to threaten the well-being of either the person with dementia or the family care partner, but rarely have studies considered the dyadic nature of well-being in dementia. The purpose of this study was to examine the dyadic effects of multiple dimensions of strain on the well-being of dementia care dyads.
Using multilevel modeling to account for the inter-relatedness of individual well-being within dementia care dyads, we examined cross-sectional responses collected from 42 dyads comprised of a hospitalized patient diagnosed with a primary progressive dementia (PWD) and their family care partner (CP). Both PWDs and CPs self-reported on their own well-being using measures of quality of life (QOL-Alzheimer’s Disease scale) and depressive symptoms (Center for Epidemiological Studies Depression Scale).
In adjusted models, the PWD’s well-being (higher QOL and lower depressive symptoms) was associated with significantly less strain in the dyad’s relationship. The CP’s well-being was associated with significantly less care-related strain and (for QOL scale) less relationship strain.
Understanding the impact of dementia on the well-being of PWDs or CPs may require an assessment of both members of the dementia care dyad in order to gain a complete picture of how dementia-related stressors and strains impact individual well-being. These results underscore the need to assess and manage dementia-related strain as a multi-dimensional construct that may include strain related to the progression of the disease, strain from providing care, and strain on the dyad’s relationship quality.
Informal (unpaid) care-givers of older people with dementia experience stress and isolation, causing physical and psychiatric morbidity. Comprehensive geriatric assessment clinics represent an important geriatrician-led model of dementia care. Our qualitative study examined the educational and support needs of care-givers of people diagnosed with dementia at a geriatric assessment clinic, resources used to address those needs and challenges experienced in doing so. We conducted structured thematic analysis of interviews with 18 informal care-givers. Participants’ narratives reflected four themes. First, care-givers sought information from varied sources, including the Alzheimer Society, the internet and clinic staff. Responsive behaviours, the expected progression of dementia and system navigation were topics of particular interest. Second, care-givers obtained assistance from public, for-profit and voluntary sources. Third, care-givers received little assistance. Two-thirds received fewer than four hours of help weekly from all sources combined, and none more than 15. Several received no assistance whatsoever. Publicly funded support workers’ tasks, and their timing, were often unhelpful. Finally, while numerous care-givers felt physical and emotional strain, and worried about how poor health impaired their care-giving, many hesitated to seek help. The needs of this unique population of informal care-givers can be met by improved home-care service flexibility, and access to trustworthy information about the expected progression of dementia and skills for managing behavioural and psychological symptoms.
Infants with prenatally diagnosed CHD are at high risk for adverse outcomes owing to multiple physiologic and psychosocial factors. Lack of immediate physical postnatal contact because of rapid initiation of medical therapy impairs maternal–infant bonding. On the basis of expected physiology, maternal–infant bonding may be safe for select cardiac diagnoses.
This is a single-centre study to assess safety of maternal–infant bonding in prenatal CHD.
In total, 157 fetuses with prenatally diagnosed CHD were reviewed. On the basis of cardiac diagnosis, 91 fetuses (58%) were prenatally approved for bonding and successfully bonded, 38 fetuses (24%) were prenatally approved but deemed not suitable for bonding at delivery, and 28 (18%) were not prenatally approved to bond. There were no complications attributable to bonding. Those who successfully bonded were larger in weight (3.26 versus 2.6 kg, p<0.001) and at later gestation (39 versus 38 weeks, p<0.001). Those unsuccessful at bonding were more likely to have been delivered via Caesarean section (74 versus 49%, p=0.011) and have additional non-cardiac diagnoses (53 versus 29%, p=0.014). There was no significant difference regarding the need for cardiac intervention before hospital discharge. Infants who bonded had shorter hospital (7 versus 26 days, p=0.02) and ICU lengths of stay (5 versus 23 days, p=0.002) and higher survival (98 versus 76%, p<0.001).
Fetal echocardiography combined with a structured bonding programme can permit mothers and infants with select types of CHD to successfully bond before ICU admission and intervention.
Good education requires student experiences that deliver lessons about practice as well as theory and that encourage students to work for the public good—especially in the operation of democratic institutions (Dewey 1923; Dewy 1938). We report on an evaluation of the pedagogical value of a research project involving 23 colleges and universities across the country. Faculty trained and supervised students who observed polling places in the 2016 General Election. Our findings indicate that this was a valuable learning experience in both the short and long terms. Students found their experiences to be valuable and reported learning generally and specifically related to course material. Postelection, they also felt more knowledgeable about election science topics, voting behavior, and research methods. Students reported interest in participating in similar research in the future, would recommend other students to do so, and expressed interest in more learning and research about the topics central to their experience. Our results suggest that participants appreciated the importance of elections and their study. Collectively, the participating students are engaged and efficacious—essential qualities of citizens in a democracy.
In truth, apart from 1919 and 1920, when order books were full to replace vessels lost in the war, the period up to 1935 was mostly a struggle for the private shipbuilding industry. The experience of the naval race with Germany before the Great War had created a set of circumstances that could not last forever and certainly could not be matched in the post-war world. The Scottish shipyards on the River Clyde alone launched one-quarter of the world's tonnage in 1913, a figure which owed much to its strong warship sector. Thus, after the war, the Royal Navy possessed a large, young, and expensive fleet, but faced no obvious enemies following the collapse of the Imperial German Navy. Moreover, the British public had been shocked by the horrors of the Great War and now demanded that elected representatives turn their attention to arms limitation and international treaties to ensure there would be no repeat. As such, continuing the high levels of expenditure on the Royal Navy was not a priority for successive British (or other) governments. Industry was left to face the 1920s with only a fraction of the orders it had previously enjoyed.
In private, the government's thinking was guided by the so-called “Ten Year Rule.” Adopted secretly in 1919, the rule assumed that because no major conflict was likely within the next decade there would be no need for a major construction programme. This was renewed annually until 1932, when it was revoked following events in the Far East. But within just five years of its repeal, events in China and later in Germany and Italy set in motion the largest armaments drive in British history, an enterprise in which the Royal Navy played a major part. The problems by 1937 were very different for the British government and industry. After a period of uncertainty when the government feared the political and financial consequences of rearmament, it soon found that it could not move quickly enough: shortages of skills and plant were holding up the effort to put the country's defences in a state of readiness, while severe bottlenecks – large guns were a problem, armour plate another – persisted.
The three politically tumultuous years that followed the formation of the National Government in 1931 fundamentally transformed the nature of defence planning and the relationship between government and industry. Between 1922 and 1931, the debate between Admiralty and Treasury had been comprehensively won by the latter's view on spending limitations. The Committee of Imperial Defence (CID) and its subcommittees had not been called upon to work on pressing matters of national security, but spurred by the changes in the political situation after the Japanese invasion of Manchuria, it adopted a far more central planning role and contributed a great deal towards understanding British defensive deficiencies by 1934. Thus, it was the CID, comprised of all three fighting services, and not just the Admiralty, which became the vehicle for articulating Britain's defence needs to the Cabinet.
This resulted in industrialists playing a key role in advising, and then shaping, defence policy for the first time. On the other hand, by 1934 the Admiralty also began competing more intensely with the developing Air Force for a share of the defence budget. Thus, while the narrative to this point has stressed the private industry's existence outside of the state planning framework and consequently has focussed on industry's responses to the crises in the 1920s and the Admiralty-Treasury disputes, from 1931 onwards the increasing importance of the CID warrants a more detailed examination of the process that gave new meaning to defence planning and that subsequently led to approaching industrialists for assistance.
Overview: Constraints and Pressures
Between Washington and Manchuria, support for the navy had been occasionally vocal but rarely consistent. Winston Churchill argued in favour of increased naval expenditure almost as often as he argued against it before and between 1918 and 1931, first supporting expansion of the fleet before slashing naval estimates and finally implementing the Ten Year Rule for perpetuity. This sort of behaviour was not unique to him: past, present and future Prime Ministers Stanley Baldwin, David Lloyd George and Ramsay MacDonald had periods on both sides of the divide, at times supporting the centrality of the navy to British defence, at others angrily being described by Admiralty figures as “dangerously imperilling everything for which the Royal Navy stands.”
Naval arms manufacturers were in a depressing position at the beginning of 1926, and this suffering prompted the radical action that followed. It was against the backdrop of the cruiser crisis and the naval treaties in the mid-1920s that the WSBC was formed. The committee grew out of other unsuccessful attempts to assist private industry in 1925 and 1926 which drove shipbuilding firms towards collaboration rather than competition. Unable to count on the Admiralty's ability to win increases from the Treasury or Cabinet in the medium term, the individual yards soon realised that the process of competitive tendering was unsustainable and could even drive most of them out of business.
By 1926, it was time for action. The situation was becoming acute: the Coventry Ordnance Works had closed altogether, while Beardmore, Scott and Yarrow had completely run out of profitable work, naval or otherwise, and Palmer and others had fared little better. Sir Alexander Kennedy, the Chairman of Fairfield, summed up the mood of firms in a similar position to his own when he noted despondently that “today private firms [find] themselves burdened with resources and equipment capable of meeting naval requirements far beyond any programme that might for some years to come – if not for ever – likely to be laid down.” His words could have come as easily from the First Lord, who after all was providing the hard evidence which supported Beatty's and D'Eyncourt's worst fears.
Admiralty Responses II: Bending Rules
The Admiralty for its part had long been a champion of state funding to provide a minimum level of orders to ensure firms’ survival, even if this would still be a long way from providing an opportunity for the private industry to maintain the world primacy it had enjoyed before Washington. On top of the heated discussions with the Treasury, the Admiralty had also devised several schemes to preserve capacity of key items where few alternative sources of supply existed, but these plans were often poorly conceived and for the most part did not work well.
It has been asserted elsewhere that the Admiralty's concerns about naval gun capacity forced them to give Vickers a “virtual monopoly of contracts” for the rest of the 1920s.