To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Discovered at the beginning of the twentieth century, the Abri Casserole (Dordogne, France) was the subject of salvage excavations in the early nineties. The fieldwork revealed a sequence of 13 archaeological levels that document human occupations from the Gravettian to the Magdalenian, including very rare and poorly known assemblages (e.g. Early Badegoulian, Protosolutrean) that afford a particular importance to this sequence. Results of a previous dating program that focused on the Badegoulian levels were obtained in 1994 but were neither extensively published nor discussed. Five AMS 14C ages obtained for the Gravettian and Solutrean assemblages in the early 2010s served to complement the site’s chronology. However, since the beta counting ages for the Badegoulian levels were in conflict with the accepted AMS chronology for the region’s late Pleniglacial archaeological record, a new AMS dating program was implemented to renew the radiometric framework of this specific portion of the sequence. Compared to the previous beta counting measurements, the seven newly obtained AMS ages are about 1000 years older (23.3–20.5 cal ka BP) and congruent with other AMS-dated Badegoulian sequences. These results thereby restore the inter-site chronological coherence of the Solutrean–Badegoulian and Badegoulian–Magdalenian transitions.
Smoking was one of the biggest preventable killers of the 20th century, and it continues to cause the death of millions across the globe. The rapid growth of the e-cigarette market in the last 10 years and the claims that it is a safer form of smoking, and can help with smoking cessation, have led to questions being raised on their possible impact to society, the health of the population and the insurance industry. Recent media attention around the possible health implications of e-cigarette use has also ensured that this topic remains in the public eye. The e-cigarette working party was initiated by the Institute and Faculty of Actuaries’ Health and Care Research Sub-Committee in July 2016, with the primary objective of understanding the impact of e-cigarettes on life and health insurance. In this paper, we have looked at all areas of e-cigarette usage and how it relates to insurance in the UK market. In particular, we have covered the potential risks and benefits of switching to e-cigarettes, the results of studies that have been published, the potential impact on underwriting and claims processes and the potential impact on pricing (based on what modelling is possible with the data available). Research in this area is still in its infancy and data are not yet mature, which makes predicting the long-term impact of e-cigarette smoking extremely challenging, for example, there are no studies that directly measure the mortality or morbidity impact of long-term e-cigarette use and so we have had to consider studies that consider more immediate health impacts or look more simply at the constituents of the output of an e-cigarette and compare them to that of a cigarette. The data issue is further compounded by the findings of studies and the advice of national health authorities often being conflicting. For example, while National Health Service England has publicly stated that it supports the growth of e-cigarette usage as an aid to reduce traditional smoking behaviour, the US Food and Drug Administration has been much more vocal in highlighting the perceived dangers of this new form of smoking. Users’ behaviour also adds complexity, as dual use (using both e-cigarettes and cigarettes) is seen in a high percentage of users and relapse rates back to cigarette smoking are currently unknown. Having talked to a number of experts in the field, we have discovered that there is certainly not a common view on risk. We have heard from experts who have significant concerns but also to experts who do believe that e-cigarettes are far safer than tobacco. We have purposefully considered conflicting evidence and have consulted with various parties so we can present differing points of view, thereby ensuring a balanced, unbiased and fair picture of our findings is presented. The evidence we have reviewed does suggest that e-cigarettes are a safer alternative to traditional smoking, but not as safe as non-smoking. There are no large, peer-reviewed, long-term studies yet available to understand the true impact of a switch to e-cigarette use, so currently we are unable to say where on the risk spectrum between cigarette smoking and life-time non-smoking it lies. We do not yet understand if the benefits seen in the studies completed so far will reduce the risk in the long term or whether other health risks will come to light following more prolonged use and study. This, coupled with concerns with the high proportion of dual use of cigarettes and e-cigarettes, relapse rates and the recent growth in medical problems linked with e-cigarette use, means that we need to wait for experience to emerge fully before firm conclusions can be drawn. Although we have presented a view, it is vitally important that our industry continues to monitor developments in this area and fully considers what next steps and future actions may be required to ensure our position reflects the potential benefits and risks that e-cigarette use may bring. We feel that the time is right for a body such as the IFoA to analyse the feasibility of collecting the necessary data through the Continuous Mortality Investigation that would allow us to better analyse the experience that is emerging.
Current adolescent substance use risk models have inadequately predicted use for African Americans, offering limited knowledge about differential predictability as a function of developmental period. Among a sample of 500 African American youth (ages 11–21), four risk indices (i.e., social risk, attitudinal risk, intrapersonal risk, and racial discrimination risk) were examined in the prediction of alcohol, marijuana, and cigarette initiation during early (ages 11–13), mid (ages 16–18), and late (ages 19–21) adolescence. Results showed that when developmental periods were combined, racial discrimination was the only index that predicted initiation for all three substances. However, when risk models were stratified based on developmental period, variation was found within and across substance types. Results highlight the importance of racial discrimination in understanding substance use initiation among African American youth and the need for tailored interventions based on developmental stage.
Starting in 2016, we initiated a pilot tele-antibiotic stewardship program at 2 rural Veterans Affairs medical centers (VAMCs). Antibiotic days of therapy decreased significantly (P < .05) in the acute and long-term care units at both intervention sites, suggesting that tele-stewardship can effectively support antibiotic stewardship practices in rural VAMCs.
To test the feasibility of using telehealth to support antimicrobial stewardship at Veterans Affairs medical centers (VAMCs) that have limited access to infectious disease-trained specialists.
A prospective quasi-experimental pilot study.
Two rural VAMCs with acute-care and long-term care units.
At each intervention site, medical providers, pharmacists, infection preventionists, staff nurses, and off-site infectious disease physicians formed a videoconference antimicrobial stewardship team (VAST) that met weekly to discuss cases and antimicrobial stewardship-related education.
Descriptive measures included fidelity of implementation, number of cases discussed, infectious syndromes, types of recommendations, and acceptance rate of recommendations made by the VAST. Qualitative results stemmed from semi-structured interviews with VAST participants at the intervention sites.
Each site adapted the VAST to suit their local needs. On average, sites A and B discussed 3.5 and 3.1 cases per session, respectively. At site A, 98 of 140 cases (70%) were from the acute-care units; at site B, 59 of 119 cases (50%) were from the acute-care units. The most common clinical syndrome discussed was pneumonia or respiratory syndrome (41% and 35% for sites A and B, respectively). Providers implemented most VAST recommendations, with an acceptance rate of 73% (186 of 256 recommendations) and 65% (99 of 153 recommendations) at sites A and B, respectively. Qualitative results based on 24 interviews revealed that participants valued the multidisciplinary aspects of the VAST sessions and felt that it improved their antimicrobial stewardship efforts and patient care.
This pilot study has successfully demonstrated the feasibility of using telehealth to support antimicrobial stewardship at rural VAMCs with limited access to local infectious disease expertise.
Is the shortest path from A to B the straight line between them? Your first response might be to think it's obviously so. But in fact you know that it's not quite that straightforward. Your sat-nav knows it's not that straightforward. It asks whether you would like it to find the shortest route or the fastest route, because finding the best path depends on knowing what exactly you mean by ‘long’. Likewise, if you're on a walk in the mountains, there's a good chance you'd rather follow the path around the head of the valley, rather than heading down the steep slope and up the other side.
The same sorts of considerations apply in mathematical worlds. I use the mountainside image because it is my preferred way of thinking of a Riemannian metric. Pick an abstract surface S. A Riemannian metric on S gives a well-behaved distance function. By force of habit I tend to picture S as sitting somehow within the physical world. Probably, I'm looking at it from the outside. But if I change viewpoint, so that I am walking around on S, I can picture how the topography affects the idea of the ‘shortest path’.
In the foothills of the Cumberland Mountains, in central Appalachia (a region that spans 13 states in the US), sits an economically distressed and rural community of the United States. Once a thriving coal-mining area, this region now is reported as one of the hardest places to live in the US. Southeastern Kentucky, located in a remote, rocky, mountainous area surrounded by rivers and valleys and prone to flooding, experienced a major flood in Spring 2013 causing significant damage to homes and critical infrastructure.
Aims of the study were to: (1) identify and better understand the contextual variables compounding the impact of a disaster event that occurred in Spring 2013; (2) identify ways participants managed antecedent circumstances, risk, and protective factors to cope with disaster up to 12 months post-event; and (3) further determine implications for community-focused interventions that may enhance recovery for vulnerable populations to promote greater outcomes of adaptation, wellness, and readiness.
Using an ethnographic mixed-methods approach, an inter-collaborative team conducted face-to-face interviews with (N=12) Appalachian residents about their disaster experience, documented observations and visual assessment of need on an observation tool, and used photography depicting structural and environmental conditions. A Health and Emergency Preparedness Assessment Survey Tool was used to collect demographic, health, housing, environment, and disaster readiness assessment data. Community stakeholders facilitated purposeful sampling through coordination of scheduled home visits.
Triangulation of all data sources provided evidence that the community had unique coping strategies related to faith and spirituality, cultural values and heritage, and social support to manage antecedent circumstances, risk, and protective factors during times of adversity that, in turn, enhanced resilience up to 12 months post-disaster. The community was found to have an innate capacity to persevere and utilize resources to manage and transcend adversity and restore equilibrium, which reflected components of resilience that deserve greater recognition and appreciation.
Resilience is a foundational concept for disaster science. A model of resilience for the rural Appalachia community was developed to visually depict the encompassing element of community-based interventions that may enhance coping strategies, mitigate risk factors, integrate protective factors, and strengthen access. Community-based interventions are recommended to strengthen resilience, yielding improved outcomes of adaptation, health and wellness, and disaster readiness.
BanksLH, DavenportLA, HayesMH, McArthurMA, ToroSN, KingCE, VaziraniHM. Disaster Impact on Impoverished Area of US: An Inter-Professional Mixed Method Study. Prehosp Disaster Med. 2016;31(6):583–592.
Studying the emergence of teaching in our lineage entails identifying learning strategies among human and non-human groups, understanding the situations in which they occur, evaluating their performance, recognizing their expression in the archaeological record, identifying trends in the way knowledge transmission changed through time, and detecting the key moments in which members of our lineage complemented pre-existing transmission strategies with those that led our species to develop cumulative culture and eventually ‘teaching’ as we know it. Here we explore how learning processes function in spatial, temporal, and social dimensions and use the resulting situations to build a tentative framework, which may guide our interpretation of the archaeological record and ultimately aid our identification of the learning processes at work in animal and past hominin societies. We test the pertinence of this heuristic approach by applying it to a handful of archaeological case studies.
We analysed data from a prospective cohort of 255024 adults aged ⩾45 years recruited from 2006–2009 to identify characteristics associated with a zoster diagnosis. Diagnoses were identified by linkage to pharmaceutical treatment and hospitalization records specific for zoster and hazard ratios were estimated. Over 940583 person-years, 7771 participants had a zoster diagnosis; 253 (3·3%) were hospitalized. After adjusting for age and other factors, characteristics associated with zoster diagnoses included: having a recent immunosuppressive condition [adjusted hazard ratio (aHR) 1·58, 95% confidence interval (CI) 1·32–1·88], female sex (aHR 1·36, 95% CI 1·30–1·43), recent cancer diagnosis (aHR 1·35, 95% CI 1·24–1·46), and severe physical limitation vs. none (aHR 1·33, 95% CI 1·23–1·43). The relative risk of hospitalization for zoster was higher for those with an immunosuppressive condition (aHR 3·78, 95% CI 2·18–6·55), those with cancer (aHR 1·78, 95% CI 1·24–2·56) or with severe physical limitations (aHR 2·50, 95% CI 1·56–4·01). The novel finding of an increased risk of zoster diagnoses and hospitalizations in those with physical limitations should prompt evaluation of the use of zoster vaccine in this population.
We report on a preliminary analysis of a 5600 sec per point survey of 32 square degrees in Centaurus, carried out with the Parkes 13-beam system. The signal-to-noise ratio is found to improve as for the whole integration. We have detected 102 HI sources between +250 and +12,700 km s−1 either by eye or by using the new galaxy-finding algorithm PICASSO. Over half of these are new HI detections. Around a dozen of these are not associated with catalogued galaxies and, in two of these cases, we have not identified an optical counterpart on the Digitized Sky Survey. Arguments are put forward to explain why deep integrations are needed to find low surface brightness objects.
Few emergency medical services (EMS) interventions in New Mexico have been assessed for efficacy, potential harm, or potential benefit. There is concern that many interventions added over the years may be outdated, harmful, or ineffective in the EMS setting. A formal process for reviewing the state EMS scope of practice using literature review and expert consensus is discussed. In Phase One of the project, interventions in the New Mexico EMS scope of practice were prioritized for further review by surveying a national cadre of EMS experts to evaluate EMS interventions using a utilitarian harm/benefit metric.
An electronic survey based on the 2010 New Mexico EMS Scope of Practice statute was administered from March through June, 2011. A national cadre of 104 respondents was identified. Respondents were either State EMS medical directors or EMS fellowship directors. Respondents were asked to rate the potential harm and the potential benefit of specific EMS interventions on a 5-point ordinal scale. Median harm and benefit scores were calculated.
A total of 88 completed surveys were received following 208 emailed invitations to 104 respondents (43% response rate). Twenty-two (22) highest-priority interventions (those with a harm/benefit median score ratio of >1) were identified. Seven additional second-priority interventions were also identified. These interventions will be advanced for formal literature review and expert consensus.
The New Mexico EMS Interventions Project offers a novel model for assessing a prehospital scope of practice.
MunkMD, FullertonL, BanksL, MorleyS, McDanielsR, CastleS, ThorntonK, RichardsME. Assessing EMS Scope of Practice for Utility and Risk: the New Mexico EMS Interventions Assessment Project, Phase One Results. Prehosp Disaster Med.2012;27(5):1-6.
We give new bounds on sums of the form ∑ n≤NΛ(n)exp (2πiagn/m) and ∑ n≤NΛ(n)χ(gn+a), where Λ is the von Mangoldt function, m is a natural number, a and g are integers coprime to m, and χ is a multiplicative character modulo m. In particular, our results yield bounds on the sums ∑ p≤Nexp (2πiaMp/m) and ∑ p≤Nχ(Mp) with Mersenne numbers Mp=2p−1, where p is prime.
To determine the gross motor skills of school-aged children after the Fontan procedure and compare the locomotor and object control skills with normative data.
This study followed a cross-sectional design.
This study was based on hospital outpatient visit, with accelerometry conducted at home.
This study included 55 patients, including 22 girls in the age group of 6–10 years, 5.1 years after Fontan.
Main outcome measures
Test of Gross Motor Development – Version 2, daily activity by accelerometer, medical history review, child and parent perceptions of activity.
Being involved in active team sports increased locomotor percentile score by 10.3 points (CI: 4.4, 16.1). Preference for weekend outdoor activities (6.9, CI: 2.0, 11.8), performing at least 30 minutes of moderate-to-vigorous physical activity daily (24.5, CI: 7.3, 41.8), and reporting that parents seldom criticise the child's physical activity (21.8, CI: 8.9, 34.8) were also associated with higher locomotor percentile scores (p < 0.01). Object control percentile scores were higher (p < 0.03) with involvement in formal instruction (5.9, CI: 1.1, 10.6) and being restricted to “activities within comfortable limits” (27.6, CI: 7.7, 47.5). Older chronological age (r = 0.28), a more complicated medical history (r = 0.36), and older age at Fontan (r = 0.28) were associated with greater skill delay (p < 0.04).
Children after Fontan attain basic motor skills at a later age than their peers, and deficits continue for more complex skills as age increases, suggesting a need for longitudinal monitoring of gross motor skill development through the elementary school years. Future research might investigate whether a gross motor skill rehabilitation programme can provide these children with the motor skills needed to successfully participate in a physically active lifestyle with peers.
Background: As populations age, psychological distress in late life will become of increasing public health and social importance. This study seeks to bridge the gap in information that exists about psychological distress in late life, by exploring the prevalence of psychological distress among a very large sample of older adults to determine the impact of age and gender, and the modifying effect of these factors on the associations between measures of psychological distress and sociodemographic and comorbid conditions.
Methods: We analyzed self-reported data from 236,508 men and women in the New South Wales 45 and Up Study, to determine the impact of age and gender, and the modifying effects of these factors on associations between psychological distress and sociodemographic and comorbid conditions.
Results: Higher education, married status, and higher income were associated with lower risk of psychological distress. Although overall prevalence of psychological distress is lower at older ages, this increases after age 80, and is particularly associated with physical disabilities. Some older people (such as those requiring help because of disability and those with multiple comorbid health conditions) are at increased risk of psychological distress.
Conclusion: These findings have implications for both healthcare providers and policy-makers in identifying and responding to the needs of older people in our aging society.