To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The COllaborative project of Development of Anthropometrical measures in Twins (CODATwins) project is a large international collaborative effort to analyze individual-level phenotype data from twins in multiple cohorts from different environments. The main objective is to study factors that modify genetic and environmental variation of height, body mass index (BMI, kg/m2) and size at birth, and additionally to address other research questions such as long-term consequences of birth size. The project started in 2013 and is open to all twin projects in the world having height and weight measures on twins with information on zygosity. Thus far, 54 twin projects from 24 countries have provided individual-level data. The CODATwins database includes 489,981 twin individuals (228,635 complete twin pairs). Since many twin cohorts have collected longitudinal data, there is a total of 1,049,785 height and weight observations. For many cohorts, we also have information on birth weight and length, own smoking behavior and own or parental education. We found that the heritability estimates of height and BMI systematically changed from infancy to old age. Remarkably, only minor differences in the heritability estimates were found across cultural–geographic regions, measurement time and birth cohort for height and BMI. In addition to genetic epidemiological studies, we looked at associations of height and BMI with education, birth weight and smoking status. Within-family analyses examined differences within same-sex and opposite-sex dizygotic twins in birth size and later development. The CODATwins project demonstrates the feasibility and value of international collaboration to address gene-by-exposure interactions that require large sample sizes and address the effects of different exposures across time, geographical regions and socioeconomic status.
Introduction: Many drugs, including cannabis and alcohol, cause impairment and contribute to motor vehicle collisions (MVCs). Policy makers require knowledge of the prevalence of drug use in crash-involved drivers, and types of drugs used in order to develop effective prevention programs. This issue is particularly relevant with the recent legalization of cannabis. We aim to study the prevalence of alcohol, cannabis, sedating medications, and other drugs in injured drivers from 4 Canadian Provinces. Methods: This prospective cohort study obtained excess clinical blood samples from consecutive injured drivers who attended a participating Canadian trauma centre following a MVC. Blood samples were analyzed using a broad spectrum toxicology screen capable of detecting cannabinoids, cocaine, amphetamines (including their major analogues), and opioids as well as psychotropic pharmaceuticals (including antihistamines, benzodiazepines, other hypnotics, and sedating antidepressants). Alcohol and cannabinoids were quantified. Health records were reviewed to extract demographic, medical, and MVC information using a standardized data collection tool. Results: This study has been collecting data in 4 trauma centres in British Columbia (BC) since 2011 and was launched in 2 trauma centres in Alberta (AB), 1 in Saskatchewan (SK), and 2 in Ontario (ON) in 2018. In preliminary results from BC (n = 2412), 8% of injured drivers tested positive for THC and 13% for alcohol. Preliminary results from other provinces (n = 301) suggest a regional variation in prevalence of drivers testing positive for THC (10% - 27%), alcohol (17% - 29%), and other drugs. By May 2018, an estimated 4500 cases from BC, 600 from AB, 150 from SK, and 650 from ON will have been analyzed. We will report the prevalence of positive tests for alcohol, THC, other recreational drugs, and sedating medications, pre and post cannabis legalization. The number of cases with alcohol and/or THC levels above Canadian per se limits will also be reported. Results will be reported according to province, driver sex, age, single vs. multi vehicle crashes, and requirement for hospital admission. Conclusion: This will be among the largest international datasets on drug use by injured drivers. Our findings will provide patterns of drug and alcohol impairment in 4 Canadian provinces pre and post cannabis legalization. The significance of these findings and implication for impaired driving policy and prevention programs in Canada will be discussed.
Giardia duodenalis is the most common intestinal parasite of humans in the USA, but the risk factors for sporadic (non-outbreak) giardiasis are not well described. The Centers for Disease Control and Prevention and the Colorado and Minnesota public health departments conducted a case-control study to assess risk factors for sporadic giardiasis in the USA. Cases (N = 199) were patients with non-outbreak-associated laboratory-confirmed Giardia infection in Colorado and Minnesota, and controls (N = 381) were matched by age and site. Identified risk factors included international travel (aOR = 13.9; 95% CI 4.9–39.8), drinking water from a river, lake, stream, or spring (aOR = 6.5; 95% CI 2.0–20.6), swimming in a natural body of water (aOR = 3.3; 95% CI 1.5–7.0), male–male sexual behaviour (aOR = 45.7; 95% CI 5.8–362.0), having contact with children in diapers (aOR = 1.6; 95% CI 1.01–2.6), taking antibiotics (aOR = 2.5; 95% CI 1.2–5.0) and having a chronic gastrointestinal condition (aOR = 1.8; 95% CI 1.1–3.0). Eating raw produce was inversely associated with infection (aOR = 0.2; 95% CI 0.1–0.7). Our results highlight the diversity of risk factors for sporadic giardiasis and the importance of non-international-travel-associated risk factors, particularly those involving person-to-person transmission. Prevention measures should focus on reducing risks associated with diaper handling, sexual contact, swimming in untreated water, and drinking untreated water.
Introduction: Survival from cardiac arrest has been linked to the quality of resuscitation care. Unfortunately, healthcare providers frequently underperform in these critical scenarios, with a well-documented deterioration in skills weeks to months following advanced life support courses. Improving initial training and preventing decay in knowledge and skills are a priority in resuscitation education. The spacing effect has repeatedly been shown to have an impact on learning and retention. Despite its potential advantages, the spacing effect has seldom been applied to organized education training or complex motor skill learning where it has the potential to make a significant impact. The purpose of this study was to determine if a resuscitation course taught in a spaced format compared to the usual massed instruction results in improved retention of procedural skills. Methods: EMS providers (Paramedics and Emergency Medical Technicians (EMT)) were block randomized to receive a Pediatric Advanced Life Support (PALS) course in either a spaced format (four 210-minute weekly sessions) or a massed format (two sequential 7-hour days). Blinded observers used expert-developed 4-point global rating scales to assess video recordings of each learner performing various resuscitation skills before, after and 3-months following course completion. Primary outcomes were performance on infant bag-valve-mask ventilation (BVMV), intraosseous (IO) insertion, infant intubation, infant and adult chest compressions. Results: Forty-eight of 50 participants completed the study protocol (26 spaced and 22 massed). There was no significant difference between the two groups on testing before and immediately after the course. 3-months following course completion participants in the spaced cohort scored higher overall for BVMV (2.2 ± 0.13 versus 1.8 ± 0.14, p=0.012) without statistically significant difference in scores for IO insertion (3.0 ± 0.13 versus 2.7± 0.13, p= 0.052), intubation (2.7± 0.13 versus 2.5 ± 0.14, p=0.249), infant compressions (2.5± 0.28 versus 2.5± 0.31, p=0.831) and adult compressions (2.3± 0.24 versus 2.2± 0.26, p=0.728) Conclusion: Procedural skills taught in a spaced format result in at least as good learning as the traditional massed format; more complex skills taught in a spaced format may result in better long term retention when compared to traditional massed training as there was a clear difference in BVMV and trend toward a difference in IO insertion.
For livestock production systems to play a positive role in global food security, the balance between their benefits and disbenefits to society must be appropriately managed. Based on the evidence provided by field-scale randomised controlled trials around the world, this debate has traditionally centred on the concept of economic-environmental trade-offs, of which existence is theoretically assured when resource allocation is perfect on the farm. Recent research conducted on commercial farms indicates, however, that the economic-environmental nexus is not nearly as straightforward in the real world, with environmental performances of enterprises often positively correlated with their economic profitability. Using high-resolution primary data from the North Wyke Farm Platform, an intensively instrumented farm-scale ruminant research facility located in southwest United Kingdom, this paper proposes a novel, information-driven approach to carry out comprehensive assessments of economic-environmental trade-offs inherent within pasture-based cattle and sheep production systems. The results of a data-mining exercise suggest that a potentially systematic interaction exists between ‘soil health’, ecological surroundings and livestock grazing, whereby a higher level of soil organic carbon (SOC) stock is associated with a better animal performance and less nutrient losses into watercourses, and a higher stocking density with greater botanical diversity and elevated SOC. We contend that a combination of farming system-wide trials and environmental instrumentation provides an ideal setting for enrolling scientifically sound and biologically informative metrics for agricultural sustainability, through which agricultural producers could obtain guidance to manage soils, water, pasture and livestock in an economically and environmentally acceptable manner. Priority areas for future farm-scale research to ensure long-term sustainability are also discussed.
Postglacial emergence curves are used to infer mantle rheology, delimit ice extent, and test models of the solid Earth response to changing ice and water loads. Such curves are rarely produced by direct dating of land emergence; rather, most rely on the presence of radiocarbon-datable organic material and inferences made between the age of sedimentary deposits and landforms indicative of former sea level. Here, we demonstrate a new approach, 10Be dating, to determine rates of postglacial land emergence in two different settings. In southern Greenland (Narsarsuaq/Igaliku), we date directly the exposure, as relative sea level fell, of gravel beaches and rocky outcrops allowing determination of rapid, post–Younger Dryas emergence. In western Greenland (Kangerlussuaq), we constrain Holocene isostatic response by dating the sequential stripping of terrace sediment driven by land-surface uplift, relative sea-level fall, and resulting fluvial incision. The technique we employ provides high temporal and elevation resolution important for quantifying rapid emergence immediately after deglaciation and less rapid uplift during the middle Holocene. 10Be-constrained emergence curves can improve knowledge of relative sea-level change by dating land emergence along rocky coasts, at elevations and locations where radiocarbon-datable sediments are not present, and without the lag time needed for organic material to accumulate.
Simulation models are used widely in pharmacology, epidemiology and health economics (HEs). However, there have been no attempts to incorporate models from these disciplines into a single integrated model. Accordingly, we explored this linkage to evaluate the epidemiological and economic impact of oseltamivir dose optimisation in supporting pandemic influenza planning in the USA. An HE decision analytic model was linked to a pharmacokinetic/pharmacodynamics (PK/PD) – dynamic transmission model simulating the impact of pandemic influenza with low virulence and low transmissibility and, high virulence and high transmissibility. The cost-utility analysis was from the payer and societal perspectives, comparing oseltamivir 75 and 150 mg twice daily (BID) to no treatment over a 1-year time horizon. Model parameters were derived from published studies. Outcomes were measured as cost per quality-adjusted life year (QALY) gained. Sensitivity analyses were performed to examine the integrated model's robustness. Under both pandemic scenarios, compared to no treatment, the use of oseltamivir 75 or 150 mg BID led to a significant reduction of influenza episodes and influenza-related deaths, translating to substantial savings of QALYs. Overall drug costs were offset by the reduction of both direct and indirect costs, making these two interventions cost-saving from both perspectives. The results were sensitive to the proportion of inpatient presentation at the emergency visit and patients’ quality of life. Integrating PK/PD–EPI/HE models is achievable. Whilst further refinement of this novel linkage model to more closely mimic the reality is needed, the current study has generated useful insights to support influenza pandemic planning.
Over the past several years, we have seen many attacks on publicly funded and mandated archaeology in the United States. These attacks occur at the state level, where governors and state legislatures try to defund or outright eliminate state archaeological programs and institutions. We have also seen several attacks at the federal level. Some members of Congress showcase archaeology as a waste of public tax dollars, and others propose legislation to move federally funded or permitted projects forward without consideration of impacts on archaeological resources. These attacks continue to occur, and we expect them to increase in the future. In the past, a vigilant network of historic preservation and archaeological organizations was able to thwart such attacks. The public, however, largely remains an untapped ally. As a discipline, we have not built a strong public support network. We have not demonstrated the value of archaeology to the public, beyond a scattering of educational and informational programs. In this article, we—a group of archaeologists whose work has focused on public engagement—provide a number of specific recommendations on how to build a strong public constituency for the preservation of our nation's archaeological heritage.
A substantial proportion of persons with mental disorders seek treatment from complementary and alternative medicine (CAM) professionals. However, data on how CAM contacts vary across countries, mental disorders and their severity, and health care settings is largely lacking. The aim was therefore to investigate the prevalence of contacts with CAM providers in a large cross-national sample of persons with 12-month mental disorders.
In the World Mental Health Surveys, the Composite International Diagnostic Interview was administered to determine the presence of past 12 month mental disorders in 138 801 participants aged 18–100 derived from representative general population samples. Participants were recruited between 2001 and 2012. Rates of self-reported CAM contacts for each of the 28 surveys across 25 countries and 12 mental disorder groups were calculated for all persons with past 12-month mental disorders. Mental disorders were grouped into mood disorders, anxiety disorders or behavioural disorders, and further divided by severity levels. Satisfaction with conventional care was also compared with CAM contact satisfaction.
An estimated 3.6% (standard error 0.2%) of persons with a past 12-month mental disorder reported a CAM contact, which was two times higher in high-income countries (4.6%; standard error 0.3%) than in low- and middle-income countries (2.3%; standard error 0.2%). CAM contacts were largely comparable for different disorder types, but particularly high in persons receiving conventional care (8.6–17.8%). CAM contacts increased with increasing mental disorder severity. Among persons receiving specialist mental health care, CAM contacts were reported by 14.0% for severe mood disorders, 16.2% for severe anxiety disorders and 22.5% for severe behavioural disorders. Satisfaction with care was comparable with respect to CAM contacts (78.3%) and conventional care (75.6%) in persons that received both.
CAM contacts are common in persons with severe mental disorders, in high-income countries, and in persons receiving conventional care. Our findings support the notion of CAM as largely complementary but are in contrast to suggestions that this concerns person with only mild, transient complaints. There was no indication that persons were less satisfied by CAM visits than by receiving conventional care. We encourage health care professionals in conventional settings to openly discuss the care patients are receiving, whether conventional or not, and their reasons for doing so.
This study evaluated the annual prevalence of anogenital warts (AGW) caused by human papillomavirus (HPV) and analysed the trend in annual per cent changes (APC) by using national claims data from the Health Insurance Review and Assessment of Korea, 2007–2015. We also estimated the socio-economic burden and co-morbidities of AGW. All analyses were performed based on data for primary A63.0, the specific diagnosis code for AGW. The socio-economic cost of AGW was calculated based on the direct medical cost, direct non-medical cost and indirect cost. The overall AGW prevalence and socio-economic burden has increased during the last 9 years. However, the prevalence of AGW differed significantly by sex. The female prevalence increased until 2012, and decreased thereafter (APC + 3·6%). It would fall after the introduction of routine HPV vaccination, principally for females, in Korea. The male prevalence increased continuously over time (APC + 11·6%), especially in those aged 20–49 years. Referring to the increasing AGW prevalence and its disease burden, active HPV infection control surveillance and prevention in males are worth consideration.
The treatment gap between the number of people with mental disorders and the number treated represents a major public health challenge. We examine this gap by socio-economic status (SES; indicated by family income and respondent education) and service sector in a cross-national analysis of community epidemiological survey data.
Data come from 16 753 respondents with 12-month DSM-IV disorders from community surveys in 25 countries in the WHO World Mental Health Survey Initiative. DSM-IV anxiety, mood, or substance disorders and treatment of these disorders were assessed with the WHO Composite International Diagnostic Interview (CIDI).
Only 13.7% of 12-month DSM-IV/CIDI cases in lower-middle-income countries, 22.0% in upper-middle-income countries, and 36.8% in high-income countries received treatment. Highest-SES respondents were somewhat more likely to receive treatment, but this was true mostly for specialty mental health treatment, where the association was positive with education (highest treatment among respondents with the highest education and a weak association of education with treatment among other respondents) but non-monotonic with income (somewhat lower treatment rates among middle-income respondents and equivalent among those with high and low incomes).
The modest, but nonetheless stronger, an association of education than income with treatment raises questions about a financial barriers interpretation of the inverse association of SES with treatment, although future within-country analyses that consider contextual factors might document other important specifications. While beyond the scope of this report, such an expanded analysis could have important implications for designing interventions aimed at increasing mental disorder treatment among socio-economically disadvantaged people.
Modern datasets provide the context necessary for accurate interpretations of isotopic data from archaeological faunal assemblages. In this study, we use the oxygen isotope ratios (δ18O) of modern small mammals from Chaco Canyon, New Mexico, to quantify expected isotopic variation in a local population. The δ18O values of local, modern small mammals encompass a broad range (−6.0‰ to 4.8‰ VPDB), which is expected given the extreme seasonal variation in the δ18O of precipitation on the Colorado Plateau (−11‰ to −3‰ VPDB). Isotopic ratios of small mammals obtained from excavated archaeological sites in Chaco Canyon (ca. AD 800 to 1200) show no significant differences with their modern counterparts, suggesting that there is no difference in the origins of the archaeological small-mammal collection and the modern, local Chaco Canyon small-mammal collection. In contrast, δ18O values of large mammals from Chaco archaeological sites are significantly different from those of modern specimens, reflecting a nonlocal, but also nonspecific, source in the past.
Young adults who are not in employment, education, or training (NEET) are at risk of long-term economic disadvantage and social exclusion. Knowledge about risk factors for being NEET largely comes from cross-sectional studies of vulnerable individuals. Using data collected over a 10-year period, we examined adolescent predictors of being NEET in young adulthood.
We used data on 1938 participants from the Victorian Adolescent Health Cohort Study, a community-based longitudinal study of adolescents in Victoria, Australia. Associations between common mental disorders, disruptive behaviour, cannabis use and drinking behaviour in adolescence, and NEET status at two waves of follow-up in young adulthood (mean ages of 20.7 and 24.1 years) were investigated using logistic regression, with generalised estimating equations used to account for the repeated outcome measure.
Overall, 8.5% of the participants were NEET at age 20.7 years and 8.2% at 24.1 years. After adjusting for potential confounders, we found evidence of increased risk of being NEET among frequent adolescent cannabis users [adjusted odds ratio (ORadj) = 1.74; 95% confidence interval (CI) 1.10–2.75] and those who reported repeated disruptive behaviours (ORadj = 1.71; 95% CI 1.15–2.55) or persistent common mental disorders in adolescence (ORadj = 1.60; 95% CI 1.07–2.40). Similar associations were present when participants with children were included in the same category as those in employment, education, or training.
Young people with an early onset of mental health and behavioural problems are at risk of failing to make the transition from school to employment. This finding reinforces the importance of integrated employment and mental health support programmes.
We estimated the heritabilities (h2) and genetic and phenotypic correlations among individual and groups of fatty acids, as well as their correlations with six important carcass and meat-quality traits in Korean Hanwoo cattle. Meat samples were collected from the longissimus dorsi muscles of 1000 Hanwoo steers that were 30-month-old (progeny of 85 proven Hanwoo bulls) to determine intramuscular fatty acid profiles. Phenotypic data on carcass weight (CWT), eye muscle area (EMA), back fat thickness (BFT), marbling score (MS), Warner–Bratzler shear force (WBSF) and intramuscular fat content (IMF) were also investigated using this half-sib population. Variance and covari.ance components were estimated using restricted maximum likelihood procedures under univariate and pairwise bivariate animal models. Oleic acid (C18:1n-9) was the most abundant fatty acid, accounting for 50.69% of all investigated fatty acids, followed by palmitic (C16:0; 27.33%) and stearic acid (C18:0; 10.96%). The contents of saturated fatty acids (SFAs), monounsaturated fatty acids (MUFAs) and polyunsaturated fatty acids (PUFAs) were 41.64%, 56.24% and 2.10%, respectively, and the MUFA/SFA ratio, PUFA/SFA ratio, desaturation index (DI) and elongation index (EI) were 1.36, 0.05, 0.59 and 0.66, respectively. The h2 estimates for individual fatty acids ranged from very low to high (0.03±0.14 to 0.63±0.14). The h2 estimates for SFAs, MUFAs, PUFAs, DI and EI were 0.53±0.14, 0.49±0.14, 0.23±0.10, 0.51±0.13 and 0.53±0.13, respectively. The genetic and phenotypic correlations among individual fatty acids and fatty acid classes varied widely (−0.99 to 0.99). Notably, C18:1n-9 had favourable (negative) genetic correlations with two detrimental fatty acids, C14:0 (−0.76) and C16:0 (−0.92). Genetic correlations of individual and group fatty acids with CWT, EMA, BFT, MS, WBSF and IMF ranged from low to moderate (both positive and negative) with the exception of low-concentration PUFAs. Low or near-zero phenotypic correlations reflected potential non-genetic contributions. This study provides insights on genetic variability and correlations among intramuscular fatty acids as well as correlations between fatty acids and carcass and meat-quality traits, which could be used in Hanwoo breeding programmes to improve fatty acid compositions in meat.
Research on post-traumatic stress disorder (PTSD) course finds a substantial proportion of cases remit within 6 months, a majority within 2 years, and a substantial minority persists for many years. Results are inconsistent about pre-trauma predictors.
The WHO World Mental Health surveys assessed lifetime DSM-IV PTSD presence-course after one randomly-selected trauma, allowing retrospective estimates of PTSD duration. Prior traumas, childhood adversities (CAs), and other lifetime DSM-IV mental disorders were examined as predictors using discrete-time person-month survival analysis among the 1575 respondents with lifetime PTSD.
20%, 27%, and 50% of cases recovered within 3, 6, and 24 months and 77% within 10 years (the longest duration allowing stable estimates). Time-related recall bias was found largely for recoveries after 24 months. Recovery was weakly related to most trauma types other than very low [odds-ratio (OR) 0.2–0.3] early-recovery (within 24 months) associated with purposefully injuring/torturing/killing and witnessing atrocities and very low later-recovery (25+ months) associated with being kidnapped. The significant ORs for prior traumas, CAs, and mental disorders were generally inconsistent between early- and later-recovery models. Cross-validated versions of final models nonetheless discriminated significantly between the 50% of respondents with highest and lowest predicted probabilities of both early-recovery (66–55% v. 43%) and later-recovery (75–68% v. 39%).
We found PTSD recovery trajectories similar to those in previous studies. The weak associations of pre-trauma factors with recovery, also consistent with previous studies, presumably are due to stronger influences of post-trauma factors.
Background: Cerebral palsy (CP) is a debilitating disorder (1). Based on neuromotor impairments it is divided to spastic, dyskinetic and ataxic types (2). Inborn Errors of Metabolism (IEMs), monogenic and chromosomal disorders mimic CP (3). We aimed to identify causal genetic variants in patients with atypical dyskinetic CP in whom known IEMs were ruled out. Timely diagnosis is essential for proper management, especially in conditions that mimic CP and are treatable. Methods: We enrolled 23 patients with unexplained atypical dyskinetic CP, for whole exome sequencing. Variants were filtered against public and in-house databases to identify variants predicted as damaging (in silico tools and ACMG criteria). We applied a virtual gene panel of known and suspected CP and movement disorder genes and investigated each sample. Results: The participants presented with symptoms including: spasticity, dystonia, choera-athetosis, ataxia and cognitive delays. We identified 23 diagnoses: 13 dominant,6 recessive and 4 X-linked. 12 patients had movement disorders. In 4, the diagnoses enabled targeted treatment (neurotransmitter supplements in Unverricht Lundborg diseases (CSTB) and PAK3 deficiency, deep brain stimulation in GNAO1 deficiency, medical diet in Glutaric Aciduria (GCDH). Conclusions: Whole Exome Sequencing contributes to establishing diagnosis in patients with atypical dyskinetic CP resulting in precision medicine and improved health outcomes.
Introduction: Medical conditions that impair perception, cognition or motor skills may make people unfit to drive. Reporting unfit drivers to licensing authorities is seen by many as a public health obligation. This study investigates physician knowledge, attitudes and practice around the management of medically unfit drivers. Methods: We used an online survey to explore physician knowledge of fitness to drive issues and their attitudes and practice with regard to counselling and reporting unfit drivers. Email invitations to participate in the survey were sent to all physicians in BC through DoctorsofBC and to all emergency physicians (EPs) in the UBC Department of Emergency Medicine. Results: We received responses from 242 physicians (47% EPs, 40% GPs, 13% others). The majority (78%) reported little/no knowledge on determining driver fitness and 94% had little/no training around guidelines, reporting, and laws involving fitness to drive. Most (88%) agreed that physicians should be obligated to advise medically unfit patients not to drive, and 74% reported that they often warn patients not to drive. The majority of physicians also chart their opinion of patients’ fitness to drive (67% do so more than twice per year). Most respondents (70%) indicated that it is “always appropriate” to report definitely unfit drivers whereas only 25% indicated that it is “always appropriate” to report potentially unfit drivers. However, in practice physicians see far more unfit drivers than they report to licensing authority: 67% of physicians encounter definitely unfit drivers more than twice per year but only 19% report definitely unfit drivers more than twice per year and 34% never report definitely unfit drivers. Compared to other physicians, EPs reported less knowledge and training about criteria for determining fitness to drive, were more likely to feel that reporting unfit drivers was not their responsibility, and were less likely to report unfit drivers to licensing authorities. Conclusion: Our findings indicate a need for more education and information resources to help physicians, particularly EPs, identify and manage medically unfit drivers. Although most physicians warn unfit drivers not to drive and document this in medical records, many medically unfit drivers are not reported to licensing authorities, a potential public health problem that should be further investigated.
Introduction: Most medically unfit drivers are not reported to licensing authorities. In BC, physicians are only obligated to report unfit drivers who continue to drive after being warned to stop. This study investigates barriers to and incentives for physician reporting of medically unfit drivers. Methods: We used an online survey to study physician-reported barriers to reporting medically unfit drivers and their idea of incentives that would improve reporting. Email invitations to participate in the survey were sent to all physicians in BC through DoctorsofBC and to all emergency physicians (EPs) in the UBC Department of Emergency Medicine. Results: We received responses from 242 physicians (47% EPs, 40% GPs, 13% others). The most common barrier to reporting was not knowing which unfit drivers continue to drive (79% of respondents). Other barriers included lack of time (51%), lack of knowledge of the process, guidelines, or legal requirement for reporting (51%, 50%, 45% respectively), fearing loss of rapport with patients (48%), pressure from patients not to report (34%), lack of remuneration (27%), and pressure from family members not to report (25%).EPs were significantly less likely than other physicians to cite loss of rapport, pressure from patients, or pressure from family as barriers, but more likely to cite not being aware of drivers who continue to drive after being warned, lack of knowledge (regarding legal requirements to report, guidelines for determining fitness, and the reporting process), and lack of time. Factors that would increase reporting unfit drivers included better understanding of criteria for fitness to drive (70%), more information regarding how to report (67%), more information on when to report (65%), and compensation (43%).Free text comments from respondents identified other barriers/incentives. Reporting might be simplified by telephone hotlines or allowing physician designates to report. Physicians feared legal liability and suggested the need for better medico-legal protection. Loss of patient rapport might be minimized by public education. Failure of response from licensing authorities to a report (long wait times, lack of feedback to physician) was seen as a barrier to reporting. Conclusion: We identified barriers to physician reporting of medically unfit drivers and incentives that might increase reporting. This information could inform programs aiming to improve reporting of unfit drivers.