To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Matrix positivity is a central topic in matrix theory: properties that generalize the notion of positivity to matrices arose from a large variety of applications, and many have also taken on notable theoretical significance, either because they are natural or unifying. This is the first book to provide a comprehensive and up-to-date reference of important material on matrix positivity classes, their properties, and their relations. The matrix classes emphasized in this book include the classes of semipositive matrices, P-matrices, inverse M-matrices, and copositive matrices. This self-contained reference will be useful to a large variety of mathematicians, engineers, and social scientists, as well as graduate students. The generalizations of positivity and the connections observed provide a unique perspective, along with theoretical insight into applications and future challenges. Direct applications can be found in data analysis, differential equations, mathematical programming, computational complexity, models of the economy, population biology, dynamical systems and control theory.
To evaluate the effect of the burden of Staphylococcus aureus colonization of nursing home residents on the risk of S. aureus transmission to healthcare worker (HCW) gowns and gloves.
Multicenter prospective cohort study.
Setting and participants:
Residents and HCWs from 13 community-based nursing homes in Maryland and Michigan.
Residents were cultured for S. aureus at the anterior nares and perianal skin. The S. aureus burden was estimated by quantitative polymerase chain reaction detecting the nuc gene. HCWs wore gowns and gloves during usual care activities; gowns and gloves were swabbed and then cultured for the presence of S. aureus.
In total, 403 residents were enrolled; 169 were colonized with methicillin-resistant S. aureus (MRSA) or methicillin-sensitive S. aureus (MSSA) and comprised the study population; 232 were not colonized and thus were excluded from this analysis; and 2 were withdrawn prior to being swabbed. After multivariable analysis, perianal colonization with S. aureus conferred the greatest odds for transmission to HCW gowns and gloves, and the odds increased with increasing burden of colonization: adjusted odds ratio (aOR), 2.1 (95% CI, 1.3–3.5) for low-level colonization and aOR 5.2 (95% CI, 3.1–8.7) for high level colonization.
Among nursing home patients colonized with S. aureus, the risk of transmission to HCW gowns and gloves was greater from those colonized with greater quantities of S. aureus on the perianal skin. Our findings inform future infection control practices for both MRSA and MSSA in nursing homes.
Passive acoustic monitoring is rapidly gaining recognition as a practical, affordable and robust tool for measuring gun hunting levels within protected areas, and consequently for its potential to evaluate anti-poaching patrols’ effectiveness based on outcome (i.e., change in hunting pressure) rather than effort (e.g., kilometres patrolled) or output (e.g., arrests). However, there has been no report to date of a protected area successfully using an acoustic grid to explore baseline levels of gun hunting activity, adapting its patrols in response to the evidence extracted from the acoustic data and then evaluating the effectiveness of the new patrol strategy. We report here such a case in Cameroon’s Korup National Park, where anti-poaching patrol effort was markedly increased in the 2015–2016 Christmas/New Year holiday season to curb the annual peak in gunshots recorded by a 12-sensor acoustic grid in the same period during the previous 2 years. Despite a three- to five-fold increase in patrol days, distance and area covered, the desired outcome – lower gun hunting activity – was not achieved under the new patrol scheme. The findings emphasize the need for adaptive wildlife law enforcement and how passive acoustic monitoring can help attain this goal, and they warn about the risks of using effort-based metrics of anti-poaching strategies as a surrogate for desired outcomes. We propose ways of increasing protected areas’ capacity to adopt acoustic grids as a law enforcement monitoring tool.
Biodiversity loss may increase the risk of infectious disease in a phenomenon known as the dilution effect. Circumstances that increase the likelihood of disease dilution are: (i) when hosts vary in their competence, and (ii) when communities disassemble predictably, such that the least competent hosts are the most likely to go extinct. Despite the central role of competence in diversity–disease theory, we lack a clear understanding of the factors underlying competence, as well as the drivers and extent of its variation. Our perspective piece encourages a mechanistic understanding of competence and a deeper consideration of its role in diversity–disease relationships. We outline current evidence, emerging questions and future directions regarding the basis of competence, its definition and measurement, the roots of its variation and its role in the community ecology of infectious disease.
Seismic-reflection surveys of the Isle Royale sub-basin, central Lake Superior, reveal two large end moraines and associated glacial sediments deposited during the last cycle of the Laurentide Ice Sheet in the basin. The Isle Royale moraines directly overlie bedrock and are cored with dense, acoustically massive till intercalated down-ice with acoustically stratified outwash. Till and outwash are overlain by glacial varves, a lower red unit and an upper gray unit.
The maximum extent of late Younger Dryas-age readvance into the western Lake Superior basin is uncertain, but it was probably controlled by both ice dynamics and climate. Our data indicate that during retreat from the maximum, the ice paused just long enough to construct the outer of the two moraines, >100 m high, and then retreated to the inner moraine, during which time most of the lower glacial-lacustrine sequence (red varves) was deposited. Retreat from the inner moraine coincided with a marked flux of icebergs at the calving margin and a change to gray varves. Rapid retreat may be related to both an influx of meltwater from Glacial Lake Agassiz about 10,500 cal yr BP and retreat of the calving margin down an adverse slope into the Isle Royale sub-basin.
OBJECTIVES/GOALS: We examined how individual characteristics and characteristics of the socioeconomic and built environment were associated with care coordination’s effect on cardiovascular disease (CVD) risks to identify geographic areas that may benefit from supplementary clinic-community linkages. METHODS/STUDY POPULATION: We analyzed data with geocoded residential addresses and data from electronic health records for 9946 adults from a Centers for Medicare & Medicaid Services funded innovation project from 7/1/2013 to 3/30/2015. Variables included patient-level demographics, Elixhauser comorbidity index, total time with a nurse care manager, and neighborhood factors such as poverty indicators, walkability, and social capital index. Outcomes were change in CVD risk factors, hemoglobin A1C, blood pressure (BP), and low-density lipoprotein (LDL). Generalized linear models were used to assess the effect of nurse care management program on outcomes after controlling for confounding factors. RESULTS/ANTICIPATED RESULTS: We report preliminary models that include patient demographics (age, sex, race), health care utilization, nurse care manager contact time, Elixhauser comorbidity index, neighborhood education status, percent of population below 200% federal poverty level, median home value, walkability score of the residential address, and social capital index. After adjusting for all mentioned variables, in adults with HbA1C more than 7.5% at baseline, females had worsening HbA1C by 0.53% over the study period. Additionally, LDL values in females worsened over the study period by 4.8 mg/dL after adjusting for all variables. No clinically significant changes were noted for BP. DISCUSSION/SIGNIFICANCE OF IMPACT: Women’s HbA1C and LDL worsened despite nurse care management and may benefit from additional community-based interventions or interventionists. In future analyses, we anticipate that CVD risk will worsen for patients with higher fast food proximity and with greater geographic distance from their PCP.
At present, analysis of diet and bladder cancer (BC) is mostly based on the intake of individual foods. The examination of food combinations provides a scope to deal with the complexity and unpredictability of the diet and aims to overcome the limitations of the study of nutrients and foods in isolation. This article aims to demonstrate the usability of supervised data mining methods to extract the food groups related to BC. In order to derive key food groups associated with BC risk, we applied the data mining technique C5.0 with 10-fold cross-validation in the BLadder cancer Epidemiology and Nutritional Determinants study, including data from eighteen case–control and one nested case–cohort study, compromising 8320 BC cases out of 31 551 participants. Dietary data, on the eleven main food groups of the Eurocode 2 Core classification codebook, and relevant non-diet data (i.e. sex, age and smoking status) were available. Primarily, five key food groups were extracted; in order of importance, beverages (non-milk); grains and grain products; vegetables and vegetable products; fats, oils and their products; meats and meat products were associated with BC risk. Since these food groups are corresponded with previously proposed BC-related dietary factors, data mining seems to be a promising technique in the field of nutritional epidemiology and deserves further examination.
The development of human brain imaging has resulted in a number of techniques that allow unprecedented insights into the in vivo metabolic and neurochemical processes of the brain. Single positron emission cerebral tomography (SPECT) is a nuclear medicine technique that can be used for measuring perfusion and blood flow in patients affected with psychopathology. The aim of the study was to compare sole depressed patients and those with comorbid alcohol dependence in terms of the functional alterations detected by single positron emission scan (SPECT). For this, 27 SPECT imaging studies performed at Hospital Clínico Pontificia Universidad Católica, of selected patients, were collected and categorized by group. First group composed by depressed patients and second group of patients having alcohol dependence in addition to depression. Selected studies were corregistered, normalized and smoothed for standarization before statistic analysis was performed using MatLan7.1 software with SPM5 module. Mean blood flow in brain areas were compared between groups, with significant statistical difference at p<0.01.
Results show significantly less blood flow in the group with alcohol dependence in Brodmann Areas 4,6,8,9,45and46 of the frontal lobe and BrodmannAreas 2,3,4,5,7and40 of the parietal lobe (p<0.01). Furthermore, the group with alcohol dependence showed increased blood flow in frontal lobe's Brodmann Area 10, temporal lobe's Brodmann Areas 13,20,22, cerebellum, uncus and thalamus.(p<0.01). We conclude that alcohol dependence as comorbid condition in depressed patients determines an additional decrease in the mean blood flow of prefrontal and temporal lobes.
Optimal management of schizophrenia in adolescents is limited by the lack of available therapies. The efficacy and tolerability of aripiprazole was investigated in this patient population.
This 6-week, randomized, double-blind, placebo controlled trial was conducted at 101 international centers, with a safety monitoring board. 13-17 year-olds with a DSM-IV diagnosis of schizophrenia were randomized to placebo, or a fixed dose of aripiprazole 10 mg or 30 mg reached after a 5 or 11 day titration, respectively. The primary endpoint was mean change from baseline on the PANSS Total score at week 6. Secondary endpoints included the PANSS Positive and Negative subscales, and CGI Improvement score. Tolerabilility assessements included frequency and severity of adverse events, as well as blood chemistries, metabolic parameters and weight gain.
Over 85% of 302 patients completed this study. Both 10 mg and 30 mg doses were superior to placebo on the primary endpoint (PANSS total), with significant differences observed as early as Week 1 (30mg). Both doses showed significant improvement on the PANSS Positive and CGI-I scales; and the 10 mg dose group was superior on PANSS Negative score. Approximately 5% of aripiprazole patients discontinued due to AEs. Weight gain and changes in prolactin were minimal.
10mg and 30mg doses of aripiprazole were superior to placebo in the treatment of adolescents with schizophrenia. Aripiprazole was well tolerated, in general, with few discontinuations due to AEs. EPS was the most common AE. Change in body weight was similar to placebo.
To advance the quality of mental healthcare in Europe by developing guidance on implementing quality assurance.
We performed a systematic literature search on quality assurance in mental healthcare and the 522 retrieved documents were evaluated by two independent reviewers (B.J. and J.Z.). Based on these evaluations, evidence tables were generated. As it was found that these did not cover all areas of mental healthcare, supplementary hand searches were performed for selected additional areas. Based on these findings, fifteen graded recommendations were developed and consented by the authors. Review by the EPA Guidance Committee and EPA Board led to two additional recommendations (on immigrant mental healthcare and parity of mental and physical healthcare funding).
Although quality assurance (measures to keep a certain degree of quality), quality control and monitoring (applying quality indicators to the current degree of quality), and quality management (coordinated measures and activities with regard to quality) are conceptually distinct, in practice they are frequently used as if identical and hardly separable. There is a dearth of controlled trials addressing ways to optimize quality assurance in mental healthcare. Altogether, seventeen recommendations were developed addressing a range of aspects of quality assurance in mental healthcare, which appear usable across Europe. These were divided into recommendations about structures, processes and outcomes. Each recommendation was assigned to a hierarchical level of analysis (macro-, meso- and micro-level).
There was a lack of evidence retrievable by a systematic literature search about quality assurance of mental healthcare. Therefore, only after further topics and search had been added it was possible to develop recommendations with mostly medium evidence levels.
Evidence-based graded recommendations for quality assurance in mental healthcare were developed which should next be implemented and evaluated for feasibility and validity in some European countries. Due to the small evidence base identified corresponding to the practical obscurity of the concept and methods, a European research initiative is called for by the stakeholders represented in this Guidance to improve the educational, methodological and empirical basis for a future broad implementation of measures for quality assurance in European mental healthcare.
There is limited published data from long-term pediatric bipolar clinical trials with which to guide appropriate treatment decisions. Long-term efficacy and safety of aripiprazole was investigated in this patient population.
296 youths, ages 10-17 year-old with a DSM-IV diagnosis of bipolar I disorder were randomized to receive either placebo or aripiprazole (10mg or 30mg) in a 4-week double-blind trial. Completers continued assigned treatments for an additional 26 weeks (double-blind). Efficacy endpoints included mean change from baseline to week 4 and week 30 on the Young Mania Rating Scale; Children's Global Assessment Scale, Clinical Global Impressions-Bipolar version severity scale, General Behavior Inventory, Attention Deficit Hyperactivity Disorders Rating Scale, and time to discontinuation. Tolerability/safety assessments included incidence and severity of AEs, blood chemistries and metabolic parameters.
Over the 30-week course of double-blind treatment, aripiprazole (10 mg and 30 mg) was superior to placebo as early as week 1 (p< 0.002) and at all scheduled visits from week 2 through week 30 on mean change from baseline in the Y-MRS total score (p<.0001; all visits). Significant improvements were observed on multiple endpoints including the CGAS, GBI, CGI-BP, ADHD-RS-IV total score, time to discontinuation, and response and remission rates. The 3 most common AEs were somnolence, extrapyramidal disorder, and fatigue. Mean change in body weight z-scores over 30 weeks was not clinically significant.
Over 30-weeks of treatment, both doses of aripiprazole were superior to placebo in the long term treatment of pediatric bipolar patients. Aripiprazole was generally well tolerated.
The Health Utilities Index-Mark 2 (HUI2), a generic instrument for assessing health status, is an important effectiveness input for pharmacoeconomic modelling. It has not previously been used in patients with attention deficit/hyperactivity disorder (ADHD).
To use HUI2 to assess health utility in patients aged 6–17 years with ADHD receiving the prodrug stimulant lisdexamfetamine dimesylate (LDX).
SPD489-325 was a 7-week, randomized, double-blind, placebo-controlled trial of LDX, with osmotic-release oral system methylphenidate (OROSMPH) as a reference treatment. Patients’ parents or guardians completed HUI2 questionnaires at baseline and weeks 4 and 7. Utilities were estimated for treatment responders and non-responders, with response defined as a Clinical Global Impressions-Improvement (CGI-I) score of 1 or 2, or a ≥25% or ≥30% reduction in ADHD Rating Scale IV (ADHD-RS-IV) total score.
Of 336 patients randomized, 317 were included in the full analysis set (LDX, n=104; OROS-MPH, n=107; placebo, n=106) and 196 completed the study. At endpoint, mean HUI2 utility scores across all treatment groups were higher for responders than non-responders when response was based on CGI-I score (responders: 0.896 [SD, 0.0990]; non-responders: 0.838 [0.1421]), on a ≥25% reduction in ADHD-RS-IV score from baseline (responders, 0.899 [0.0969]; non-responders, 0.809 [0.1474]), or on a ≥30% reduction in ADHD-RS-IV score from baseline (responders, 0.902 [0.0938]; non-responders 0.814 [0.1477]).
The HUI2 instrument is sensitive to treatment response in the child and adolescent ADHD patient population. Health utilities generated using HUI2 are therefore suitable for cost effectiveness evaluations of ADHD medications.
GXR, a selective α2A-adrenergic agonist, is a non-stimulant treatment for ADHD (approved in the USA for children and adolescents and in Canada for children).
To assess the efficacy (symptoms and function) and safety of dose-optimized GXR compared with placebo in children and adolescents with ADHD.
To evaluate the efficacy (symptom and function) and safety of GXR for the treatment of ADHD. An atomoxetine (ATX) arm was included to provide reference data against placebo (NCT01244490).
Patients (6–17 years) were randomly assigned at baseline to dose-optimized GXR (6–12 years, 1–4 mg/day; 13–17 years, 1–7 mg/day), ATX (10–100mg/day) or placebo for 4 or 7 weeks. The primary efficacy measure is change from baseline in ADHD-Rating Scale-version IV (ADHD-RS-IV). Key secondary measures were defined as Clinical Global Impressions-Improvement (CGI-I) and the Weiss Functional Impairment Rating Scale-Parent (WFIRS-P). Safety assessments included treatment-emergent adverse events (TEAEs), electrocardiograms, and vital signs.
Of 338 patients randomized, 272 (80.5%) completed the study. Placebo-adjusted differences in least squares (LS) mean in ADHD-RS-IV total score, percent improvement versus placebo for CGI-I, placebo-adjusted differences in LS mean change from baseline in WFIRS-P score (family and learning and school domains) are shown in the Table. The most common TEAEs for GXR were somnolence, headache, and fatigue; 8 (7%) TEAEs were severe.
GXR was effective and well tolerated in children and adolescents with ADHD.
Placebo-adjusted difference in LS mean change from baseline in ADHD-RS-IV total score (95% Cl, p-value; effect size)
−8.9 (−11.9, −5.8, p<0.001; 0.76)
−3.8 (−6.8, −0.7, p<0.05; 0.32)
Difference in improvement from placebo for CGI-I (95% Cl, p-value)
23.7% (11.1, 36.4; p<0.001)
12.1% (−0.9, 25.1; p<0.05)
Placebo-adjusted difference in LS mean change from baseline in WFIRS-P; learning and school domain score (95%CI, p-value; effect size)
−0.22 (−0.36, −0.08, p<0.01; 0.42)
−0.16 (−0.31, −0.02, p<0.05; 0.32)
Placebo-adjusted difference in LS mean change from baseline in WFIRS-P; family domain score (95%CI, p-value; effect size)
Evaluate the efficacy and long-term safety of investigational aripiprazole once-monthly (ARI-OM) for maintenance treatment in schizophrenia.
Patients requiring chronic treatment for schizophrenia, not on aripiprazole monotherapy, were cross-titrated from other antipsychotic(s) to aripiprazole in an oral conversion phase (Phase 1). All patients entered an oral aripiprazole stabilization phase (Phase 2). Patients meeting stability criteria entered an ARI-OM stabilization phase (Phase 3), with coadministration of oral aripiprazole for the first 2 weeks. Patients meeting stability criteria were randomized to ARI-OM or placebo once-monthly (placebo-OM) during a 52-week, double-blind maintenance phase (Phase 4). Primary endpoint was time-to-impending relapse. Safety and tolerability were also assessed.
710 patients entered Phase 2, 576 Phase 3 and 403 Phase 4 (ARI-OM=269, placebo-OM=134). The study was terminated early because efficacy was demonstrated by a pre-planned interim analysis. Time-to-impending relapse was significantly delayed with ARI-OM vs. placebo-OM (p< 0.0001, log-rank test). Discontinuations due to treatment-emergent adverse events (AEs) were: Phase 1, 3.8% (n=24/632); Phase 2, 3.0% (n=21/709); Phase 3, 4.9% (n=28/576); Phase 4, 7.1% (n=19/269). Most AEs were mild or moderate. Insomnia was the only AE >5% incidence in any phase. Headache, somnolence, and nausea had a peak first onset within the first 4 weeks of treatment. There were no unusual shifts in all phases in laboratory values, fasting metabolic parameters, weight, or objective scales of movement disorders.
ARI-OM significantly delayed time-to-impending relapse compared with placebo-OM and was well tolerated as maintenance treatment in schizophrenia1.
Schizophrenia is a serious mental illness that carries a significant burden for families providing care.
The ADHES carers' survey canvassed opinions of families/friends of patients with schizophrenia across Europe.
To ascertain carer attitudes towards schizophrenia, its treatment and treatment adherence.
The survey was conducted from January-April 2011 in 16 European countries, comprising 10 questions relating to the respondents' understanding of schizophrenia, attitudes towards schizophrenia treatments, and perception of the family's/friend's role in supporting patients with schizophrenia.
Results were obtained from 138 respondents. 76% of carers recognized the importance of medication to help patients get better, improve their quality of life (77%) and relationships (74%). 67% of carers responded that they believed schizophrenia treatment damages patients' general health. Two-thirds of the carers reported that treatment adherence was a burden for the patient and over a third of carers indicated that it was a daily struggle to get patients to take their medication. 50% of carers considered the benefits offered by long-acting injectable antipsychotics as very/quite important and thus, could provide a valuable tool in improving treatment adherence. 92% of carers agreed on the importance of family support to boost treatment adherence with education/information deemed important for families and patients alike.
Carers recognize the issues they face in caring for patients with schizophrenia and their role in improving partial/non-adherence to medication, especially to avoid suboptimal treatment outcomes. The important role of family carers should be considered by healthcare professionals when treating patients with schizophrenia.
GXR, a selective α2A-adrenergic agonist, is a non-stimulant ADHD treatment approved in the USA for children and adolescents, and in Canada for children.
To evaluate long-term maintenance of efficacy of GXR in children and adolescents with ADHD who respond to an initial open-label, short-term trial.
To determine if there is a higher rate of treatment failure for placebo vs GXR during the double-blind randomised-withdrawal phase (RWP) (NCT01081145).
Patients (6–17 years) meeting DSM-IV-TR criteria for ADHD, baseline ADHD Rating Scale-IV (ADHD-RS-IV) ≥32 and Clinical Global Impressions-Severity (CGI-S) ratings ≥4 were enrolled. Following 7-week dose optimization and 6-week maintenance periods on open-label GXR (1–7 mg/day), eligible patients entered a 26-week, double-blind, RWP with GXR or placebo. The primary endpoint was rate of treatment failure (≥50% increase in ADHD-RS-IV total score and ≥2-point increase in CGI-S at two consecutive visits, compared to the RWP baseline). The key secondary endpoint was time-to-treatment failure. Safety assessments included treatment-emergent adverse events (TEAEs), electrocardiograms and vital signs.
Of 528 patients enrolled, 316 (60.0%) entered the RWP. At study end, 49.3% (GXR) and 64.9% (placebo) (95%CI; −26.6, −4.5, p<0.01) of patients had relapsed (Figure). Time-to-treatment failure was 56 days (placebo) versus 218 days (GXR), p=0.003. During the RWP, the most common GXR TEAEs (≥5% patients) were headache, somnolence and nasopharyngitis.
GXR demonstrated long-term maintenance of efficacy versus placebo in children and adolescents with ADHD.
There is strong evidence that foods containing dietary fibre protect against colorectal cancer, resulting at least in part from its anti-proliferative properties. This study aimed to investigate the effects of supplementation with two non-digestible carbohydrates, resistant starch (RS) and polydextrose (PD), on crypt cell proliferative state (CCPS) in the macroscopically normal rectal mucosa of healthy individuals. We also investigated relationships between expression of regulators of apoptosis and of the cell cycle on markers of CCPS. Seventy-five healthy participants were supplemented with RS and/or PD or placebo for 50 d in a 2 × 2 factorial design in a randomised, double-blind, placebo-controlled trial (the Dietary Intervention, Stem cells and Colorectal Cancer (DISC) Study). CCPS was assessed, and the expression of regulators of the cell cycle and of apoptosis was measured by quantitative PCR in rectal mucosal biopsies. SCFA concentrations were quantified in faecal samples collected pre- and post-intervention. Supplementation with RS increased the total number of mitotic cells within the crypt by 60 % (P = 0·001) compared with placebo. This effect was limited to older participants (aged ≥50 years). No other differences were observed for the treatments with PD or RS as compared with their respective controls. PD did not influence any of the measured variables. RS, however, increased cell proliferation in the crypts of the macroscopically-normal rectum of older adults. Our findings suggest that the effects of RS on CCPS are not only dose, type of RS and health status-specific but are also influenced by age.
Wild oat (Avena fatua L.) is one of the most problematic weed species in western Canada due to widespread populations, herbicide resistance, and seed dormancy. In wheat (Triticum aestivum L.), and especially in shorter crops such as lentil (Lens culinaris Medik.), A. fatua seed panicles elongate above the crop canopy, which can facilitate physical cutting of the panicles (clipping) to reduce viable seed return to the seedbank. However, the viability of A. fatua seed at the time of panicle elongation is not known. The objective of this study was to determine the viability of A. fatua seed at successive time intervals after elongation above a wheat or lentil crop canopy. A 2-yr panicle clipping and removal study in wheat and lentil was conducted in Lacombe, AB, and Saskatoon, SK, in 2015 and 2016 to determine the onset of viability in A. fatua seeds at successive clipping intervals. Manual panicle clipping of A. fatua panicles above each crop canopy began when the majority of panicles were visible above respective crop canopies and continued weekly until seed shed began. At the initiation of panicle clipping, A. fatua seed viability was between 0% and 10%. By the last clipping treatment (approximately 6 to 7 wk after elongation), 95% of the A. fatua seeds were viable. Seed moisture and awn angle were not good predictors of A. fatua viability, and therefore were unlikely to provide effective tools to estimate appropriate timing for implementation of A. fatua clipping as a management technique. Based on A. fatua seed viability, earlier clipping of A. fatua is likely to be more effective in terms of population management and easier to implement in shorter crops such as lentil. Investigations into long-term effects of clipping on A. fatua populations are needed to evaluate the efficacy of this management strategy on A. fatua.
Coexistence of people and large carnivores depends on a complex combination of factors that vary geographically. Both the number and range of the Asiatic lion Panthera leo leo in the Greater Gir landscape, India, has increased since the 1990s. The challenge has been managing the success of conservation, with a particular focus on the spillover population ranging extensively in human-dominated landscapes. To understand the factors conducive to lion survival in this landscape, we undertook an interview-based survey. Overall, people expressed positive, tolerant attitudes towards lions. There was a distinct contrast between people's liking for lions (76.9% of respondents) compared to leopards (27.7%) in spite of greater depredation of livestock by lions (82.6%) than by leopards (17.4%). Younger people and respondents having greater awareness regarding lions expressed positive attitudes. Although community discussions on lions had a positive effect, there was no evidence that land-holding, management interventions, personal encounters with lions, or association of lions with religion affected attitudes. Respondents who had experienced livestock depredation tended to express negative attitudes. Respondents with positive attitudes towards lions favoured non-interventionist strategies for managing lions in the village areas. We advocate consideration of varied factors influencing tolerance of wildlife in conservation planning. We emphasize that site-specific human–wildlife conflict issues such as crop-foraging by wild ungulates and variation in attitudes towards different species should also be considered. Specifically, improved livestock management, motivation of local youth and their participation in awareness campaigns could all further strengthen the prevalent positive attitudes towards lions.