To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Introduction: Although acute gastroenteritis is an extremely common childhood illness, there is a paucity of literature characterizing the associated pain and its management. Our primary objective was to quantify the pain experienced by children with acute gastroenteritis in the 24-hours prior to emergency department (ED) presentation. Secondary objectives included describing maximum pain, analgesic use, discharge recommendations, and factors that influenced analgesic use in the ED. Methods: Study participants were recruited into this prospective cohort study by the Alberta Provincial Pediatric EnTeric Infection TEam between January 2014 and September 2017. This study was conducted at two Canadian pediatric EDs; the Alberta Children's Hospital (Calgary) and the Stollery Children's Hospital (Edmonton). Eligibility criteria included < 18 years of age, acute gastroenteritis (□ 3 episodes of diarrhea or vomiting in the previous 24 hours), and symptom duration □ 7 days. The primary study outcome, caregiver-reported maximum pain in the 24-hours prior to presentation, was assessed using the 11-point Verbal Numerical Rating Scale. Results: We recruited 2136 patients, median age 20.8 months (IQR 10.4, 47.4); 45.8% (979/2136) female. In the 24-hours prior to enrolment, 28.6% (610/2136) of caregivers reported that their child experienced moderate (4-6) and 46.2% (986/2136) severe (7-10) pain in the preceding 24-hours. During the emergency visit, 31.1% (664/2136) described pain as moderate and 26.7% (571/2136) as severe. In the ED, analgesia was provided to 21.2% (452/2131) of children. The most commonly administered analgesics in the ED were ibuprofen (68.1%, 308/452) and acetaminophen (43.4%, 196/452); at home, acetaminophen was most commonly administered (77.7%, 700/901), followed by ibuprofen (37.5%, 338/901). Factors associated with analgesia use in the ED were greater pain scores during the visit, having a primary-care physician, shorter illness duration, fewer diarrheal episodes, presence of fever and hospitalization. Conclusion: Although children presenting to the ED with acute gastroenteritis experience moderate to severe pain, both prior to and during their emergency visit, analgesic use is limited. Future research should focus on appropriate pain management through the development of effective and safe pain treatment plans.
Introduction: Intravenous insertion (IVI) is identified by children as extremely painful and the resultant distress can have lasting negative consequences. There is an urgent need to effectively manage such procedures. Our primary objective was to compare the pain and distress of IVI with the addition of humanoid robot-based distraction to standard care, versus standard care alone. Methods: This two-armed randomized controlled trial (RCT) was conducted from April 2017 to May 2018 at the Stollery Children's Hospital emergency department (ED). Children aged 6 to 11 years who required IVI were included. Exclusion criteria included hearing or visual impairments, neurocognitive delays, sensory impairment to pain, previous enrolment, and discretion of the ED clinical staff. Primary outcomes were measured using the Observational Scale of Behavioural Distress-Revised (OSBD-R) (distress) and the Faces Pain Scale-Revised (FPS-R) (pain). A total of 426 pediatric patients were screened and 340 were excluded. Results: We recruited 86 children, of which 55% (47/86) were male; 9% (7/82) were premature at birth; 82% (67/82) had a previous ED visit; 30% (25/82) required previous hospitalization; 78% (64/82) had previous IV placement and 96% (78/81) received topical anesthesia. The mean total OSBD-R score was 1.49 ± 2.36 (standard care) compared to 0.78 ± 1.32 (robot group) (p = 0.047). The median FPS-R during the IV procedure was 4 (IQR 2,6) in the standard care group alone, compared to 2 (IQR 0,4) with the addition of humanoid robot-based distraction (p = 0.10). Change in parental state anxiety pre-procedure versus post-procedure was not significantly different between groups (p = 0.49). Parental satisfaction with the IV start was 93% (39/42) in the robot arm compared to 74% (29/39) in the standard care arm (p = 0.03). Parents were also more satisfied with management of their child's pain in the robot group (95% very satisfied) compared with standard care (72% very satisfied) (p = 0.002). Conclusion: A statistically significant reduction in distress was observed with the addition of robot-based distraction to standard care. Humanoid robot-based distraction therapy reduces distress and to a lesser extent, pain, in children undergoing IVI in the ED. Further trials are required to confirm utility in other age groups and settings.
Introduction: Inadequate pain management in children is ubiquitous in the emergency department (ED). As the current national opioid crisis has highlighted, physicians are caught between balancing pain management and the risk of long term opioid dependence. This study aimed to describe pediatric emergency physicians (PEPs) willingness to prescribe opioids to children in the ED and at discharge. Methods: A unique survey tool was created using published methodology guidelines. Information regarding practices, knowledge, attitudes, perceived barriers, facilitators and demographics were collected. The survey was distributed to all physician members of Pediatric Emergency Research Canada (PERC), using a modified Dillmans Tailored Design method, from October to December 2017. Results: The response rate was 49.7% (124/242); 53% (57/107) were female, mean age was 43.6 years (+/− 8.7), and 58% (72/124) had pediatric emergency subspecialty training. The most common first line ED pain medication was ibuprofen for mild, moderate and severe musculoskeletal injury (MSK-I)-related pain (94.4% (117/124), 89.5% (111/124), and 62.9% (78/124), respectively). For moderate and severe MSK-I, intranasal fentanyl was the most common opioid for first (35.5% (44/124) and 61.3% (76/124), respectively) and second line pain management (41.1% (51/124) and 20.2% (25/124), respectively). 74.8% (89/119) of PEPs reported that an opioid protocol would be helpful, specifically for morphine, fentanyl, and hydromorphone. Using a 0-100 scale, physicians minimally worried about physical dependence (13.3 +/−19.3), addiction (16.6 +/−19.8), and diversion of opioids (32.8+/−26.4) when prescribing short-term opioids to children. They reported that the current opioid crisis minimally influenced their willingness to prescribe opioids (30.0 +/−26.2). Physicians reported rarely (36%; 45/125) or never (28%; 35/125) completing a screening risk assessment prior to prescribing opioids. Conclusion: Ibuprofen remains the most common medication recommended for MSK-I pain in the ED and at discharge. Intranasal fentanyl was the top opioid for all pain intensities. PEPs are minimally concerned regarding dependence, addiction, and the current opioid crisis when prescribing short-term opioids to children. There is an urgent need for robust evidence regarding the dependence and addiction risk for children receiving short term opioids in order to create knowledge translation tools for ED physicians. Opioid specific protocols for both in the ED and at discharge would likely improve physician comfort in responsible and adequate pain management for children.
Individual increase in inbreeding coefficients (ΔFi) has been recommended as an alternate measure of inbreeding. It can account for the differences in pedigree knowledge of individual animals and avoids overestimation due to increased number of known generations. The effect of inbreeding (F) and equivalent inbreeding (EF) calculated from ΔFi, on growth traits were studied in Nilagiri and Sandyno flocks of sheep. The study was based on data maintained at the Sheep Breeding Research Station, Sandynallah. The pedigree information and equivalent number of generations were less in Sandyno compared with Nilagiri sheep. The average F and EF for the Nilagiri population were 2.17 and 2.44, respectively and the corresponding values for Sandyno sheep were 0.83 and 0.84, respectively. The trend of inbreeding over years in both the populations indicated that EF was higher during earlier generations when pedigree information was shallow. Among the significant effects of inbreeding, the depression in growth per 1 percent increase in inbreeding ranged from 0.04 kg in weaning weight to 0.10 kg in yearling weight. In general, more traits were affected by inbreeding in Nilagiri sheep, in which greater regression of growth traits was noticed with F compared with EF. Higher values of EF than F in earlier generations in both the populations indicate that EF avoided the potential overestimation of inbreeding coefficient during recent generations. In the Sandyno population, the magnitude of depression noticed among growth traits with significant effects of inbreeding was higher. The differences in response to F and EF noticed in the two populations and possible causes for the trait wise differences in response to F and EF are appropriately discussed.
To compare the outcomes of two types of tracheostomy tubes used in major head and neck surgery.
A retrospective study was conducted of prospectively collected data. The post-operative safety and adequacy of a single cannula tracheostomy tube was compared to a double cannula tracheostomy tube in patients undergoing tracheostomy during major oral and oropharyngeal resections.
Out of 46 patients with the single cannula tube, 7 (15 per cent) experienced significant obstruction warranting immediate tube removal, while another 9 (20 per cent) needed a change of tube or tube re-insertion for continued airway protection. In contrast, out of 50 patients with the double cannula tube, the corresponding numbers were 0 (p = 0.004) and 1 (2 per cent; p = 0.007) respectively.
Insertion of a double cannula (instead of a single cannula) tracheostomy tube in the course of major oral and oropharyngeal resections offers better airway protection during the post-operative period.
Single crystals of urea-oxalic acid (UOA) have been grown from aqueous solution by slow evaporation technique. Single crystal X-ray diffraction analysis confirmed that the grown crystals belong to monoclinic system having space group P21/C. The presence of functional groups was confirmed by using Fourier transform infrared (FTIR) spectroscopy. Optical absorption studies show very low absorption in entire visible region and the UV cut-off is found to be around 240 nm. Thermal analysis studies were carried out using TG/DTA analysis and the grown crystal is thermally stable up to 180 °C. Dielectric constant studies confirm the ferroelectric property of the materials and very low dielectric loss reveals very high purity of the crystal.
The Nilagiri sheep is a dual utility (fine wool and meat), native to the Nilagiri hills of Tamil Nadu. It is known for its adaptability to high altitude and low input system of rearing. At present, this breed is endangered with less than a thousand numbers existing, of which about 50 percent is maintained at Sheep Breeding Research Station, Sandynallah. Efforts are on to conserve the breed in-situ. Generation interval (GI), pedigree completeness level, inbreeding coefficient (F), average relatedness (AR), effective population size (Ne), and effective number of founders (fe) and ancestors (fa) were studied for the breed. Pedigree analysis was carried out using data available at the research station on 5 051 animals from 1965 onwards using ENDOG ver. 4.8. Higher values of pedigree completeness (more than 80 percent for 5th generation), balance in percent of ancestors between sire and dam pathways and higher equivalent complete generations (7.12) for the reference population were indicative of the depth in pedigree. The GI, F, and AR were 3.36 years, 2.17 and 3.45 percent, respectively. Ne based on maximum number of generations and individual increase in inbreeding was 298.83 and 97.25, respectively. fe and fa were 59 and 41, respectively, for the reference population. F was far from critical values of inbreeding and fe/fa ratio indicated absence of stringent bottlenecks. The effective population size was on the higher end of the range reported for endangered sheep breeds. The knowledge on genetic diversity and effective population size coefficients would support the cause of conservation.
The risk of stroke after transient ischemic attack (TIA) is elevated in the days to weeks after TIA. A variety of prediction rules to predict stroke risk have been suggested. In Alberta a triage algorithm to facilitate urgent access based on risk level was agreed upon for the province. Patients with ABCD2 score ≥ 4, or motor or speech symptoms lasting greater than five minutes, or with atrial fibrillation were considered high risk (the ASPIRE approach). We assessed the ability of the ASPIRE approach to identify patients at risk for stroke.
We retrospectively reviewed charts from 573 consecutive patients diagnosed with TIA in Foothills Hospital emergency room from 2002 through 2005. We recorded clinical and event details and identified the risk of stroke at three months.
Among 573 patients the 90-day risk of stroke was 4.7% (95% CI 3.0%, 6.4%). 78% of the patients were identified as high risk using this approach. In patients defined as high risk on the ASPIRE approach there was a 6.3% (95% CI 4.2%, 8.9%) risk of stroke. In patients defined as low risk using the ASPIRE approach there were no recurrent strokes (100% negative predictive value). In contrast, two patients with low ABCD2 scores (ABCD2 score < 4) suffered recurrent strokes.
The ASPIRE approach has a perfect negative predictive value in the population in predicting stroke. However, this high sensitivity comes at a cost of identifying most patients as high risk.
Sixty per cent of the global elderly population live in low-and middle-income countries, and this proportion was expected to rise to 70% by 2010 (International Institute of Ageing, 2001; Ferri et al, 2005). The 2001 Indian census found over 70 million people aged 60 years or more (considered senior citizens according to the Indian National Policy on Older Persons). Most of those senior citizens live with younger family members and are dependent on them for financial and social support. Hence, any physiological and psychological changes in the older family members affect the younger supportive members as well.
Greater yam (Dioscorea alata), a popular crop in India, is cultivated widely in Orissa state, India. In spite of the availability of several improved varieties, farmers preferred the local landraces. An investigation was carried out to identify whether the varietal preferences of yam farmers in two production systems, subsistence and commercial, were different. While the subsistence farmers demanded the yam varieties adaptable to a wide range of soils, the commercial farmers preferred the anthracnose-resistant cultivars. This study demonstrated that the farmers' varietal preferences were highly influenced by the production systems. Identifying the convergence/divergence of varietal preferences across production systems can help breeders to develop the high impact varieties.
Decision-making in agricultural production is a complex process in which many risks need to be con-sidered for an informed decision to be made. In many parts of the world, weather and climate are one of the biggest production risks and uncertainty factors impacting on agricultural systems performance and management. Farmers around the world, especially those in the developing countries, have been trying to adapt to the variable weather and climate conditions through various risk-management strategies. Improved weather and climate information, supplied to the farming community in a timely manner, can greatly assist the farmers in their operational decisions.
During an Inter-Regional Workshop on Improving Agrometeorological Bulletins in Barbados in October 2001, participants recommended that a dedicated web server be developed for distributing agrometeorological products from WMO members. Subsequently, an Expert Group Meeting on Internet Applications for Agrometeorological Products was held in Washington, DC, during May 2002, to discuss the practical steps needed to develop this web server. Discussions from this meeting led to the development of the World AgroMeteorological Information Service (WAMIS). WAMIS is a dedicated web server on which countries and organisations can place their agrometeorological bulletins and advisories. Provision of such a central location for agrometeorological information enables users to quickly and easily evaluate various bulletins and gain insight into improving their own bulletins. Also, these bulletins represent the expert knowledge of the individual countries and provide the possibility to assess extreme events and disasters in a historical perspective, especially when an archive of bulletins is present. Placement of agrometeorological bulletins on WAMIS also increases the visibility of the National Meteorological/Hydrological Services (NMHS). In March 2005, a tools and resources section was added to WAMIS to provide users with additional papers, links to software tools, Internet links, and other resources to help improving their agrometeorological bulletins and advisories and services.
Agrometeorological information and services from the National Meteorological and Hydrological Services (NMHSs) are increasingly being demanded by the farming community to cope more efficiently with climate variability and the increasing incidence of extreme meteorological events such as droughts, floods, frosts and wind erosion. While considerable advances have been made in the collection, archiving and analysis of weather and climate data, their transformation into information that can be readily used by the farm sector has lagged behind, especially in developing countries where such information needs are the greatest. One of the important reasons is the lack of adequate interaction with the user community in assessing the appropriate dissemination and communication procedures that can enhance the value of the agrometeorological information and services. A brief review of the present status of dissemination and communication of agrometeorological information by the NMHSs and associated agencies in different regions around the world is presented. A description of the user communities for agrometeorological information and their varying needs is also presented. Opportunities and challenges in the dissemination and communication of agrometeorological information by the NMHSs are described with suitable examples which emphasize that continued improvements are necessary to make agrometeorological information more accessible and useful to the user community.
Background. We carried out a large randomized trial of a brief form of cognitive therapy, manual-assisted cognitive behaviour therapy (MACT) versus treatment as usual (TAU) for deliberate self-harm.
Method. Patients presenting with recurrent deliberate self-harm in five centres were randomized to either MACT or (TAU) and followed up over 1 year. MACT patients received a booklet based on cognitive behaviour therapy (CBT) principles and were offered up to five plus two booster sessions of CBT from a therapist in the first 3 months of the study. Ratings of parasuicide risk, anxiety, depression, social functioning and global function, positive and negative thinking, and quality of life were measured at baseline and after 6 and 12 months.
Results. Four hundred and eighty patients were randomized. Sixty per cent of the MACT group had both the booklet and CBT sessions. There were seven suicides, five in the TAU group. The main outcome measure, the proportion of those repeating deliberate self-harm in the 12 months of the study, showed no significant difference between those treated with MACT (39%) and treatment as usual (46%) (OR 0·78, 95% CI 0·53 to 1·14, P=0·20).
Conclusion. Brief cognitive behaviour therapy is of limited efficacy in reducing self-harm repetition, but the findings taken in conjunctin with the economic evaluation (Byford et al. 2003) indicate superiority of MACT over TAU in terms of cost and effectiveness combined.
Fortification of salt with iron has been developed by the National Institute of Nutrition (NIN) as a strategy for the control of iron deficiency anaemia (IDA) in India, similar to iodization of salt for control of iodine deficiency disorders (IDD). Stability of the iron fortified salt (IFS), its bioavailability and organoleptic evaluation of food items containing the IFS have been demonstrated. Acceptability and effectiveness of the IFS in school children and in multicentric community trials have been demonstrated. With the introduction of universal iodization of salt as a national policy in 1988, NIN has developed a formulation for double fortification (DFS) of salt with iodine and iron. The stability of the nutrients under laboratory conditions along with their bioavailability were found to be good but varying with the quality of salt used. The DFS has been evaluated in controlled trials in tribal communities and in residential school children. The findings of these studies are discussed. Overall, in these trials, DFS effectively controlled iodine deficiency but a clear impact on reducing anaemia was not demonstrated. In residential schoolchildren, increased urinary excretion of iodine as well as reduced anaemia were observed. The quality of salt has been found to be an important determinant of the stability of iodine in DFS. Further evaluation of this potentially important intervention is in progress.
The regional cerebral blood flow (rCBF) response to the Wisconsin Card Sort Test (WCST) has been used to assess the functional integrity of the prefrontal cortex in patients with schizophrenia.
In this study, patients were divided into two groups according to whether they had made few or many perseverative errors on a modified version of the WCST. A control group consisted of normal volunteers. The groups were then compared with respect to rCBF response to WCST activation.
rCBF was measured during administration of a modified version of the WCST and during a card sorting control task, using single photon emission computerised tomography (SPECT).
Performance of the modified WCST was associated with a widespread and substantial increase in rCBF, particularly in the frontal region. The poorly performing group of patients with schizophrenia showed only a modest increase in rCBF in the left anterior cingulate region.
Subjects with schizophrenia are able to respond to specific neuropsychological challenge with activation of the frontal regions.
A comprehensive study was conducted over a 4-year period (1984–87)
to evaluate the water use,
growth and yield responses of pearl millet (Pennisetum glaucum
(L.) R. Br.) cv. CIVT grown with and
without fertilizer (30 kg P2O5 and 45 kg N ha−1)
at the ICRISAT Sahelian Centre, Sadoré, Niger. Our
study showed significant year and fertilizer effects on the growth and
yield of millet at the study site.
Observed year effects were primarily due to the variations in the amount
and distribution of rainfall
in relation to the potential demand for water. During 1984, 1985 and 1987,
total rainfall was below
the long term average, while in 1986 it was above average. While the onset
of rains (relative to the
average date of onset) was early from 1984 to 1986, in 1987 the sowings
were delayed by as much as
33 days. Of all the four years, the separation between the treatments in
the cumulative evaporation
is most evident for 1984, which was a drought year with below-average rainfall
in all the months from
June to September. Cumulative evaporation patterns in 1985 and 1986 were
similar because of regular
rains and high average rainfall per rainy day from June to October. In
1987, sowings were delayed
until 15 July and only 6·9 mm of rainfall was received per rainy
day in July. Hence cumulative
evaporation was initially low and showed a significant increase only after
two significant rain events
in early August. There was a large response to fertilizer in all the years
as small additions of fertilizer
phosphate increased the soluble phosphate in the soil. Fertilizer application
resulted in a small
increase in water use (7–14%) in all years except 1987. Increased
yield due to the application of
fertilizer was accompanied by an increase in the water-use efficiency (WUE)
in all the four years with
the largest increase in 1985. The beneficial effect of fertilizers could
be attributed to the rapid early
growth of leaves which can contribute to reduction of soil evaporative
losses and increased WUE.
Over the four seasons, average increase in the WUE due to the addition
of fertilizer was 84%.
The response of four cowpea (Vigna unguiculata(L.) Walp.) cultivars to the warm, semi-arid tropical environment at the ICRISAT Sahelian Center at Sadore, Niger was studied during 1985 and 1986 interms of leaf area index (LAI), dry matter (DM) accumulation, net photosynthesis, stomatal conductance, total water use and yield. Among the three improved cultivars, IT82D–716 is early and erect, cv. IT83S–947 is early and spreading and cv. TVX4659–03E is a medium-duration, highyielding, dual-purpose type. The local cv. Sadore Local is a long-duration, photosensitive, spreading type used mainly for fodder. In both years, Sadore Local recorded the highest LAI. IT82D–716 and IT83S–947 produced < 1·3 t/ha of DM in both years, whereas TVX 4659–03E produced > 2 t/ha of DM and proved superior to Sadore Local in partitioning DM into pods. The four cultivars did not differ significantly either in stomatal conductance or in net phytosynthetic rates. Observed maximum photosynthetic rates of c. 20 μmol/m2/s lie at the bottom of the range 21–38 μmol/m2/s reported for 31 cowpea genotypes in an earlier study. Photosynthetic rates increased with increasing photon flux density. TVX4659–03E had an advantage in total seed plus fodder yields while the local cultivar gave significantly greater fodder yields in both years. Seed and fodder yields, as well as water-use efficiency, confirmed the advantages offered by the dual-purpose cultivar TVX4659–03E. Future breeding efforts in the Sahel should focus on dual-purpose (grain/fodder) cowpea types.
Root/shoot relations of two cultivars of pearl millet (Pennisetum glaucum) were studied on a sandy soil at Sadore in Niger using a wet excavation method. For the first 10 days after emergence (DAE), the length of the seminal root showed an exponential growth rate while plant height increased more or less linearly. The maximum rooting depth for millet was 168 cm and the maximum number of root axes and primary laterals, 172 per plant. Root length continued to increase up to 75 DAE, the maximum length exceeding 5000 cm per plant. The proportion of total day matter accumulated in the roots decreased from 30% in the early stages to less than 20% by maturity. The wet excavation method is a promising technique for the rapid removal of intact root systems of pearl millet from the sandy soils of the Sahel.
Field trials conducted previously in Niger have shown that in years when the onset of the rains is 15–20 days earlier than average, the long growing season can be exploited by growing a relay crop of millet (Pennisetum glaucum) and cowpea (Vigna unguiculata). In the trials reported here, the advantages of relay cropping were compared with intercropping with improved management and intercropping under traditional management during the 1989, 1990 and 1991 rainy seasons at the ICRISAT Sahelian Center, Sadore, Niger. The length of the growing season varied from 139 to 150 days over the three seasons. The relay crop produced more dry matter and leaf area and yielded more than the intercrops in all three years, confirming that in years when the onset of the rains is early, relay cropping with millet and cowpea is a better option than growing the same two species as an intercrop. Relay cropping avoids the competitive effects inherent in intercropping systems, while offering the additional advantages of rotating cereals with legumes.
Mijo y caupí en sistemas de relevo y de cultivo simultáneo