To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
To establish the adherence of Assertive Outreach Service to the local protocol for the use of high dose antipsychotic (HDAT) medication. This audit cycle was started with an intention to minimise metabolic syndrome due to antipsychotic medication.
’Guidelines for the use of high dose antipsychotic (HDAT) medication’-February 2008. The following standards from the guidelines were measured from the sample:
1. HDAT monitoring sheet should be completed for all HDAT patients
2. ECG should be done for all HDAT patients
3. Blood tests (LFTs and U&Es) should be done for all HDAT patients
4. Physical examination should be done for all HDAT patients
5. Consent should be obtained from all HDAT patients and filled on the monitoring sheet.
6. Physical health risk factors should be filled for all HDAT patients
Data sample and collection
Out of 179 patients under the team, 17 patients were prescribed HDAT. All the data were collected from the case notes of all HDAT patients and entered into the HDAT monitoring sheets in January 2009 as a baseline assessment by Dr. Praveen Kumar and re-audited in June 2009 by Dr. Gaurav Mehta.
As evidenced by the results of audit in January 2009 and re-audit in June 2009, all the parameters were up to the standards in June 2009 except the physical examination not performed on all 17 HDAT patients as revealed inthe re-audit.
Traumatic brain injuries (TBI) may lead to persistent depression symptoms. We conducted several pilot studies to examine the efficacy of mindfulness-based interventions to deal with this issue; all showed strong effect sizes. The logical next step was to conduct a randomized controlled trial (RCT).
We sought to determine the efficacy of mindfulness-based cognitive therapy for people with depression symptoms post-TBI (MBCT-TBI).
Using a multi-site RCT design, participants (mean age = 47) were randomized to intervention or control arms. Treatment participants received a group-based, 10-week intervention; control participants waited. Outcome measures, administered pre- and post-intervention, and after three months, included: Beck Depression Inventory-II (BDI-II), Patient Health Questionnaire-9 (PHQ-9), and Symptom Checklist-90-Revised (SCL-90-R). The Philadelphia Mindfulness Scale (PHLMS) captured present moment awareness and acceptance.
BDI-II scores decreased from 25.47 to 18.84 in treatment groups while they stayed relatively stable in control groups (respectively 27.13 to 25.00; p = .029). We did not find statistically significant differences on the PHQ-9 and SCL-90R post- treatment. However, after three months, all scores were statistically significantly lower than at baseline (ps < .01). Increases in mindfulness were associated with decreases in BDI-II scores (r = -.401, p = .025).
MBCT-TBI may alleviate depression symptoms up to three months post-intervention. Greater mindfulness may have contributed to the reduction in depression symptoms although the association does not confirm causality. More work is required to replicate these findings, identify subgroups that may better respond to the intervention, and refine the intervention to maximize its effectiveness.
A national need is to prepare for and respond to accidental or intentional disasters categorized as chemical, biological, radiological, nuclear, or explosive (CBRNE). These incidents require specific subject-matter expertise, yet have commonalities. We identify 7 core elements comprising CBRNE science that require integration for effective preparedness planning and public health and medical response and recovery. These core elements are (1) basic and clinical sciences, (2) modeling and systems management, (3) planning, (4) response and incident management, (5) recovery and resilience, (6) lessons learned, and (7) continuous improvement. A key feature is the ability of relevant subject matter experts to integrate information into response operations. We propose the CBRNE medical operations science support expert as a professional who (1) understands that CBRNE incidents require an integrated systems approach, (2) understands the key functions and contributions of CBRNE science practitioners, (3) helps direct strategic and tactical CBRNE planning and responses through first-hand experience, and (4) provides advice to senior decision-makers managing response activities. Recognition of both CBRNE science as a distinct competency and the establishment of the CBRNE medical operations science support expert informs the public of the enormous progress made, broadcasts opportunities for new talent, and enhances the sophistication and analytic expertise of senior managers planning for and responding to CBRNE incidents.
Multi-sire mating of a mob of ewes is commonly used in commercial sheep production systems. However, ram mating success (defined as the number of lambs sired by an individual) can vary between rams in the mating group. If this trait was repeatable and heritable, selection of rams capable of siring larger numbers of lambs could reduce the number of rams required for mating and ultimately lead to increased genetic gain. However, genetic correlations with other productive traits, such as growth and female fertility, could influence the potential for ram mating success to be used as a selection trait. In order to investigate this trait, parentage records (including accuracy of sire assignment) from 15 commercial ram breeding flocks of various breeds were utilised to examine the repeatability and heritability of ram mating success in multi-sire mating groups. In addition, genetic and phenotypic correlations with growth and female fertility traits were estimated using ASReml. The final model used for the ram mating success traits included age of the ram and mating group as fixed effects. Older rams (3+years old) had 15% to 20% greater mating success than younger rams (1 or 2 years of age). Increasing the stringency of the criteria for inclusion of both an individual lamb, based on accuracy of sire assignment, or a whole mating group, based on how many lambs had an assigned sire, increased repeatability and heritability estimates of the ram mating success traits examined. With the most stringent criteria employed, where assignment of sire accuracy was >0.95 and the total number of lambs in the progeny group that failed to have a sire assigned was<0.05, repeatability and heritability for loge(number of lambs) was 0.40±0.09 and 0.26±0.12, respectively. For proportion of lambs sired, repeatability and heritability were both 0.30±0.09. The two ram mating traits (loge(nlamb) and proportion) were highly correlated, both phenotypically and genetically (0.88±0.01 and 0.94±0.06, respectively). Both phenotypic and genetic correlations between ram mating success and growth and other female fertility traits were low and non-significant. In conclusion, there is scope to select rams capable of producing high numbers of progeny and thus increase selection pressure on rams to increase genetic gain.
Introduction: Patient-reported outcome measures (PROM) are questionnaires that can be used to elicit care outcome information from patients. We sought to develop and validate the first PROM for adult patients without a primary mental health or addictions presentation receiving emergency department (ED) care and who were not hospitalized. Methods: PROM development used a multi-phase process based on national and international guidance (FDA, NQF, ISPOR). Phase 1: ED outcome conceptual framework qualitative interviews with ED patients post-discharge informed four core domains (previously published). Phase 2: Item generation scoping review of the literature and existing instruments identified candidate questions relevant for each domain for inclusion in tool. Phase 3: Cognitive debriefing existing and newly written questions were tested with ED patients post-discharge for comprehension and wording preference. Phase 4: Field and validity testing revised tool pilot tested on a national online survey panel and then again at 2 weeks (test-retest). Phase 5: Final item reduction using a Delphi process involving ED clinicians, researchers, patients and system administrators. Phase 6: Validation - psychometric testing of PROM-ED 1.0. Results: Four core outcome domains were defined in Phase 1: (1) understanding; (2) symptom relief; (3) reassurance and (4) having a plan. The domains informed a review of existing relevant questionnaires and instruments and the writing of additional questions creating an initial long-form questionnaire. Eight patients participated in cognitive debriefing of the long-form questionnaire. Expert clinicians, researchers and patient partners provided input on item refinement and reduction. Four hundred forty-four patients completed a second version of the long-form questionnaire (add in retest numbers) which informed the final item reduction process by a modified Delphi method involving 21 diverse contributors. The questionnaire was validated and underwent final revisions to create the 21 questions that constitute PROM-ED 1.0. Conclusion: Using accepted PROM instrument development methodology, we developed the first outcome questionnaire for use with adult ED patients who are not hospitalized. This questionnaire can be used to systematically gather patient-reported outcome information that could support and inform improvement work in ED care.
Alcohol consumption around the time of conception is highly prevalent in Western countries. Exposure to ethanol levels during gestation has been associated with altered development of the mesolimbic reward pathway in rats and increased propensity to addiction, however the effect of exposure only around the time of conception is unknown. The current study investigated the effects of periconceptional alcohol exposure (PC:EtOH) on alcohol and palatable food preferences and gene expression in the ventral tegmental area (VTA) and the nucleus accumbens of the adult offspring. Rats were exposed to a liquid diet containing ethanol (EtOH) (12.5% vol/vol) or a control diet from 4 days before mating until 4 days after mating. PC:EtOH had no effect on alcohol preference in either sex. At 15 months of age, however, male PC:EtOH offspring consumed more high-fat food when compared with male control offspring, but this preference was not observed in females. Expression of the dopamine receptor type 1 (Drd1a) was lower in the VTA of male PC:EtOH offspring compared with their control counterparts. There was no effect of PC:EtOH on mRNA expression of the µ-opioid receptor, tyrosine hydroxylase (Th), dopamine receptor type 2 (Drd2) or dopamine active transporter (Slc6a3). These data support the hypothesis that periconceptional alcohol exposure can alter expression of key components of the mesolimbic reward pathway and heighten the preference of offspring for palatable foods and may therefore increase their propensity towards diet-induced obesity. These results highlight the importance of alcohol avoidance when planning a pregnancy.
Conventional bedside tests of visuospatial function such as the clock drawing (CDT) and intersecting pentagons tests (IPT) are subject to considerable inconsistency in their delivery and interpretation. We compared performance on a novel test – the letter and shape drawing (LSD) test – with these conventional tests in hospitalised elderly patients.
The LSD, IPT, CDT and the Montreal Cognitive Assessment (MoCA) were performed in 40 acute elderly medical inpatients at University Hospital Limerick The correlation between these tests was examined as well as the accuracy of the visuospatial tests to identify significant cognitive impairment on the MoCA.
The patients (mean age 81.0±7.71; 21 female) had a median MoCA score of 15.5 (range=1–29). There was a strong, positive correlation between the LSD and both the CDT (r=0.56) and IPT (r=0.71). The correlation between the LSD and MoCA (r=0.91) was greater than for the CDT and IPT (both 0.67). The LSD correlated highly with all MoCA domains (ranging from 0.54 to 0.86) and especially for the domains of orientation (r=0.86), attention (0.81) and visuospatial function (r=0.73). Two or more errors on the LSD identified 90% (26/29) of those patients with MoCA scores of ⩽20, which was substantially higher than for the CDT (59%) and IPT (55%).
The LSD is a novel test of visuospatial function that is brief, readily administered and easily interpreted. Performance correlates strongly with other tests of visuospatial ability, with favourable ability to identify patients with significant impairment of general cognition.
Infectious bovine keratoconjunctivitis (IBK) is a common and important disease of calves. Without effective vaccines, antibiotic therapy is often implemented to minimize the impact of IBK. This review updates a previously published systematic review regarding comparative efficacy for antibiotic treatments of IBK. Available years of Centre for Biosciences and Agriculture International and MEDLINE databases were searched, including non-English results. Also searched were the American Association of Bovine Practitioners and World Buiatrics Congress conference proceedings from 1996 to 2016, reviews since 2013, reference lists from relevant trials, and U.S. Food and Drug Administration New Animal Drug Application summaries. Eligible studies assessed antibiotic treatment of naturally-occurring IBK in calves randomly allocated to group at the individual level. Outcomes of interest were clinical score, healing time, unhealed ulcer risk, and ulcer surface area. A mixed-effects model comparing active drug with placebo was employed for all outcomes. Heterogeneity was assessed visually and using Cochran's Q-test. Thirteen trials assessing nine treatments were included. Compared with placebo, most antibiotic treatments were effective. There was evidence that the treatment effect differed by day of outcome measurement. Visually, the largest differences were observed 7–14 days post-treatment. These results indicate improved IBK healing with many antibiotics and suggest the need for randomized trials comparing different antibiotic treatments.
The subsurface exploration of other planetary bodies can be used to unravel their geological history and assess their habitability. On Mars in particular, present-day habitable conditions may be restricted to the subsurface. Using a deep subsurface mine, we carried out a program of extraterrestrial analog research – MINe Analog Research (MINAR). MINAR aims to carry out the scientific study of the deep subsurface and test instrumentation designed for planetary surface exploration by investigating deep subsurface geology, whilst establishing the potential this technology has to be transferred into the mining industry. An integrated multi-instrument suite was used to investigate samples of representative evaporite minerals from a subsurface Permian evaporite sequence, in particular to assess mineral and elemental variations which provide small-scale regions of enhanced habitability. The instruments used were the Panoramic Camera emulator, Close-Up Imager, Raman spectrometer, Small Planetary Linear Impulse Tool, Ultrasonic drill and handheld X-ray diffraction (XRD). We present science results from the analog research and show that these instruments can be used to investigate in situ the geological context and mineralogical variations of a deep subsurface environment, and thus habitability, from millimetre to metre scales. We also show that these instruments are complementary. For example, the identification of primary evaporite minerals such as NaCl and KCl, which are difficult to detect by portable Raman spectrometers, can be accomplished with XRD. By contrast, Raman is highly effective at locating and detecting mineral inclusions in primary evaporite minerals. MINAR demonstrates the effective use of a deep subsurface environment for planetary instrument development, understanding the habitability of extreme deep subsurface environments on Earth and other planetary bodies, and advancing the use of space technology in economic mining.
The ability to perform microbial detection and characterization in-field at extreme environments, rather than on returned samples, has the potential to improve the efficiency, relevance and quantity of data from field campaigns. To date, few examples of this approach have been reported. Therefore, we demonstrate that the approach is feasible in subglacial environments by deploying four techniques for microbial detection: real-time polymerase chain reaction; microscopic fluorescence cell counts, adenosine triphosphate bioluminescence assay and recombinant Factor C assay (to detect lipopolysaccharide). Each technique was applied to 12 subglacial ice samples, 12 meltwater samples and two snow samples from Engabreen, Northern Norway. Using this multi-technique approach, the detected biomarker levels were as expected, being highest in debris-rich subglacial ice, moderate in glacial meltwater and low in clean ice (debris-poor) and snow. Principal component analysis was applied to the resulting dataset and could be performed in-field to rapidly aid the allocation of resources for further sample analysis. We anticipate that in-field data collection will allow for multiple rounds of sampling, analysis, interpretation and refinement within a single field campaign, resulting in the collection of larger and more appropriate datasets, ultimately with more efficient science return.
Pituitary volume enlargements have been observed among individuals with first-episode psychosis. These abnormalities are suggestive of hypothalamic–pituitary–adrenal (HPA) axis hyperactivity, which may contribute to the development of psychosis. However, the extent to which these abnormalities characterize individuals at elevated risk for schizophrenia prior to illness onset is currently unclear, as volume increases, decreases and no volume differences have all been reported relative to controls. The current study aimed to determine whether antipsychotic-naive, putatively at-risk children who present multiple antecedents of schizophrenia (ASz) or a family history of illness (FHx) show pituitary volume abnormalities relative to typically developing (TD) children. An additional aim was to explore the association between pituitary volume and experiences of psychosocial stress.
ASz (n = 30), FHx (n = 22) and TD (n = 32) children were identified at age 9–12 years using a novel community-screening procedure or as relatives of individuals with schizophrenia. Measures of pituitary volume and psychosocial stress were obtained at age 11–14 years.
Neither ASz nor FHx children showed differences in pituitary volume relative to TD children. Among FHx children only, pituitary volume was negatively associated with current distress relating to negative life events and exposure to physical punishment.
The lack of pituitary volume abnormalities among ASz and FHx children is consistent with our previous work demonstrating that these children are not characterized by elevated diurnal cortisol levels. The findings imply that these biological markers of HPA axis hyperactivity, observed in some older samples of high-risk individuals, may emerge later, more proximally to disease onset.
With prevention and treatment of mental disorders a challenge for primary care and increasing capability of electronic medical records (EMRs) to facilitate research in practice, we aim to determine the prevalence and treatment of mental disorders by using routinely collected clinical data contained in EMRs.
We reviewed EMRs of patients randomly sampled from seven general practices, by piloting a study instrument and extracting data on mental disorders and their treatment.
Data were collected on 690 patients (age range 18–95, 52% male, 52% GMS-eligible). A mental disorder (most commonly anxiety/stress, depression and problem alcohol use) was recorded in the clinical records of 139 (20%) during the 2-year study period. While most patients with the common disorders had been prescribed medication (i.e. antidepressants or benzodiazepines), a minority had been referred to other agencies or received psychological interventions. ‘Free text’ consultation notes and ‘prescriptions’ were how most patients with disorders were identified. Diagnostic coding alone would have failed to identify 92% of patients with a disorder.
Although mental disorders are common in general practice, this study suggests their formal diagnosis, disease coding and access to psychological treatments are priorities for future research efforts.
Social context has a major influence on the detection and treatment of youth mental and substance use disorders in socioeconomically disadvantaged urban areas, particularly where gang culture, community violence, normalisation of drug use and repetitive maladaptive family structures prevail. This paper aims to examine how social context influences the development, identification and treatment of youth mental and substance use disorders in socioeconomically disadvantaged urban areas from the perspectives of health care workers.
Semi-structured interviews were conducted with health care workers (n=37) from clinical settings including: primary care, secondary care and community agencies and analysed thematically using Bronfenbrenner’s Ecological Theory to guide analysis.
Health care workers’ engagement with young people was influenced by the multilevel ecological systems within the individual’s social context which included: the young person’s immediate environment/‘microsystem’ (e.g., family relationships), personal relationships in the ‘mesosystem’ (e.g., peer and school relationships), external factors in the young person’s local area context/‘exosystem’ (e.g., drug culture and criminality) and wider societal aspects in the ‘macrosystem’ (e.g., mental health policy, health care inequalities and stigma).
In socioeconomically disadvantaged urban areas, social context, specifically the micro-, meso-, exo-, and macro-system impact both on the young person’s experience of mental health or substance use problems and services, which endeavour to address these problems. Interventions that effectively identify and treat these problems should reflect the additional challenges posed by such settings.
Seagrass meadows are soft sediment intertidal to subtidal benthic habitats that are comprised of a group of plants adapted to life in the sea (den Hartog, 1970; Hemminga & Duarte, 2000). Seagrasses comprise one of the world’s most widespread habitats in shallow coastal waters; they are found on all of the world’s continents except Antarctica. Seagrass habitat can be patchy, but is more commonly comprised of continuous vegetation, which can be thousands of square kilometers in size. It is these large swaths that are referred to as seagrass beds or meadows (terms that are interchangeable). Seagrass meadows occur in sheltered intertidal and shallow subtidal areas on sand or mud substratum (and occasionally in among boulders). Current documented distributions include 125 000 km2 of seagrass meadows; however, recent estimates suggest that these meadows could cover up to 600 000 km2 of the coastal ocean (Duarte et al., 2010).
Seagrasses are marine angiosperms belonging to the order Helobiae and comprising two families – Potamogetonaceae and Hydrocharitaceae (den Hartog, 1970). Seagrass plants are rhizomatous (they have stems extending horizontally below the sediment surface) and modular, composed of repeating units (ramets) that exhibit clonal growth (Hemminga & Duarte, 2000). In contrast to other submerged marine plants (e.g. seaweeds or algae), seagrasses flower, develop fruit, and produce seeds (Ackerman, 2006). They have true roots and internal gaseous and nutrient transport systems (Kuo & den Hartog, 2006). The functional definition for seagrass plants encompasses only 72 species. Three seagrass species are considered endangered and 10 are at elevated risk of extinction; however, the gross majority of species are considered common (Short et al., 2011). It is the common abundance of these species, rather than their rarity, that makes them important. Seagrasses provide habitat, meaning they have a major functional role in supporting various stages in the life cycles of other organisms. For this reason, and with their extensive root–rhizome system and well-developed canopy, seagrasses, like reef-building organisms, are termed foundation species (Hughes et al., 2009).
This chapter describes the arrangements in Australia for regulating the quality of long-term care services delivered in the community or in a residential setting. Its focus is on the long-term care of ‘older people’ – ‘aged care’ in Australian parlance. The chapter begins with an overview of Australia’s aged care system and its quality framework, including its place within the broader health and welfare system. It then discusses the arrangements for regulating the quality of residential care, which have been a major focus in recent decades, and the arrangements for regulating the quality of community care, which have a shorter history and are less developed. The chapter then discusses current reforms, which are aimed at better integrating these arrangements within and across programmes, and concludes with some reflections on the key challenges currently facing Australian public policy in this area.
Overview of Australia’s aged care system and its quality framework
Australia’s aged care system is funded and regulated through a complex set of arrangements, involving different levels of government and a diverse range of stakeholders, including informal carers and formal care providers from the not-for-profit (religious and charitable), for-profit and government sectors. These arrangements reflect, in part, the broader Australian health and welfare system, involving a similarly complex range of providers, with responsibilities for funding, regulation and service delivery shared between the three levels of government: federal, state and territory (‘state’), and local (AIHW, 2010, 2011a).
Improving the interface between primary care and mental health services is a key target in current healthcare policy in Ireland. This study examines the content of referrals from primary care to a community mental health service for apparent depression.
We retrospectively reviewed the clinical records of 100 patients with depression who consecutively attended a specialist mental health service in Ireland's midwest region. Records were reviewed for demographic and clinical information provided by the doctor at the time of referral, subsequent service engagement, diagnosis and treatment initiated.
There was considerable variation in the content and presentation of information contained in referral letters. Eleven per cent used structured HSE mental health referral forms. Seventy-six per cent of referrals contained clear information regarding name, address, symptoms and treatment previously initiated. Specifically, low mood, biological symptoms of depression and illness severity were documented in 43%, 34% and 27%, respectively. Suicide risk was documented in 20%. More detail was significantly associated with more severe illness. At initial specialist assessment, 71% had commenced antidepressant treatment, with 11% having received an adequate trial of a first antidepressant and 3% an adequate trial of two antidepressants. Two-thirds were diagnosed with mild/moderate depression. Initiation of antidepressant treatment was linked to subsequent diagnosis of depressive illness by mental health services (p < 0.001).
Our findings indicate variable referral practices from general practice to mental health in our region. Most referrals were for mild to moderate depression. Poor access to psychological services locally may be a key factor in this phenomenon.
Mindfulness-based approaches for adults are effective at enhancing mental health, but few controlled trials have evaluated their effectiveness among young people.
To assess the acceptability and efficacy of a schools-based universal mindfulness intervention to enhance mental health and well-being.
A total of 522 young people aged 12–16 in 12 secondary schools either participated in the Mindfulness in Schools Programme (intervention) or took part in the usual school curriculum (control).
Rates of acceptability were high. Relative to the controls, and after adjusting for baseline imbalances, children who participated in the intervention reported fewer depressive symptoms post-treatment (P = 0.004) and at follow-up (P = 0.005) and lower stress (P = 0.05) and greater well-being (P = 0.05) at follow-up. The degree to which students in the intervention group practised the mindfulness skills was associated with better well-being (P<0.001) and less stress (P = 0.03) at 3-month follow-up.
The findings provide promising evidence of the programme's acceptability and efficacy.
Rapid and wide dispersal of passengers after flights makes investigation of flight-related outbreaks challenging. An outbreak of Salmonella Heidelberg was identified in a group of Irish travellers returning from Tanzania. Additional international cases sharing the same flight were identified. Our aim was to determine the source and potential vehicles of infection. Case-finding utilized information exchange using experts' communication networks and national surveillance systems. Demographic, clinical and food history information was collected. Twenty-five additional cases were identified from Ireland, The Netherlands, Norway, USA and Canada. We conducted a case-control study which indicated a significant association between illness and consumption of milk tart (OR 10·2) and an egg dish (OR 6) served on-board the flight. No food consumed before the flight was associated with illness. Cases from countries other than Ireland provided supplementary information that facilitated the identification of likely vehicles of infection. Timely, committed international collaboration is vital in such investigations.
Groupers and reef octopus are economically and ecologically important predators on Indo-Pacific coral reefs and known as solitary hunters. Here we describe a highly unique and unusual observation of a behavioural non-random association between the highfin grouper (Epinephelus maculatus) and the reef octopus (Octopus cyanea) in the Great Barrier Reef. Such an observation is highly novel given that the association crosses both the phyla and the invertebrate to vertebrate divide. The present study hypothesizes that the association is non-random and potentially the result of cooperative hunting but as such this requires further evidence and testing.
The effects of source field plates on AlGaN/GaN High Electron Mobility Transistor reliability under off-state stress conditions were investigated using step-stress cycling. The source field plate enhanced the drain breakdown voltage from 55V to 155V and the critical voltage for off-state gate stress from 40V to 65V, relative to devices without the field plate. Transmission electron microscopy was used to examine the degradation of the gate contacts. The presence of cracking that appeared on both source and drain side of the gate edges was attributed to the inverse piezoelectric effect. In addition, a thin oxide layer was observed between the Ni gate contact and the AlGaN layer, and both Ni and oxygen had diffused into the AlGaN layer. The critical degradation voltage of AlGaN/GaN High Electron Mobility Transistors during off-state electrical stress was determined as a function of Ni/Au gate dimensions (0.1-0.17μm). Devices with different gate length and gate-drain distances were found to exhibit the onset of degradation at different source-drain biases but similar electric field strengths, showing that the degradation mechanism is primarily field-driven. The temperature dependence of sub-threshold drain current versus gate voltage at a constant drain bias voltage were used to determine the trap densities in AlGaN/GaN high electron mobility transistors (HEMTs) before and after the off-state stress. Two different trap densities were obtained for the measurements conducted at 300-493K and 493-573K, respectively.