To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
India's development cooperation efforts began soon after the country gained its Independence in 1947. Indeed, the first recorded instance was somewhat earlier, dating from the pre-independence interim government formed in September 1946, when Jawaharlal Nehru, India's first Prime Minister, established a fellowship programme for trainees from China and Indonesia.
Although India was a pioneer in such initiatives between developing countries, there has been very sparse documentation or public debate on the genesis and evolution of its development assistance programme.
In the early years after independence, India played an active role in promoting decolonisation across the world. From its own experience it was aware that the political independence of the newly emerging, or ‘developing’, countries could be sustained only if they were also masters of their economic destiny. India's political leadership fashioned a dual approach to meeting this challenge. On one hand, the UN became a platform to mobilise an international consensus behind the proposition that economic and social progress in developing countries was an international responsibility to be implemented collectively through multilateral action. On the other hand, there had to be parallel emphasis on development cooperation between developing countries themselves, bilaterally as well as multilaterally through the UN. Newly-emerging countries should not be entirely dependent upon economic assistance from the developed world; they must also pool their own resources and capabilities to help each other. It was this process that became known as ‘South–South Cooperation’ (SSC) of which India was an early and enthusiastic proponent as well as practitioner. Propagated domestically as one aspect of solidarity and common ground between poor and developing countries, Indian assistance to the developing world found tacit political acceptance and support.
India worked very hard in multilateral forums to forge an international consensus on the principle that developed countries had an international obligation to support development in their former colonies and in newly-emerging countries, not as charity or a dole-out to the poor but rather as a partnership for peace and prosperity. It is for this reason that India has never accepted the donor-recipient relationship that has come to characterise overseas development assistance (ODA) in general; instead it has hewed to the idea of development partnership, whether for North–South or South–South cooperation.
Hospital evacuations of patients with special needs are extremely challenging, and it is difficult to train hospital workers for this rare event.
Researchers developed an in-situ simulation study investigating the effect of standardized checklists on the evacuation of a patient under general anesthesia from the operating room (OR) and hypothesized that checklists would improve the completion rate of critical actions and decrease evacuation time.
A vertical evacuation of the high-fidelity manikin (SimMan3G; Laerdal Inc.; Norway) was performed and participants were asked to lead the team and evacuate the manikin to the ground floor after a mock fire alarm. Participants were randomized to two groups: one was given an evacuation checklist (checklist group [CG]) and the other was not (non-checklist group [NCG]). A total of 19 scenarios were run with 28 participants.
Mean scenario time, preparation phase of evacuation, and time to transport the manikin down the stairs did not differ significantly between groups (P = .369, .462, and .935, respectively). The CG group showed significantly better performance of critical actions, including securing the airway, taking additional drug supplies, and taking additional equipment supplies (P = .047, .001, and .001, respectively). In the post-evacuation surveys, 27 out of 28 participants agreed that checklists would improve the evacuation process in a real event.
Standardized checklists increase the completion rate of pre-defined critical actions in evacuations out of the OR, which likely improves patient safety. Checklist use did not have a significant effect on total evacuation time.
Under the European Union’s Solvency II regulations, insurance firms are required to use a one-year VaR (Value at Risk) approach. This involves a one-year projection of the balance sheet and requires sufficient capital to be solvent in 99.5% of outcomes. The Solvency II Internal Model risk calibrations require annual changes in market indices/term structure for the estimation of risk distribution for each of the Internal Model risk drivers. This presents a significant challenge for calibrators in terms of:
Robustness of the calibration that is relevant to the current market regimes and at the same time able to represent the historically observed worst crisis;
Stability of the calibration model year on year with arrival of new information.
The above points need careful consideration to avoid credibility issues with the Solvency Capital Requirement (SCR) calculation, in that the results are subject to high levels of uncertainty.
For market risks, common industry practice to compensate for the limited number of historic annual data points is to use overlapping annual changes. Overlapping changes are dependent on each other, and this dependence can cause issues in estimation, statistical testing, and communication of uncertainty levels around risk calibrations.
This paper discusses the issues with the use of overlapping data when producing risk calibrations for an Internal Model. A comparison of the overlapping data approach with the alternative non-overlapping data approach is presented. A comparison is made of the bias and mean squared error of the first four cumulants under four different statistical models. For some statistical models it is found that overlapping data can be used with bias corrections to obtain similarly unbiased results as non-overlapping data, but with significantly lower mean squared errors. For more complex statistical models (e.g. GARCH) it is found that published bias corrections for non-overlapping and overlapping datasets do not result in unbiased cumulant estimates and/or lead to increased variance of the process.
In order to test the goodness of fit of probability distributions to the datasets, it is common to use statistical tests. Most of these tests do not function when using overlapping data, as overlapping data breach the independence assumption underlying most statistical tests. We present and test an adjustment to one of the statistical tests (the Kolmogorov Smirnov goodness-of-fit test) to allow for overlapping data.
Finally, we explore the methods of converting “high”-frequency (e.g. monthly data) to “low”-frequency data (e.g. annual data). This is an alternative methodology to using overlapping data, and the approach of fitting a statistical model to monthly data and then using the monthly model aggregated over 12 time steps to model annual returns is explored. There are a number of methods available for this approach. We explore two of the widely used approaches for aggregating the time series.
The present study examines characteristics of those who benefited from a dietary Fe intervention comprised of salt double-fortified with iodine and Fe (DFS).
Data from a randomized controlled trial were analysed to identify predictors of improved Fe status and resolution of Fe deficiency (serum ferritin (sFt) < 12 μg/l) and low body Fe (body Fe (BI) < 0·0 mg/kg) using non-parametric estimations and binomial regression models.
A tea estate in West Bengal, India.
Female tea pluckers, aged 18–55 years.
Consuming DFS significantly (P = 0·01) predicted resolution of Fe deficiency (relative risk (RR) = 2·31) and of low BI (RR = 2·78) compared with consuming iodized salt. Baseline sFt (β = –0·32 (se 0·03), P < 0·001) and treatment group (β = 0·13 (se 0·03), P < 0·001) significantly predicted change in sFt. The interaction of baseline BI with treatment group (β = –0·11 (se 0·06), P = 0·08) predicted the change in BI. DFS did not significantly predict change in Hb and marginally predicted resolution of anaemia (Hb < 120 g/l).
Baseline Fe status, as assessed by sFt and BI, and consumption of DFS predict change in Fe status and resolution of Fe deficiency and low BI. Anaemia prevalence and Hb level, although simple and inexpensive to measure, may not be adequate to predict resolution of Fe deficiency in response to an intervention of DFS in similar populations with high prevalence of Fe deficiency and multiple nutritional causes of anaemia. These findings will guide appropriate targeting of future interventions.
The use of statistical/machine learning (ML) approaches to materials science is experiencing explosive growth. Here, we review recent work focusing on the generation and application of libraries from both experiment and theoretical tools. The library data enables classical correlative ML and also opens the pathway for exploration of underlying causative physical behaviors. We highlight key advances facilitated by this approach and illustrate how modeling, macroscopic experiments, and imaging can be combined to accelerate the understanding and development of new materials systems. These developments point toward a data-driven future wherein knowledge can be aggregated and synthesized, accelerating the advancement of materials science.
Recognising the significant extent of poor-quality care and human rights issues in mental health, the World Health Organization launched the QualityRights initiative in 2013 as a practical tool for implementing human rights standards including the United Nations Convention on Rights of Persons with Disabilities (CRPD) at the ground level.
To describe the first large-scale implementation and evaluation of QualityRights as a scalable human rights-based approach in public mental health services in Gujarat, India.
This is a pragmatic trial involving implementation of QualityRights at six public mental health services chosen by the Government of Gujarat. For comparison, we identified three other public mental health services in Gujarat that did not receive the QualityRights intervention.
Over a 12-month period, the quality of services provided by those services receiving the QualityRights intervention improved significantly. Staff in these services showed substantially improved attitudes towards service users (effect sizes 0.50–0.17), and service users reported feeling significantly more empowered (effect size 0.07) and satisfied with the services offered (effect size 0.09). Caregivers at the intervention services also reported a moderately reduced burden of care (effect size 0.15).
To date, some countries are hesitant to reforming mental health services in line with the CRPD, which is partially attributable to a lack of knowledge and understanding about how this can be achieved. This evaluation shows that QualityRights can be effectively implemented even in resource-constrained settings and has a significant impact on the quality of mental health services.
To determine whether central findings from vestibular tests predict abnormal findings on magnetic resonance imaging.
This study was a retrospective case series at a tertiary referral centre. The main outcome measure of this diagnostic intervention study was the positive predictive value of central vestibular findings in relation to magnetic resonance imaging abnormalities.
Central vestibular findings had a 50.9 per cent positive predictive value for magnetic resonance imaging abnormalities across all age groups although they varied according to age group. Optokinetic nystagmus (p < 0.05) and abnormal findings on videonystagmography tests (p < 0.05) were the main predictors of magnetic resonance imaging abnormalities. White matter lesions constituted the bulk of the central lesions on magnetic resonance imaging followed by cortical and cerebellar atrophy.
Central vestibular findings had a 50.9 per cent positive predictive value for magnetic resonance imaging abnormalities across all age groups. Magnetic resonance imaging is medically justified to further evaluate patients with central findings on vestibular studies. Therefore, it is reasonable to request magnetic resonance imaging in these patients.
To measure the association between receipt of specific infection prevention interventions and procedure-related cardiac implantable electronic device (CIED) infections.
Retrospective cohort with manually reviewed infection status.
Setting: National, multicenter Veterans Health Administration (VA) cohort.
Sampling of procedures entered into the VA Clinical Assessment Reporting and Tracking-Electrophysiology (CART-EP) database from fiscal years 2008 through 2015.
A sample of procedures entered into the CART-EP database underwent manual review for occurrence of CIED infection and other clinical/procedural variables. The primary outcome was 6-month incidence of CIED infection. Measures of association were calculated using multivariable generalized estimating equations logistic regression.
We identified 101 procedure-related CIED infections among 2,098 procedures (4.8% of reviewed sample). Factors associated with increased odds of infections included (1) wound complications (adjusted odds ratio [aOR], 8.74; 95% confidence interval [CI], 3.16–24.20), (2) revisions including generator changes (aOR, 2.4; 95% CI, 1.59–3.63), (3) an elevated international normalized ratio (INR) >1.5 (aOR, 1.56; 95% CI, 1.12–2.18), and (4) methicillin-resistant Staphylococcus colonization (aOR, 9.56; 95% CI, 1.55–27.77). Clinically effective prevention interventions included preprocedural skin cleaning with chlorhexidine versus other topical agents (aOR, 0.41; 95% CI, 0.22–0.76) and receipt of β-lactam antimicrobial prophylaxis versus vancomycin (aOR, 0.60; 95% CI, 0.37–0.96). The use of mesh pockets and continuation of antimicrobial prophylaxis after skin closure were not associated with reduced infection risk.
These findings regarding the real-world clinical effectiveness of different prevention strategies can be applied to the development of evidence-based protocols and infection prevention guidelines specific to the electrophysiology laboratory.
Background: Micrographia is a rare neurological finding in isolation. Most cases of isolated micrographia have been found in association with focal ischemia of the left basal ganglia. Methods: We present a case of post-traumatic micrographia stemming from contusion to the left basal ganglia. We performed a detailed analysis of the patient’s writing at three-year follow-up. Results: A halthy 15 year old male was admitted following a BM accident. CT showed contusion to the left basall ganglia/external capsule. MRI was negative for underlying lesion. He had a short stay in the ICU and then was discharged. Two years later, he expressed concern regarding difficulty with sma, cramped writing at school. Writing analysis revealed micrographia with spontaneous printing as well as printing to dictation, but not with copied English nor Japanese writing. Conclusions: Isolated micrographia is a rare neurological finding. We present the incidence of this symptom folllowing gliding contusion to the et basal ganglia and external capsule.
Globally, grandparents are the main informal childcare providers with one-quarter of children aged ≤5 years regularly cared for by grandparents in Australia, the UK and USA. Research is conflicting; many studies claim grandparents provide excessive amounts of discretionary foods (e.g. high in fat/sugar/sodium) while others suggest grandparents can positively influence children’s diet behaviours. The present study aimed to explore the meaning and role of food treats among grandparents who provide regular informal care of young grandchildren.
Qualitative methodology utilising a grounded theory approach. Data were collected using semi-structured interviews and focus groups, then thematically analysed.
Participants were recruited through libraries, churches and playgroups in South Australia.
Grandparents (n 12) caring for grandchild/ren aged 1–5 years for 10 h/week or more.
Three themes emerged: (i) the functional role of treats (e.g. to reward good behaviour); (ii) grandparent role, responsibility and identity (e.g. the belief that grandparent and parent roles differ); and (iii) the rules regarding food treats (e.g. negotiating differences between own and parental rules). Grandparents favoured core-food over discretionary-food treats. They considered the risks (e.g. dental caries) and rewards (e.g. pleasure) of food treats and balanced their wishes with those of their grandchildren and parents.
Food treats play an important role in the grandparent–grandchild relationship and are used judiciously by grandparents to differentiate their identity and relationship from parents and other family members. This research offers an alternative narrative to the dominant discourse regarding grandparents spoiling grandchildren with excessive amounts of discretionary foods.
OBJECTIVES/SPECIFIC AIMS: Our objective was to compare the proteomics of HDL between youth with T1DM and healthy controls (HC). METHODS/STUDY POPULATION: We did chromatography-based HDL purification and SWATH-MS-based proteomic quantitation. Proteomic alterations of HDL fractions and their association with glycemic control was examined. Study population: 26 patients with T1DM and 13 HC. RESULTS/ANTICIPATED RESULTS: We quantified 78 proteins in isolated HDL, using mass spectrometry and label-free SWATH quantification. Youth with T1DM had significantly higher protein levels of A1BG (P = 0.008), A2AP (P = 0.0448), APOA4 (P = 0.0366), CFAH (P = 0.0476), FHR2 (P = 0.0005), ITIH4(P = 0.01), PGRP2 (P = 0.0167) and lower levels of ALBU (P = 0.0164) and CO3 (P = 0.019) compared to HC. A1BG (r=0.541, P<0.001) and ITIH4 (r=0.357, P = 0.026) were significantly positively correlated with HbA1c. DISCUSSION/SIGNIFICANCE OF IMPACT: Youth with T1DM have proteomic alterations of their HDL compared to HC, despite similar concentration of HDL cholesterol, that might affect the cardioprotective mechanisms of HDL. Future efforts should focus on investigating the role of these HDL associated proteins in regard to HDL function and their role in CVD risk in patients with T1DM.
Among 300 advanced cancer patients with potential urinary tract infection (UTI), 19 had symptomatic UTI. Among remaining patients (n = 281), 21% had asymptomatic bacteriuria or candiduria, and 14% received inappropriate therapy for 279 antimicrobial days. Bacteriuria or candiduria predicted antimicrobial therapy. At 10,000 to <100,000 CFU/mL, the incidence rate ratio [IRR] was 16.9 (95% confidence interval [CI], 6.0–47.2), and at ≥100,000 CFU/mL, the IRR was 27.9 (95% CI, 10.9–71.2).
OBJECTIVES/SPECIFIC AIMS: Our study aims to create a novel state level HIV-ESRD dataset and compare patient-level characteristics on rates of transplant referral, evaluation, waitlisting, and transplantation for HIV-positive versus HIV-negative patients. Our main hypothesis is that HIV-positive patients in Georgia are less likely to be referred to kidney transplant compared with HIV-negative patients. METHODS/STUDY POPULATION: Three datasets will be merged in order to create the HIV-ESRD dataset. The datasets are United States Renal Data System (USRDS), a southeast Transplant Referral Dataset and patient-level Georgia Department of Public Health HIV Incidence Database. The resulting study population will include patients that are older than 18, but less than 70, are HIV-positive and are on dialysis in Georgia. This dataset will also identify those patients who have been referred to transplantation, have been waitlisted, and have received kidney transplants between January 2012 and December 2017. If within a 1-year period, the prevalence of HIV-positive patients referred to transplant was lower than the 1-year period prevalence of HIV-negative patients for 3 consecutive years, the dialysis facility will be classified as having a within-facility disparity. We will then characterize patient level and dialysis facility-level factors that may contribute to observed findings. Patient characteristics will include demographic, clinical data, proxies of socioeconomic status, and geospatial relationships to transplant centers and rural vs urban neighborhoods. Facility-level characteristics includes profit status (profit vs. nonprofit), total number of staff (including full-time and part-time employees), aggregate demographic and clinical facility characteristics, and total number of treated patients. RESULTS/ANTICIPATED RESULTS:. We anticipate the successful creation of the proposed dataset that will allow for accurate identification of HIV-positive patients on dialysis in Georgia.. This dataset will provide the ability to determine referral, waitlisting, and transplantation rates.. We predict the overall rate of referral, waitlisting, and kidney transplantation in HIV patients will be relatively low, and that dialysis facilities with a higher proportion of HIV-positive will have lower referral rates compared to dialysis facilities treating a higher proportion of HIV-negative patients. It is foreseen that among patient-level characteristics, the strongest predictor for decreased referral rates will be HIV serostatus and among dialysis facility factors, profit status will be associated with decreased referral rates. DISCUSSION/SIGNIFICANCE OF IMPACT: This pilot study offers the creation of the first regional dataset of HIV-ESRD patients that will include patient-level characteristic of HIV-positive patients and provide a model for other states to adopt. We will contribute improved state-level description of incidence data of HIV-positive patients on dialysis, current rates of transplant referral, waitlisting, and transplantation, and offer potential associated factors that influence these processes. This knowledge will be used to determine the next steps in improving access to care; conducting qualitative research to understand dialysis facility views on transplant in HIV patients, understanding HIV patient’s position on transplantation, providing education on the value of kidney transplant referral, and expanding the approach of combining patient level HIV data to the southeast.
OBJECTIVES/SPECIFIC AIMS: In a randomized controlled trial in participants with HIV infection, recombinant human growth hormone (rhGH) reduced visceral adipose tissue (VAT); addition of rosiglitazone to rhGH prevented the accompanying decline in insulin sensitivity (SI). Within this parent RCT, we sought to determine the effect of rosiglitazone and rhGH intervention on alpha-1-acid glycoprotein (AGP), a biomarker of inflammation. We also investigated AGP as an independent risk factor for SI and VAT changes along with any potential effect modification by AGP of the intervention. METHODS/STUDY POPULATION: Participants with HIV-infection (n = 72) with abdominal adiposity and insulin resistance were randomized to rosiglitazone, rhGH, combination, or placebo for 12 weeks (NCT00130286). SI was determined by frequently sampled intravenous glucose tolerance test, and VAT by whole body MRI. AGP concentrations were determined by immunoturbidimetric assay in available serum samples at baseline (time 0), 4, and 12 weeks (n = 41 participants with samples at all 3 time points). A linear mixed model was used to assess the impact of intervention over time on AGP concentrations. General linear models were used to assess baseline AGP concentrations as an independent predictor of SI and VAT changes by treatment group with the model initially including age quartile, gender, race, ethnicity, BMI, HIV RNA <400 copies/mL, antiretroviral regimen, CD4 count, Stavudine use, and zidovudine use with step-by-step removal of least significant predictors. Effect modification was assessed by adding an interaction term between AGP and assigned intervention. RESULTS/ANTICIPATED RESULTS: AGP did not differ among treatment groups at baseline; overall median (Q1, Q3): 0.608 (.526,.727) g/L, P = 0.92. Treatment with rosiglitazone, rhGH, or the combination significantly reduced AGP concentrations from baseline to week 12, compared to placebo (time by treatment interaction, P = 0.0038). Baseline AGP was not a significant predictor or effect modifier of SI change in response to treatment (P ≥ 0.50). Baseline AGP (g/L) was an independent predictor of VAT change (L) (β = 1.91, SE = 0.89, P = 0.038) in addition to a treatment effect (P < 0.001) and age quartile effect (P < 0.001). No other predictors or interactions were significant, including effect modification of AGP (AGP by treatment interaction P = 0.50). DISCUSSION/SIGNIFICANCE OF IMPACT: It is known that immune and metabolic pathways are highly integrated, and biomarkers of inflammation have predictive abilities for cardiovascular and metabolic disease outcomes. This analysis provides data showing that treatment with rosiglitazone or rhGH in the context of HIV reduces AGP concentrations, indicating efficacy in reducing systemic inflammation. Baseline AGP was an independent risk factor for VAT changes as those with lower AGP at baseline showed a greater reduction in VAT in response to treatment. Biomarkers of inflammation may provide prognostic information for individualized patient outcomes to help guide treatment and follow-up.
Darius J. Khambata, Practises before the Bombay High Court, Supreme Court of India and other High Courts and Tribunals across India.,
Aditya N. Mehta, Practises before the Bombay High Court and various tribunals
India's approach to foreign investment has evolved from strict protectionism to a liberalisation of its exchange control laws. This chapter attempts an analysis of the safety and stability currently afforded to foreign investment in India by its laws and the legal redress available in India. India has only equivocally, opened its doors to foreign capital despite the compelling argument for attracting greater foreign investment and thus rapidly expanding its economy. Yet as India advances to being one of the world's largest economies, new conundrums are arising, including whether India can adequately protect its own investments abroad.
THE HISTORICAL CONTEXT
India has been a trading nation from antiquity. Yet for several decades after 1947, when India attained independence from British rule, its economic philosophy reflected its reaction to Western colonialism and was largely protectionist.
The British were the predominant colonial power in South Asia (now India, Pakistan, Bangladesh, Sri Lanka and Burma). The devastation of Indian industry and trade over about 190 years of colonisation was achieved by taxation and expropriation. India was reduced to a deindustrialised exporter (largely to Britain) of raw materials such as cotton, jute, silk, coal, iron ore, rice, diamonds, spices and tea. Colonial rule left India penurious and distrustful of foreign business interests. In 1700, India's share of the world economy had been a staggering 23 percent; equal to all of Europe. At the end of British colonial rule it was 3 percent.
EXPROPRIATION AND PROTECTIONISM
India enacted the Foreign Exchange Regulation Act 1947 (FERA 1947) with the limited object of regulating the outflow of foreign exchange from India. FERA 1947 was initially introduced as a temporary measure, and it was only in 1957 that it was made permanent. FERA 1947 subjected persons and entities resident in India to exchange control regulation. The controls were not stringent and India permitted business to be carried on relatively freely by branches of foreign firms and companies in which non-residents had a substantial interest.
By the late 1960s India's economic policy had become protectionist and state oriented. India nationalised several industries including the Indian subsidiaries of foreign oil companies Esso, Burmah-Shell and Caltex, then in 1969 all major private Indian Banks and in 1980, jute companies.
This analysis was conducted to evaluate the evidence of the efficacy of iron biofortification interventions on iron status and functional outcomes. Iron deficiency is a major public health problem worldwide, with a disproportionate impact on women and young children, particularly those living in resource-limited settings. Biofortification, or the enhancing of micronutrient content in staple crops, is a promising and sustainable agriculture-based approach to improve nutritional status. Previous randomised efficacy trials and meta-analyses have demonstrated that iron-biofortification interventions improved iron biomarkers; however, no systematic reviews to date have examined the efficacy of biofortification interventions on health outcomes. We conducted a systematic review of the efficacy of iron-biofortified staple crops on iron status and functional outcomes: cognitive function (e.g. attention, memory) and physical performance. Five studies from three randomised efficacy trials (i.e. rice, pearl millet, beans) conducted in the Philippines, India and Rwanda were identified for inclusion in this review. Iron status (Hb, serum ferritin, soluble transferrin receptor, total body iron, α-1-acid glycoprotein) was measured at baseline and endline in each trial; two studies reported cognitive outcomes, and no studies reported other functional outcomes. Meta-analyses were conducted using DerSimonian and Laird random-effects methods. Iron-biofortified crop interventions significantly improved cognitive performance in attention and memory domains, compared with conventional crops. There were no significant effects on categorical outcomes such as iron deficiency or anaemia. Further studies are needed to determine the efficacy of iron-biofortified staple crops on human health, including additional functional outcomes and other high-risk populations.
Aberrations in reward and penalty processing are implicated in depression and putatively reflect altered dopamine signalling. This study exploits the advantages of a placebo-controlled design to examine how a novel D2 antagonist with adjunctive antidepressant properties modifies activity in the brain's reward network in depression.
We recruited 43 medication-naïve subjects across the range of depression severity (Beck's Depression Inventory-II score range: 0–43), including healthy volunteers, as well as people meeting full-criteria for major depressive disorder. In a double-blind placebo-controlled cross-over design, all subjects received either placebo or lurasidone (20 mg) across two visits separated by 1 week. Functional magnetic resonance imaging with the Monetary Incentive Delay (MID) task assessed reward functions via neural responses during anticipation and receipt of gains and losses. Arterial spin labelling measured cerebral blood flow (CBF) at rest.
Lurasidone altered fronto-striatal activity during anticipation and outcome phases of the MID task. A significant three-way Medication-by-Depression severity-by-Outcome interaction emerged in the anterior cingulate cortex (ACC) after correction for multiple comparisons. Follow-up analyses revealed significantly higher ACC activation to losses in high- v. low depression participants in the placebo condition, with a normalisation by lurasidone. This effect could not be accounted for by shifts in resting CBF.
Lurasidone acutely normalises reward processing signals in individuals with depressive symptoms. Lurasidone's antidepressant effects may arise from reducing responses to penalty outcomes in individuals with depressive symptoms.