To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Background: Telemedicine has been defined as the use of technology to provide healthcare when the provider and patient are geographically separated. Use of telemedicine to meet the needs of specific populations has become increasingly common across Canada. The current study employs the Ontario Telemedicine Network (OTN) to connect the emergency departments of a community hospital system and a pediatric tertiary care hospital. OTN functions through a two-way video conferencing system, allowing physicians at the tertiary site to see and hear the patient being treated in the community hospitals. Aim Statement: The aim of this project is to ensure essential care is provided to CTAS 1 and 2 pediatric patients who present to Niagara Health emergency departments, to increase the number of appropriate patient transfers. Measures & Design: Data for this project include a) description of common diagnoses, b) time of call, c) occurrence of transfers, and d) professional perceptions of the technology. A descriptive design was used together with the implementation of quality improvement cycles as the intervention occurred. Quality improvement methodologies including plan-do-study-act (PDSA) cycles ensured continuous improvement to the process of OTN use and therefore patient safety throughout the study. Evaluation/Results: Since the intervention was employed on December 17, 2018 there have been a total of 19 cases for which 4 transfers were requested. Changes to the process were made including the addition of weekly technology tests and feedback to health professionals involved to garner further support for the use. Results have indicated that seizure was the most common diagnosis, accounting for 37% of cases. The majority of calls were placed after 19:00 hours with no calls being placed between 24:00 and 10:00. Discussion/Impact: Healthcare providers had positive perceptions of the technology agreeing that decision making between on-site and remote teams was timely and collaborative, as well as that patient care and outcomes were improved with its use. The results of this study will be used to determine the benefits of employing telemedicine in the emergency departments of other hospital systems.
Cardiac disease has been the leading cause of overall maternal mortality in the UK since the 2002–2004 triennium. The maternal death rate from cardiac disease has increased from 1.65 per 100,000 maternities in the 1997–1999 triennium to 2.34 per 100,000 maternities in the 2013–2015 triennium. This is thought to be due to increasing maternal age, increasing levels of obesity and better recognition of cardiac pathology at autopsy.
Currently there is no consensus regarding how long anti-psychotics medication should be continued following a first/single psychotic episode. Clinically patients often request discontinuation after a period of remission. This is one of the first double-blind randomized-controlled studies designed to address the issue.
Patients with DSM-IV schizophrenia and related psychoses (excluding substance induced psychosis) who remitted well following a first/single-episode, and had remained well on maintenance medication for one year, were randomized to receive either maintenance therapy with quetiapine (400 mg/day), or placebo for 12 months. Relapse was defined by the presence of (i) an increase in at least one of the following PANSS psychotic symptom items to a threshold score (delusion, hallucinatory behaviour, conceptual disorganization, unusual thought content, suspiciousness); (ii) CGI Severity of Illness 3 or above; and (iii) CGI Improvement 5 or above.
178 patients were randomized. 144 patients completed the study (80.9%). The relapse rate was 33.7% (30/89) for the maintenance group and 66.3% (59/89) for the placebo group (log-rank test, chi-square=13.328, p<0.001). Relapse was not related to age or gender. Other significant predictors of relapse include medication status, pre-morbid schizotypal traits, verbal memory and soft neurological signs.
There is a substantial risk of relapse if medication is discontinued in remitted first-episode psychosis patients following one year of maintenance therapy. On the contrary 33.7% of patients discontinued medication and remained well.
Medication discontinuation in remitted single episode patients after a period of maintenance therapy is a major clinical decision and thus the identification of risk factors controlling for medication status is important.
Following a first/single episode with DSM-IV schizophrenia and related psychoses, remitted patients who had remained well on maintenance medication for at least one year were randomized to receive either maintenance therapy (with quetiapine 400 mg/day), or placebo for 12 months.
178 patients were randomized. Relapse rates were 33.7% (30/89) in maintenance group and 66.3% (59/89) in placebo group. Potential predictors were initially identified in univariate Cox regression models (p<0.1) and were subsequently entered into a multivariate Cox regression model for measuring the relapse risk. Significant predictors included patients on placebo (hazard ratio, 0.41; CI, 0.25 – 0.68; p=0.001); having more pre-morbid schizotypal traits (hazard ratio, 2.32; CI, 1.33 – 4.04; p=0.003); scoring lower in the logical memory test (hazard ratio, 0.94; CI, 0.9 – 0.99; p=0.028); and having more soft neurological signs (disinhibition) (hazard ratio, 1.33; CI, 1.02 – 1.74; p=0.039).
Relapse predictors may help to inform clinical decisions about discontinuation of maintenance therapy specifically for patients with a first/single episode psychosis following at least one year of maintenance therapy.
We are grateful to Dr TJ Yao at the Clinical Trials Center, University of Hong Kong, for statistical advice. The study was supported by investigator initiated trial award from AstraZeneca and the Research Grants Council Hong Kong (Project number: 765505).
This paper investigates the recidivism of Mount Sinai Hospital mental health court support program in Toronto, Canada among patients involved in the criminal justice system. It also looks to find relationships between recidivism and factors including gender, age and ethnicity.
Follow up periods of up to 48 months after the time of initial admission to the program was conducted and the frequency of re-offense was observed. Comparisons for the significance of risk factors were analyzed using t-tests and Chisquare tests.
191 clients were admitted to the Mount Sinai Hospital Court Support Program between September 2001 and June 2007. At first admission, the mean ± s.d. age was 35.8 ± 9.8 years (range=18-74 years; n=184). The median age was 35 years. The modal age was 34 years. Of the 191 clients, 16 (8.4%) reoffended. Two of them (12.5%) had a third offense; and 1 (6.3%) had a total of four offenses within this tracking period. it appears that re-offense is more likely between 13 and 24 months. No re-offense was noted beyond the 48 months. The gender distribution was not significantly different between reoffenders and non-reoffenders. The mean age at first admission also did not differ between reoffenders and non-reoffenders. The distribution of ethnic groups among reoffenders and non-reoffenders did not differ.
The findings seem to indicate that recidivism has no relationship with gender, age and ethnic groups. The comprehensive and length of support services seem more important in preventing recidivism.
Continued reliance on chemical methods for controlling annual bluegrass has resulted in many populations evolving resistance to PRE and POST herbicides, particularly in warm-season turfgrass species such as zoysiagrass. Soil seedbank management is critically important when managing herbicide-resistant weeds. Fraise mowing (also spelled fraze, frase, and fraize) is a new turfgrass cultivation practice designed to remove aboveground biomass while allowing turf to regrow vegetatively. We hypothesized that this process would remove annual bluegrass seed and therefore be a mechanical means of controlling annual bluegrass in turfgrass. Zoysiagrass field plots were fraise-mowed in June 2015 only, June 2016 only, June 2015 and June 2016, or left untreated. The fraise mower was configured to remove the uppermost 25 mm of plot surface (i.e., 15-mm verdure and 10-mm soil). Annual bluegrass infestation was quantified in April following fraise mowing via grid count. Soil cores (10.8 cm diameter) were extracted from each plot after grid count data were collected to assess effects of fraise mowing on the soil seedbank. Moreover, replicated subsamples (7.6 L) of debris generated during fraise mowing were collected to better understand weed seed content removed during the fraise mowing process. Fraise mowing in June offered a slight reduction (24%) in annual bluegrass cover the following April. Whereas 28% of the seed in fraise-mowing debris consisted of annual bluegrass, there was no difference in the quantity of annual bluegrass seed remaining in the soil seedbank among fraise-mowed and non–fraise-mowed plots. Although fraise mowing may help to temporarily reduce existing annual bluegrass infestations via mechanical removal, the frequency and depth we studied did not effectively reduce the seedbank. Fraise mowing is a useful tool for providing mechanical suppression of annual bluegrass but it is not a replacement for properly timed herbicide applications.
Red meat is an important dietary source of protein and many other essential nutrients including omega(n)-3 polyunsaturated fatty acids (PUFA) which provide numerous benefits to human health. It is well known that grass-fed meat contains a more favourable fatty acid profile, compared to other feeding regimes, but the feasibility of grass finishing is in decline for many farmers/producers. Therefore, alternative methods to enhance the fatty acid profile of red meats, such as beef, are needed to meet increasing consumer demands for ‘healthier’ products. This study compared plasma PUFA concentrations across cattle finished on three different feeding regimes. Three farms supplied livestock to the current study, where cattle were fed three different feeding regimes for a minimum of 15-weeks prior to slaughter. Feeding regimes were ad lib concentrate (negative control), n3-enriched ad lib concentrate (treatment) or grass-fed only (positive control). Blood was collected at slaughter into EDTA tubes and plasma aliquots were stored at -80°C until analysis. A validated gas chromatography–mass spectrometry (GC-MS) method was used to quantify individual PUFA concentrations in mg/ml [linoleic acid (LA); arachidonic acid (AA); alpha-linolenic acid (ALA); eicosapentaenoic acid (EPA); docosapentaenoic acid (DPA); docosahexaenoic acid (DHA)]. Samples from 23, 49 and 40 animals (in control, treatment & grass groups, respectively) were available for the current analysis. One-way ANOVA tests revealed significant differences between groups in all PUFA concentrations quantified (all P < 0.026). Post-hoc (LSD) tests showed mean ± SD n3 PUFA concentrations were significantly different within all three groups (all P < 0.04), increasing from negative control (0.049 ± 0.013 mg/ml), to treatment (0.095 ± 0.034 mg/ml) and grass-fed groups (0.461 ± 0.132 mg/ml). The opposite was observed for mean ± SD n6 PUFA concentrations (1.060 ± 0.297 vs. 0.918 ± 0.267 vs. 0.355 ± 0.085 mg/ml, respectively; all P < 0.02). Cattle finished on either treatment or grass regimes had a more favourable n6:n3 PUFA ratio, compared to negative control (11.98 and 0.79 vs. 22.65, respectively). This study demonstrates that the finishing diet can impact plasma PUFA concentrations of beef cattle. Animals finished on the n3-enriched concentrate had, on average, double the total n3 PUFA concentrations, as well as an improved n6:n3 ratio, compared to control cattle. These results provide preliminary data on an alternative n3-enriched feeding regime for beef cattle to improve PUFA concentrations. Further research, however, is required to confirm if such beneficial changes are also observed in bovine muscle, which would have direct benefits for consumers.
Antibody-mediated rejection is a major clinical challenge that limits graft survival. Various modalities of treatment have been reported in small studies in paediatric heart recipients. A novel approach is to use complement-inhibiting agents, such as eculizumab, which inhibits cleavage of C5 to C5a thereby limiting the formation of membrane attack complex and terminal complement-mediated injury of tissue-bound antibodies. This medical modality of treatment has theoretical advantages but the collective experience in its use in the solid organ transplant community remains small. We add to this experience by combining 14 cases from 6 paediatric heart centres in this descriptive study.
Maternal inflammation in early pregnancy has been identified epidemiologically as a prenatal pathogenic factor for the offspring's later mental illness. Early newborn manifestations of the effects of maternal inflammation on human fetal brain development are largely unknown.
Maternal infection, depression, obesity, and other factors associated with inflammation were assessed at 16 weeks gestation, along with maternal C-reactive protein (CRP), cytokines, and serum choline. Cerebral inhibition was assessed by inhibitory P50 sensory gating at 1 month of age, and infant behavior was assessed by maternal ratings at 3 months of age.
Maternal CRP diminished the development of cerebral inhibition in newborn males but paradoxically increased inhibition in females. Similar sex-dependent effects were seen in mothers' assessment of their infant's self-regulatory behaviors at 3 months of age. Higher maternal choline levels partly mitigated the effect of CRP in male offspring.
The male fetal-placental unit appears to be more sensitive to maternal inflammation than females. Effects are particularly marked on cerebral inhibition. Deficits in cerebral inhibition 1 month after birth, similar to those observed in several mental illnesses, including schizophrenia, indicate fetal developmental pathways that may lead to later mental illness. Deficits in early infant behavior follow. Early intervention before birth, including prenatal vitamins, folate, and choline supplements, may help prevent fetal development of pathophysiological deficits that can have life-long consequences for mental health.
High-resolution Chirp sub-bottom data were obtained offshore from the Northern Channel Islands (NCI), California, to image submerged paleoshorelines and assess local uplift rates. Although modern bathymetry is often used for modeling paleoshorelines, Chirp data image paleoshorelines buried beneath sediment that obscures their seafloor expression. The NCI were a unified landmass during the last glacial maximum (LGM; ~20 ka), when eustatic sea level was ~120 m lower than present. We identified seven paleoshorelines, ranging from ~28 to 104 m in depth, across this now-submerged LGM platform. Paleoshoreline depths were compared to local sea-level curves to estimate ages, which suggest that some were reoccupied over multiple sea-level cycles. Additionally, previous studies determined conflicting uplift rates for the NCI, ranging from 0.16 to 1.5 m/ka. Our results suggest that a rate on the lower end of this range better fits the observed submerged paleoshorelines. Using the uplift rate of ~0.16 m/ka, we estimate that paleoshorelines formed during Marine Oxygen Isotope Stage 3, the LGM, and the Younger Dryas stade are preserved on the NCI platform. These results help clarify uplift rates for the NCI and illustrate the importance of sub-bottom data for mapping submerged paleoshorelines.
Introduction: Ureteral colic is a common painful disorder. Early surgical intervention is an attractive management option but existing evidence does not clarify which patients benefit. Based on lack of evidence, current national specialty guidelines provide conflicting recommendations regarding who is a candidate for early intervention. We compared treatment failure rates in patients receiving early intervention to those in patients offered spontaneous passage to identify subgroups that benefit from early intervention. Methods: We used administrative data and structured chart review to study consecutive patients attending one of nine hospitals in two provinces with an index emergency department (ED) visit and a confirmed 2.0-9.9 mm ureteral stone. We described patient, stone and treatment variables, and used multivariable regression to identify factors associated with treatment failure, defined as the need for rescue intervention or hospitalization within 60 days. Our secondary outcome was ED revisit rate. Results: Overall, 1168 (37.9%) of 3081 eligible patients underwent early intervention. Patients with small stones <5mm experienced more treatment failures (31.5% v. 9.9%) and more ED revisits (38.5% v. 19.7%) with early intervention than with spontaneous passage. Patients with large stones ≥7.0mm experienced fewer treatment failures (34.7% v. 58.6%) and similar ED revisit rates with early intervention. Patients with intermediate-sized 5.0-6.9mm stones had fewer treatment failures with intervention (37.4% v. 55.5%), but only if stones were in the proximal or middle ureter. Conclusion: This study clarifies stone characteristics that identify patients likely to benefit from early intervention. We recommend low-risk patients with uncomplicated stones <5mm generally undergo initial trial of spontaneous passage, while high-risk patients with proximal or middle stones >5mm, or any stone >7mm, be offered early intervention.
Introduction: The optimal initial management approach for ureteral colic is unclear. Guidelines recommend spontaneous passage for most patients, but early stone intervention may rapidly terminate acute episodes. We compared 60-day treatment failure rates in matched patients undergoing early intervention versus spontaneous passage. Methods: We used administrative data and structured chart review to study all emergency department (ED) patients at nine Canadian hospitals who had an index ureteral colic visit and a computed tomography (CT) confirmed 2.0-9.9 mm stone during 2014. Using Cox Proportional Hazards models, we assessed 60-day treatment failure, defined as hospitalization or rescue intervention, in patients undergoing early intervention compared to propensity-score matched controls undergoing trial of spontaneous passage. Results: From 3,081 eligible patients, mean age 51 years and 70% male, we matched 577 patients in each group (total 1154). Control and intervention cohorts were balanced on all parameters and propensity scores, which reflect the conditional probability a patient would undergo early intervention, were similarly distributed. In the time to event analysis, 21.8% in both groups experienced the composite primary outcome of treatment failure (difference = 0%; 95% CI, -4.8 to 4.8%). Early intervention patients required more ED revisits (36.1% v. 25.5%; difference 10.6%; 95% CI 5.3 to 15.9%) and more 60-day hospitalizations (20.1% v. 12.8%). The strongest predictors of adverse outcome were stone size, proximal or middle stone location, and ED length of stay. Conclusion: If applied broadly to patients with 2.0-9.9mm ureteral stones, an early interventional approach was associated with similar rates of treatment failure, but more hospitalizations and emergency revisits. Research clarifying subgroups most likely to benefit will facilitate better targeting of early intervention, potentially reducing patient morbidity and improving system utilization.
The Pain Catastrophizing Scale (PCS) measures three aspects of catastrophic cognitions about pain—rumination, magnification, and helplessness. To facilitate assessment and clinical application, we aimed to (a) develop a short version on the basis of its factorial structure and the items’ correlations with key pain-related outcomes, and (b) identify the threshold on the short form indicative of risk for depression.
Social centers for older people.
664 Chinese older adults with chronic pain.
Besides the PCS, pain intensity, pain disability, and depressive symptoms were assessed.
For the full scale, confirmatory factor analysis showed that the hypothesized 3-factor model fit the data moderately well. On the basis of the factor loadings, two items were selected from each of the three dimensions. An additional item significantly associated with pain disability and depressive symptoms, over and above these six items, was identified through regression analyses. A short-PCS composed of seven items was formed, which correlated at r=0.97 with the full scale. Subsequently, receiver operating characteristic (ROC) curves were plotted against clinically significant depressive symptoms, defined as a score of ≥12 on a 10-item version of the Center for Epidemiologic Studies-Depression Scale. This analysis showed a score of ≥7 to be the optimal cutoff for the short-PCS, with sensitivity = 81.6% and specificity = 78.3% when predicting clinically significant depressive symptoms.
The short-PCS may be used in lieu of the full scale and as a brief screen to identify individuals with serious catastrophizing.
This study investigated the characteristics of subjective memory complaints (SMCs) and their association with current and future cognitive functions.
A cohort of 209 community-dwelling individuals without dementia aged 47–90 years old was recruited for this 3-year study. Participants underwent neuropsychological and clinical assessments annually. Participants were divided into SMCs and non-memory complainers (NMCs) using a single question at baseline and a memory complaints questionnaire following baseline, to evaluate differential patterns of complaints. In addition, comprehensive assessment of memory complaints was undertaken to evaluate whether severity and consistency of complaints differentially predicted cognitive function.
SMC and NMC individuals were significantly different on various features of SMCs. Greater overall severity (but not consistency) of complaints was significantly associated with current and future cognitive functioning.
SMC individuals present distinctive features of memory complaints as compared to NMCs. Further, the severity of complaints was a significant predictor of future cognition. However, SMC did not significantly predict change over time in this sample. These findings warrant further research into the specific features of SMCs that may portend subsequent neuropathological and cognitive changes when screening individuals at increased future risk of dementia.
Some centres favour early intervention for ureteral colic while others prefer trial of spontaneous passage, and relative outcomes are poorly described. Calgary and Vancouver have similar populations and physician expertise, but differing approaches to ureteral colic. We studied 60-day hospitalization and intervention rates for patients having a first emergency department (ED) visit for ureteral colic in these diverse systems.
We used administrative data and structured chart review to study all Vancouver and Calgary patients with an index visit for ureteral colic during 2014. Patient demographics, arrival characteristics and triage category were captured from ED information systems, while ED visits and admissions were captured from linked regional hospital databases. Laboratory results were obtained from electronic health records and stone characteristics were abstracted from diagnostic imaging reports. Our primary outcome was hospitalization or urological intervention from 0 to 60 days. Secondary outcomes included ED revisits, readmissions and rescue interventions. Time to event analysis was conducted and Cox Proportional Hazards modelling was performed to adjust for covariate imbalance.
We studied 3283 patients with CT-defined stones. Patient and stone characteristics were similar for the cities. Hospitalization or intervention occurred in 60.9% of Calgary patients and 31.3% of Vancouver patients (p<0.001). Calgary patients had higher index intervention rates (52.1% v. 7.5%), and experienced more ED revisits and hospital readmissions during follow-up. The data suggest that outcome events were associated with overtreatment of small stones in one city and undertreatment of large stones in the other.
An early interventional approach was associated with higher ED revisit, hospitalization and intervention rates. If these events are markers of patient disability, then a less interventional approach to small stones and earlier definitive management of large stones may reduce system utilization and improve outcomes for patients with acute ureteral colic.
The purpose of this study was to investigate whether significant difference exists on radiation dose delivered to organs at risks in megavoltage computed tomography (MVCT) verification using three predefined scanning modes, namely fine (2 mm), normal (4 mm) and coarse (6 mm). This will provide information for the imaging protocol of tomotherapy for the left breast.
Materials and methods
Organ doses were measured using thermoluminescent dosimeters (TLD-100) placed within a female Rando phantom for MVCT imaging. Kruskal–Wallis test was conducted with p<0·05 to evaluate the significant difference between the three MVCT scanning modes.
Statistically significant difference existed in organ absorbed dose between different scan mode selections (p<0·001). Relative to the normal scan selection (4 mm), the absorbed dose to the organs of interests can be scaled down by 0·7 and scaled up by 2·1 for coarse (6 mm) and fine scans (2 mm) respectively.
Optimisation of imaging protocols is of paramount importance to keep the radiation exposure ‘as low as reasonably achievable’. The recommendation of undergoing daily coarse mode for MVCT verification in breast tomotherapy not only mitigates the radiation exposure to normal tissues, but also trims the scan-acquisition time.
This study examined the processing of derivational morphology and its association with measures of morphological awareness and literacy outcomes in 30 Dutch-speaking high-functioning dyslexics, and 30 controls, matched for age and reading comprehension. A masked priming experiment was conducted where the semantic overlap between morphologically related pairs was manipulated as part of a lexical decision task. Measures of morphological awareness were assessed using a specifically designed sentence completion task. Significant priming effects were found in each group, yet adults with dyslexia were found to benefit more from the morphological structure than the controls. Adults with dyslexia were found to be influenced by both form (morpho-orthographic) and meaning (morphosemantic) properties of morphemes while controls were mainly influenced by morphosemantic properties. The reports suggest that morphological processing is intact in high-functioning dyslexics and a strength when compared to controls matched for reading comprehension and age. Thus, reports support morphological processing as a potential factor in the reading compensation of adults with dyslexia. However, adults with dyslexia performed significantly worse than controls on morphological awareness measures.
As part of further investigations into three linked haemorrhagic fever with renal syndrome (HFRS) cases in Wales and England, 21 rats from a breeding colony in Cherwell, and three rats from a household in Cheltenham were screened for hantavirus. Hantavirus RNA was detected in either the lungs and/or kidney of 17/21 (81%) of the Cherwell rats tested, higher than previously detected by blood testing alone (7/21, 33%), and in the kidneys of all three Cheltenham rats. The partial L gene sequences obtained from 10 of the Cherwell rats and the three Cheltenham rats were identical to each other and the previously reported UK Cherwell strain. Seoul hantavirus (SEOV) RNA was detected in the heart, kidney, lung, salivary gland and spleen (but not in the liver) of an individual rat from the Cherwell colony suspected of being the source of SEOV. Serum from 20/20 of the Cherwell rats and two associated HFRS cases had high levels of SEOV-specific antibodies (by virus neutralisation). The high prevalence of SEOV in both sites and the moderately severe disease in the pet rat owners suggest that SEOV in pet rats poses a greater public health risk than previously considered.
The brain-derived neurotrophic factor (BDNF) Val66Met polymorphism Met allele exacerbates amyloid (Aβ) related decline in episodic memory (EM) and hippocampal volume (HV) over 36–54 months in preclinical Alzheimer's disease (AD). However, the extent to which Aβ+ and BDNF Val66Met is related to circulating markers of BDNF (e.g. serum) is unknown. We aimed to determine the effect of Aβ and the BDNF Val66Met polymorphism on levels of serum mBDNF, EM, and HV at baseline and over 18-months.
Non-demented older adults (n = 446) underwent Aβ neuroimaging and BDNF Val66Met genotyping. EM and HV were assessed at baseline and 18 months later. Fasted blood samples were obtained from each participant at baseline and at 18-month follow-up. Aβ PET neuroimaging was used to classify participants as Aβ– or Aβ+.
At baseline, Aβ+ adults showed worse EM impairment and lower serum mBDNF levels relative to Aβ- adults. BDNF Val66Met polymorphism did not affect serum mBDNF, EM, or HV at baseline. When considered over 18-months, compared to Aβ– Val homozygotes, Aβ+ Val homozygotes showed significant decline in EM and HV but not serum mBDNF. Similarly, compared to Aβ+ Val homozygotes, Aβ+ Met carriers showed significant decline in EM and HV over 18-months but showed no change in serum mBDNF.
While allelic variation in BDNF Val66Met may influence Aβ+ related neurodegeneration and memory loss over the short term, this is not related to serum mBDNF. Longer follow-up intervals may be required to further determine any relationships between serum mBDNF, EM, and HV in preclinical AD.
In this paper, a control approach called Artificial Neural Tissue (ANT) is applied to multirobot excavation for lunar base preparation tasks including clearing landing pads and burying of habitat modules. We show for the first time, a team of autonomous robots excavating a terrain to match a given three-dimensional (3D) blueprint. Constructing mounds around landing pads will provide physical shielding from debris during launch/landing. Burying a human habitat modules under 0.5 m of lunar regolith is expected to provide both radiation shielding and maintain temperatures of −25 °C. This minimizes base life-support complexity and reduces launch mass. ANT is compelling for a lunar mission because it does not require a team of astronauts for excavation and it requires minimal supervision. The robot teams are shown to autonomously interpret blueprints, excavate and prepare sites for a lunar base. Because little pre-programmed knowledge is provided, the controllers discover creative techniques. ANT evolves techniques such as slot-dozing that would otherwise require excavation experts. This is critical in making an excavation mission feasible when it is prohibitively expensive to send astronauts. The controllers evolve elaborate negotiation behaviors to work in close quarters. These and other techniques such as concurrent evolution of the controller and team size are shown to tackle problem of antagonism, when too many robots interfere reducing the overall efficiency or worse, resulting in gridlock. Although many challenges remain with this technology, our work shows a compelling pathway for field testing this approach.