To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Functional changes in the brain during aging can alter learning and memory, gait, and balance - in some cases leading to early cognitive decline, disability, or injurious falls among older adults. Dietary interventions with strawberry (SB) have been associated with improvements in neuronal, psychomotor, and cognitive function in rodent models of aging. We hypothesized that dietary supplementation with SB would improve mobility and cognition among older adults. In this study, 22 men and 15 women, between the ages of 60 and 75 years, were recruited into a randomized, double-blind, placebo-controlled trial in which they consumed either freeze-dried SB (24g/d, equivalent to 2 cups of fresh SB) or a SB placebo for 90 days. Participants completed a battery of balance, gait, and cognitive tests at baseline and again at 45 and 90 days of intervention. Significant supplement group by study visit interactions were observed on tests of learning and memory. Participants in the SB group showed significantly shorter latencies in a virtual spatial navigation task (p = 0.020, ηp2 = 0.106), and increased word recognition in the California Verbal Learning test (p = 0.014, ηp2 = 0.159), across study visits, relative to controls. However, no improvement in gait or balance was observed. These findings show that the addition of SB to the diets of healthy, older adults can improve some aspects of cognition, but not gait or balance, although more studies with a larger sample size and longer follow-up are needed to confirm this finding.
In recent years, a variety of efforts have been made in political science to enable, encourage, or require scholars to be more open and explicit about the bases of their empirical claims and, in turn, make those claims more readily evaluable by others. While qualitative scholars have long taken an interest in making their research open, reflexive, and systematic, the recent push for overarching transparency norms and requirements has provoked serious concern within qualitative research communities and raised fundamental questions about the meaning, value, costs, and intellectual relevance of transparency for qualitative inquiry. In this Perspectives Reflection, we crystallize the central findings of a three-year deliberative process—the Qualitative Transparency Deliberations (QTD)—involving hundreds of political scientists in a broad discussion of these issues. Following an overview of the process and the key insights that emerged, we present summaries of the QTD Working Groups’ final reports. Drawing on a series of public, online conversations that unfolded at www.qualtd.net, the reports unpack transparency’s promise, practicalities, risks, and limitations in relation to different qualitative methodologies, forms of evidence, and research contexts. Taken as a whole, these reports—the full versions of which can be found in the Supplementary Materials—offer practical guidance to scholars designing and implementing qualitative research, and to editors, reviewers, and funders seeking to develop criteria of evaluation that are appropriate—as understood by relevant research communities—to the forms of inquiry being assessed. We dedicate this Reflection to the memory of our coauthor and QTD working group leader Kendra Koivu.1
The study used naturalistic data on the production of nominal prefixes in the Otopamean language Northern Pame (autonym: Xi'iuy) to test Whole Word (constructivist) and Minimal Word (prosodic) theories for the acquisition of inflection. Whole Word theories assume that children store words in their entirety; Minimal Word theories assume that children produce words as binary feet. Northern Pame uses obligatory portmanteaux prefixes to inflect nouns for class, number, animacy and possessor. Singular nouns constitute 90 percent of the nouns that the children hear and yet all five two-year-old children frequently omitted the singular noun prefixes, but produced the low frequency noun suffixes for dual and animate plural. Neither the children's production of the noun-class prefixes nor their prefix overextensions correlated with the adult type and token frequencies of production. Northern Pame children constructed Minimal Words that contain binary feet and disfavor the production of initial, extrametrical prefixes.
Antimicrobial resistance is an urgent public health threat. Identifying trends in antimicrobial susceptibility can inform public health policy at the state and local levels.
To determine the ability of statewide antibiogram aggregation for public health surveillance to identify changes in antimicrobial resistance trends.
Facility-level trend analysis.
Crude and adjusted trend analyses of the susceptibility of Escherichia coli and Klebsiella pneumoniae to particular antibiotics, as reported by aggregated antibiograms, were examined from 2008 through 2018. Multivariable regression analyses via generalized linear mixed models were used to examine associations between hospital characteristics and trends of E. coli and K. pneumoniae susceptibility to ciprofloxacin and ceftriaxone.
E. coli and K. pneumoniae showed inverse trends in drug susceptibility over time. K. pneumoniae susceptibility to fluoroquinolones increased by 5% between 2008 and 2018 (P < .05). In contrast, E. coli susceptibility declined during the same period to ceftriaxone (6%), gentamicin (4%), and fluoroquinolones (4%) (P < .05). When compared to Boston hospitals, E. coli isolates from hospitals in other regions had a >4% higher proportion of susceptibility to ciprofloxacin and a >3% higher proportion of susceptibility to ceftriaxone (P < .05). Isolates of K. pneumoniae had higher susceptibility to ciprofloxacin (>3%) and ceftriaxone (>1.5%) in all regions when compared to Boston hospitals (P < .05).
Cumulative antibiograms can be used to monitor antimicrobial resistance, to discern regional and facility differences, and to detect changes in trends. Furthermore, because the number of years that hospitals contributed reports to the state-level aggregate had no significant influence on susceptibility trends, other states should not be discouraged by incomplete hospital compliance.
Intensified cover-cropping practices are increasingly viewed as a herbicide-resistance management tool but clear distinction between reactive and proactive resistance management performance targets is needed. We evaluated two proactive performance targets for integrating cover-cropping tactics, including (1) facilitation of reduced herbicide inputs and (2) reduced herbicide selection pressure. We conducted corn (Zea mays L.) and soybean [Glycine max (L.) Merr.] field experiments in Pennsylvania and Delaware using synthetic weed seedbanks of horseweed [Conyza canadensis (L.) Cronquist] and smooth pigweed (Amaranthus hybridus L.) to assess winter and summer annual population dynamics, respectively. The effect of alternative cover crops was evaluated across a range of herbicide inputs. Cover crop biomass production ranged from 2,000 to 8,500 kg ha−1 in corn and 3,000 to 5,500 kg ha−1 in soybean. Experimental results demonstrated that herbicide-based tactics were the primary drivers of total weed biomass production, with cover-cropping tactics providing an additive weed-suppression benefit. Substitution of cover crops for PRE or POST herbicide programs did not reduce total weed control levels or cash crop yields but did result in lower net returns due to higher input costs. Cover-cropping tactics significantly reduced C. canadensis populations in three of four cover crop treatments and decreased the number of large rosettes (>7.6-cm diameter) at the time of preplant herbicide exposure. Substitution of cover crops for PRE herbicides resulted in increased selection pressure on POST herbicides, but reduced the number of large individuals (>10 cm) at POST applications. Collectively, our findings suggest that cover crops can reduce the intensity of selection pressure on POST herbicides, but the magnitude of the effect varies based on weed life-history traits. Additional work is needed to describe proactive resistance management concepts and performance targets for integrating cover crops so producers can apply these concepts in site-specific, within-field management practices.
OBJECTIVES/GOALS: Introduction: Between 2014 and 2019 the National Institute of Health (NIH) through the National Center for the Advancement of Translational Science (NCATS) has awarded about $2.7 billion to U.S. Academic Medical Centers to build a national network of clinical and translational science program hubs that serve to meet their key goals and initiatives. Today there are about 60 Clinical and Translational Science Award (CTSA) program hubs. Each CTSA program hub has a corresponding website highlighting its clinical and translational science centered programs and activities. These websites are a critical communication gateway to promote NCATS goals and initiatives. Objective: The objective of this research is to evaluate the NIH funded Clinical and Translational Science Award (CTSA) program hub websites for NCATS goals and initiative content alignment, navigability, and interactivity. METHODS/STUDY POPULATION: Methods: Each CTSA program hub website was systematically evaluated for information or tools that align with the five NCATS / CTSA Goals and eight CTSA nationally identified program initiatives. Each NCATS goal and CTSA initiative was subsequently ranked by information diversity level (text, tool, interactivity) and navigation level (click distance from the home page). RESULTS/ANTICIPATED RESULTS: Results: Four of the five NCATS goals are thoroughly and consistently represented among the CTSA Consortium with workforce development, patient and community engagement, and quality and efficiency of research being the top three. Informatics is thoroughly and consistently represented, but not always clearly identified on the home page. The most underrepresented goal is integration of special and underserved populations which was identified on only 60% of CTSA program hub websites. The most common focus of the eight CTSA program initiatives is the Trial Innovation Network in CTSA program hub websites. The Smart IRB comes in a distant second. The remaining six initiatives are severely underrepresented. DISCUSSION/SIGNIFICANCE OF IMPACT: Discussion: The identification of these gaps among the CTSA program hubs presents an understanding of content management and website functionality among the consortium from 3 principal approaches. First it creates an understanding of CTSA program hub content alignment with its funding source goals and initiatives. Such an understanding presents an opportunity to promote ways to create a better aligned consortium with improved collaboration pathways by the funding source through program hub website content standards. Second, it creates an opportunity for program hubs to understand and respond to the messaging their websites are presenting as it relates to the funding source. Third, it provides an opportunity to identify specific program initiatives and goals the CTSA institutions independently chose to highlight which can open a dialog to the better understanding the value of the program initiatives as they relate to the needs of CTSA program hubs. Ultimately, CTSA websites through content alignment should lead to an improved user experience.
Interesterified (IE) fats are widely used to replace partially-hydrogenated fats as hard fats with functional and sensory properties needed for spreads/margarines, baked goods, and confectionary, while avoiding the health hazards of trans fats. Detailed mechanistic work to determine the metabolic effects of interesterification of commonly-consumed hard fats has not yet been done. Earlier studies using fats less commonly consumed have shown either neutral or a lowering effect on postprandial lipaemia. We investigated postprandial lipaemia, lipoprotein remodelling, and triacylglycerol-rich lipoprotein (TRL) fraction apolipoprotein concentrations following a common IE blend of palm oil/kernel fractions versus its non-IE counterpart, alongside a reference monounsaturated (MUFA) oil. A 3-armed, double blind, randomized controlled trial (clinicaltrials.gov NCT03191513) in healthy adults (n = 20; 10 men, 10 women) aged 45–75 y, assessed effects of single meals (897 kcal, 50 g fat, 16 g protein, 88 g carbohydrate) on postprandial plasma triacylglycerol (TAG) concentrations, lipoprotein profiles, and TRL fraction apolipoprotein B48 and TAG concentrations. Test fats were IE 80:20 palm stearin/palm kernel fat, the equivalent non-IE fat, and a high-MUFA reference oil (rapeseed oil, RO). Blood was collected at baseline and hourly for 8 h. Linear mixed modelling was performed, adjusting for treatment order and baseline values (ver. 24.0; SPSS Inc., Chicago, IL, USA). Total 8 h incremental area under the curves (iAUC) for plasma TAG concentrations were lower following IE and non-IE compared with RO (mean difference in iAUC: non-IE vs. RO -1.8 mmol/L.h (95% CI -3.3, -0.2); IE vs. RO -2.6 mmol/L.h (95% CI -5.3, 0.0)), but iAUCs for IE and non-IE were not significantly different. There were no differences between IE and non-IE for chylomicron fraction apoB48 concentrations nor TAG:apoB48 ratio. No differences were observed between IE and non-IE for lipoprotein (VLDL, HDL, LDL) particle size or sub-class particle concentrations. However, LDL particle diameters were reduced at 5 and 6 h following IE vs RO (P < 0.05). XXL- (including chylomicron remnants and VLDL particles), XL- and L-VLDL particle concentrations (average diameters > 75, 64, and 53.6 nm respectively) were higher following IE and non-IE vs. RO at 6 h (P < 0.05) and 8 h postprandially (P < 0.005–0.05). In conclusion, both IE and non-IE palmitic acid-rich fats generated a greater preponderance of pro-atherogenic large TRL remnant particles in the late postprandial phase relative to an oleic acid-rich oil. However, the process of interesterification did not modify postprandial TAG response or lipoprotein metabolism.
Weeds can cause significant yield loss in watermelon production systems. Commercially acceptable weed control is difficult to achieve, even with heavy reliance on herbicides. A study was conducted to evaluate a spring-seeded cereal rye cover crop with different herbicide application timings for weed management between row middles in watermelon production systems. Common lambsquarters and pigweed species (namely, Palmer amaranth and smooth pigweed) densities and biomasses were often lower with cereal rye compared with no cereal rye, regardless of herbicide treatment. The presence of cereal rye did not negatively influence the number of marketable watermelon fruit, but average marketable fruit weight in cereal rye versus no cereal rye treatments varied by location. These results demonstrate that a spring-seeded cereal rye cover crop can help reduce weed density and weed biomass, and potentially enhance overall weed control. Cereal rye alone did not provide full-season weed control, so additional research is needed to determine the best methods to integrate spring cover cropping with other weed management tactics in watermelon for effective, full-season control.
Drawing on a landscape analysis of existing data-sharing initiatives, in-depth interviews with expert stakeholders, and public deliberations with community advisory panels across the U.S., we describe features of the evolving medical information commons (MIC). We identify participant-centricity and trustworthiness as the most important features of an MIC and discuss the implications for those seeking to create a sustainable, useful, and widely available collection of linked resources for research and other purposes.
Childhood adversity is associated with poor mental and physical health outcomes across the life span. Alterations in the hypothalamic–pituitary–adrenal axis are considered a key mechanism underlying these associations, although findings have been mixed. These inconsistencies suggest that other aspects of stress processing may underlie variations in this these associations, and that differences in adversity type, sex, and age may be relevant. The current study investigated the relationship between childhood adversity, stress perception, and morning cortisol, and examined whether differences in adversity type (generalized vs. threat and deprivation), sex, and age had distinct effects on these associations. Salivary cortisol samples, daily hassle stress ratings, and retrospective measures of childhood adversity were collected from a large sample of youth at risk for serious mental illness including psychoses (n = 605, mean age = 19.3). Results indicated that childhood adversity was associated with increased stress perception, which subsequently predicted higher morning cortisol levels; however, these associations were specific to threat exposures in females. These findings highlight the role of stress perception in stress vulnerability following childhood adversity and highlight potential sex differences in the impact of threat exposures.
Timely herbicide applications for no-till soybean can be challenging given the diverse communities of both winter and summer annual weeds that are often present. Research was conducted to compare various approaches for nonselective and preplant weed control for no-till soybean. Nonselective herbicide application timings of fall (with and without a residual herbicide) followed by early-spring (4 wk before planting), late-spring (1 to 2 wk before planting), or sequential-spring applications (4 wk before planting and at planting) were compared. Spring applications also included a residual herbicide. For consistent control of winter annual weeds, two herbicide applications were needed, either a fall application followed by a spring application or sequential-spring applications. When a fall herbicide application did not include a residual herbicide, greater winter annual weed control resulted from early- or sequential-spring treatments. However, application timings that effectively controlled winter annual weeds did not effectively control summer annual weeds that have a prolonged emergence period. Palmer amaranth and large crabgrass control at 4 wk after planting was better when the spring residual treatment (chlorimuron plus metribuzin) was applied 1 to 2 wk before planting or at planting, compared with 4 wk before planting. Results indicate that in order to optimize control, herbicide application programs in soybean should coincide with seasonal growth cycles of winter and summer annual weeds.
Much of the interest in youth at clinical high risk (CHR) of psychosis has been in understanding conversion. Recent literature has suggested that less than 25% of those who meet established criteria for being at CHR of psychosis go on to develop a psychotic illness. However, little is known about the outcome of those who do not make the transition to psychosis. The aim of this paper was to examine clinical symptoms and functioning in the second North American Prodrome Longitudinal Study (NAPLS 2) of those individuals whose by the end of 2 years in the study had not developed psychosis.
In NAPLS-2 278 CHR participants completed 2-year follow-ups and had not made the transition to psychosis. At 2-years the sample was divided into three groups – those whose symptoms were in remission, those who were still symptomatic and those whose symptoms had become more severe.
There was no difference between those who remitted early in the study compared with those who remitted at one or 2 years. At 2-years, those in remission had fewer symptoms and improved functioning compared with the two symptomatic groups. However, all three groups had poorer social functioning and cognition than healthy controls.
A detailed examination of the clinical and functional outcomes of those who did not make the transition to psychosis did not contribute to predicting who may make the transition or who may have an earlier remission of attenuated psychotic symptoms.
The deep subsurface of other planetary bodies is of special interest for robotic and human exploration. The subsurface provides access to planetary interior processes, thus yielding insights into planetary formation and evolution. On Mars, the subsurface might harbour the most habitable conditions. In the context of human exploration, the subsurface can provide refugia for habitation from extreme surface conditions. We describe the fifth Mine Analogue Research (MINAR 5) programme at 1 km depth in the Boulby Mine, UK in collaboration with Spaceward Bound NASA and the Kalam Centre, India, to test instruments and methods for the robotic and human exploration of deep environments on the Moon and Mars. The geological context in Permian evaporites provides an analogue to evaporitic materials on other planetary bodies such as Mars. A wide range of sample acquisition instruments (NASA drills, Small Planetary Impulse Tool (SPLIT) robotic hammer, universal sampling bags), analytical instruments (Raman spectroscopy, Close-Up Imager, Minion DNA sequencing technology, methane stable isotope analysis, biomolecule and metabolic life detection instruments) and environmental monitoring equipment (passive air particle sampler, particle detectors and environmental monitoring equipment) was deployed in an integrated campaign. Investigations included studying the geochemical signatures of chloride and sulphate evaporitic minerals, testing methods for life detection and planetary protection around human-tended operations, and investigations on the radiation environment of the deep subsurface. The MINAR analogue activity occurs in an active mine, showing how the development of space exploration technology can be used to contribute to addressing immediate Earth-based challenges. During the campaign, in collaboration with European Space Agency (ESA), MINAR was used for astronaut familiarization with future exploration tools and techniques. The campaign was used to develop primary and secondary school and primary to secondary transition curriculum materials on-site during the campaign which was focused on a classroom extra vehicular activity simulation.
While our fascination with understanding the past is sufficient to warrant an increased focus on synthesis, solutions to important problems facing modern society require understandings based on data that only archaeology can provide. Yet, even as we use public monies to collect ever-greater amounts of data, modes of research that can stimulate emergent understandings of human behavior have lagged behind. Consequently, a substantial amount of archaeological inference remains at the level of the individual project. We can more effectively leverage these data and advance our understandings of the past in ways that contribute to solutions to contemporary problems if we adapt the model pioneered by the National Center for Ecological Analysis and Synthesis to foster synthetic collaborative research in archaeology. We propose the creation of the Coalition for Archaeological Synthesis coordinated through a U.S.-based National Center for Archaeological Synthesis. The coalition will be composed of established public and private organizations that provide essential scholarly, cultural heritage, computational, educational, and public engagement infrastructure. The center would seek and administer funding to support collaborative analysis and synthesis projects executed through coalition partners. This innovative structure will enable the discipline to address key challenges facing society through evidentially based, collaborative synthetic research.
Crop safety is an important consideration in determining PRE herbicide application, especially when multiple herbicide sites-of-action are used. This research examined relative corn injury as the result of PRE applications containing ALS- and/or HPPD-inhibiting herbicides to a sandy loam soil. Herbicide premixes containing clopyralid, flumetsulam, isoxaflutole, mesotrione, rimsulfuron, tembotrione, thifensulfuron, and thiencarbazone were applied at twice the labeled rate. In general, isoxaflutole alone was the safest herbicide evaluated, while PRE applications of rimsulfuron-containing herbicides caused the most corn stunting, had a lower recovery rate, and lower yields. However, POST applications of mesotrione plus rimsulfuron stunted corn less than 2%. Although there was little correlation between corn injury and yield, growers should be aware of the other factors, such as soil texture and environment that may impact crop production.
The developmental course of daily functioning prior to first psychosis-onset remains poorly understood. This study explored age-related periods of change in social and role functioning. The longitudinal study included youth (aged 12–23, mean follow-up years = 1.19) at clinical high risk (CHR) for psychosis (converters [CHR-C], n = 83; nonconverters [CHR-NC], n = 275) and a healthy control group (n = 164). Mixed-model analyses were performed to determine age-related differences in social and role functioning. We limited our analyses to functioning before psychosis conversion; thus, data of CHR-C participants gathered after psychosis onset were excluded. In controls, social and role functioning improved over time. From at least age 12, functioning in CHR was poorer than in controls, and this lag persisted over time. Between ages 15 and 18, social functioning in CHR-C stagnated and diverged from that of CHR-NC, who continued to improve (p = .001). Subsequently, CHR-C lagged behind in improvement between ages 21 and 23, further distinguishing them from CHR-NC (p < .001). A similar period of stagnation was apparent for role functioning, but to a lesser extent (p = .007). The results remained consistent when we accounted for the time to conversion. Our findings suggest that CHR-C start lagging behind CHR-NC in social and role functioning in adolescence, followed by a period of further stagnation in adulthood.
Metribuzin will control many problematic weed species in winter wheat in the mid-Atlantic states, including herbicide-resistant biotypes, but it has not been recommended due to crop safety concerns. In a three-year trial, metribuzin was applied at 105 or 210 g ai ha−1 to wheat at the PRE, 2-leaf (Feekes stage 1 to 2), early spring (Feekes stage 3 to 4), and late spring (Feekes stage 4 to 6) growth stages using wheat cultivars sensitive to metribuzin. Early spring applications had the least amount of injury, and injury at this timing was transient and yield was not reduced. Yield loss was observed with the other application timings in at least one out of three years. Rainfall shortly after application appears to increase the risk of wheat injury.
Crop safety is one of the many considerations when deciding which POST herbicide to use. This research examined relative corn injury as a result of POST herbicides and the effect of including the safener isoxadifen, the choice of a sensitive or tolerant hybrid, or both. The herbicides included commercial combinations of dicamba, diflufenzopyr, nicosulfuron, rimsulfuron, and thifensulfuron, all at twice the labeled rate. Isoxadifen reduced twisting from dicamba plus diflufenzopyr but not with dicamba plus rimsulfuron. Isoxadifen had negligible effect on chlorosis. In general, rimsulfuron plus thifensulfuron caused the most corn stunting, whereas including isoxadifen or using a tolerant hybrid often reduced corn injury. In two of the four years, treatments with rimsulfuron plus thifensulfuron resulted in yield reductions. Although using products with isoxadifen or selecting tolerant hybrids may influence injury, herbicide selection will have the greatest effect on corn injury.
Integrated weed management (IWM) for agronomic and vegetable production
systems utilizes all available options to effectively manage weeds.
Late-season weed control measures are often needed to improve crop harvest
and stop additions to the weed seed bank. Eliminating the production of
viable weed seeds is one of the key IWM practices. The objective of this
research was to determine how termination method and timing influence viable
weed seed production of late-season weed infestations. Research was
conducted in Delaware, Michigan, and New York over a 2-yr period. The weeds
studied included: common lambsquarters, common ragweed, giant foxtail,
jimsonweed, and velvetleaf. Three termination methods were imposed: cutting
at the plant base (simulating hand hoeing), chopping (simulating mowing),
and applying glyphosate. The three termination timings were flowering,
immature seeds present, and mature seeds present. Following termination,
plants were stored in the field in mesh bags until mid-Fall when seeds were
counted and tested for viability. Termination timing influenced viable seed
development; however, termination method did not. Common ragweed and giant
foxtail produced viable seeds when terminated at the time of flowering. All
species produced some viable seed when immature seeds were present at the
time of termination. The time of viable seed formation varied based on
species and site-year, ranging from plants terminated the day of flowering
to 1,337 growing degree d after flowering (base 10, 0 to 57 calendar d).
Viable seed production was reduced by 64 to 100% when common lambsquarters,
giant foxtail, jimsonweed, and velvetleaf were terminated with immature
seeds present, compared to when plants were terminated with some mature
seeds present. Our results suggest that terminating common lambsquarters,
common ragweed, and giant foxtail prior to flowering, and velvetleaf and
jimsonweed less than 2 and 3 wk after flowering, respectively, greatly
reduces weed seed bank inputs.
Scott David, works at the intersections of law and technology, where theory informs practice.,
Barbara Endicott-Popovsky, is Executive Director of the Center for Information Assurance and Cybersecurity at the University of Washington
Introduction: the IM problem and solution landscape
This chapter suggests that networked IM (information management) challenges of security, privacy and risk/liability are just symptoms of a single condition. That single condition is the lack of agreement by stakeholders. This condition is acute, since networked IM systems (the ‘cloud’) operate as distributed, socio-technical systems, i.e. those that simultaneously serve and are constituted from both people and technology acting in concert. However, IM systems are typically designed, developed and deployed as if they were systems composed solely of technology and as if the problems with their operation could be fixed by technology alone. This ignores the people operating IM solutions, causing their destabilization.
Technology solutions are a natural focus because their performance is more readily measurable than that of people and institutions. Unfortunately, like the man who is looking for his lost wristwatch only under the streetlight because that is where the light is better, we are unlikely to find the solutions we are looking for just because they are more readily apparent. The actions of individuals and institutions in networked IM systems, not the technology, are the source of most current security, privacy and risk/liability concerns, and focus on technology alone does not adequately address the system operational variables that arise from human behaviours engaged in by IM system stakeholders. That blind spot is revealed through consideration of the fact that the vast majority of data breaches are the result of human, rather than technological, factors. Technological fixes alone are insufficient for improving socio-technical systems where human negligence and intentional misconduct are the chief causes of lack of system integrity. We all need socio-technical processes to evolve socio-technical systems. Markets and other formal and informal rule-making processes generate enforceable ‘agreements’ among people that also reduce risk and offer an additional solution space for networked IM.
The big strides in improving security and privacy, and the introduction of measures to mitigate liability will allow stakeholders to develop their own performance standards which can be measured meaningfully and provide feedback that can be used to police both technological and human failure in systems.