We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure coreplatform@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
To understand which anthropometric diagnostic criteria best discriminate higher from lower risk of death in children and explore programme implications.
Design:
A multiple cohort individual data meta-analysis of mortality risk (within six months of measurement) by anthropometric case definitions. Sensitivity, specificity, informedness and inclusivity in predicting mortality, face validity and compatibility with current standards and practice were assessed and operational consequences modelled.
Setting:
Community-based cohort studies in 12 low-income countries between 1977 and 2013 in settings where treatment of wasting was not widespread.
Participants:
Children aged 6 to 59 months
Results:
Of the 12 anthropometric case definitions, four (weight-for-age Z-score (WAZ) <-2), (mid-upper-arm circumference (MUAC) <125 mm), (MUAC <115 mm or WAZ <-3), and (WAZ <-3) had the highest informedness in predicting mortality. A combined case definition (MUAC <115 mm or WAZ <-3) was better at predicting deaths associated with weight-for-height Z-score (WHZ) <-3 and concurrent wasting and stunting (WaSt) than the single WAZ <-3 case-definition. After assessment of all criteria, the combined case definition performed best. The simulated workload for programmes admitting based on MUAC <115 mm or WAZ <-3, when adjusted with a proxy for required intensity and/or duration of treatment, was 1.87 times larger than programmes admitting on MUAC <115 mm alone.
Conclusions:
A combined case definition detects nearly all deaths associated with severe anthropometric deficits suggesting that therapeutic feeding programmes may achieve higher impact (prevent mortality and improve coverage) by using it. There remain operational questions to examine further before wide-scale adoption can be recommended.
To compare the prognostic value of mid-upper arm circumference (MUAC), weight-for-height z-score (WHZ) and weight-for-age z-score (WAZ) for predicting death over periods of one, three and six months follow-up in children.
Design:
Pooled analysis of 12 prospective studies examining survival after anthropometric assessment. Sensitivity and false-positive ratios to predict death within one, three and six months were compared for three individual anthropometric indices and their combinations.
Setting:
Community-based, prospective studies from 12 countries in Africa and Asia
Participants:
Children aged 6-59 months living in the study areas
Results:
For all anthropometric indices, the receiver operating characteristic curves were higher for shorter than for longer durations of follow-up. Sensitivity was higher for death with one month follow-up compared to six months by 49% (95% CI: 30-69%) for MUAC <115 mm (p<0.001), 48% (95%CI: 9.4-87%) for WHZ <-3 (p<0.01) and 28% (95%CI: 7.6-42%) for WAZ <-3 (p<0.005). This was accompanied by an increase in false-positives of only 3% or less. For all durations of follow-up, WAZ <-3 identified more children who died and were not identified by WHZ <-3 or by MUAC <115 mm, 120 mm or 125 mm but the use of WAZ <-3 led to an increased false-positive ratio up to 16.4% (95%CI: 12.0-20.9%) compared to 3.5% (0.4-6.5%) for MUAC <115 mm alone.
Conclusions:
Frequent anthropometric measurements significantly improve the identification of malnourished children with a high risk of death without markedly increasing false-positives. Combining two indices increases sensitivity but also increases false-positives among children meeting case definitions.
How can psychiatrists best provide care in complex, sometimes overwhelming disasters? COVID-19 strained every aspect of health care to the breaking point, from finances to pharmaceutical supply lines. We can expect more challenges to prescribing in the future, as shown by recent hurricanes in Puerto Rico, fires in California, and ice storms in Texas. When medications become scarce or inaccessible, then clinicians need to make difficult prescribing decisions. We suggest that a culture of deprescribing, a systematic approach to reducing or simplifying medications, could be applied to a wide variety of crises. Deprescribing is defined as the planned reduction of medications to improve patient health or to reduce side effects (see deprescribing.org). It has been used to reduce polypharmacy in geriatric and other complex populations. It provides evidence-based guidance for phasing out many classes of medications. It is part of the larger program to reduce waste in health care and to make pharmacy more rational. Disasters and resource scarcity, however, require a different approach. In contrast to routine care focused on individual patients, crisis standards of care (CSC) shift the clinical focus to the community. Instead of deprescribing guidelines for individual clinicians, CSC deprescribing would be national policies addressing shortages of important medications. We did a scoping review looking for studies of deprescribing in a crisis.
Methods/Results
We extracted 1340 references in Google Scholar 2016 to 2021 using (deprescribing) AND (disaster OR crisis OR climate OR pandemic OR supply lines ). A scan of texts found 160 references matching our criteria, and only 19 of them addressed deprescribing as a strategy to strengthen health systems or providers in an emergency. Most of those were related to scarce supplies during COVID, and a few addressed the carbon impact of medications. We also reviewed related literatures on medication supply chain vulnerabilities, WHO Essential Medicines, and healthcare rationing.
Implications
Deprescribing gained attention during the COVID pandemic, responding to both disrupted supply lines and improving patient safety. Writers concerned with climate change support deprescribing to reduce the carbon impact of medications. Deprescribing as crisis policy could help streamline national stockpiles, supply chains, and manufacturing. Education could make deprescribing second nature for clinicians, potentially decreasing stress and increasing flexibility in future emergencies. Barriers to deprescribing generally include cultural inertia, industry lobbyists, education, and malpractice fears. In a crisis, deprescribing guidelines could provide clinicians with confidence and flexibility while conserving scarce resources. Research is needed to evaluate deprescribing guidelines for crises, especially ensuring equity in how they reduce polypharmacy and save money.
How can psychiatrists best provide care in complex, sometimes overwhelming disasters? COVID-19 strained every aspect of health care to the breaking point, from finances to pharmaceutical supply lines. We can expect more challenges to prescribing in the future, as shown by recent hurricanes in Puerto Rico, fires in California, and ice storms in Texas. When medications become scarce or inaccessible, then clinicians need to make difficult prescribing decisions. We suggest that a culture of deprescribing, a systematic approach to reducing or simplifying medications, could be applied to a wide variety of crises. Deprescribing is defined as the planned reduction of medications to improve patient health or to reduce side effects (see deprescribing.org). It has been used to reduce polypharmacy in geriatric and other complex populations. It provides evidence-based guidance for phasing out many classes of medications. It is part of the larger program to reduce waste in health care and to make pharmacy more rational. Disasters and resource scarcity, however, require a different approach. In contrast to routine care focused on individual patients, crisis standards of care (CSC) shift the clinical focus to the community. Instead of deprescribing guidelines for individual clinicians, CSC deprescribing would be national policies addressing shortages of important medications. We did a scoping review looking for studies of deprescribing in a crisis.
Methods/Results
We extracted 1340 references in Google Scholar 2016 to 2021 using (deprescribing) AND (disaster OR crisis OR climate OR pandemic OR supply lines). A scan of texts found 160 references matching our criteria, and only 19 of them addressed deprescribing as a strategy to strengthen health systems or providers in an emergency. Most of those were related to scarce supplies during COVID, and a few addressed the carbon impact of medications. We also reviewed related literatures on medication supply chain vulnerabilities, WHO Essential Medicines, and healthcare rationing.
Implications
Deprescribing gained attention during the COVID pandemic, responding to both disrupted supply lines and improving patient safety. Writers concerned with climate change support deprescribing to reduce the carbon impact of medications. Deprescribing as crisis policy could help streamline national stockpiles, supply chains, and manufacturing. Education could make deprescribing second nature for clinicians, potentially decreasing stress and increasing flexibility in future emergencies. Barriers to deprescribing generally include cultural inertia, industry lobbyists, education, and malpractice fears. In a crisis, deprescribing guidelines could provide clinicians with confidence and flexibility while conserving scarce resources. Research is needed to evaluate deprescribing guidelines for crises, especially ensuring equity in how they reduce polypharmacy and save money.
The COVID-19 crisis has severely stressed our healthcare system and pushed our economy to the brink. This long emergency will probably cause years of severe suffering in every region. Health expenses greatly increased, supply chains were disrupted, and governments coped with much less revenue. Good clinicians plan for ALL contingencies, and we need to consider that the current disaster may get much worse. How can we adapt psychiatry to a long emergency? This goes far beyond previous work on crisis standards of care because the emergency is severe, prolonged, and widespread. If we had to spend much less on psychotropics, which meds stay on the formulary? If we have to close hospitals, which patients get a bed? What adaptations could be used if demand exceeds the supply of providers? Very little is known about how to make severe, permanent cuts to healthcare. Our previous systematic review found no scholarship addressing the ethics of severe and prolonged healthcare rationing. Global catastrophes need a global health policy, but this one has no experts. The present study starts the project by surveying experts with related experience that could be useful in future plans.
Method
We used purposive sampling to find 18 professionals with experience in healthcare rationing from underserved, indigenous communities, homeless programs, and African nations. We also interviewed ethicists, pharmacists, administrators, NGO clinicians, and military. Interviews were transcribed and coded using basic inductive techniques. Because so little is known about this topic, we used grounded theory, an iterative approach to guide further sampling, refine interviews, and make some preliminary conclusions.
Results
Participants all agreed this crisis planning is extremely important and complex. They described diverse concerns regarding ethical decision making, with some having confidence with top-down government policy, and others recommending a grassroots approach. Minority participants had less confidence in government. There was no consensus on any best ethical framework. Most had confidence that clinicians will ultimately do the right thing. Native American leaders had confidence in a holistic, preventive approach. All agreed that social justice should be central in measuring economic impact of long emergencies and choosing ethical options. We collected suggestions for innovative approaches to rationing.
Conclusions
This research program illuminates the difficult ethical questions about adapting psychiatry to a prolonged, widespread, and severe emergency. Our interviews identify areas where severe but ethical cuts can be made in medications, hospitals, clinical staff, and administration. Next steps include evidence-based formularies, utilitarian staff cuts, and ethical standards for closing beds or revamping state hospitals. Underserved and diverse communities with rationing experience must have a voice in the discussion.
The National Institute of Standards and Technology (NIST) certifies a suite of Standard Reference Materials (SRMs) to be used to evaluate specific aspects of the instrument performance of both X-ray and neutron powder diffractometers. This report describes SRM 640f, the seventh generation of this powder diffraction SRM, which is designed to be used primarily for calibrating powder diffractometers with respect to line position; it also can be used for the determination of the instrument profile function. It is certified with respect to the lattice parameter and consists of approximately 7.5 g of silicon powder prepared to minimize line broadening. A NIST-built diffractometer, incorporating many advanced design features, was used to certify the lattice parameter of the Si powder. Both statistical and systematic uncertainties have been assigned to yield a certified value for the lattice parameter at 22.5 °C of a = 0.5431144 ± 0.000008 nm.
Recent investigations now suggest that cerebrovascular reactivity (CVR) is impaired in Alzheimer’s disease (AD) and may underpin part of the disease’s neurovascular component. However, our understanding of the relationship between the magnitude of CVR, the speed of cerebrovascular response, and the progression of AD is still limited. This is especially true in patients with mild cognitive impairment (MCI), which is recognized as an intermediate stage between normal aging and dementia. The purpose of this study was to investigate AD and MCI patients by mapping repeatable and accurate measures of cerebrovascular function, namely the magnitude and speed of cerebrovascular response (τ) to a vasoactive stimulus in key predilection sites for vascular dysfunction in AD.
Methods:
Thirty-three subjects (age range: 52–83 years, 20 males) were prospectively recruited. CVR and τ were assessed using blood oxygen level-dependent MRI during a standardized carbon dioxide stimulus. Temporal and parietal cortical regions of interest (ROIs) were generated from anatomical images using the FreeSurfer image analysis suite.
Results:
Of 33 subjects recruited, 3 individuals were excluded, leaving 30 subjects for analysis, consisting of 6 individuals with early AD, 11 individuals with MCI, and 13 older healthy controls (HCs). τ was found to be significantly higher in the AD group compared to the HC group in both the temporal (p = 0.03) and parietal cortex (p = 0.01) following a one-way ANCOVA correcting for age and microangiopathy scoring and a Bonferroni post-hoc correction.
Conclusion:
The study findings suggest that AD is associated with a slowing of the cerebrovascular response in the temporal and parietal cortices.
The National Institute of Standards and Technology (NIST) certifies a suite of Standard Reference Materials (SRMs) to evaluate specific aspects of instrument performance of both X-ray and neutron powder diffractometers. This report describes SRM 660c, the fourth generation of this powder diffraction SRM, which is used primarily for calibrating powder diffractometers with respect to line position and line shape for the determination of the instrument profile function (IPF). It is certified with respect to lattice parameter and consists of approximately 6 g of lanthanum hexaboride (LaB6) powder. So that this SRM would be applicable for the neutron diffraction community, the powder was prepared from an isotopically enriched 11B precursor material. The microstructure of the LaB6 powder was engineered specifically to yield a crystallite size above that where size broadening is typically observed and to minimize the crystallographic defects that lead to strain broadening. A NIST-built diffractometer, incorporating many advanced design features, was used to certify the lattice parameter of the LaB6 powder. Both Type A, statistical, and Type B, systematic, uncertainties have been assigned to yield a certified value for the lattice parameter at 22.5 °C of a = 0.415 682 6 ± 0.000 008 nm (95% confidence).
Rapid diagnosis of dementia is essential to ensure optimum patient care. This study used real-world data to quantify the dementia diagnostic pathway in Australia.
Design:
A real-world, cross-sectional survey of physicians and patients.
Setting:
Clinical practice.
Participants:
Primary care or specialist physicians managing patients with cognitive impairment (CI).
Measurements:
Descriptive analyses focused on key events in the diagnostic pathway. Regression modeling compared the duration between first consultation and formal diagnosis with various factors.
Results:
Data for 600 patients were provided by 60 physicians. Mean time from initial symptoms to first consultation was 6.1 ± 4.4 months; 20% of patients had moderate or severe CI at first consultation. Mean time from first consultation to formal diagnosis was 4.0 ± 7.4 months (1.2 ± 3.6 months if not referred to a secondary physician, and 5.3 ± 8.3 months if referred). Time from first consultation to diagnosis was significantly associated with CI severity at first consultation; time was shorter with more severe CI. There was no association of disease severity and referral to a secondary physician; 69.5% of patients were referred, the majority (57.1%) to a geriatrician. The highest proportion of patients were diagnosed by geriatricians (47.4%). Some form of test or scale was used to aid diagnosis in 98.8% of patients.
Conclusions:
A substantial number of Australians experience cognitive decline and behavioral changes some time before consulting a physician or being diagnosed with dementia. Increasing public awareness of the importance of early diagnosis is essential to improve the proportion of patients receiving comprehensive support prior to disease progression.
Electronic health records (EHRs) provide great promise for identifying cohorts and enhancing research recruitment. Such approaches are sorely needed, but there are few descriptions in the literature of prevailing practices to guide their use. A multidisciplinary workgroup was formed to examine current practices in the use of EHRs in recruitment and to propose future directions. The group surveyed consortium members regarding current practices. Over 98% of the Clinical and Translational Science Award Consortium responded to the survey. Brokered and self-service data warehouse access are in early or full operation at 94% and 92% of institutions, respectively, whereas, EHR alerts to providers and to research teams are at 45% and 48%, respectively, and use of patient portals for research is at 20%. However, these percentages increase significantly to 88% and above if planning and exploratory work were considered cumulatively. For most approaches, implementation reflected perceived demand. Regulatory and workflow processes were similarly varied, and many respondents described substantive restrictions arising from logistical constraints and limitations on collaboration and data sharing. Survey results reflect wide variation in implementation and approach, and point to strong need for comparative research and development of best practices to protect patients and facilitate interinstitutional collaboration and multisite research.
Greenhouse studies were conducted to determine host status of weed species for Rhizoctonia solani AG-1, which causes Rhizoctonia foliar blight of soybean. Weed species were barnyardgrass, broadleaf signalgrass, common cocklebur, entireleaf morningglory, hemp sesbania, itchgrass, johnsongrass, large crabgrass, northern jointvetch, prickly sida, purple nutsedge, redweed, sicklepod, and smooth pigweed. Seedling weeds were inoculated with suspensions containing intraspecific group IA and IB isolates of the fungus. In the first study, sclerotia of IA were recovered from tissue of all weeds except smooth pigweed, and mycelia of IA were recovered from all except smooth pigweed and redweed. In that study, neither microsclerotia nor mycelia of IB were recovered from sicklepod, barnyardgrass, or large crabgrass, and only microsclerotia were recovered from itchgrass and purple nutsedge. In the second study, sclerotia of IA, microsclerotia of IB, and mycelia of each isolate were recovered from all weed species. In other studies, R. solani spread from at least six of seven weed species to a noninfected soybean plant growing in close proximity. These studies emphasize the importance of weed control, not only for reducing plant competition and increasing yield, but also for the potential impact on development of RFB.
Field studies evaluated response of soybean to Rhizoctonia foliar blight (RFB) disease in combination with varying densities of common cocklebur, hemp sesbania, or johnsongrass. Soybean plants at both V10 and R1 growth stages were not inoculated or inoculated with suspensions containing equal concentrations of Rhizoctonia solani AG-1 IA and IB mycelia. Intensity of RFB was rated weekly beginning at V1 soybean growth stage, and data were used to determine area under disease progress curves. Intensity of RFB was greater in 1993 than in 1994. When averaged across weed species and weed densities, soybean yield in 1993 was reduced 18% in plots inoculated with R. solani compared with those not inoculated. Intensity of RFB, however, did not differ between inoculated and noninoculated plots in 1994. Interactions between R. solani and weed density for RFB intensity and yield were not significant either year. Soybean yields in 1994, however, were reduced by hemp sesbania and johnsongrass in inoculated plots. Soybean maturity was delayed both years when hemp sesbania was present.
Acifluorfen, alachlor, glufosinate, glyphosate, paraquat, and pendimethalin were evaluated for their effects on mycelial growth and sclerotia/microsclerotia production by Rhizoctonia solani AG-1 IA and IB in culture. All of these herbicides except glufosinate and glyphosate were evaluated for effects on severity of Rhizoctonia foliar blight of soybean in the field. In laboratory studies, all herbicides reduced colony radius of R. solani. Growth reductions for IB were greater than for IA in the presence of pendimethalin, alachlor, and acifluorfen, but glufosinate reduced growth of IA more than IB. Sclerotia production by both isolates was prevented by paraquat, greatly reduced by glufosinate, but markedly less affected by the other herbicides tested. In field studies, all tested herbicides influenced severity of Rhizoctonia foliar blight when disease pressure was low, but only paraquat reduced severity when disease pressure was high.
To determine whether living in a food swamp (≥4 corner stores within 0·40 km (0·25 miles) of home) or a food desert (generally, no supermarket or access to healthy foods) is associated with consumption of snacks/desserts or fruits/vegetables, and if neighbourhood-level socio-economic status (SES) confounds relationships.
Design
Cross-sectional. Assessments included diet (Youth/Adolescent FFQ, skewed dietary variables normalized) and measured height/weight (BMI-for-age percentiles/Z-scores calculated). A geographic information system geocoded home addresses and mapped food deserts/food swamps. Associations examined using multiple linear regression (MLR) models adjusting for age and BMI-for-age Z-score.
Setting
Baltimore City, MD, USA.
Subjects
Early adolescent girls (6th/7th grade, n 634; mean age 12·1 years; 90·7 % African American; 52·4 % overweight/obese), recruited from twenty-two urban, low-income schools.
Results
Girls’ consumption of fruit, vegetables and snacks/desserts: 1·2, 1·7 and 3·4 servings/d, respectively. Girls’ food environment: 10·4 % food desert only, 19·1 % food swamp only, 16·1 % both food desert/swamp and 54·4 % neither food desert/swamp. Average median neighbourhood-level household income: $US 35 298. In MLR models, girls living in both food deserts/swamps consumed additional servings of snacks/desserts v. girls living in neither (β=0·13, P=0·029; 3·8 v. 3·2 servings/d). Specifically, girls living in food swamps consumed more snacks/desserts than girls who did not (β=0·16, P=0·003; 3·7 v. 3·1 servings/d), with no confounding effect of neighbourhood-level SES. No associations were identified with food deserts or consumption of fruits/vegetables.
Conclusions
Early adolescent girls living in food swamps consumed more snacks/desserts than girls not living in food swamps. Dietary interventions should consider the built environment/food access when addressing adolescent dietary behaviours.
The National Institute of Standards and Technology (NIST) certifies a suite of Standard Reference Materials (SRMs) to address specific aspects of the performance of X-ray powder diffraction instruments. This report describes SRM 1878b, the third generation of this powder diffraction SRM. SRM 1878b is intended for use in the preparation of calibration standards for the quantitative analyses of α-quartz by X-ray powder diffraction in accordance to National Institute for Occupational Safety and Health Analytical Method 7500, or equivalent. A unit of SRM 1878b consists of approximately 5 g of α-quartz powder bottled in an argon atmosphere. It is certified with respect to crystalline phase purity, or amorphous phase content, and lattice parameter. Neutron powder diffraction, both time of flight and constant wavelength, was used to certify the phase purity using SRM 676a as an internal standard. A NIST-built diffractometer, incorporating many advanced design features was used for certification measurements for lattice parameters.
The National Institute of Standards and Technology (NIST) certifies a suite of Standard Reference Materials (SRMs) to address specific aspects of the performance of X-ray powder diffraction instruments. This report describes SRM 1976b, the third generation of this powder diffraction SRM. SRM 1976b consists of a sintered alumina disc, approximately 25.6 mm in diameter by 2.2 mm in thickness, intended for use in the calibration of X-ray powder diffraction equipment with respect to line position and intensity as a function of 2θ-angle. The sintered form of the SRM eliminates the effect of sample loading procedures on intensity measurements. Certified data include the lattice parameters and relative peak intensity values from 13 lines in the 2θ region between 20° and 145° using CuKα radiation. A NIST-built diffractometer, incorporating many advanced and unique design features was used to make the certification measurements.
Lower extremity amputations are performed for tumors, trauma, peripheral vascular disease, infection, or congenital deformity. The goal of treatment is to return the patient to a functional level allowing pain-free ambulation, which is best achieved through a multidisciplinary approach involving physician, physical therapist, and prosthetic team. Due to the psychological aspects of care, it is important to involve the patient in the decision-making process. This will help the patient to understand the intervention and (hopefully) to concur with the medical staff regarding the importance and necessity of performing the amputation, as well as postoperative expectations.
The vast majority of amputations are performed for vascular disease and infection resulting from diabetic neuropathy; the most common level is a below-knee amputation. The more proximal the amputation, the greater the metabolic cost of walking. Studies have shown that walking speed is decreased and oxygen consumption is increased with more proximal amputations.
Preoperative consideration of several important factors will
directly affect the patient’s ability to successfully recover from
the amputation. The goal of surgery is to leave enough viable
tissue that will heal and allow for prosthetic fitting. A serum
albumin level below 3.5 g/dL indicates a malnourished patient
and an absolute lymphocyte count below 1,500/mm3 is a sign
of immune deficiency; these values should be corrected prior
to any elective amputation. Some advocate the optimization of
serum glucose levels in patients with diabetes, but this treatment
choice is not entirely clear. To maximize the health and
nourishment of the patient, an internist and nutritionist
should be included in the treatment team.
This report describes SRM 660b, the third generation of this powder diffraction SRM used primarily for determination of the instrument profile function (IPF). It is certified with respect to unit-cell parameter. It consists of approximately 6 g LaB6 powder prepared using a 11B isotopically enriched precursor material so as to render the SRM applicable to the neutron diffraction community. The microstructure of the LaB6 powder was engineered to produce a crystallite size above that where size broadening is typically observed and to minimize the crystallographic defects that lead to strain broadening. A NIST -built diffractometer, incorporating many advanced design features, was used to certify the unit-cell parameter of the LaB6 powder. Both type A, statistical, and type B, systematic, errors have been assigned to yield a certified value for the unit-cell parameter of a=0.415691(8) nm at 22.5°C.
The National Institute of Standards and Technology (NIST) certifies a variety of standard reference materials (SRM) to address specific aspects of instrument performance for divergent beam diffractometers. This paper describes SRM 640d, the fifth generation of this powder diffraction SRM, which is certified with respect to the lattice parameter. It consists of approximately 7.5 g silicon powder specially prepared to produce strain-free particles in a size range between 1 and 10 μm to eliminate size-broadening effects. It is typically used for calibrating powder diffractometers for the line position and line shape. A NIST built diffractometer, incorporating many advanced design features, was used to certify the lattice parameter of the silicon powder measured at 22.5 °C. Both type A, statistical, and type B, systematic, errors have been assigned to yield a certified value for the lattice parameter of a=0.543 159±0.000 020 nm.