To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The Dayao Paleolithic site, located in Inner Mongolia on the eastern margin of China's vast northwestern drylands, was a lithic quarry-workshop utilized by Pleistocene human migrants through the region. Determining the age of this activity has previously yielded controversial results. Our magnetostratigraphic and OSL dating results suggest the two artifact-bearing paleosols are correlated with MIS 5 and 7, respectively. Correlating paleoclimatic data with marine δ18O records leads us to conclude that two sandy gravel layers containing many artifacts in the lower part of the Dayao sequence were formed during MIS 9 and 11, if not earlier. Our results reveal that the earliest human occupation at the Dayao site occurred before ca. 400 ka during a relatively warm and moist interglacial period, similar to several subsequent occupations, documenting the earliest and northernmost archaeological assemblage yet reported in China's arid northwest. We conclude that the northward and southward displacements of the East Asian summer monsoon rain belt during past interglacial-glacial cycles were responsible for the discontinuous human occupation detected at the Dayao site. The penetration of this precipitation regime into dryland ecologies via the Huanghe (Yellow River) Valley effectively created a corridor for hominin migration into China's arid northwest.
Late-life depression has substantial impacts on individuals, families and society. Knowledge gaps remain in estimating the economic impacts associated with late-life depression by symptom severity, which has implications for resource prioritisation and research design (such as in modelling). This study examined the incremental health and social care expenditure of depressive symptoms by severity.
We analysed data collected from 2707 older adults aged 60 years and over in Hong Kong. The Patient Health Questionnaire-9 (PHQ-9) and the Client Service Receipt Inventory were used, respectively, to measure depressive symptoms and service utilisation as a basis for calculating care expenditure. Two-part models were used to estimate the incremental expenditure associated with symptom severity over 1 year.
The average PHQ-9 score was 6.3 (standard deviation, s.d. = 4.0). The percentages of respondents with mild, moderate and moderately severe symptoms and non-depressed were 51.8%, 13.5%, 3.7% and 31.0%, respectively. Overall, the moderately severe group generated the largest average incremental expenditure (US$5886; 95% CI 1126–10 647 or a 272% increase), followed by the mild group (US$3849; 95% CI 2520–5177 or a 176% increase) and the moderate group (US$1843; 95% CI 854–2831, or 85% increase). Non-psychiatric healthcare was the main cost component in a mild symptom group, after controlling for other chronic conditions and covariates. The average incremental association between PHQ-9 score and overall care expenditure peaked at PHQ-9 score of 4 (US$691; 95% CI 444–939), then gradually fell to negative between scores of 12 (US$ - 35; 95% CI - 530 to 460) and 19 (US$ -171; 95% CI - 417 to 76) and soared to positive and rebounded at the score of 23 (US$601; 95% CI -1652 to 2854).
The association between depressive symptoms and care expenditure is stronger among older adults with mild and moderately severe symptoms. Older adults with the same symptom severity have different care utilisation and expenditure patterns. Non-psychiatric healthcare is the major cost element. These findings inform ways to optimise policy efforts to improve the financial sustainability of health and long-term care systems, including the involvement of primary care physicians and other geriatric healthcare providers in preventing and treating depression among older adults and related budgeting and accounting issues across services.
Stratospheric airships are promising aircraft, usually designed as a non-rigid airship. As an essential part of the non-rigid airship, the envelope plays a significant role in maintaining its shape and bearing the external force load. Generally, the envelope material of a flexible airship consists of plain-weave fabric, composed of warp and weft fibre yarn. At present, biaxial tensile experiments are the primary method used to study the stress–strain characteristics of such flexible airship materials. In this work, biaxial tensile testing of UN-5100 material was carried out. The strain on the material under unusual stress and the stress ratio were obtained using Digital Image Correlation (DIC) technology. Also, the stress–strain curve was corrected by polynomial fitting. The slope of the stress–strain curve at different points, the Membrane Structures Association of Japan (MSAJ) standard and the Radial Basis Function (RBF) model were compared to identify the stress–strain characteristics of the materials. Some conclusions on the mechanical properties of the flexible airship material can be drawn and will play a significant role in the design of such envelopes.
We present the elements required to construct two devices used in an undergraduate plasma physics laboratory. The materials and construction costs of the sources, the vacuum systems and probe drives and electrical circuits are presented in detail in the text and the first appendix. We also provide the software for probe motion and data acquisition as well as the electrical schematics for key components. Experiments which have been performed are listed and two (resonance cones and whistler waves) are described in greater detail. The machines are flexible and original research is possible.
From 2011 through 2018, there was a notable increase in sporadic Legionnaires' disease in the state of Minnesota. Sporadic cases are those not associated with a documented outbreak. Outbreak-related cases are typically associated with a common identified contaminated water system; sporadic cases typically do not have a common source that has been identified. Because of this, it is hypothesised that weather and environmental factors can be used as predictors of sporadic Legionnaires' disease. An ecological design was used with case report surveillance data from the state of Minnesota during 2011 through 2018. Over this 8-year period, there were 374 confirmed Legionnaires' disease cases included in the analysis. Precipitation, temperature and relative humidity (RH) data were collected from weather stations across the state. A Poisson regression analysis examined the risk of Legionnaires' disease associated with precipitation, temperature, RH, land-use and age. A lagged average 14-day precipitation had the strongest association with Legionnaires' disease (RR 2.5, CI 2.1–2.9), when accounting for temperature, RH, land-use and age. Temperature, RH and land-use also had statistically significant associations to Legionnaires' disease, but with smaller risk ratios. This study adds to the body of evidence that weather and environmental factors play an important role in the risk of sporadic Legionnaires' disease. This is an area that can be used to target additional research and prevention strategies.
Reducing dietary CP content is an effective approach to reduce animal nitrogen excretion and save protein feed resources. However, it is not clear how reducing dietary CP content affects the nutrient digestion and absorption in the gut of ruminants, therefore it is difficult to accurately determine how much reduction in dietary CP content is appropriate. This study was conducted to investigate the effects of reduced dietary CP content on N balance, intestinal nutrient digestion and absorption, and rumen microbiota in growing goats. To determine N balance, 18 growing wether goats (25.0 ± 0.5 kg) were randomly assigned to one of three diets: 13.0% (control), 11.5% and 10.0% CP. Another 18 growing wether goats (25.0 ± 0.5 kg) were surgically fitted with ruminal, proximate duodenal, and terminal ileal fistulae and were randomly assigned to one of the three diets to investigate intestinal amino acid (AA) absorption and rumen microbiota. The results showed that fecal and urinary N excretion of goats fed diets containing 11.5% and 10.0% CP were lower than those of goats fed the control diet (P < 0.05). When compared with goats fed the control diet, N retention was decreased and apparent N digestibility in the entire gastrointestinal tract was increased in goats fed the 10% CP diet (P < 0.05). When compared with goats fed the control diet, the duodenal flow of lysine, tryptophan and phenylalanine was decreased in goats fed the 11.5% CP diet (P < 0.05) and that of lysine, methionine, tryptophan, phenylalanine, leucine, glutamic acid, tyrosine, essential AAs (EAAs) and total AAs (TAAs) was decreased in goats fed the 10.0% CP diet (P < 0.05). When compared with goats fed the control diet, the apparent absorption of TAAs in the small intestine was increased in goats fed the 11.5% CP diet (P < 0.05) and that of isoleucine, serine, cysteine, EAAs, non-essential AAs, and TAAs in the small intestine was increased in goats fed the 10.0% CP diet (P < 0.05). When compared with goats fed the control diet, the relative richness of Bacteroidetes and Fibrobacteres was increased and that of Proteobacteria and Synergistetes was decreased in the rumen of goats fed a diet with 10.0% CP. In conclusion, reducing dietary CP content reduced N excretion and increased nutrient utilization by improving rumen fermentation, enhancing nutrient digestion and absorption, and altering rumen microbiota in growing goats.
At present, analysis of diet and bladder cancer (BC) is mostly based on the intake of individual foods. The examination of food combinations provides a scope to deal with the complexity and unpredictability of the diet and aims to overcome the limitations of the study of nutrients and foods in isolation. This article aims to demonstrate the usability of supervised data mining methods to extract the food groups related to BC. In order to derive key food groups associated with BC risk, we applied the data mining technique C5.0 with 10-fold cross-validation in the BLadder cancer Epidemiology and Nutritional Determinants study, including data from eighteen case–control and one nested case–cohort study, compromising 8320 BC cases out of 31 551 participants. Dietary data, on the eleven main food groups of the Eurocode 2 Core classification codebook, and relevant non-diet data (i.e. sex, age and smoking status) were available. Primarily, five key food groups were extracted; in order of importance, beverages (non-milk); grains and grain products; vegetables and vegetable products; fats, oils and their products; meats and meat products were associated with BC risk. Since these food groups are corresponded with previously proposed BC-related dietary factors, data mining seems to be a promising technique in the field of nutritional epidemiology and deserves further examination.
The risk factors of criminal behavior in patients with schizophrenia are not well explored. This study is to explore the risk factors for criminal behavior in patients with schizophrenia in rural China.
We used data from a 14-year prospective follow-up study (1994-2008) of criminal behavior among a cohort (n=510) of patients with schizophrenia in Xinjin County, Chengdu, China.
There were 489 patients (95.9%) who were followed up from 1994 to 2008. The rate of criminal behavior was 13.5% among these patients with schizophrenia during the follow-up period. Compared with female subjects (6 cases, 20.0%), male patients had significantly higher rate of violent criminal behavior (e.g., arson, sexual assault, physical assault, and murder) (24 cases, 80.0%) (p< 0.001). Bivariate analyses showed that the risk of criminal behavior was significantly associated with being unmarried, of younger age, previous violent behavior, homelessness, lower family economic status, no family caregivers, and higher scores on measures (PANSS) of positive, negative, and total symptoms of illness. In multiple logistic regression analyses being unmarried and previous violent behavior were identified as independent predictors of increased criminal behavior in persons with schizophrenia.
The risk factors for criminal behavior among patients with schizophrenia should be understood within a particular social context. Criminal behavior may be predicted by specific characteristics of patients with schizophrenia in rural community. The findings of risk factors for criminal behavior should be considered in planning community mental health care and interventions for high-risk patients and their families.
Many schizophrenia patients experience residual symptoms even after treatment. Electroconvulsive therapy (ECT) is often used in medication-resistant schizophrenia patients when pharmacologic interventions have failed; however, the mechanism of action is unclear. Brain-derived neurotrophic factor (BDNF) levels are reduced in drug-naive, first-episode schizophrenia and are increased by antipsychotic treatment. We tested the hypothesis that ECT increases serum BDNF levels by measuring BDNF concentrations in schizophrenia patients before and after they received ECT.
A total of 160 patients with schizophrenia were examined. The ECT group (n = 80) was treated with antipsychotics and ECT (eight to 10 sessions administered every other day). The drug therapy group (n = 80) received only antipsychotic treatment. A control group (n = 77) was recruited that served as the baseline for comparison.
Baseline serum BDNF level in ECT group was lower than in controls (9.7 ± 2.1 vs. 12.4 ± 3.2 ng/ml; P < 0.001), but increased after ECT, such that there was no difference between the two groups (11.9 ± 3.3 vs. 12.4 ± 3.2 ng/ml; P = 0.362). There was no correlation between patients’ Positive and Negative Syndrome Scale (PANSS) score and serum BDNF level before ECT; however, a negative correlation was observed after ECT (total: r = −0.692; P < 0.01). From baseline to remission after ECT, serum BDNF level increased (P < 0.001) and their PANSS score decreased (P < 0.001). Changes in BDNF level (2.21 ± 4.10 ng/ml) and PANSS score (28.69 ± 14.96) were positively correlated in the ECT group (r = 0.630; P < 0.01).
BDNF level was lower in schizophrenia patients relative to healthy controls before ECT and medication. BDNF level increased after ECT and medication, and its longitudinal change was associated with changes in patients’ psychotic symptoms. These results indicate that BDNF mediates the antipsychotic effects of ECT.
An improved variational optimization approach is established to optimize and analyse the propulsion efficiency of the high-altitude contra-rotating propellers for high-altitude airships based on the Vortex Lattice Lifting Line Method. The optimum radial circulation distribution, chord and pitch distribution are optimized under the maximum lift-to-drag ratio of aerofoils. To consider the effects of the actual Reynolds number and the Mach number of each aerofoil section, aerodynamics such as lift coefficient, drag coefficient and lift-to-ratio are obtained by interpolating a CFD database, which is established by numerical simulations under different Reynolds number, Mach number and angles-of-attack. The improved method is verified by validation cases on a high-altitude CRP using the three-dimensional steady Reynolds-averaged Navier-Stokes solver and moving reference frames technique. The optimization results of thrust, torque and efficiency for both the individual front/rear propeller and CRP are shown to agree reasonably well with the CFD results. Using the improved approach, the influence of blade numbers, diameter, rotation speeds, axial distance and torque ratio on the optimum efficiency of CRPs is illustrated in detail by conducting parametric studies.
Plant nitrogen (N) links with many physiological progresses of crop growth and yield formation. Accurate simulation is key to predict crop growth and yield correctly. The aim of the current study was to improve the estimation of N uptake and translocation processes in the whole rice plant as well as within plant organs in the RiceGrow model by using plant and organ maximum, critical and minimum N dilution curves. The maximum and critical N (Nc) demand (obtained from the maximum and critical curves) of shoot and root and Nc demand of organs (leaf, stem and panicle) are calculated by N concentration and biomass. Nitrogen distribution among organs is computed differently pre- and post-anthesis. Pre-anthesis distribution is determined by maximum N demand with no priority among organs. In post-anthesis distribution, panicle demands are met first and then the remaining N is allocated to other organs without priority. The amount of plant N uptake depends on plant N demand and N supplied by the soil. Calibration and validation of the established model were performed on field experiments conducted in China and the Philippines with varied N rates and N split applications; results showed that this improved model can simulate the processes of N uptake and translocation well.
This study aim to derive and validate a simple and well-performing risk calculator (RC) for predicting psychosis in individual patients at clinical high risk (CHR).
From the ongoing ShangHai-At-Risk-for-Psychosis (SHARP) program, 417 CHR cases were identified based on the Structured Interview for Prodromal Symptoms (SIPS), of whom 349 had at least 1-year follow-up assessment. Of these 349 cases, 83 converted to psychosis. Logistic regression was used to build a multivariate model to predict conversion. The area under the receiver operating characteristic (ROC) curve (AUC) was used to test the effectiveness of the SIPS-RC. Second, an independent sample of 100 CHR subjects was recruited based on an identical baseline and follow-up procedures to validate the performance of the SIPS-RC.
Four predictors (each based on a subset of SIPS-based items) were used to construct the SIPS-RC: (1) functional decline; (2) positive symptoms (unusual thoughts, suspiciousness); (3) negative symptoms (social anhedonia, expression of emotion, ideational richness); and (4) general symptoms (dysphoric mood). The SIPS-RC showed moderate discrimination of subsequent transition to psychosis with an AUC of 0.744 (p < 0.001). A risk estimate of 25% or higher had around 75% accuracy for predicting psychosis. The personalized risk generated by the SIPS-RC provided a solid estimate of conversion outcomes in the independent validation sample, with an AUC of 0.804 [95% confidence interval (CI) 0.662–0.951].
The SIPS-RC, which is simple and easy to use, can perform in the same manner as the NAPLS-2 RC in the Chinese clinical population. Such a tool may be used by clinicians to counsel appropriately their patients about clinical monitor v. potential treatment options.
The response of soil microbial communities to soil quality changes is a sensitive indicator of soil ecosystem health. The current work investigated soil microbial communities under different fertilization treatments in a 31-year experiment using the phospholipid fatty acid (PLFA) profile method. The experiment consisted of five fertilization treatments: without fertilizer input (CK), chemical fertilizer alone (MF), rice (Oryza sativa L.) straw residue and chemical fertilizer (RF), low manure rate and chemical fertilizer (LOM), and high manure rate and chemical fertilizer (HOM). Soil samples were collected from the plough layer and results indicated that the content of PLFAs were increased in all fertilization treatments compared with the control. The iC15:0 fatty acids increased significantly in MF treatment but decreased in RF, LOM and HOM, while aC15:0 fatty acids increased in these three treatments. Principal component (PC) analysis was conducted to determine factors defining soil microbial community structure using the 21 PLFAs detected in all treatments: the first and second PCs explained 89.8% of the total variance. All unsaturated and cyclopropyl PLFAs except C12:0 and C15:0 were highly weighted on the first PC. The first and second PC also explained 87.1% of the total variance among all fertilization treatments. There was no difference in the first and second PC between RF and HOM treatments. The results indicated that long-term combined application of straw residue or organic manure with chemical fertilizer practices improved soil microbial community structure more than the mineral fertilizer treatment in double-cropped paddy fields in Southern China.
Previous studies have demonstrated that type 1 diabetes mellitus (T1DM) could be triggered by an early childhood infection. Whether maternal infection during pregnancy is associated with T1DM in offspring is unknown. Therefore, we aimed to study the association using a systematic review and meta-analysis. Eighteen studies including 4304 cases and 25 846 participants were enrolled in this meta-analysis. Odds ratios (ORs) and 95% confidence intervals (CIs) were synthesised using random-effects models. Subgroup analyses and sensitivity analyses were conducted to assess the robustness of associations. Overall, the pooled analysis yielded a statistically significant association between maternal infection during pregnancy and childhood T1DM (OR 1.31, 95% CI 1.07–1.62). Furthermore, six studies that tested maternal enterovirus infection showed a pooled OR of 1.54 (95% CI 1.05–2.27). Heterogeneity from different studies was evident (I2 = 70.1%, P < 0.001) and was mainly attributable to the different study designs, ascertaining methods and sample size among different studies. This study provides evidence for an association between maternal infection during pregnancy and childhood T1DM.
The gastrointestinal tract (GIT) of animals is capable of sensing various kinds of nutrients via G-protein coupled receptor-mediated signaling transduction pathways, and the process is known as ‘gut nutrient chemosensing’. GPR40, GPR41, GPR43 and GPR119 are chemoreceptors for free fatty acids (FFAs) and lipid derivatives, but they are not well studied in small ruminants. The objective of this study is to determine the expression of GPR40, GPR41, GPR43 and GPR119 along the GIT of kid goats under supplemental feeding (S) v. grazing (G) during early development. In total, 44 kid goats (initial weight 1.35±0.12 kg) were slaughtered for sampling (rumen, abomasum, duodenum, jejunum, ileum, cecum, colon and rectum) between days 0 and 70. The expression of GPR41 and GPR43 were measured at both mRNA and protein levels, whereas GPR40 and GPR119 were assayed at protein level only. The effects of age and feeding system on their expression were variable depending upon GIT segments, chemoreceptors and expression level (mRNA or protein), and sometimes feeding system × age interactions (P<0.05) were observed. Supplemental feeding enhanced expression of GPR40, GPR41 and GPR43 in most segments of the GIT of goats, whereas G enhanced expression of GPR119. GPR41 and GPR43 were mainly expressed in rumen, abomasum and cecum, with different responses to age and feeding system. GPR41 and GPR43 expression in abomasum at mRNA level was greatly (P<0.01) affected by both age and feeding system; whereas their expression in rumen and abomasum at protein level were different, feeding system greatly (P<0.05) affected GPR41 expression, but had no effect (P>0.05) on GPR43 expression; and there were no feeding system×age interactions (P>0.05) on GPR41 and GPR43 protein expression. The expression of GPR41 and GPR43 in rumen and abomasum linearly (P<0.01) increased with increasing age (from days 0 to 70). Meanwhile, age was the main factor affecting GPR40 expression throughout the GIT. These outcomes indicate that age and feeding system are the two factors affecting chemoreceptors for FFAs and lipid derivatives expression in the GIT of kids goats, and S enhanced the expression of chemoreceptors for FFAs, whereas G gave rise to greater expression of chemoreceptors for lipid derivatives. Our results suggest that enhanced expression of chemoreceptors for FFAs might be one of the benefits of early supplemental feeding offered to young ruminants during early development.
Knowledge about the infection transmission routes is significant for developing effective intervention strategies. We searched the PubMed databases and identified 10 studies with 14 possible inflight influenza A(H1N1)pdm09 outbreaks. Considering the different mechanisms of the large-droplet and airborne routes, a meta-analysis of the outbreak data was carried out to study the difference in attack rates for passengers within and beyond two rows of the index case(s). We also explored the relationship between the attack rates and the flight duration and/or total infectivity of the index case(s). The risk ratios for passengers seated within and beyond the two rows of the index cases were 1.7 (95% confidence interval (CI) 0.98–2.84) for syndromic secondary cases and 4.3 (95% CI 1.25–14.54) for laboratory-confirmed secondary cases. Furthermore, with an increase of the product of the flight duration and the total infectivity of the index cases, the overall attack rate increased linearly. The study indicates that influenza A(H1N1)pdm09 may mainly be transmitted via the airborne route during air travel. A standardised approach for the reporting of such inflight outbreak investigations would help to provide more convincing evidence for such inflight transmission events.
Influenza is a long-standing public health concern, but its transmission remains poorly understood. To have a better knowledge of influenza transmission, we carried out a detailed modelling investigation in a nosocomial influenza outbreak in Hong Kong. We identified three hypothesised transmission modes between index patient and other inpatients based on the long-range airborne and fomite routes. We considered three kinds of healthcare workers’ routine round pathways in 1140 scenarios with various values of important parameters. In each scenario, we used a multi-agent modelling framework to estimate the infection risk for each hypothesis and conducted least-squares fitting to evaluate the hypotheses by comparing the distribution of the infection risk with that of the attack rates. Amongst the hypotheses tested in the 1140 scenarios, the prediction of modes involving the long-range airborne route fit better with the attack rates, and that of the two-route transmission mode had the best fit, with the long-range airborne route contributing about 94% and the fomite route contributing 6% to the infections. Under the assumed conditions, the influenza virus was likely to have spread via a combined long-range airborne and fomite routes, with the former predominant and the latter negligible.
Introduction: Accurate identification of children with a concussion by emergency department (ED) physicians is important to initiate appropriate anticipatory guidance and management. In children meeting international criteria for concussion, we aimed to determine the proportion who were provided this diagnosis by the ED physician and which variables were associated with a physician-diagnosed concussion. We also compared persistent symptoms in concussion cases versus those with alternative diagnoses. Methods: This was a planned secondary analysis of a prospective, multicenter cohort study. Participants were children aged 5 through 17 years and met Zurich/Berlin International Consensus Statement criteria for concussion. The primary outcome was the proportion of study participants who were assigned a diagnosis of concussion by the treating ED physician. Based on available evidence, between 50% and 90% of children meeting international concussion criteria are also diagnosed by an ED physician as having a concussion. Assuming a worst case scenario that 50% of physicians would diagnose concussion, our anticipated study sample size of 2946 would be accompanied by a +2% margin of error at the 95% confidence level for the primary outcome. Results: Among the 2946 eligible children, 2340 [79.4% (95% CI 78.0, 80.8)] were diagnosed with a concussion by an ED physician. Twelve variables were associated with this ED diagnosis, five of which had an odds ratio (OR) > 1.5: older age (13-17 vs 5-7 years, OR=2.9), longer time to presentation (>16 vs. <16 hours, OR=2.1), nausea (OR=1.7), sport mechanism (OR=1.7), and amnesia (OR=1.6). In those with physician-diagnosed concussion versus no concussion, the frequency of persistent symptoms was 62.5% vs. 38.8% (p<0.0001) at one week, 46.3% vs. 25.8% (p<0.0001) at two weeks and 33.0% vs. 23.0% (p<0.0001) at four weeks. Conclusion: Most children meeting international criteria for concussion were provided this diagnosis by the ED physician. There were five variables which increased the odds of this diagnosis by at least 1.5-fold. Relative to international criteria, the more selective assignment of concussion by ED physicians was associated with a greater frequency of persistent concussion symptoms. Nevertheless, many patients with alternative diagnoses exhibited persistent concussive symptoms at all time points. Clinicians should therefore weigh the benefits and risks of strictly applying the Zurich/Berlin international criteria versus individual discretion.
Coinfection with human immunodeficiency virus (HIV) and viral hepatitis is associated with high morbidity and mortality in the absence of clinical management, making identification of these cases crucial. We examined characteristics of HIV and viral hepatitis coinfections by using surveillance data from 15 US states and two cities. Each jurisdiction used an automated deterministic matching method to link surveillance data for persons with reported acute and chronic hepatitis B virus (HBV) or hepatitis C virus (HCV) infections, to persons reported with HIV infection. Of the 504 398 persons living with diagnosed HIV infection at the end of 2014, 2.0% were coinfected with HBV and 6.7% were coinfected with HCV. Of the 269 884 persons ever reported with HBV, 5.2% were reported with HIV. Of the 1 093 050 persons ever reported with HCV, 4.3% were reported with HIV. A greater proportion of persons coinfected with HIV and HBV were males and blacks/African Americans, compared with those with HIV monoinfection. Persons who inject drugs represented a greater proportion of those coinfected with HIV and HCV, compared with those with HIV monoinfection. Matching HIV and viral hepatitis surveillance data highlights epidemiological characteristics of persons coinfected and can be used to routinely monitor health status and guide state and national public health interventions.