To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Historical sources reveals that Copenhagen was founded in the late 12th century AD by Bishop Absalon. However, during the excavation for the new metro in central Copenhagen a previously unknown early medieval cemetery was discovered and excavated at the Town Hall Square. Radiocarbon (14C) analysis was conducted on the 9 individuals found in situ, together with 11 individuals from the other early medieval cemetery in Copenhagen, belonging to the St Clemens church. The radiocarbon analysis places the onset of the cemeteries to the early 11th century AD and therefore questions the age of Copenhagen and hence the archaeological and historical perception of the Danish historical record. Here a detailed account of the radiocarbon-based Bayesian model is presented.
Certain hypotheses cannot be directly confirmed for theoretical, practical, or moral reasons. For some of these hypotheses, however, there might be a workaround: confirmation based on analogical reasoning. In this paper we take up Dardashti, Hartmann, Thébault, and Winsberg’s (2019) idea of analyzing confirmation based on analogical inference Bayesian style. We identify three types of confirmation by analogy and show that Dardashti et al.’s approach can cover two of them. We then highlight possible problems with their model as a general approach to analogical inference and argue that these problems can be avoided by supplementing Bayesian update with Jeffrey conditionalization.
Navigational accidents (collisions and groundings) account for approximately 85% of mari-time accidents, and consequence estimation for such accidents is essential for both emergency resource allocation when such accidents occur and for risk management in the framework of a formal safety assessment. As the traditional Bayesian network requires expert judgement to develop the graphical structure, this paper proposes a mutual information-based Bayesian network method to reduce the requirement for expert judgements. The central premise of the proposed Bayesian network method involves calculating mutual information to obtain the quantitative element among multiple influencing factors. Seven-hundred and ninety-seven historical navigational accident records from 2006 to 2013 were used to validate the methodology. It is anticipated the model will provide a practical and reasonable method for consequence estimation of navigational accidents.
An en primeur agreement is an unconventional forward contract. In this article, we provide a new conceptual framework for analyzing the properties of en primeur prices based on the cost of carry approach. The results, based upon Bayesian modeling, indicate that the cost of carry increases up to 0.9598 when en primeur and bottled wines are traded in parallel. Moreover, our findings confirm that price dispersion around the mean value is greater for en primeur wines (22.42%) than for standard bottled wines (8.2%) traded after the sale of en primeur wines has ended. (JEL Classifications: G12, G15, L66, Q02)
Lange (2000) famously argues that although Jeffrey Conditionalization is non-commutative over evidence, it’s not defective in virtue of this feature. Since reversing the order of the evidence in a sequence of updates that don’t commute does not reverse the order of the experiences that underwrite these revisions, the conditions required to generate commutativity failure at the level of experience will fail to hold in cases where we get commutativity failure at the level of evidence. If our interest in commutativity is, fundamentally, an interest in the order-invariance of information, an updating sequence that does not violate such a principle at the more fundamental level of experiential information should not be deemed defective. This paper claims that Lange’s argument fails as a general defense of the Jeffrey framework. Lange’s argument entails that the inputs to the Jeffrey framework differ from those of classical Bayesian Conditionalization in a way that makes them defective. Therefore, either the Jeffrey framework is defective in virtue of not commuting its inputs, or else it is defective in virtue of commuting the wrong kinds of ones.
The introduction reviews, summarizes, and illustrates fundamental connections among Bayesian inference, numerical quadrature, Gausssian process regression, polyharmonic splines, information-based complexity, optimal recovery, and game theory that form the basis for the book. This is followed by describing a sample of the results derived from these interplays; including those in numerical homogenization, operator-adapted wavelets, fast solvers, and Gaussian process regression. It finishes with an outline of the structure of the book.
Experiments should be designed to facilitate the detection of experimental measurement error. To this end, we advocate the implementation of identical experimental protocols employing diverse experimental modes. We suggest iterative nonparametric estimation techniques for assessing the magnitude of heterogeneous treatment effects across these modes. And we propose two diagnostic strategies—measurement metrics embedded in experiments, and measurement experiments—that help assess whether any observed heterogeneity reflects experimental measurement error. To illustrate our argument, first we conduct and analyze results from four identical interactive experiments: in the lab; online with subjects from the CESS lab subject pool; online with an online subject pool; and online with MTurk workers. Second, we implement a measurement experiment in India with CESS Online subjects and MTurk workers.
Fremont societies of the Uinta Basin incorporated domesticates into a foraging lifeway over a 1,000-year period from AD 300 to 1300. Fremont research provides a unique opportunity to critically examine the social and ecological processes behind the adoption and abandonment of domesticates by hunter-gatherers. We develop and integrate a 2,115-year precipitation reconstruction with a Bayesian chronological model for the growth of Fremont societies in the Cub Creek reach of Dinosaur National Monument. Comparison of the archaeological chronology with the precipitation record suggests that the florescence of Fremont societies was an adaptation to multidecadal precipitation variability with an approximately 30-plus-year periodicity over most, but not all, of the last 2,115 years. Fremont societies adopted domesticates to enhance their resilience to periodic droughts. We propose that reduced precipitation variability from AD 750 to AD 1050, superimposed over consistent mean precipitation availability, was the tipping point that increased maize production, initiated agricultural intensification, and resulted in increased population and development of pithouse communities. Our study develops a multidecadal/multigenerational model within which to evaluate the strategies underwriting the adoption of domesticates by foragers, the formation of Fremont communities, and the inherent vulnerabilities to resource intensification that implicate the eventual dissolution of those communities.
Biodiversity conservation in forest fragments surrounded by a low-quality matrix requires an understanding of how ecological conditions prevailing in the matrix enter the fragments and interact with local habitat conditions. We assessed the regeneration of oak species along edge–interior gradients in forest fragments at the periphery of Mexico City. The abundance of oak saplings was sampled along transects to the forest, while the edge effect was analysed using segmented zero-inflated Poisson models for abundance data. Three oak species were dominant in terms of their relative abundances: Quercus laeta, Quercus castanea and Quercus obtusata. Regeneration of nine oak species responded nonlinearly to the edge distance, with greater sapling abundance from the edge up to 10 m into the fragment. Canopy cover and tree height decreased from edge to fragment interior, while saplings increased in open areas within the fragments (i.e., independent of edge distance). A posterior analysis indicated that Q. obtusata reacted positively to edges. These results indicate that oak regeneration is promoted by suitable habitat conditions near the boundaries. Therefore, we suggest that forest management should focus on promoting seed production and oak establishment in forest interior habitats.
Does the importance of the economy change during a government's time in office? Governments arguably become more responsible for current economic conditions as their tenure progresses. This might lead voters to hold experienced governments more accountable for economic conditions. However, voters also accumulate information about governments' competence over time. If voters are Bayesian learners, then this growing stock of information should crowd out the importance of current economic conditions. This article explores these divergent predictions about the relationship between tenure and the economic vote using three datasets. First, using country-level data from a diverse set of elections, the study finds that support for more experienced governments is less dependent on economic growth. Secondly, using individual-level data from sixty election surveys covering ten countries, the article shows that voters' perceptions of the economy have a greater impact on government support when the government is inexperienced. Finally, the article examines a municipal reform in Denmark that assigned some voters to new local incumbents and finds that these voters responded more strongly to the local economy. In conclusion, all three studies point in the same direction: economic voting decreases with time in office.
Radiocarbon dating is rarely used in historical or contact-era North American archaeology because of idiosyncrasies of the calibration curve that result in ambiguous calendar dates for this period. We explore the potential and requirements for radiocarbon dating and Bayesian analysis to create a time frame for early contact-era sites in northeast North America independent of the assumptions and approximations involved in temporal constructs based on trade goods and other archaeological correlates. To illustrate, we use Bayesian chronological modeling to analyze radiocarbon dates on short-lived samples and a post from four Huron-Wendat Arendarhonon sites (Benson, Sopher, Ball, and Warminster) to establish an independent chronology. We find that Warminster was likely occupied in 1615–1616, and so is the most likely candidate for the site of Cahiagué visited by Samuel de Champlain in 1615–1616, versus the other main suggested alternative, Ball, which dates earlier, as do the Sopher and Benson sites. In fact, the Benson site seems likely to date ~50 years earlier than currently thought. We present the methods employed to arrive at these new, independent age estimates and argue that absolute redating of historic-era sites is necessary to accurately assess existing interpretations based on relative dating and associated regional narratives.
Human-like motion of robots can improve human–robot interaction and increase the efficiency. In this paper, a novel human-like motion planning strategy is proposed to help anthropomorphic arms generate human-like movements accurately. The strategy consists of three parts: movement primitives, Bayesian network (BN), and a novel coupling neural network (CPNN). The movement primitives are used to decouple the human arm movements. The classification of arm movements improves the accuracy of human-like movements. The motion-decision algorithm based on BN is able to predict occurrence probabilities of the motions and choose appropriate mode of motion. Then, a novel CPNN is proposed to solve the inverse kinematics problems of anthropomorphic arms. The CPNN integrates different models into a single network and reflects the features of these models by changing the network structure. Through the strategy, the anthropomorphic arms can generate various human-like movements with satisfactory accuracy. Finally, the availability of the proposed strategy is verified by simulations for the general motion of humanoid NAO.
After the sharp transition to aridity that followed the “Green Sahara” episode 5500 years ago, human settlements took refuge in Egyptian oases, which have to varying extents been “Green Oases” for centuries. In that period, synchronous with the beginning of historical times, the desert’s aridity is generally regarded as broadly comparable to the current period. Natural and anthropogenic deposits studied during 13 excavation campaigns in Bahariya Oasis (Egyptian Desert) suggest that a fairly clear transition from a relatively green environment to much more arid landscapes occurred in the first millennia BCE and CE. This article aims at establishing the chronology of human occupations and environmental change within this period, by combining archaeological and radiocarbon data, using Bayesian modeling. It reveals that the drying up of the environment experienced by desert farmers occurred at some point between the reigns of Antoninus Pius and Caracalla (2nd–3rd century CE). The accuracy of the produced chronological models made it possible to highlight synchronisms between the end of this “Green Oasis” phase and comparable aridification phenomena on regional and interregional scales. Similar degradation processes on remote sites inside the Roman Empire might be explained by globalized anthropogenic agencies overlapping with a broader climatic drying.
Disagreement is a ubiquitous feature of human life, and philosophers have dutifully attended to it. One important question related to disagreement is epistemological: How does a rational person change her beliefs (if at all) in light of disagreement from others? The typical methodology for answering this question is to endorse a steadfast or conciliatory disagreement norm (and not both) on a priori grounds and selected intuitive cases. In this paper, I argue that this methodology is misguided. Instead, a thoroughgoingly Bayesian strategy is what's needed. Such a strategy provides conciliatory norms in appropriate cases and steadfast norms in appropriate cases. I argue, further, that the few extant efforts to address disagreement in the Bayesian spirit are laudable but uncompelling. A modelling, rather than a functional, approach gets us the right norms and is highly general, allowing the epistemologist to deal with (1) multiple epistemic interlocutors, (2) epistemic superiors and inferiors (i.e. not just epistemic peers), and (3) dependence between interlocutors.
The northern lobe of the White River Ash (WRAn) is part of a bilobate distribution of tephras that originated from the Wrangell Volcanic Field near the border of Alaska, USA, and Yukon, Canada. It is distributed across northeastern Alaska and the northwestern portion of the Yukon. The timing of this eruption has seen little critical analysis relative to the younger and more extensive eastern lobe eruption of the White River Ash. We compiled 38 radiocarbon (14C) dates from above and below the WRAn, and employed several statistical approaches to identify and eliminate or down-weight outliers, combine dates, and different Bayesian models, to provide a revised age estimate for the timing of the WRAn tephra deposition. Our results indicate that the most accurate modeled age estimate for the northern lobe of the White River Ash deposition is between 1689 and 1560 cal BP, with a mean and median of 1625 and 1623 cal BP, respectively. This age range is 90 to 200 years younger than previous age estimates.
The empirical importance of news shocks—anticipated future shocks—in business cycle fluctuations has been explored by using only actual data when estimating models augmented with news shocks. This paper additionally exploits forecast data to identify news shocks in a canonical dynamic stochastic general equilibrium model. The estimated model shows new empirical evidence that technology news shocks are a major source of fluctuations in US output growth. Exploiting the forecast data not only generates more precise estimates of news shocks and other parameters in the model, but also increases the contribution of technology news shocks to the fluctuations.
This article proposes a unified framework for solving and estimating linear rational expectations models with a variety of frequency-domain techniques, some established, some new. The solution methodology is applicable to a wide class of models and leads to straightforward construction of the spectral density for performing likelihood-based inference. We also generalize the well-known spectral decomposition of the Gaussian likelihood function to a composite version implied by several competing models. Taken together, these techniques yield fresh insights into the model’s theoretical and empirical implications beyond conventional time-domain approaches can offer. We illustrate the proposed framework using a prototypical new Keynesian model with fiscal details and two determinate monetary–fiscal policy regimes. The model is simple enough to deliver an analytical solution that makes the policy effects transparent under each regime, yet still able to shed light on the empirical interactions between US monetary and fiscal policies along different frequencies.
In the context of a motivating study of dynamic network flow data on a large-scale e-commerce website, we develop Bayesian models for online/sequential analysis for monitoring and adapting to changes reflected in node–node traffic. For large-scale networks, we customize core Bayesian time series analysis methods using dynamic generalized linear models (DGLMs). These are integrated into the context of multivariate networks using the concept of decouple/recouple that was recently introduced in multivariate time series. This method enables flexible dynamic modeling of flows on large-scale networks and exploitation of partial parallelization of analysis while maintaining coherence with an over-arching multivariate dynamic flow model. This approach is anchored in a case study on Internet data, with flows of visitors to a commercial news website defining a long time series of node–node counts on over 56,000 node pairs. Central questions include characterizing inherent stochasticity in traffic patterns, understanding node–node interactions, adapting to dynamic changes in flows and allowing for sensitive monitoring to flag anomalies. The methodology of dynamic network DGLMs applies to many dynamic network flow studies.
In political science, data with heterogeneous units are used in many studies, such as those involving legislative proposals in different policy areas, electoral choices by different types of voters, and government formation in varying party systems. To disentangle decision-making mechanisms by units, traditional discrete choice models focus exclusively on the conditional mean and ignore the heterogeneous effects within a population. This paper proposes a conditional binary quantile model that goes beyond this limitation to analyze discrete response data with varying alternative-specific features. This model offers an in-depth understanding of the relationship between the explanatory and response variables. Compared to conditional mean-based models, the conditional binary quantile model relies on weak distributional assumptions and is more robust to distributional misspecification. The model also relaxes the assumption of the independence of irrelevant alternatives, which is often violated in practice. The method is applied to a range of political studies to show the heterogeneous effects of explanatory variables across the conditional distribution. Substantive interpretations from counterfactual scenarios are used to illustrate how the conditional binary quantile model captures unobserved heterogeneity, which extant models fail to do. The results point to the risk of averaging out the heterogeneous effects across units by conditional mean-based models.