To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The social sciences underwent rapid development in post-war America. Problems once framed in social terms gradually became redefined as individual with regards to scope and remedy, with economics and psychology winning influence over the social sciences. By the 1970s, both economics and psychology had spread their intellectual remits wide: psychology's concepts suffused everyday language, while economists entered a myriad of policy debates. Psychology and economics contributed to, and benefited from, a conception of society that was increasingly skeptical of social explanations and interventions. Sociology, in particular, lost intellectual and policy ground to its peers, even regarding 'social problems' that the discipline long considered its settled domain. The book's ten chapters explore this shift, each refracted through a single 'problem': the family, crime, urban concerns, education, discrimination, poverty, addiction, war, and mental health, examining the effects an increasingly individualized lens has had on the way we see these problems.
The real world is perceived and broken down as data, models and algorithms in the eyes of physicists and engineers. Data is noisy by nature and classical statistical tools have so far been successful in dealing with relatively smaller levels of randomness. The recent emergence of Big Data and the required computing power to analyse them have rendered classical tools outdated and insufficient. Tools such as random matrix theory and the study of large sample covariance matrices can efficiently process these big data sets and help make sense of modern, deep learning algorithms. Presenting an introductory calculus course for random matrices, the book focusses on modern concepts in matrix theory, generalising the standard concept of probabilistic independence to non-commuting random variables. Concretely worked out examples and applications to financial engineering and portfolio construction make this unique book an essential tool for physicists, engineers, data analysts, and economists.
Get up to speed on the modelling, design, technologies, and applications of tunable circuits and reconfigurable mm-wave systems. Coverage includes smart antennas and frequency-agile RF components, as well as a detailed comparison of three key technologies for the design of tunable mm-wave circuits: CMOS, RF MEMS, and microwave liquid crystals, and measurement results of state-of-the-art prototypes. Numerous examples of tunable circuits and systems are included that can be practically implemented for the reader's own needs. Ideal for graduate students studying RF/microwave engineering, and researchers and engineers involved in circuit and system design for new communication platforms such as mm-wave 5G and beyond, high-throughput satellites in GSO, and future satellite constellations in MEO/LEO, as well as for automotive radars, security and biomedical mm-wave systems.
The 2-degrees target of the Paris Agreement and Sustainable Development Goal 7 on energy are intrinsically intertwined and highlight the urgency of an effective and integrated approach on climate change and energy. However, there are over a hundred international and transnational institutions with different characteristics and priorities that aim to address climate and energy-related targets. While prior research has contributed useful insights into the complexity of climate and energy governance, respectively, an integrated and coherent analysis of the climate-energy nexus is lacking. This chapter therefore maps, visualizes, and analyzes this nexus, i.e. institutions that seek to govern climate change and energy simultaneously. In addition, the chapter zooms in on three specific subsets of institutions: renewable energy, fossil fuel subsidy reform, and carbon pricing. The mapping and analysis are based on a new dataset and provide first insights into the gaps, overlaps, and varying degrees of complexity of the climate-energy nexus and across its subfields. Moreover, the chapter serves as the empirical basis for further analyses of coherence, management, legitimacy, and effectiveness, and as the first step in creating a knowledge base to guide actors who seek to navigate the institutionally complex landscape of the climate-energy nexus.
This chapter establishes four evaluative themes that will be employed across this volume to analyze the institutional complexity of policy fields in the climate-energy nexus: coherence, management, legitimacy, and effectiveness. Coherence among institutions is conceptualized along four dimensions: convergence on an overarching core norm for the policy field, balanced coverage and distribution of memberships (private, public, hybrid), balanced coverage and distribution of governance functions (standards and commitments, operational activities, information and networking, financing), and mechanisms underlying cross-institutional relations (cognitive, normative, behavioural). Management will be examined according to types of managing agents, political levels (from domestic to global), and the consequences of management efforts in enhancing coherence. Legitimacy will be assessed along nine dimensions, among them expertise, transparency, accountability, or procedural and distributive fairness. Effectiveness, finally, will be examined in terms of normative and legal output produced by the institutions, their behaviour-changing outcome, and their ultimate problem-solving impact. Altogether, the four themes and their dimensions make up a novel framework for an in-depth analysis of a governance nexus. They help us examine a variety of important questions in a comparative research design, combining a high level of ambition with feasibility and novelty.
Leading accounts of the politics of the welfare state focus on societal demands for risk-spreading policies. Yet current measures of the welfare state focus not on risk, but on inequality. To address this gap, this letter describes the development of two new measures, risk incidence and risk reduction, which correspond to the prevalence of large income losses and the degree to which welfare states reduce that prevalence, respectively. Unlike existing indicators, these measures require panel data, which the authors harmonize for twenty-one democracies. The study finds that large losses affect all income and education levels, making the welfare state valuable to a broad cross-section of citizens. It also finds that taxes and transfers greatly reduce the prevalence of such losses, though to varying degrees across countries and over time. Finally, it disaggregates the measures to identify specific ‘triggers’ of large losses, and finds that these triggers are associated with risks on which welfare states focus, such as unemployment and sickness.
Atom probe tomography (APT) analysis conditions play a major role in the composition measurement accuracy. Preferential evaporation (PE), which significantly biases the apparent composition, more than other well-known phenomena in APT, is strongly connected to those analysis conditions. One way to optimize them, in order to have the most accurate measurement, is therefore to be able to predict and then to estimate their influence on the apparent composition. An analytical model is proposed to quantify the PE. This model is applied to three different alloys such as NiCu, FeCrNi, and FeCu. The model explains not only the analysis temperature dependence, as in an already existing model, but also the dependence to the pulse fraction and the pulse frequency. Moreover, the model can also provide an energetic constant directly linked to the energy barrier required to field evaporate atom from the sample surface.
This paper presents a theoretical and experimental analysis of the capabilities of the dual-input Doherty power amplifier (DPA) architecture to mitigate efficiency and output power degradations when used in a mismatched load environment. Following a simplified linear piecewise approach, an analytical demonstration is proposed to derive optimal radio frequency drives applied to the Auxiliary path of the DPA to restore power performances while avoiding large signal voltage clipping of active cells. The proposed analytical study is corroborated with harmonic balance simulated results of a C-band, 20-W GaN DPA prototype. The fabricated dual-input DPA prototype has been measured under 1.5-VSWR mismatch configurations to validate the proposed analysis.
This paper presents a reduced-order modelling strategy for Rayleigh–Bénard convection of a radiating gas, based on the proper orthogonal decomposition (POD). Direct numerical simulation (DNS) of coupled natural convection and radiative transfer in a cubic Rayleigh–Bénard cell is performed for an
mixture at room temperature and at a Rayleigh number of
. It is shown that radiative transfer between the isothermal walls and the gas triggers a convection growth outside the boundary layers. Mean and turbulent kinetic energy increase with radiation, as well as temperature fluctuations to a lesser extent. As in the uncoupled case, the large-scale circulation (LSC) settles in one of the two diagonal planes of the cube with a clockwise or anticlockwise motion, and experiences occasional brief reorientations which are rotations of
of the LSC in the horizontal plane. A POD analysis is conducted and reveals that the dominant POD eigenfunctions are preserved with radiation while POD eigenvalues are increased. Two POD-based reduced-order models including radiative transfer effects are then derived: the first one is based on coupled DNS data while the second one is an a priori model based on uncoupled DNS data. Owing to the weak temperature differences, radiation effects on mode amplitudes are linear in the models. Both models capture the increase in energy with radiation and are able to reproduce the low-frequency dynamics of reorientations and the high-frequency dynamics associated with the LSC velocity observed in the coupled DNS.
Unlike randomized controlled trials, lack of methodological rigor is a concern about real-world evidence (RWE) studies. The objective of this study was to characterize methodological practices of studies collecting pharmacoeconomic data in a real-world setting for the management of type 2 diabetes mellitus (T2DM).
A systematic literature review was performed using the PICO framework: population consisted of T2DM patients, interventions and comparators were any intervention for T2DM care or absence of intervention, and outcomes were resource utilization, productivity loss or utility. Only RWE studies were included, defined as studies that were not clinical trials and that collected de novo data (no retrospective analysis).
The literature search identified 1,158 potentially relevant studies, among which sixty were included in the literature review. Many studies showed a lack of transparency by not mentioning the source for outcome and exposure measurement, source for patient selection, number of study sites, recruitment duration, sample size calculation, sampling method, missing data, approbation by an ethics committee, obtaining patient's consent, conflicts of interest, and funding. A significant proportion of studies had poor quality scores and was at high risk of bias.
RWE from T2DM studies lacks transparency and credibility. There is a need for good procedural practices that can increase confidence in RWE studies. Standardized methodologies specifically adapted for RWE studies collecting pharmacoeconomic data for the management of T2DM could help future reimbursement decision making in this major public health problem.
We assessed Clostridioides difficile toxin testing and positivity for all patients in Manitoba hospitals during June 2016–November 2018. The testing rate was 30 per 10,000 patient bed days (95% confidence interval [CI], 30–31) and the incidence rate was 3.5 per 10,000 patient bed days (95% CI, 3.3–3.7). The context of testing is essential to the interpretation of data among jurisdictions.