To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
After the Meiji Restoration of 1868, Japan introduced Western institutions and new technologies. As the first Asian economy to make the transformation to ‘modern economic growth’, Japan’s development process was not smooth, and it took until the 1970s to achieve the goal of catching up with Western economies. From a supply-side perspective, the main drivers of economic growth to the 1970s were capital accumulation and TFP growth. From a demand-side perspective, private and government investment played a more significant role than private consumption. After the high-speed growth era, investment and TFP growth slowed down and exports took centre stage in propelling the economy forward. Further, especially since the 1990s, some features that played a key role in Japan’s remarkable economic performance, such as lifetime employment, the dual economy, and the high saving rate, held back Japan. Also, globalization and slow/negative growth in the working-age population have posed new challenges.
In March 2020, at the onset of the coronavirus disease 2019 (COVID-19) pandemic in the United States, the Southern California Extracorporeal Membrane Oxygenation (ECMO) Consortium was formed. The consortium included physicians and coordinators from the 4 ECMO centers in San Diego County. Guidelines were created to ensure that ECMO was delivered equitably and in a resource effective manner across the county during the pandemic. A biomedical ethicist reviewed the guidelines to ensure ECMO use would provide maximal community benefit of this limited resource. The San Diego County Health and Human Services Agency further incorporated the guidelines into its plans for the allocation of scarce resources. The consortium held weekly video conferences to review countywide ECMO capacity (including census and staffing), share data, and discuss clinical practices and difficult cases. Equipment exchanges between ECMO centers maximized regional capacity. From March 1 to November 30, 2020, consortium participants placed 97 patients on ECMO. No eligible patients were denied ECMO due to lack of resources or capacity. The Southern California ECMO Consortium may serve as a model for other communities seeking to optimize ECMO resources during the current COVID-19 or future pandemics.
The National Institute for Health and Care Excellence (NICE), the UK's primary health care priority-setting body, has traditionally described its decisions as being informed by ‘social value judgements’ about how resources should be allocated across society. This paper traces the intellectual history of this term and suggests that, in NICE's adoption of the idea of the ‘social value judgement’, we are hearing the echoes of welfare economics at a particular stage of its development, when logical positivism provided the basis for thinking about public policy choice. As such, it is argued that the term offers an overly simplistic conceptualisation of NICE's normative approach and contributes to a situation in which NICE finds itself without the necessary language fully and accurately to articulate its basis for decision-making. It is suggested that the notion of practical public reasoning, based on reflection about coherent principles of action, might provide a better characterisation of the enterprise in which NICE is, or hopes to be, engaged.
With the increase of access point (AP) density and the exponential growth of mobile devices supported by ultra dense networks (UDNs), overlapped user-centric (UC) clustering is becoming a promising design principle for guaranteeing the quality of service (QoS) required by each UE. However, the overlapped UC clustering has to be jointly designed with resource allocation in UDNs. In this context, both the traffic-load balancing and the limited availability of orthogonal resource blocks (RBs) are carefully considered in UDNs. To tackle these challenges, we formulate a joint overlapped UC clustering and resource allocation problem with the goal of maximizing the system’s spectral efficiency (SE). With the aid of the graph-theoretical framework, the problem is decoupled into two independent subproblems, and a distributed overlapped UC clustering solution as well as a graph-based resource allocation scheme were proposed. Our numerical results quantify the superior performance of the proposed framework in terms of both its per area aggregated user rate (PAAR) and user rate.
This chapter investigates the application non-orthogonal multiple access (NOMA) in heterogeneous ultra-dense networks (HUDNs). Particularly, we propose a unified NOMA framework first. Then the applications of the proposed unified NOMA framework in HUDNs will be discussed. With the fact that small cells are densely deployed and the non-orthogonality of resource sharing, the system suffers severe interference. In this chapter, we identify the key challenges in the unified NOMA enabled HUDNs, especially for user association and resource allocation. In addition, we carry out the related case studies for the proposed unified NOMA enabled HUDNs including the user association based on matching theory and resource allocation based on optimization techniques. Furthermore, some critical insights will be provided for the design of NOMA enabled HUDNs, which can promote network access capacity in the next generation of communication systems.
Recently, software-defined networking (SDN) has been expected as an efficient technology to realize flexible resource management and system performance control by separating resource management from geo-distributed resources, especially for heterogeneous ultra-dense networks (HetUDNs). This work establishes an SDN based architecture for mobile traffic offloading in HetUDNs, which consist of densely deployed macro-cell base stations (MBSs) and small-cell base stations (SBSs). Additional, we explore a scenario with information asymmetric, specifically, the capacity of the SBSs can be accessible, but their performance for offloading cannot be obtained by the controller of SDN. To address such asymmetry, we propose a bundle of traffic offloading contracts, which are capable of encouraging each SBS to select the right contract that designed personally to it by promising its maximum utility. Moreover, by designing the contracts which offer rationality and incentive compatibility to different SBS types, the characteristics of a large number of SBSs are aggregated to support the efficient selection on SBSs to provide traffic offloading. Then a closed-form expression for SBS types is proposed, and we prove the monotonicity and incentive compatibility of the resulting contracts. Furthermore, simulation results validate the system performance, and the effectiveness and efficiency of the proposed contract-based traffic offloading mechanism.
The frameworks used by Health Technology Assessment (HTA) agencies for value assessment of medicines aim to optimize healthcare resource allocation. However, they may not be effective at capturing the value of antimicrobial drugs.
To analyze stakeholder perceptions regarding how antimicrobials are assessed for value for reimbursement purposes and how the Australian HTA framework accommodates the unique attributes of antimicrobials in cost-effectiveness evaluation.
Eighteen individuals representing the pharmaceutical industry or policy-makers were interviewed. Interviews were transcribed verbatim, coded, and thematically analyzed.
Key emergent themes were that reimbursement decision-making should consider the antibiotic spectrum when assessing value, risk of shortages, the impact of procurement processes on low-priced comparators, and the need for methodological transparency when antimicrobials are incorporated into the economic evaluation of other treatments.
Participants agreed that the current HTA framework for antimicrobial value assessment is inadequate to properly inform funding decisions, as the contemporary definition of cost-effectiveness fails to explicitly incorporate the risk of future resistance. Policy-makers were uncertain about how to incorporate future resistance into economic evaluations without a systematic method to capture costs avoided due to good stewardship. Lacking financial reward for the benefits of narrower-spectrum antimicrobials, companies will likely focus on developing broad-spectrum agents with wider potential use. The perceived risks of shortages have influenced the funding of generic antimicrobials in Australia, with policy-makers suggesting a willingness to pay more for assured supply. Although antibiotics often underpin the effectiveness of other medicines, it is unclear how this is incorporated into economic models.
Extracorporeal membrane oxygenation (ECMO) has accelerated rapidly for patients in severe cardiac or respiratory failure. As a result, ECMO networks are being developed across the world using a “hub and spoke” model. Current guidelines call for all patients transported on ECMO to be accompanied by a physician during transport. However, as ECMO centers and networks grow, the increasing number of transports will be limited by this mandate.
The aim of this study was to compare rates of adverse events occurring during transport of ECMO patients with and without an additional clinician, defined as a physician, nurse practitioner (NP), or physician assistant (PA).
This is a retrospective cohort study of all adults transported while cannulated on ECMO from 2011-2018 via ground and air between 21 hospitals in the northeastern United States, comparing transports with and without additional clinicians. The primary outcome was the rate of major adverse events, and the secondary outcome was minor adverse events.
Over the seven-year study period, 93 patients on ECMO were transported. Twenty-three transports (24.7%) were accompanied by a physician or other additional clinician. Major adverse events occurred in 21.5% of all transports. There was no difference in the total rate of major adverse events between accompanied and unaccompanied transports (P = .91). Multivariate analysis did not demonstrate any parameter as being predictive of major adverse events.
In a retrospective cohort study of transports of ECMO patients, there was no association between the overall rate of major adverse events in transport and the accompaniment of an additional clinician. No variables were associated with major adverse events in either cohort.
In order to realize the online allocation of collaborative processing resource of smart workshop in the context of cloud manufacturing, a multi-objective optimization model of workshop collaborative resources (MOM-WCR) was proposed. Considering the optimization objectives of processing time, processing cost, product qualification rate, and resource utilization, MOM-WCR was constructed. Based on the time sequence of workshop processing tasks, the workshop collaborative manufacturing resource was integrated in MOM-WCR. Fuzzy analytic hierarchy process (FAHP) was adopted to simplified the multi-objective problem into the single-objective problem. Then, the improved firefly algorithm which integrated the particle swarm algorithm (IFA-PSA) was used to solve MOM-WCR. Finally, a group of connecting rod processing experiments were used to verify the model proposed in this paper. The results show that the model is feasible in the application of workshop-level resource allocation in the context of cloud manufacturing, and the improved firefly algorithm shows good performance in solving the multi-objective resource allocation problem.
One of the good practice principles for health technology assessment (HTA) is having a clear link between the assessment and decision making. The objective of the 2019 Latin American Policy Forum (LatamPF) of Health Technology Assessment International was to explore different models of connection between HTA and decision making and to discuss the potential applicability of such models in Latin America.
This paper is based on a background document and the deliberations of the members of the LatamPF (fifty-four participants from twelve countries) where a design-thinking methodology was used.
The participants agreed that insufficient links between HTA and decision making undermine the legitimacy of decisions, expose the HTA process to excessive political and judicial influence, and promote the exclusion of some stakeholders from participating in the assessment process and decision making. High priority aspects of the HTA process that could feasibly be improved and which hold the greatest potential to generate positive changes in the health systems in the region were identified. The majority of these aspects were associated with the appropriate institutionalization of HTA, a greater degree of participation by different stakeholders, and improved transparency in the HTA process.
The LatamPF identified barriers and recommended actions to strengthen the link between HTA and decision making. Participants emphasized that there is now a window of opportunity in the region as many societal actors see this as a priority. For this reason, health system stakeholders must take this opportunity to increase efforts toward strengthening the link between HTA and decision making.
Effective administration of healthcare in an emergency setting, especially in field-hospital deployment where order must be established, needs assessed and limited resources allocated effectively, is considerably more complex than the regular patient–doctor interactions characteristic of routine times. Due to the complexity and uncertainty typical of such an environment, leadership is required not only by the field hospital staff but also by the affected public, which seeks leadership in those who are perceived to be the center of clinical service delivery.
This type of leadership demands organized command and control and practice of more than just basic leadership processes, and therefore requires, alongside the mission leader, a structured management and task-orientated chain of command.
For the hospital to operate effectively and independently, it is necessary to also define the organizational structure. The organizational structure discussed in this chapter is a model tested over the past three decades by the IDF Medical Corps hospital in numerous missions. This structure is generally similar to the basic structure of a small- to medium-scale hospital in routine times. At the same time, it allows more focused and simple managing processes required in non-routine scenarios such as emergencies or disasters.
Flower and leaf herbivory might cause relevant and negative impacts on plant fitness. While flower removal or damage by florivores produces direct negative effects on plant fitness, folivores affect plant fitness by reducing resource allocation to reproduction. In this study, we examine the effects of both flower and leaf herbivory by leaf-cutting ants on the reproductive success of the shrub species Miconia nervosa (Smith) Triana (Family Melastomataceae) in a fragment of Atlantic Forest in Northeast Brazil. We conducted a randomized block-designed field experiment with nine replicates (blocks), in which three plants per block were assigned to one of the three following treatments: undamaged plants (ant exclusion), leaf-damaged plants (ant exclusion from reproductive organs, but not from leaves), and flower + leaf-damaged plants (no exclusion of ants). We then measured flower production, fruit set, and fruit production. Our results showed that flower + leaf-damaged plants reduced flower production nearly twofold in relation to undamaged plants, while flower set in leaf-damaged plants remained constant. The number of flowers that turned into fruits (i.e., fruit set), however, increased by 15% in flower + leaf-damaged plants, while it slightly decreased in leaf-damaged compared to undamaged plants. Contrastingly, fruit production was similar between all treatments. Taken together, our results suggest a prominent role of ant floral herbivory across different stages of the reproductive cycle in M. nervosa, with no consequences on final fruit production. The tolerance of M. nervosa to leaf-cutting ant herbivory might explain its high abundance in human-modified landscapes where leaf-cutting ants are hyper-abundant.
In the western Serengeti of Tanzania, African elephant Loxodonta africana populations are increasing, which is rare across the species’ range. Here, conservation objectives come into conflict with competing interests such as agriculture. Elephants regularly damage crops, which threatens livelihoods and undermines local support for conservation. For damage reduction efforts to be successful, limited resources must be used efficiently and strategies for mitigation and prevention should be informed by an understanding of the spatial and temporal distribution of crop damage. We assessed historical records of crop damage by elephants to describe the dynamics and context of damage in the western Serengeti. We used binary data and generalized additive models to predict the probability of crop damage at the village level in relation to landscape features and metrics of human disturbance. During 2012–2014 there were 3,380 reports of crop damage by elephants submitted to authorities in 42 villages. Damage was concentrated in villages adjacent to a reserve boundary and peaked during periods of crop maturity and harvest. The village-level probability of crop damage was negatively associated with distance from a reserve, positively with length of the boundary shared with a reserve, and peaked at moderate levels of indicators of human presence. Spatially aggregated historical records can provide protected area managers and regional government agencies with important insights into the distribution of conflict across the landscape and between seasons, and can guide efforts to optimize resource allocation and future land use planning efforts.
Quantifying interconnected performances of the modules in a colonial organism (feeding, sexual reproduction, rejuvenation, dormancy) into an integral picture enables studying functional dynamics and resource allocation at different levels – from module to population. Testing this approach on the common boreal-Arctic bryozoan Cribrilina annulata in the White Sea, we describe its life history, comparing colonies on two algal substrates with contrasting size and lifespan. Colonies living on kelps were much larger and had a higher proportion of dormant zooids, whereas the percentage of reproducing, feeding and rejuvenating zooids was higher in colonies on red algae (with the colonies also exhibiting longer reproductive period). Colony lifespan was dependent both on substrate type and on time of colony establishment, lasting from 4–5 to up to 17 months on kelps and 14–18 months on red algae. During the reproductive season (May–September) the C. annulata population consisted of colonies of three cohorts on both substrata: overwintered and two summer generations that behaved differently. Whereas overwintered and summer colonies established in June–early August produced larvae, most of the colonies established after mid-summer were preparing for hibernation and postponed reproduction until next spring. Moreover, young reproducing colonies formed brooding hermaphrodite zooids of ordinary size, whereas overwintered colonies budded smaller-sized basal and frontal (dwarf) hermaphrodites. Finally, overall zooidal performance in co-existing colonies of the overwintered and young generations was different on kelps, but similar on red algae. Altogether our findings indicate that the life histories of colonial epibionts are much more complex and evolutionarily flexible than generally acknowledged.
As referrals to specialist palliative care (PC) grow in volume and diversity, an evidence-based triage method is needed to enable services to manage waiting lists in a transparent, efficient, and equitable manner. Discrete choice experiments (DCEs) have not to date been used among PC clinicians, but may serve as a rigorous and efficient method to explore and inform the complex decision-making involved in PC triage. This article presents the protocol for a novel application of an international DCE as part of a mixed-method research program, ultimately aiming to develop a clinical decision-making tool for PC triage.
Five stages of protocol development were undertaken: (1) identification of attributes of interest; (2) creation and (3) execution of a pilot DCE; and (4) refinement and (5) planned execution of the final DCE.
Six attributes of interest to PC triage were identified and included in a DCE that was piloted with 10 palliative care practitioners. The pilot was found to be feasible, with an acceptable cognitive burden, but refinements were made, including the creation of an additional attribute to allow independent analysis of concepts involved. Strategies for recruitment, data collection, analysis, and modeling were confirmed for the final planned DCE.
Significance of results
This DCE protocol serves as an example of how the sophisticated DCE methodology can be applied to health services research in PC. Discussion of key elements that improved the utility, integrity, and feasibility of the DCE provide valuable insights.
This systematic review aimed to identify criteria being used for priority setting for resource allocation decisions in low- and middle-income countries (LMICs). Furthermore, the included studies were analyzed from a policy perspective to understand priority setting processes in these countries.
Searches were carried out in PubMed, Embase, Econlit, and Cochrane databases, supplemented with pre-identified Web sites and bibliographic searches of relevant papers. Quality appraisal of included studies was undertaken. The review protocol is registered in International Prospective Register of Systematic Reviews PROSPERO CRD42017068371.
Of 16,412 records screened by title and abstract, 112 papers were identified for full text screening and 44 studies were included in the final analysis. At an overall level, cost-effectiveness 52 percent (n = 22) and health benefits 45 percent (n = 19) were the most cited criteria used for priority setting for public health resource allocation. Inter-region (LMICs) and between various approaches (like health technology assessment, multi-criteria decision analysis (MCDA), accountability for reasonableness (AFR) variations among criteria were also noted. Our review found that MCDA approach was more frequently used in upper middle-income countries and AFR in lower-income countries for priority setting in health. Policy makers were the most frequently consulted stakeholders in all regions.
Conclusions and Recommendations
Priority-setting criteria for health resource allocation decisions in LMICs largely comprised of cost-effectiveness and health benefits criteria at overall level. Other criteria like legal and regulatory framework conducive for implementation, fairness/ethics, and political considerations were infrequently reported and should be considered.
Research suggests that a significant minority of hospital in-patients could be more appropriately supported in the community if enhanced services were available. However, little is known about these individuals or the services they require.
To identify which individuals require what services, at what cost.
A ‘balance of care’ (BoC) study was undertaken in northern England. Drawing on routine electronic data about 315 admissions categorised into patient groups, frontline practitioners identified patients whose needs could be met in alternative settings and specified the services they required, using a modified nominal group approach. Costing employed a public-sector approach.
Community care was deemed appropriate for approximately a quarter of admissions including people with mild-moderate depression, an eating disorder or personality disorder, and some people with schizophrenia. Proposed community alternatives drew heavily on carer support services, community mental health teams and consultants, and there was widespread consensus on the need to increase out-of-hours community services. The costs of the proposed community care were relatively modest compared with hospital admission. On average social care costs increased by approximately £60 per week, but total costs fell by £1626 per week.
The findings raise strategic issues for both national policymakers and local service planners. Patients who could be managed at home can be characterised by diagnosis. Although potential financial savings were identified, the reported cost differences do not directly equate to cost savings. It is not clear whether in-patient beds could be reduced. However, existing beds could be more efficiently used.
Clinical and translational science is vitally dependent on the nation’s underlying health-care policies and programs. In a reciprocal fashion, data generated by clinical and translational research can inform both health policy and health-care delivery. It is important, therefore, to rate health reform proposals comprehensively on a set of criteria that reflect the broad goals of reform, including the potential impact on clinical and translational science and medical education. I propose that the criteria include achieving universal coverage, reducing administrative costs, retaining one’s chosen primary care physician, encouraging care coordination, empowering physicians, freeing industry from choosing and administering health plans, providing choice of specialists and hospitals, providing patient education, preventing patient overuse of services, rationalizing resource allocation, encouraging competition, limiting government’s role, supporting medical education, training, and research, and freeing industry to make personnel decisions based on business criteria rather than the impact on health-care costs to the company. I discuss the rationale for each element and offer a rating of current proposals relative to a proposal previously made.
Explanations of the state of ‘crisis’ in the English National Health Service (NHS) generally focus on the overall level of health care funding rather than the way in which funding is distributed. Describing systematic patterns in the way different areas are experiencing crisis, this paper suggests that NHS organisations in older, rural and particularly coastal areas are more likely to be ‘failing’ and that this is due to the historic underfunding of such areas. This partly reflects methodological and technical shortcomings in NHS resource allocation formulae. It is also the outcome of a philosophical shift from horizontal (equal access for equal needs) to vertical (unequal access to equalise health outcomes) principles of equity. Insofar as health inequalities are determined by factors well beyond health care, we argue that this is an ineffective approach to addressing health inequalities. Moreover, it sacrifices equity in access to health care by failing to adequately fund the health care needs of older populations. The prioritisation of vertical over horizontal equity also conflicts with public perspectives on the NHS. Against this background, we ask whether the time has come to reassert the moral and philosophical case for the principle of equal access for equal health care need.