We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure coreplatform@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This brief essay contrasts two modes of constitutional change: abusive constitutional projects that seek to erode democracy and restorative constitutional projects that aim to repair eroded democratic constitutional orders. Constitutional democracies are eroded and restored via the same mechanisms: formal processes of constitutional amendment and replacement, legislative amendment, changes to executive policies and practices (or respect for conventions), and processes of judicial decision-making. Under the right conditions, abusive uses of these mechanisms for antidemocratic ends can be reversed by prodemocratic or restorative uses. The more difficult question is what kinds of political discourses are most likely to sustain successful processes of democratic rebuilding. In recent work, we have pointed to the role sometimes played by liberal democratic discourses as purported justifications for processes of abusive constitutional change: we label this the rise of “abusive constitutional borrowing.” Less well understood are the kind of discourses likely to sustain successful democratic healing or rebuilding. Often, the most popular discourse is a restorative one, which focuses on repairing damage caused by authoritarians and returning to a constitutional status quo ante. In this essay, we discuss the advantages and disadvantages of restorative constitutionalism as a response to prior episodes of democratic erosion.
Parties have emerged as a central concern of comparative constitutional law, and for good reason. Parties should be at the forefront of descriptive analyses of how constitutions function. In practice, for instance, the separation of powers depends heavily on the shape of the party system, as do many other aspects of constitutional performance.1 Moreover, breakdowns of party systems have been a significant factor in explaining recent episodes of democratic erosion or breakdown.2 In other words, crises of liberal democratic constitutionalism are often caused by crises in party systems.
But what is more challenging is developing a normative theory linking constitutions and parties. In other words, to what extent can constitutions (or comparative constitutional law) prevent or fix breakdowns in party systems? On that question, this chapter offers some skepticism, or at least outlines a series of challenges.
Section 17.2 develops a brief map of the different ways in which party systems break down.
Recent scholarship has highlighted the theoretical possibility and examples of the tools of constitutional change being used “abusively,” in order to erode the democratic order. This chapter will explore the experience of constitutional backsliding in Colombia, and the response to those efforts by the Colombian Constitutional Court and other political actors. The chapter will explain the utility of a well-developed doctrine of unconstitutional constitutional amendment as a response to potentially abusive amendments such as term limit extensions. However, it will also highlight the dependence of such a doctrinal response on particular political conditions that often do not hold throughout Latin America.
In the preceding chapters we have dealt extensively with equilibrium properties of a wide variety of models and materials. We have emphasized the importance of insuring that equilibrium has been reached, and we have discussed the manner in which the system may approach the correct distribution of states, i.e. behavior before it comes to equilibrium. This latter topic has been treated from the perspective of helping us understand the difficulties of achieving equilibrium. The theory of equilibrium behavior is well developed and in many cases there is extensive, reliable experimental information available.
In this chapter we want to introduce simple importance sampling Monte Carlo techniques as applied in statistical physics and which can be used for the study of phase transitions at finite temperature. We shall discuss details, algorithms, and potential sources of difficulty using the Ising model as a paradigm. It should be understood, however, that virtually all of the discussion of the application to the Ising model is relevant to other models as well, and a few such examples will also be discussed. Other models as well as sophisticated approaches to the Ising model will be discussed in later chapters. The Ising model is one of the simplest lattice models which one can imagine, and its behavior has been studied for a century. The simple Ising model consists of spins which are confined to the sites of a lattice and which may have only the values +1 or −1.
Lattice gauge theories have played an important role in the theoretical description of phenomena in particle physics, and Monte Carlo methods have proven to be very effective in their study. In the lattice gauge approach a field theory is defined on a lattice by replacing partial derivatives in the Lagrangian by finite difference operators. For physical systems a quantum field theory on a four-dimensional space–time lattice is used, but simpler models in lower dimensions have also been studied in the hope of gaining some understanding of more complicated models as well as for the development of computational techniques. The present chapter is not at all intended to give a thorough treatment, but rather to convey the flavor of the subject to the non-expert.
In a Monte Carlo simulation we attempt to follow the ‘time dependence’ of a model for which change, or growth, does not proceed in some rigorously predefined fashion (e.g. according to Newton’s equations of motion) but rather in a stochastic manner which depends on a sequence of random numbers which is generated during the simulation. With a second, different sequence of random numbers the simulation will not give identical results but will yield values which agree with those obtained from the first sequence to within some ‘statistical error’. A very large number of different problems fall into this category: in percolation an empty lattice is gradually filled with particles by placing a particle on the lattice randomly with each ‘tick of the clock’. Lots of questions may then be asked about the resulting ‘clusters’ which are formed of neighboring occupied sites. Particular attention has been paid to the determination of the ‘percolation threshold’, i.e. the critical concentration of occupied sites for which an ‘infinite percolating cluster’ first appears. A percolating cluster is one which reaches from one boundary of a (macroscopic) system to the opposite one. The properties of such objects are of interest in the context of diverse physical problems such as conductivity of random mixtures, flow through porous rocks, behavior of dilute magnets, etc. Another example is diffusion limited aggregation (DLA), where a particle executes a random walk in space, taking one step at each time interval, until it encounters a ‘seed’ mass and sticks to it. The growth of this mass may then be studied as many random walkers are turned loose. The ‘fractal’ properties of the resulting object are of real interest, and while there is no accepted analytical theory of DLA to date, computer simulation is the method of choice. In fact, the phenomenon of DLA was first discovered by Monte Carlo simulation.
In the preceding chapters we described the application of Monte Carlo methods in numerous areas that can be clearly identified as belonging to physics. Although the exposition was far from complete, it should have sufficed to give the reader an appreciation of the broad impact that Monte Carlo studies has already had in statistical physics. A more recent occurrence is the application of these methods in non-traditional areas of physics related research. More explicitly, we mean subject areas that are not normally considered to be physics at all but which make use of physics principles at their core. In some cases physicists have entered these arenas by introducing quite simplified models that represent a ‘physicist’s view’ of a particular problem. Often such descriptions are oversimplified, but the hope is that some essential insight can be gained as is the case in many traditional physics studies. (A provocative perspective of the role of statistical physics outside physics has been presented by Stauffer (2004).) In other cases, however, Monte Carlo methods are being applied by non-physicists (or ‘recent physicists’) to problems that, at best, have a tenuous relationship to physics. This chapter is to serve as a brief glimpse of applications of Monte Carlo methods ‘outside’ physics. The number of such studies will surely grow rapidly; and even now, we wish to emphasize that we will make no attempt to be complete in our treatment.
The concepts of scaling and universality presented in Chapter 2 can be given a concrete foundation through the use of renormalization group (RG) theory. The fundamental physical ideas underlying RG theory were introduced by Kadanoff (1971) in terms of a simple coarse-graining approach, and a mathematical basis for this viewpoint was completed by Wilson (1971). Kadanoff divided the system up into cells of characteristic size ba, where a is the nearest neighbor spacing, and ba < ξ , where ξ is the correlation length of the system (see Fig. 9.1). The singular part of the free energy of the system can then be expressed in terms of cell variables instead of the original site variables, i.e.
Within this book we have attempted to elucidate the essential features of Monte Carlo simulations and their application to problems in statistical physics. We have attempted to give the reader practical advice as well as to present theoretically based background for the methodology of the simulations as well as the tools of analysis. New Monte Carlo methods will be devised and will be used with more powerful computers, but we believe that the advice given to the reader in Section 4.8 will remain valid.
Dealing with all aspects of Monte Carlo simulation of complex physical systems encountered in condensed matter physics and statistical mechanics, this book provides an introduction to computer simulations in physics. The 5th edition contains extensive new material describing numerous powerful algorithms and methods that represent recent developments in the field. New topics such as active matter and machine learning are also introduced. Throughout, there are many applications, examples, recipes, case studies, and exercises to help the reader fully comprehend the material. This book is ideal for graduate students and researchers, both in academia and industry, who want to learn techniques that have become a third tool of physical science, complementing experiment and analytical theory.
In the previous chapters of this text we have examined a wide variety of Monte Carlo methods in depth. Although these are exceedingly useful for many different problems in statistical physics, there are some circumstances in which the systems of interest are not well suited to Monte Carlo study. Indeed there are some problems which may not be treatable by stochastic methods at all, since the time-dependent properties as constrained by deterministic equations of motion are the subject of the study. The purpose of this chapter is thus to provide a very brief overview of some of the other important simulation techniques in statistical physics. Our goal is not to present a complete list of other methods or even a thorough discussion of these methods which are included, but rather to offer sufficient background to enable the reader to compare some of the different approaches and better understand the strengths and limitations of Monte Carlo simulations.
In this chapter we shall review some of the basic features of thermodynamics and statistical mechanics which will be used later in this book when devising simulation methods and interpreting results. Many good books on this subject exist and we shall not attempt to present a complete treatment. This chapter is hence not intended to replace any textbook for this important field of physics but rather to ‘refresh’ the reader’s knowledge and to draw attention to notions in thermodynamics and statistical mechanics which will henceforth be assumed to be known throughout this book.
In most of the discussion presented so far in this book, the quantum character of atoms and electrons has been ignored. The Ising spin models have been an exception, but since the Ising Hamiltonian is diagonal (in the absence of a transverse magnetic field), all energy eigenvalues are known and the Monte Carlo sampling can be carried out just as in the case of classical statistical mechanics. Furthermore, the physical properties are in accord with the third law of thermodynamics for Ising-type Hamiltonians (e.g. entropy S and specific heat vanish for temperature T → 0, etc.) in contrast to the other truly classical models dealt with in previous chapters (e.g. classical Heisenberg spin models, classical fluids and solids, etc.) which have many unphysical low temperature properties.