To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Vitamin D deficiency is recognised as a public health problem globally, and a high prevalence of deficiency has previously been reported in Australia. This study details the prevalence of vitamin D deficiency in a nationally representative sample of Australian adults aged ≥25 years, using an internationally standardised method to measure serum 25-hydroxyvitamin D (25(OH)D) concentrations and identifies demographic and lifestyle factors associated with vitamin D deficiency. We used data from the 2011–2013 Australian Health Survey (n 5034 with complete information on potential predictors and serum 25(OH)D concentrations). Serum 25(OH)D concentrations were measured by a liquid chromatography-tandem MS that is certified to the reference measurement procedures developed by the National Institute of Standards and Technology, Ghent University and the US Centers for Disease Control and Prevention. Vitamin D deficiency and insufficiency were defined as serum 25(OH)D concentrations <50 nmol/l and 50 to <75 nmol/l, respectively. Overall, 20 % of participants (19 % men; 21 % women) were classified as vitamin D deficient, with a further 43 % classified as insufficient (45 % men; 42 % women). Independent predictors of vitamin D deficiency included being born in a country other than Australia or the main English-speaking countries, residing in southern (higher latitude) states of Australia, being assessed during winter or spring, being obese, smoking (women only), having low physical activity levels and not taking vitamin D or Ca supplements. Given our increasingly indoor lifestyles, there is a need to develop and promote strategies to maintain adequate vitamin D status through safe sun exposure and dietary approaches.
This study of loneliness across adult lifespan examined its associations with sociodemographics, mental health (positive and negative psychological states and traits), subjective cognitive complaints, and physical functioning.
Analysis of cross-sectional data
340 community-dwelling adults in San Diego, California, mean age 62 (SD = 18) years, range 27–101 years, who participated in three community-based studies.
Loneliness measures included UCLA Loneliness Scale Version 3 (UCLA-3), 4-item Patient-Reported Outcomes Measurement Information System (PROMIS) Social Isolation Scale, and a single-item measure from the Center for Epidemiologic Studies Depression (CESD) scale. Other measures included the San Diego Wisdom Scale (SD-WISE) and Medical Outcomes Survey- Short form 36.
Seventy-six percent of subjects had moderate-high levels of loneliness on UCLA-3, using standardized cut-points. Loneliness was correlated with worse mental health and inversely with positive psychological states/traits. Even moderate severity of loneliness was associated with worse mental and physical functioning. Loneliness severity and age had a complex relationship, with increased loneliness in the late-20s, mid-50s, and late-80s. There were no sex differences in loneliness prevalence, severity, and age relationships. The best-fit multiple regression model accounted for 45% of the variance in UCLA-3 scores, and three factors emerged with small-medium effect sizes: wisdom, living alone and mental well-being.
The alarmingly high prevalence of loneliness and its association with worse health-related measures underscore major challenges for society. The non-linear age-loneliness severity relationship deserves further study. The strong negative association of wisdom with loneliness highlights the potentially critical role of wisdom as a target for psychosocial/behavioral interventions to reduce loneliness. Building a wiser society may help us develop a more connected, less lonely, and happier society.
Design problems are often presented as structured briefs with detailed constraints and requirements, suggesting a fixed definition. However, past studies have identified the importance of exploring design problems for creative design outcomes. Previous protocol studies of designers has shown that problems can “co-evolve” with the development of solutions during the design process. But to date, little evidence has been provided about how designers systematically explore presented problems to create better solutions. In this study, we conducted a qualitative analysis of 252 design problems collected from publically available sources, including award-winning product designs and open-source design competitions. This database offers an independent sample of presented problems, designers’ alternative problem descriptions, and innovative solutions. We report the results of this large-scale qualitative analysis aimed at characterizing changes to problems during the design process. Inductive coding was used to identify content patterns in “discovered” problem descriptions, with qualitative codes reliably scored by two independent coders. A total of 32 distinct patterns of problem exploration were identified across designers and presented problems. Each pattern is described in the form of a generalized strategy to guide designers as they explore problem spaces. The exploration patterns identified in this study are the first empirical evidence of problem exploration in independent design problems. Further, the presence of exploration patterns in discovered problems is associated with the selection of the corresponding solution as a challenge finalist. These empirically identified strategies for problem exploration may be useful for computational tools supporting designers.
Most studies underline the contribution of heritable factors for psychiatric disorders. However, heritability estimates depend on the population under study, diagnostic instruments, and study designs that each has its inherent assumptions, strengths, and biases. We aim to test the homogeneity in heritability estimates between two powerful, and state of the art study designs for eight psychiatric disorders.
We assessed heritability based on data of Swedish siblings (N = 4 408 646 full and maternal half-siblings), and based on summary data of eight samples with measured genotypes (N = 125 533 cases and 208 215 controls). All data were based on standard diagnostic criteria. Eight psychiatric disorders were studied: (1) alcohol dependence (AD), (2) anorexia nervosa, (3) attention deficit/hyperactivity disorder (ADHD), (4) autism spectrum disorder, (5) bipolar disorder, (6) major depressive disorder, (7) obsessive-compulsive disorder (OCD), and (8) schizophrenia.
Heritability estimates from sibling data varied from 0.30 for Major Depression to 0.80 for ADHD. The estimates based on the measured genotypes were lower, ranging from 0.10 for AD to 0.28 for OCD, but were significant, and correlated positively (0.19) with national sibling-based estimates. When removing OCD from the data the correlation increased to 0.50.
Given the unique character of each study design, the convergent findings for these eight psychiatric conditions suggest that heritability estimates are robust across different methods. The findings also highlight large differences in genetic and environmental influences between psychiatric disorders, providing future directions for etiological psychiatric research.
Advancements in image-based technologies and body composition research over the past decade has led to increased understanding of the importance of muscle abnormalities, such as low muscle mass (sarcopenia), and more recently low muscle attenuation (MA), as important prognostic indicators of unfavourable outcomes in patients with cancer. Muscle abnormalities can be highly prevalent in patients with cancer (ranging between 10 and 90 %), depending on the cohort under investigation and diagnostic criteria used. Importantly, both low muscle mass and low MA have been associated with poorer tolerance to chemotherapy, increased risk of post-operative infectious and non-infectious complications, increased length of hospital stay and poorer survival in patients with cancer. Studies have shown that systemic antineoplastic treatment can exacerbate losses in muscle mass and MA, with reported loss of skeletal muscle between 3 and 5 % per 100 d, which are increased exponentially with progressive disease and proximity to death. At present, no effective medical intervention to improve muscle mass and MA exists. Most research to date has focused on treating muscle depletion as part of the cachexia syndrome using nutritional, exercise and pharmacological interventions; however, these single-agent therapies have not provided promising results. Rehabilitation care to modify body composition, either increasing muscle mass and/or MA should be conducted, and its respective impact on oncology outcomes explored. Although the optimal timing and treatment strategy for preventing or delaying the development of muscle abnormalities are yet to be determined, multimodal interventions initiated early in the disease trajectory appear to hold the most promise.
Fatigue cracking in polycrystalline NiTi was investigated using a multiscale experimental framework for average grain sizes (GS) from 10 to 1500 nm for the first time. Macroscopic fatigue crack growth rates, measured by optical digital image correlation, were connected to microscopic crack opening and closing displacements, measured by scanning electron microscope DIC (SEM-DIC) using a high-precision external SEM scan controller. Among all grain sizes, the 1500 nm GS sample exhibited the slowest crack growth rate at the macroscale, and the largest crack opening level (stress intensity at first crack opening) and minimum crack opening displacements at the microscale. Smaller GS samples (10, 18, 42, and 80 nm) exhibited nonmonotonic trends in their fatigue performance, yet the correlation was strong between macroscale and microscale behaviors for each GS. The samples that exhibited the fastest crack growth rates (42 and 80 nm GS) showed a small crack opening level and the largest crack opening displacements. The irregular trends in fatigue performance across the nanocrystalline GS samples were consistent with nonmonotonic values in the elastic modulus reported previously, both of which may be related to the presence of residual martensite only evident in the small GS samples (10 and 18 nm).
Resistance training (RT) and increased dietary protein are recommended to attenuate age-related muscle loss in the elderly. This study examined the effect of a lean red meat protein-enriched diet combined with progressive resistance training (RT+Meat) on health-related quality of life (HR-QoL) in elderly women. In this 4-month cluster randomised controlled trial, 100 women aged 60–90 years (mean 73 years) from self-care retirement villages participated in RT twice a week and were allocated either 160 g/d (cooked) lean red meat consumed across 2 meals/d, 6 d/week or ≥1 serving/d (25–30 g) carbohydrates (control group, CRT). HR-QoL (SF-36 Health Survey questionnaire), lower limb maximum muscle strength and lean tissue mass (LTM) (dual-energy X-ray absorptiometry) were assessed at baseline and 4 months. In all, ninety-one women (91 %) completed the study (RT+Meat (n 48); CRT (n 43)). Mean protein intake was greater in RT+Meat than CRT throughout the study (1·3 (sd 0·3) v. 1·1 (sd 0·3) g/kg per d, P<0·05). Exercise compliance (74 %) was not different between groups. After 4 months there was a significant net benefit in the RT+Meat compared with CRT group for overall HR-QoL and the physical component summary (PCS) score (P<0·01), but there were no changes in either group in the mental component summary (MCS) score. Changes in lower limb muscle strength, but not LTM, were positively associated with changes in overall HR-QoL (muscle strength, β: 2·2 (95 % CI 0·1, 4·3), P<0·05). In conclusion, a combination of RT and increased dietary protein led to greater net benefits in overall HR-QoL in elderly women compared with RT alone, which was because of greater improvements in PCS rather than MCS.
Surveys were distributed to parents and childcare agency staff to determine seasonal influenza vaccine uptake. Multivariate logistic regressions identified vaccination determinants. Overall, 351 parents and staff participated (response rate, 32%). One-half (168 [48%]) received vaccine. Vaccination predictors included healthcare provider or employer recommendation, perceived seriousness, and no vaccine fear.
Some great men receive their due from historians late. Caesarius of Arles is one of these, in good part perhaps because the established mold for writing and teaching about the tradition of spirituality and intellectuality which Roman culture contributed to early medieval Europe had its heroes defined for it early. The patterns thus set up have tended to resist the admission of intruders brought to the fore by more recent historical investigations.
Sidonius Apollinaris needs no introduction as by far our best informant for the history of Roman Gaul in the late fifth century and as one of the most accomplished practitioners of the complicated literary style so admired in his age and so often contemned in ours. Because his poems and letters provide abundant evidence for political and social history, most historians have until recently tended to see him as a predominantly secular figure. That impression is heightened by the literary genres he chose for his writings and whose accepted rules and forms he meticulously followed. However, a handful of studies since mid-century, and recently a political biography, have drawn attention to one or another neglected religious dimension of his life. Gradually a hidden Sidonius has been emerging from between the lines of reluctant sources that are not autobiographical by nature or by his intent.
The factionalism of Southern politics had continued during the 1980s, but an internal split of the SPLM/A in August 1991 proved militarily and politically disastrous, and it gave the Islamist regime in Khartoum new breathing room. Rather than exploiting this opportunity, however, the government estranged Sudan's neighbors, Egypt, Eritrea, Ethiopia, Kenya, and Uganda. This in turn gave John Garang the opportunity to establish links with those states, to regroup and to regain the military initiative. Toward the end of the decade, the military status quo ante had apparently been restored. But the political impact of violent factionalism was fundamental and has continued to this day, with disastrous results. The split forced the SPLM/A onto a new political course whereby self-determination for South Sudan became a prominent part of the Movement's platform. Meanwhile, the split itself and the end of the SPLM's Cold War attachments ensured the continuation and institutionalization of humanitarian aid to rebel-controlled areas, and in many ways this shaped South Sudan's current constellation of foreign relations.
Their darkest hour: internal factionalism in the South
On August 28, 1991, Riek Machar, Lam Akol, and Gordon Kong radioed from the small town of Nasir, in upper Nile, to all SPLM/A units: “Why John Garang must go now.” In addition to demanding a leadership change, the trio called for reform of the SPLM/A. They faulted Garang for violations of human rights and stifling internal democracy. Perhaps most controversially, they wanted the SPLM/A to fight for an independent South Sudan rather than a reformed Sudan. The dissenters came to be known as the “Nasir-faction” or “SPLM/A Nasir,” but their initial goal was to take control of the entire Movement, not to set up a rival. They needed allies from the Bahr al-Ghazal and Equatoria, where they hoped that resentment of the “Bor Dinka” would sway rebel commanders to support their coup. Although they later gained some ground in these provinces, the core area of the Nasir faction was today's Greater upper Nile region (the states of Jonglei, Upper Nile, and Unity).
The background and motivation for the “Nasir Declaration” have been hotly debated.
The second civil war dominated South Sudan's history during the period 1983–91. Since the late 1970s, groups of insurgents calling themselves Anya-Nya2 had already been operating in some parts of Bahr al-Ghazal and Upper Nile provinces. By the early 1980s, the political order instituted by the Addis Ababa Agreement had all but collapsed. Unrest and protests reached unprecedented levels and large swaths of the south were simmering with insecurity and violence. The new war, which started in 1983, was radically different from the first. Instead of southern secession, the Sudan People's Liberation Movement/Army (SPLM/A) advocated a reformed, secular and democratic Sudan. Compared to the Anya-Nya of the first war, the SPLM/A was also bigger, better coordinated, and more politically savvy. It found allies elsewhere in Sudan and managed to take the fight to the north. The new war was also more intense and resulted in large-scale destruction and displacement. The ensuing humanitarian crises and developments in international politics resulted in unprecedented foreign involvement.
Return to civil war in South Sudan
Violence in and around the town of Bor on May 16, 1983, marked the beginning of the second civil war. On that day the government decided to use force to end a mutiny which had already been under way since March because the soldiers there, all former Anya-Nya, had gone unpaid owing to allegations of corruption. This crisis was accompanied by renewed rumors that all former Anya-Nya were to be transferred north. A force sent from Juba clashed with the Bor garrison. The mutineers fled to Ethiopia and established themselves in bases at Itang, Bonga, and Dimma. The garrisons of Ayod, Waat, Boma, and Pochalla went with them. The future leader of the SPLM/A, John Garang de Mabior, was already in Bor – allegedly on leave. He was part of an underground network already planning a new uprising on August 18, the anniversary of the 1955 Torit Mutiny. Among other leaders were Kerubino Kwanjin Bol, William Nyoun, Arok Thon Arok, Joseph Oduho, Salva Kiir, Martin Manyiel, and Nyachigak Ng'achiluk (who was killed in 1984). All except Oduho were military officers.
South Sudan became independent on July 9, 2011, six-and-a-half years after the signing of the CPA and almost ten years after the Machakos Protocol. The event was celebrated with a remarkable assembly of statesmen, politicians, and celebrities. Footage of ecstatic Southerners traveled across the globe. But while secession was symbolically important, independence proved to be a process rather than a single event, one that had started decades earlier and has continued since 2011. Upon independence, the government of South Sudan was embroiled in a multitude of crises. Sudan and South Sudan almost went to war during negotiations over the terms of secession; oil production stopped and battles were fought. Negotiations continued until early 2013, and central issues were unresolved when a power struggle within the SPLM/A became the focus of attention. Following a government crisis in the summer, political tension escalated and, after an ultimatum from the internal opposition, exploded into large-scale violence in mid December. Civil war had returned to South Sudan. The familiar pattern of fighting, destruction, displacement, and negotiations ensued. Foreign observers were not alone in wondering whether independence had been a mistake and if the new state would ever function.
From referendum to independence: a slow and painful divorce
Although the Peace Agreement had opened the way for autonomous governance structures, in July 2011, much remained to be done to establish a sovereign state. During the summer of 2010, the parties agreed to a High Level Panel of the African Union, led by Thabo Mbeki, the former South African president, to facilitate post-Agreement arrangements; IGAD was sidelined. The policy of “making unity attractive,” vigorously policed by the NCP, had meant that even discussing terms of secession awaited the referendum in January 2011. This gave the leaders of South Sudan less than six months to prepare for sovereignty and to negotiate terms with Khartoum. Developments along the border added to the burden.
Abyei was tense. If the referendum over its future had taken place as intended, there is no doubt that the permanent residents, the Ngok Dinka, would have voted overwhelmingly in favor of joining South Sudan, and the nomadic Misseriya – who seasonally used the land and demanded a vote – would have opted for Sudan.
Like much of Europe's nineteenth-century global empire-building, the Anglo-Egyptian conquest in 1896–8 resulted from momentum. The fall of Khartoum to the Mahdi in 1885 had not brought about abandonment of the Sudan: Gordon had been sent to conduct its evacuation. During the decade that followed, officers of the British occupation in Egypt, collaborating with publishers in England, pressed to reconquer the Sudan. But when the British government finally sanctioned an advance, in 1896, it was not to “avenge Gordon” but to deflect the Mahdists from the beleaguered Italians in Eritrea. Thereafter, the campaign continued methodically until the decisive battle of Omdurman in September 1898, not in order to rescue the Sudanese from the fanatical “dervishes” but to stymie a French advance to the upper Nile. While Sudan, north and south, should, therefore, not be considered an “accidental” acquisition, it was almost incidental, in that what mattered to imperial strategists was the Nile rather than the territory – still less the people – of its watershed. The south of Sudan was but a necessary inconvenience.
The European colonial era was comparatively brief in Sudan: it was one of the last African territories taken under European rule and one of the first to shed it (in 1956). In the case of South Sudan, effective rule was even briefer, since the first two decades were spent gaining control of the territory. Nonetheless, the colonial regime, for all its shortcomings, fashioned governance structures and practices that have largely survived up to today. There is also little doubt that it was during the latter days of colonial rule that the notion of South Sudan as a nation gained a foothold there.
Sources for the study of the period are relatively copious but remain inadequate and skewed. Written material from the first quarter of the twentieth century is mostly “official,” uninformed, and concerned with discrete administrative problems of an apparently transient military occupation. Accounts by independent travelers, who anyway tended to follow established routes, are as always impressionistic; rarely did observer and observed (or governor and governed) speak the same language.
A History of South Sudan addresses several audiences and a wide variety of issues. We have chosen a conventional chronological approach, but a number of themes recur. Above all, we aim to illuminate two questions in the history of this new country: How did South Sudan become a political and administrative entity? And why did it separate from Sudan?
Answering these questions requires a new look at standard versions, for the historiography of South Sudan reflects entrenched and often diametrically opposed political views. Some nationalists’ mission to create a South Sudanese national identity has led to the invention of a “natural” and timeless political and cultural unit. But we know remarkably little about what most people even today think it means to be South Sudanese. Although this book is not a “history of an idea,” we examine some processes and events that contributed to shaping one. When South Sudanese voted, in January 2011, the proffered alternative to separation from Sudan was confederation and considerable autonomy: South Sudan would be recognized as a political and administrative unit within Sudan. Yet the vote went overwhelmingly for independence. How deep, and with what particular ramifications, was the sentiment for separation?
After all, the history of South Sudan over the past two centuries is of steadily increasing interaction between its peoples and the outside world. And since the mid-twentieth century, South Sudanese have migrated (or fled) in millions to Sudan, to neighboring countries, and beyond. Today, there are South Sudanese communities in most corners of the world. Some have impacted the places to which they have moved; many have returned to South Sudan with new allies and ideas. Thus, patterns of interaction have varied considerably over time and from place to place. So also have South Sudanese responses, their motives, and the opportunities for exchange and transformation that interaction opened up. This book aims to present at least broad outlines of how these opportunities came about and to what uses South Sudanese put them in pursuit of their own goals.
The term “South Sudan” has also become associated with war and human suffering.
The period 1963–72 has received inadequate attention in accounts of South Sudan's contemporary history. As a consequence, the personalities, events, and processes that plunged South Sudan into civil war, and contributed to its continuing for almost ten years, have been pushed into relative obscurity. In general overviews, some themes tend to be mentioned in passing: Southern involvement in the ousting of the Abboud regime in November 1964; the Round Table peace talks of March 1965; the endless fragmentation of political parties and rebel movements in exile in the period 1965–9; the military coup in 1969; the consolidation of the Anya-Nya rebel groups under Joseph Lagu; and, finally, the negotiations that culminated in the March 1972 Addis Ababa Peace Agreement. But a more coherent account of the period would explain the impact of civil war on South Sudan and the narrowing of the range of possible future relations between the region and the rest of the Sudanese polity.
1963–1964: the beginning of civil war
The beginning of organized diaspora politics, foremost in Uganda but also in Ethiopia, Kenya, and Congo, dates from 1962. We have seen that a growing stream of refugees to neighboring countries in the early 1960s included politicians, former government employees, and other “intellectuals.” Among them, the trio of Fr. Saturnino Lahure, Joseph Oduho, and William Deng stand out as leaders of the militant diaspora. In February 1962, they formed the Sudan African Closed District National Union (SACDNU), which in early 1963 changed its name to the less unwieldy Sudan African National Union (SANU). SANU was evidently forged in the same mold as contemporary anticolonial movements in other African countries – for example, the Kenya African National Union and Tanganyika African National Union – and its leaders sought to portray the Southern situation as one in which one colonial master had been exchanged for another. A demand for an independent South Sudan had already replaced the call for autonomy and federation within a united Sudan. Many former supporters of federalism had been radicalized by recent developments and become separatists in the process. Without any way of wielding political influence peacefully, they viewed armed rebellion against an increasingly repressive regime as their only option.