America’s understanding, or perhaps more appropriately its misunderstanding, of Africa is based on a long history of explorer, traveler, and missionary experience recounted in travelogue and the popular press. For the most part, images of Africa expressed in these accounts portray a world of barbarous, uncivilized peoples living in the unbearable climes of torrid deserts and tropical swelter. Today, there is little evidence to suggest any fundamental change in these earlier perceptions. As Michael McCarthy comments, “[the] ‘dark continent’ image of Africa as a mixture of desert and jungle, savage beasts and beastly savages, has persisted to such an extent that it has become over time the essential way in which most Americans have come to understand African realities.”