New

On the epistemological repercussions of 1492

On the epistemological repercussions of 1492


We are searching data for your request:

Forums and discussions:
Manuals and reference books:
Data from registers:
Wait the end of the search in all databases.
Upon completion, a link will appear to access the found materials.

Lecture 11 of Yuval Harari's online course A brief history of humankind is titled "The discovery of ignorance." In it he presents the thesis that by arriving at a land mass entirely different from the one that the then-standard geographical treatises had led him to expect, Columbus unwittingly dealt a mortal blow to all of the conventional wisdom of his time, especially that received from tradition and from ancient texts.

Harari therefore concludes that the so-called "discovery of America" was also, in a very literal sense, a discovery by Europeans of their ignorance (hence the lecture's title). Indeed, according to Harari, up until 1492, the general and unquestioned belief was that all that was worth knowing was either immediately apparent to the senses (i.e. "common sense"), or else it was contained in the traditional wisdom transmitted from one generation to the next and in canonical texts preserved since antiquity. Columbus's landing on the New World made this belief untenable, and Europeans for the first time got an inkling of the extent of their ignorance. (Of course, this realization did not happen overnight: it took generations for the implications of 1492 to be fully digested. Also, it should go without saying that none of the above should imply that Europeans were more ignorant than anybody else.)

According to Harari, there is a direct connection between the events of 1492 and the publication in 1620 of Bacon's Novum Organum Scientiarum, generally considered the first enunciation of what we now refer to as the "scientific method."

(Indeed, it is consonant with Harari's thesis that the frontispiece of Bacon's work features a galleon sailing beyond Gibraltar's Pillars of Hercules, into the ne plus ultra: a clear metaphor for the newly discovered Unknown, and for the challenge to venture into it.)

Harari's lecture was the first time I ever heard this causal link, as it were, from Columbus's 1492 to Bacon's 1620.

My question is: Is Harari the originator of this thesis, or is he quoting earlier thinkers?


Columbian Exchange

Our editors will review what you’ve submitted and determine whether to revise the article.

Columbian Exchange, the largest part of a more general process of biological globalization that followed the transoceanic voyaging of the 15th and 16th centuries. Ecological provinces that had been torn apart by continental drift millions of years ago were suddenly reunited by oceanic shipping, particularly in the wake of Christopher Columbus’s voyages that began in 1492. The consequences profoundly shaped world history in the ensuing centuries, most obviously in the Americas, Europe, and Africa. The phrase “the Columbian Exchange” is taken from the title of Alfred W. Crosby’s 1972 book, which divided the exchange into three categories: diseases, animals, and plants.


1. History, Problems, and Issues

Traditional epistemology has its roots in Plato and the ancient skeptics. One strand emerges from Plato&rsquos interest in the problem of distinguishing between knowledge and true belief. His solution was to suggest that knowledge differs from true belief in being justified. Ancient skeptics complained that all attempts to provide any such justification were hopelessly flawed. Another strand emerges from the attempt to provide a reconstruction of human knowledge showing how the pieces of human knowledge fit together in a structure of mutual support. This project got its modern stamp from Descartes and comes in empiricist as well as rationalist versions which in turn can be given either a foundational or coherentist twist. The two strands are woven together by a common theme. The bonds that hold the reconstruction of human knowledge together are the justificational and evidential relations which enable us to distinguish knowledge from true belief.

The traditional approach is predicated on the assumption that epistemological questions have to be answered in ways which do not presuppose any particular knowledge. The argument is that any such appeal would obviously be question begging. Such approaches may be appropriately labeled &ldquotranscendental.&rdquo

The Darwinian revolution of the nineteenth century suggested an alternative approach first explored by Dewey and the pragmatists. Human beings, as the products of evolutionary development, are natural beings. Their capacities for knowledge and belief are also the products of a natural evolutionary development. As such, there is some reason to suspect that knowing, as a natural activity, could and should be treated and analyzed along lines compatible with its status, i.e., by the methods of natural science. On this view, there is no sharp division of labor between science and epistemology. In particular, the results of particular sciences such as evolutionary biology and psychology are not ruled a priori irrelevant to the solution of epistemological problems. Such approaches, in general, are called naturalistic epistemologies, whether they are directly motivated by evolutionary considerations or not. Those which are directly motivated by evolutionary considerations and which argue that the growth of knowledge follows the pattern of evolution in biology are called &ldquoevolutionary epistemologies.&rdquo

Evolutionary epistemology is the attempt to address questions in the theory of knowledge from an evolutionary point of view. Evolutionary epistemology involves, in part, deploying models and metaphors drawn from evolutionary biology in the attempt to characterize and resolve issues arising in epistemology and conceptual change. As disciplines co-evolve, models are traded back and forth. Thus, evolutionary epistemology also involves attempts to understand how biological evolution proceeds by interpreting it through models drawn from our understanding of conceptual change and the development of theories. The term &ldquoevolutionary epistemology&rdquo was coined by Donald Campbell (1974).

1.1 The Evolution of Epistemological Mechanisms (EEM) versus The Evolutionary Epistemology of Theories (EET)

There are two interrelated but distinct programs which go by the name &ldquoevolutionary epistemology.&rdquo One focuses on the development of cognitive mechanisms in animals and humans. This involves a straightforward extension of the biological theory of evolution to those aspects or traits of animals which are the biological substrates of cognitive activity, e.g., their brains, sensory systems, motor systems, etc. The other program attempts to account for the evolution of ideas, scientific theories, epistemic norms and culture in general by using models and metaphors drawn from evolutionary biology. Both programs have their roots in 19th century biology and social philosophy, in the work of Darwin, Spencer, James and others. There have been a number of attempts in the intervening years to develop the programs in detail (see Campbell 1974, Bradie 1986, Cziko 1995). Much of the contemporary work in evolutionary epistemology derives from the work of Konrad Lorenz (1977), Donald Campbell (1974a, et al.), Karl Popper (1972, 1984) and Stephen Toulmin (1967, 1972).

The two programs have been labeled EEM and EET (Bradie, 1986). EEM is the label for the program which attempts to provide an evolutionary account of the development of cognitive structures. EET is the label for the program which attempts to analyze the development of human knowledge and epistemological norms by appealing to relevant biological considerations. Some of these attempts involve analyzing the growth of human knowledge in terms of selectionist models and metaphors (e.g., Popper 1972, Toulmin 1972, Hull 1988 see Renzi and Napolitano 2011 for a critique of these efforts). Others argue for a biological grounding of epistemological norms and methodologies but eschew selectionist models of the growth of human knowledge as such (e.g., Ruse 1986, Rescher 1990).

The EEM and EET programs are interconnected but distinct. A successful EEM selectionist explanation of the development of cognitive brain structures provides no warrant, in itself, for extrapolating such models to understand the development of human knowledge systems. Similarly, endorsing an EET selectionist account of how human knowledge systems grow does not, in itself, warrant concluding that specific or general brain structures involved in cognition are the result of natural selection for enhanced cognitive capacities. The two programs, though similar in design and drawing upon the same models and metaphors, do not stand or fall together.

1.2 Ontogeny versus Phylogeny

Biological development involves both ontogenetic and phylogenetic considerations. Thus, the development of specific traits, such as the opposable thumb in humans, can be viewed both from the point of view of the development of that trait in individual organisms (ontogeny) and the development of that trait in the human lineage (phylogeny). The development of knowledge and knowing mechanisms exhibits a parallel distinction. We can consider the growth of an individual&rsquos corpus of knowledge and epistemological norms or of an individual&rsquos brain (ontogeny), or the growth of human knowledge and establishment of epistemological norms across generations or the development of brains in the human lineage (phylogeny). The EEM/EET distinction cuts across this distinction since we may be concerned either with the ontogenetic or phylogenetic development of, e.g., the brain or the ontogenetic or phylogenetic development of norms and knowledge corpora. One might expect that since current orthodoxy maintains that biological processes of ontogenesis proceed differently from the selectionist processes of phylogenesis, evolutionary epistemologies would reflect this difference. Curiously enough, however, for the most part they do not. For example, the theory of &ldquoneural Darwinism&rdquo as put forth by Edelman (1987) and Changeaux (1985) offers a selectionist account of the ontogenetic development of the neural structures of the brain. Karl Popper&rsquos conjectures and refutations model of the development of human knowledge is a well known example of a selectionist account which has been applied both to the ontogenetic growth of knowledge in individuals as well as the trans-generational (phylogenetic) evolution of scientific knowledge. B. F. Skinner&rsquos theory of operant conditioning, which deals with the ontogenesis of individual behavior, is explicitly based upon the Darwinian selectionist model (Skinner 1981).

1.3 Descriptive versus Prescriptive Approaches

A third distinction concerns descriptive versus prescriptive approaches to epistemology and the growth of human knowledge. Traditionally, epistemology has been construed as a normative project whose aim is to clarify and defend conceptions of knowledge, foundations, evidential warrant and justification. Many have argued that neither the EEM programs nor the EET programs have anything at all to do with epistemology properly (i.e., traditionally) understood. The basis for this contention is that epistemology, properly understood, is a normative discipline, whereas the EEM and EET programs are concerned with the construction of causal and genetic (i.e., descriptive) models of the evolution of cognitive capacities or knowledge systems. No such models, it is alleged, can have anything important to contribute to normative epistemology (e.g., Kim 1988). The force of this complaint depends upon how one construes the relationship between evolutionary epistemology and the tradition.

There are three possible configurations of the relationship between descriptive and traditional epistemologies. (1) Descriptive epistemologies can be construed as competitors to traditional normative epistemologies. On this view, both are trying to address the same concerns and offering competing solutions. Riedl (1984) defends this position. A standard objection to such approaches is that descriptive accounts are not adequate to do justice to the prescriptive elements of normative methodologies. The extent to which an evolutionary approach contributes to the resolution of traditional epistemological and philosophical problems is a function of which approach one adopts (cf. Dretske 1971, Bradie 1986, Ruse 1986, Radnitsky and Bartley 1987, Kim 1988). (2) Descriptive epistemology might be seen as a successor discipline to traditional epistemology. On this reading, descriptive epistemology does not address the questions of traditional epistemology because it deems them irrelevant or unanswerable or uninteresting. Many defenders of naturalized epistemologies fall into this camp (e.g., Munz 1993). (3) Descriptive epistemology might be seen as complementary to traditional epistemology. This appears to be Campbell&rsquos view. On this analysis, the function of the evolutionary approach is to provide a descriptive account of knowing mechanisms while leaving the prescriptive aspects of epistemology to more traditional approaches. At best, the evolutionary analyses serve to rule out normative approaches which are either implausible or inconsistent with an evolutionary origin of human understanding.

1.4 Future Prospects

EEM programs are saddled with the typical uncertainties of phylogenetic reconstructions. Is this or that organ or structure an adaptation and if so, for what? In addition, there are the uncertainties which result from the necessarily sparse fossil record of brain and sensory organ development. The EET programs are even more problematic. While it is plausible enough to think that the evolutionary imprint on our organs of thought influences what and how we do think, it is not at all clear that the influence is direct, significant or detectible. Selectionist epistemologies which endorse a &ldquotrial and error&rdquo methodology as an appropriate model for understanding scientific change are not analytic consequences of accepting that the brain and other ancillary organs are adaptations which have evolved primarily under the influence of natural selection. The viability of such selectionist models is an empirical question which rests on the development of adequate models. Hull&rsquos (1988) is, as he himself admits, but the first step in that direction. Cziko (1995) is a manifesto urging the development of such models (cf. also the evolutionary game theory modeling approach of Harms 1997). Much hard empirical work needs to be done to sustain this line of research. Non-selectionist evolutionary epistemologies, along the lines of Ruse (1986), face a different range of difficulties. It remains to be shown that any biological considerations are sufficiently restrictive to narrow down the range of potential methodologies in any meaningful way. A non-selectionist approach to evolutionary epistemology, based on &ldquoPoincaréan dynamics,&rdquo has been proposed by Barham (1990). An alternative approach, which exploits the fact that organisms and their environments co-evole as a result of dialectical interactions between them, has led to the development of &ldquonon-adaptational&rdquo evolutionary epistemologies (Gontier et al 2006). A critical review of the problems facing the development of the naturalistic turn in evolutionary epistemology can be found in Callebaut and Stotz (1998).

Nevertheless, the emergence in the latter quarter of the twentieth century of serious efforts to provide an evolutionary account of human understanding has potentially radical consequences. The application of selectionist models to the development of human knowledge, for example, creates an immediate tension. Standard traditional accounts of the emergence and growth of scientific knowledge see science as a progressive enterprise which, under the appropriate conditions of rational and free inquiry, generates a body of knowledge which progressively converges on the truth. Selectionist models of biological evolution, on the other hand, are generally construed to be non-progressive or, at most, locally so. Rather than generating convergence, biological evolution produces diversity. Popper&rsquos evolutionary epistemology attempts to embrace both but does so uneasily. Kuhn&rsquos &ldquoscientific revolutions&rdquo account draws tentatively upon a Darwinian model, but when criticized, Kuhn retreated (cf. Kuhn 1972, pp. 172f with Lakatos and Musgrave 1970, p. 264). Toulmin (1972) is a noteworthy exception. On his account, concepts of rationality are purely &ldquolocal&rdquo and are themselves subject to evolution. This, in turn, seems to entail the need to abandon any sense of &ldquogoal directedness&rdquo in scientific inquiry. This is a radical consequence which few have embraced. Pursuing an evolutionary approach to epistemology raises fundamental questions about the concepts of knowledge, truth, realism, justification and rationality.

1.5 Expanding the Circle

Although Campbell and Popper both pointed to the continuity between the evolution of human knowledge and the evolution of knowledge in non-human organisms, much of the early work in evolutionary epistemology focused on the human condition. However, recent empirical investigations by psychologists, cognitive ethologists, cognitive neuroscientists and animal behaviorists have revealed that animals, both primates and non-primates, have much more sophisticated cognitive capacities than were previously suspected (Panksepp 1998, Heyes and Huber 2000, Rogers and Kaplan 2004, Lurz 2011, van Schaik 2010). From an evolutionary perspective this is not surprising given the shared evolutionary heritage that all animals share. Taking Darwin seriously means reconsidering and reassessing the nature of human knowledge in the light of our increased awareness of the cogntive capabilities of the members of other species. In addition, once a firm empirical basis of the scope and limits of animal cognitive capacities has been established we will be in a position to reassess our philosophical evaluations of the mental lives of animals and their epistemic and moral status as well. Further field research promises to revolutionize our understanding of the sense in which human beings are one among the animals.

The KLI Theory Lab of the Konrad Lorenz Institute publishes a journal devoted to issues in evolutionary epistemology in addition to other applications of biological theory, Biological Theory: Integrating Development, Evolution and Cognition.


The Consequences of Columbus

A condescending noblesse oblige continues to cloud our discussions of European and Native cultures.

Just when we were convinced that Newsweek, like its counterpart Time, is an essentially superficial magazine for hurried people who want information without having to think, the mail brought the Fall/Winter 1991 Columbus Special Issue. Produced in collaboration with the "Seeds of Change" exhibit on display at the Smithsonian's Museum of Natural History in Washington, the magazine provides over eighty pages of generally sound history and analysis of Columbus and of the vast changes in flora and fauna, diet and cultural patterns that followed 1492. Whatever else may be said of Columbus, write the editors, he had consequences, and those consequences "hold the key to the meaning of Columbus' voyages."

In strictly philosophical terms, this argument may come dangerously close to the consequentialism favored by some suspect philosophers and theologians. Newsweek and the Smithsonian clearly feel ill-equipped to face the partisan firestorm over Columbus per se and have taken the prudent course of describing instead 1492's consequences, good and bad. Whatever the shortcomings of this approach, it at least has the merit of examining closely and impartially a wide variety of facts about the last five hundred years in the Americas and the world.

By comparison, the National Council of Churches' document "A Faithful Response to the 500th Anniversary of the Arrival of Christopher Columbus," published in 1990 amidst much controversy, was a far less morally serious text. Newsweek amply demonstrates the wealth of interesting and morally relevant material on Columbus, Native Americans, the Spanish settlements, disease, slavery, and a host of other issues available to anyone who takes the trouble to look. The NCC, however, apparently thought a "faithful response" means moralistic denunciation on the basis of a few vague references to what we all, of course, recognize as the historical impartiality of Howard Zinn and the moral finality of "Black Elk Speaks."

In December of 1990, the National Conference of Catholic Bishops (NCCB) issued a far better informed and reasoned pastoral letter, "Heritage and Hope: Evangelization in America." In it, besides proper rejection of past atrocities and present neglect of Native Americans, a picture emerges of the complex interaction between European and native peoples that began after 1492. Yet even the Catholic bishops tread gingerly around various questions. They seem so concerned to show their solidarity with the current plight of Native Americans that they make blanket statements about pre-Columbian Indian cultures that do not accurately reflect historical reality. In the bishops' reading, the hundreds of different Native American cultures all had a natural piety already. The missionaries had only the modest task of explaining "how Christianity complemented their beliefs and challenged those things in their culture that conflicted with Christ's message."

The high Indian civilizations of the Aztecs, Mayas, and Incas in fact needed a far more vigorous spiritual liberation than that (see my "1492 and All That," First Things, May 1991). And even some less-developed tribes were engaged in activities we would not speak about in the bishops' measured tones were they still practiced today. For example, almost everyone deplores Columbus' mistreatment of the gentle and peace-loving Arawaks he encountered in the Caribbean. But how many people are aware that one of the reasons the Arawaks welcomed the Europeans so warmly was their fear of the Carib Indians who were, as one modern historian puts it, "then expanding across the Lesser Antilles and literally eating the Arawaks up"?


A condescending noblesse oblige continues to cloud our discussions of European and Native cultures. Whether the issue is natives carrying out human sacrifice, torture, cannibalism, and environmental damage in the past, or Indians poised (in age-old custom) to burn tropical rain forests in the present, the tendency is to paint European sins all the blacker by whitewashing their native counterparts. Native spokesmen and their advocates in institutions like the NCC have a point, but fail at the morally important responsibility of identifying not only Europe's sins, but those aspects of Native cultures that have been changed for the better by the encounter with Europe.

In fact, the very form of the typical moral argument against the European arrival presupposes some European principles that we have wrongly come to take for granted. European behavior in the New World is usually denounced for its cultural arrogance and its violation of universal human and political rights. Yet no other culture in the world conceived of universal respect for human persons and the embodiment of that principle in international law prior to European development of these doctrines-prodded in part by the encounter with American natives. We now blithely assume that all two-legged creatures who look like us are persons deserving human treatment, including a proper valuing of their cultures. But that realization was won by hard thinking in the face of some difficult circumstances.

China, for example, was a high ancient civilization that until the last century knew little about other cultures. It regarded itself as normative for the rest of the world and worried little about "rights." Most other cultures felt more or less the same, particularly tribal societies, which were often at perpetual war with one another. Before 1492, Europe had had some contact with Jews, Muslims, and Asians, which forced it to develop some ideas about tolerance and pluralism in civic life. But the contact with America was the event that caused a profound rethinking of everything.

To begin with, there was a religious question. One of the controversies from the Middle Ages that Columbus' voyage reignited was not whether the world was round (every educated person knew that), but whether people could exist at the antipodes (the ends of the Earth). Far from being the kind of idle speculation that some anti-medievalists associate with angels dancing on the heads of pins, this question had profound repercussions. Would God have created any people outside of all contact with the Old and New Testaments? One of the consequences of such a creation would be that the people would have been left without at least potential knowledge of what was needed for salvation. The problem arose, thus, not from ignorance, but from profound concern about the form of God's universal charity.

This dispute had immediate importance for moral reflections on the Indians. Having lived separate from the Old World, they could not be held responsible for failure to accept the Gospel (as some thought Jews and Muslims could be held responsible). Bartolomea de las Casas, the widely acclaimed Dominican priest who defended the Indians, went so far as to argue that even human sacrifice and cannibalism among the natives should not be held against them because both practices showed deep reverence and a spirit of sacrifice towards the Almighty.


Las Casas' defense was noble and in part effective, even though we may now think somewhat differently about the full moral and religious significance of these native practices. His work, however, forced Europe into unprecedented reflections on what constituted a rational being. Native religion and life were, whatever their shortcomings, clearly not the creation of irrational brutes. The Spanish crown was so sensitive to these moral arguments that in 1550 it ordered all military activity to cease in the Americas and created a royal commission at Valladolid to examine Spain's behavior in the New World. No other growing empire in history has ever similarly interrupted itself to take up moral issues. Ultimately, greed and ineffective Spanish administration led to the abuses we know of, but the commission did bring about penalties for some of the worst offenders, as well as certain reforms in administration and policy.

At Valladolid, Las Casas argued against Juan Gineas de Sepualveda, another theologian, that Indians were human beings. Sepualveda rejected that argument, but to establish his case he had to try to prove that reason was so weak in the Indians that, left to themselves, they could not live according to reason. By commonly accepted Christian principles, only rational incapacity, not (as is often assumed) the mere assertion of European cultural superiority, could justify Spanish control of natives, and even then only for the good of the Indians. The judges of the debate did not reach a definite conclusion, but Valladolid represents a consolidation of Spanish and papal misgivings going back to 1500, and gross mistreatment of the Indians gradually lessened.

The second great moral result of the European arrival in the New World came in the area of international law. Again, we now take it for granted that even nations deeply alien to us have a right to their own territory and culture, but it is largely due to the reflections begun by Francisco de Vitoria, a Dominican theologian and friend of Las Casas, that we have such principles. Vitoria was highly respected by the Spanish king, who appointed him to several royal commissions (unfortunately, he died before the great debate at Valladolid). But Vitoria did not hesitate to tell the monarch that he had no right to lands occupied by Indians, nor could he make slaves out of rational beings. Furthermore, Vitoria went so far as to call the 1494 Treaty of Tordesillas, in which the Pope ceded lands to the Spanish and Portuguese, improper because the pontiff did not have temporal sovereignty over the earth, particularly over lands already occupied by natives.

In this, Vitoria was developing principles that were also coming to have an influence over Pope Pius III, who in response to reports from the New World proclaimed in his 1537 encyclical Sublimis Deus:

Indians and all other people who may later be discovered by the Christians are by no means to be deprived of their liberty or the possession of their property, even though they be outside the faith of Jesus Christ and they may and should, freely and legitimately, enjoy their liberty and the possession of their property nor should they be in any way enslaved should the contrary happen it shall be null and of no effect. . . . By virtue of our apostolic authority we declare . . . that the said Indians and other peoples should be converted to the faith of Jesus Christ by preaching the word of God and by the example of good and holy living.

Christian sentiments like these led Vitoria to the elaboration of the beginnings of that system of global law that has borne fruit today.

Perhaps the most telling sign of the new moral reflection stimulated by contact with natives, however, is a little-known proposal by the first viceroy of New Spain, the shrewd and competent Antonio de Mendoza. In an attempt to deal with the various factions contending over the Indian question in the New World, Mendoza suggested a simple solution: "Treat the Indians like any other people and do not make special rules and regulations for them. There are few persons in these parts who are not motivated in their opinions of the Indians by some interest, good or bad." In this early wisdom, the seeds of fair and impartial treatment for all, regardless of origin, begin to sprout-a strongly, almost uniquely, American trait necessitated by the rich mixture of various peoples on these shores.

When we are tallying up the moral accounts of the last five hundred years, a good practice periodically for any people, we should recall that ethical developments, too, are a consequence of Columbus. Newsweek and the Smithsonian might have treated more fully the significant stages in that story. Is it too much to hope that our churches, even in their current condition, will come to appreciate and to remind us how Christianity, in spite of a tortuous and tangled history, has contributed to the moral development of our still sadly underdeveloped humanity?

Royal, Robert. "Consequences of Columbus." First Things 20 (February, 1992): 9-11.

Reprinted with permission of First Things, published by the Institute on Religion and Public Life, 156 Fifth Avenue, Suite 400, New York, NY 10010. To subscribe to First Things call 1-800-783-4903.


Epistemic Disobedience: The Radical Potential of Deconstructing the Logic of Coloniality

Earlier, it was argued that Berna Reale's work, specifically her performance (Americano, 2015) at a maximum-security state penitentiary in northern Brazil and video screened at the 2015 Venice Biennale, exemplify epistemic disobedience. The discussion now moves to the radical potential of her work for deconstructing and reconstructing normative processes of knowledge production. This type of mediation is important in reaffirming the significance of situated knowledge – a crucial step in disconnecting knowledge from the grip of one-sided Western epistemology.

Reale's performances generate a social critique to the extent that her work is focused on the precariousness of human life, which she sees as rooted in various kinds of societal violence, from poverty to racism to corruption. In a video interview, Luíz Camillo Osório, curator of the Brazil Pavilion at the 56th Venice Biennale, stated that the Brazilian artworks presented at the biennial used “strong poetic language and symbolic metaphors” to talk about “inequalities, social division, and political conflict in Brazil” (BiennaleChannel 2015 ). Further, in Americano, Reale uses her body as a medium to show that the “Olympics are an important component, not only of the formal part of the society [e.g., people who might attend, view, or participate in the Games] but also the excluded part, those who live under inhumane prison conditions” (BiennaleChannel 2015 ).

In an ARTnews review of the biennale, Douglas ( 2015 ), cited earlier, makes a quite different argument. She writes that Reale's intent, as stated on the biennale's gallery label/caption describing the artwork, is to “contrast the meanings embodied by that famous torch – reason, wisdom, liberty, freedom, human rights – with the grim conditions of imprisonment.” How did the Olympic flame come to signify these values (e.g., freedom and human rights) in the Western imaginary? Is Douglas's review an uncritical reflection of the artist's intent and is her review epistemologically obedient or complicit with Eurocentric knowledge production and zero point epistemology?

Depending on what one knows or chooses to know, the Olympic flame relay originated in either ancient Greece or Nazi Germany. The fire-bearing ceremonies of Greek mythology, the athletic contests that took place during the classical period of ancient Greece, and the first modern Olympics in Athens in 1896, despite popular belief, were not precursors to the torch relay of the modern-day Olympics (Maguire 2014 Rolim and Zarpellon Mazo 2008 Young 2004 ). In the historical versus mythological narrative, the torch relay event was created by Carl Diem, the chief organizer of the 1936 Olympics in Berlin, and institutionalized by Joseph Goebbels, Hitler's propaganda minister, to showcase on an unprecedented scale (live and through closed-circuit television) the “new” Germany (Maguire 2014 Rolim and Zarpellon Mazo 2008 ).

Brazil's sports commissioner and other important stakeholders witnessed the symbolic power of the torch relay at the opening ceremonies of the 1936 Berlin Games. Two years later, Brazil's National Defense League, with the enthusiastic support of the sports clubs of Porto Alegre and the capital of Rio Grande do Sul state, appropriated the flame relay ceremony, thereby creating the National Torch Relay (Rolim and Zarpellon Mazo 2008 ). The celebrations were meant to create national memories: for example, during Brazil's National Week, hundreds of relay runners ran from city to city in an effort to promote the country's accomplishments (Rolim and Zarpellon Mazo 2008 ). The appropriation of the Olympic torch relay – that is, performing and re-performing the event in different places, carrying the flame and its symbolic elements over time – “may be regarded as a practice which helps us to understand how the Olympic initiatives are dislocated and reinvented in different contexts” (Rolim and Zarpellon Mazo 2008 ). The Olympics resumed and were once again dislocated and reinvented after World War II. In spite of the dark history of the relays, the organizers of the 1948 London Olympics, like the Brazilians 10 years earlier, adopted Nazi Germany's torch-bearing ceremony as a symbol of peace and goodwill. What types of knowledge(s) or reasons are necessary in order to justify holding on to the belief that the Olympic torch relay is a symbol of peace and goodwill?

Despite the worldwide rhetoric in which the Olympics are represented as a bastion of democracy and peace, in reality, the Olympics are a part of a mega-sport events industry that mirrors and replicates neoliberal globalization. This argument is beyond the scope of the present study. Briefly, nonetheless, it is worth noting that the research of activist and scholar Helen Jefferson Lenskyj provides an important critique of the social, economic, and ethical deficiencies of the Olympics industry. For example, she cites practices such as the exploitation of migrant contract laborers and hospitality workers, the displacement of people from their homes and neighborhoods to make room for Olympics-related construction, the forced removal of the homeless and the poor from public places, large-scale corporate involvement, and the sweatshop labor that goes hand in hand with manufacturing merchandise for corporate brands (Lenskyj 2008 Milton-Smith 2002 ). There are also deep-seated connections between the Olympics and human trafficking due to increased demands for sexual exploitation (e.g., prostitution) in the countries hosting the games (Lenskyj 2008 , 2012 ).

In short, the colonial matrix of power (coloniality) that renders life precarious is well rehearsed in the Olympics on a worldwide scale. Yet, the rhetoric of modernity (e.g., discovery, expansion, and progress) and the logic of coloniality (i.e., the expendability of human life) continue to haunt us. The potential achievements related to hosting the Olympics, such as upgraded city infrastructure, new construction, increased tourism and more jobs, and world recognition of the country and host city, are celebrated through the spectacle of the media – and reified by appearances by world-famous performers such as Lady Gaga, who received US$2 million to sing at the European Games in Baku, the capital of Azerbaijan, in 2015 (Janbazian 2015 ).

But not everyone is buying the rhetoric of modernity and the logic of coloniality associated with the Olympics. Mignolo ( 2011a ) contests the a priori of Eurocentric universal knowledge systems and their historical timeline. He maintains that civilization did not originate in Greece and that it did not progress through Rome to Europe to the modern world (6). In fact, “Eurocentrism is a question not of geography but of epistemology” (19). Along the same lines, scholars, artists, activists, and other interested citizens have used and continue to use this mega sporting events industry and spectacle to create platforms for protesting human rights violations and political repression (e.g., the contestation of segregation during the 1968 Olympics in Mexico City). These people call our attention to the corporate takeover of the Olympics (e.g., by Coca-Cola, Dow Chemicals, Adidas, and the British oil company BP, to name just a few of the companies involved in this way) (Dooling 2012 Lenskyj 2008 Milton-Smith 2002 ) the exorbitant cost associated with hosting the games (US$51 billion in Sochi, Russia, in 2014, and US$44 billion in Beijing, China, in 2008) (Wikipedia 2015 ) not to mention the deception, corruption, and scandals associated with the bidding process for and hosting the Olympics and other major events (Lenskyj 2008 Milton-Smith 2002 ), from the Salt Lake City Winter Olympics ( 2002 ) to the Beijing Summer Olympics (2008) to the Baku European Games ( 2015 ).

In 2013–2014, thousands of Brazilians took to the streets to protest the exorbitant amount of public money spent on hosting the International Federation of Association Football (FIFA) World Cup (e.g., the 600 million reais price tag to build a stadium in Brazília). This public money, the Brazilians argued, should have been spent on education, hospitals, and infrastructure (“Massive Clashes,” 2013) as well as on other social goods, such as socioeconomic parity and prison reform. “Not accepting the options that are available to [them]” or “delinking” from the grips of coloniality (Mignolo 2011b ), Brazilians engaged in critical debate, protest, and activism surrounding the 2016 Summer Olympics (Millington and Darnell 2014 ), which cost Rio de Janeiro approximately US$13.1 billion (Watson 2017 ). Alongside the activism previously mentioned, Berna Reale's video performance Americano strips away the pretense of the Olympics in Brazil.


Conclusion: What really goes extinct, anyway?

Can we hope to see the thylacine again? People will certainly try. Still, the best answer to that question is one that quotes the wizard Gandalf, by way of Tolkien: "There never was much hope. Just a fool's hope." That's why thylacines are extinct.

One might call this misplaced attribution or affirming the consequent or some other horrifying dereliction of philosophical duty, but I do think there's an important lesson about extinction to be drawn here. One feature common to all extinction concepts is the improbability of observation where the concepts differ, they differ in the degree of improbability. If one could measure such a thing as the global probability--that is, the probability of any random observer in any random place, quantified over all observers in all places--of encountering a thylacine, then that probability approached zero in 1933, decreased ever so slightly in 1936, and will bottom out if (well: when) de-extinction efforts fail. There may be different underlying processes that account for those probability shifts, but the extremely low probability of encounter is nevertheless common to all extinct species.

What this means is that we can resolve the metaphysical problem of extinction by way of resolving the epistemological problem. Extinction is problematic if conceived as a property of species per se, but it isn't problematic if conceived as a relation between a species and its observers. An extinct species is one that can't be observed. This may raise a host of questions about what constitutes observation, but that's an essay for a different blog--you know, one not named "Extinct."

This resolution suggests a sobering conclusion that's worth bearing in mind as the year 2018 kicks into gear: species aren't what really goes extinct. Our hope does.


3. Levels of Stigma: Theory & Epistemology

Illustrating the constructs underlying the formation of stigma helps us understand three specific levels of stigma – social stigma, self-stigma, and professional stigma. In this context, ‘levels’ does not refer to a hierarchy of importance for these varied stigmas, but rather to represent different social fields of stigma that can be differentiated from each other. In addition, further definition and theory behind these three ‘levels’ of stigma must be presented. First, stigmatized attitudes and beliefs towards individuals with mental health and drug use disorders are often in the form of social stigma, which is structural within the general public. Second, social stigma, or even the perception that social stigma exists, can become internalized by a person resulting in what is often called self-stigma. Finally, another, less studied level of stigma is that which is held among health professionals toward their clients. Since health professionals are part of the general public, their attitudes may in part reflect social stigma however, their unique roles and responsibility to ‘help’ may create a specific barrier. The following theories are presented as an aid to understanding how each ‘level’ of stigma may develop in society.

Social Stigma

The first, and most frequently discussed, ‘level’ is social stigma. Social stigma is structural in society and can create barriers for persons with a mental or behavioral disorder. Structural means that stigma is a belief held by a large faction of society in which persons with the stigmatized condition are less equal or are part of an inferior group. In this context, stigma is embedded in the social framework to create inferiority. This belief system may result in unequal access to treatment services or the creation of policies that disproportionately and differentially affect the population. Social stigma can also cause disparities in access to basic services and needs such as renting an apartment.

Several distinct schools of thought have contributed to the understanding of how social stigma develops and plays out in society. Unfortunately, to this point, social work has offered limited contributions to this literature. Nonetheless, one of the leading disciplines of stigma research has been social psychology. Stigma development in most social psychology research focuses on social identity resulting from cognitive, behavioral, and affective processes (Yang, Kleinman, Link, Phelan, Lee, & Good, 2007). Researchers in social psychology often suggest that there are three specific models of public stigmatization. These include socio-cultural, motivational, and social cognitive models (Crocker & Lutsky, 1986 Corrigan, 1998 Corrigan, et al, 2001). The socio-cultural model suggests that stigma develops to justify social injustices (Crocker & Lutsky, 1986). For instance, this may occur as a way for society to identify and label individuals with mental and behavioral illnesses as unequal. Second, the motivational model focuses on the basic psychological needs of individuals (Crocker & Lutsky, 1986). One example of this model may be that since persons with mental and behavioral disorders are often in lower socio-economic groups, they are inferior. Finally, the social cognitive model attempts to make sense of basic society using a cognitive framework (Corrigan, 1998), such that a person with a mental disorder would be labeled in one category and differentiated from non-ill persons.

Most psychologists including Corrigan and colleagues (2001) prefer the social cognitive model to explain and understand the concept of stigma. One such understanding of this perspective – Attribution Theory – is related to three specific dimensions of stigma including stability, controllability, and pity (Corrigan, et al, 2001) that were discussed earlier. Using this framework, a recent study by these researchers found that the public often stigmatizes mental and behavioral disorders to a greater degree than physical disorders. In addition, this research found stigma variability based on the public’s 𠇊ttributions.” For example, cocaine dependence was perceived as the most controllable whereas ‘mental retardation’ was seen as least stable and both therefore received the most severe ratings in their corresponding stigma category (Corrigan, et al, 2001). These findings suggest that combinations of attributions may signify varying levels of stigmatized beliefs.

Sociologists have also heavily contributed to the stigma literature. These theories have generally been seen through the lens of social interaction and social regard. The first of these theorists was Goffman (1963) who believed that individuals move between more or less ‘stigmatized’ categories depending on their knowledge and disclosure of their stigmatizing condition. These socially constructed categories parallel Lemert’s (2000) discussion on social reaction theory. In this theory, two social categories of deviance are created including primary deviance, believing that people with mental and behavioral disorders are not acting within the norms of society, and secondary deviance, deviance that develops after society stigmatizes a person or group. Similarly, research demonstrating that higher levels of stigmatization are attributed towards individuals with more “severe” disorders (Angermeyer & Matschinger, 2005) also resembles these hierarchical categories and the disruptiveness and stability dimensions of stigma.

Furthermore, Link and Phelan clearly illustrated the view of sociology towards stigma in their article titled Conceptualizing Stigma (2001). Link and Phelan (2001) argue that stigma is the co-occurrence of several components including labeling, stereotyping, separation, status loss, and discrimination. First, labeling develops as a result of a social selection process to determine which differences matter in society. Differences such as race are easily identifiable and allow society to categorize people into groups. The same scenario may occur when society reacts to the untreated outward symptoms of several severe mental illnesses i.e., Schizophrenia. Labels connect a person, or group of people, to a set of undesirable characteristics, which can then be stereotyped. This labeling and stereotyping process gives rise to separation. Society does not want to be associated with unattractive characteristics and thus hierarchical categories are created. Once these categories develop, the groups who have the most undesirable characteristics may become victims of status loss and discrimination. The entire process is accompanied by significant embarrassment by the individuals themselves and by those associated with them (Link & Phelan, 2001).

While social psychology and sociology are the primary contributors to the stigma literature, other disciplines have provided insight as well. Communications, Anthropology, and Ethnography all favor theories that revolve around threat. In Communications literature, stigma is the result of an “us versus them” approach (Brashers, 2008). For example, the use of specific in-group language can reinforce in-group belongingness as well as promote out-group differentiation (Brashers, 2008). This is referenced in research on peer group relationships such that youth often rate interactions with their same-age peers more positively than with older adults (whether family members or not) (Giles, Noels, Williams, Ota, Lim, Ng, et. al., 2003). This can also be applied to those with mental disorders in that individuals in the out-group (mental disorders) are perceived less favorably than the non-ill in-group.

Anthropology and Ethnography also prefer the identity model. From this perspective, the focus is on the impact of stigma within the lived experience of each person. Stigma may impact persons with mental illnesses through their social network, including how it exists in the structures of lived experiences such as employment, relationships, and status. Further, the impact of stigma is a response to threat, which may be a natural or tactical self-preservation strategy. However, it only worsens the suffering of the stigmatized person (Yang, et al, 2007). It is important to note again that while many disciplines have been leaders in social stigma theory, social work-specific literature has been mostly void of discussion on this topic. This is particularly unusual, since stigma is an obvious factor that impacts the lives of social work clients on a daily basis.

Self-Stigma

Crocker (1999) demonstrates that stigma is not only held among others in society but can also be internalized by the person with the condition. Thus, the continued impact of social/public stigma can influence an individual to feel guilty and inadequate about his or her condition (Corrigan, 2004). In addition, the collective representations of meaning in society – including shared values, beliefs, and ideologies – can act in place of direct public/social stigma in these situations (Crocker & Quinn, 2002). These collective representations include historical, political, and economic factors (Corrigan, Markowitz, and Watson, 2004). Thus, in self-stigma, the knowledge that stigma is present within society, can have an impact on an individual even if that person has not been directly stigmatized. This impact can have a deleterious effect on a person’s self-esteem and self-efficacy, which may lead to altered behavioral presentation (Corrigan, 2007). Nonetheless, Crocker (1999) highlights that individuals are able to internalize stigma differently based on their given situations. This suggests that personal self-esteem may or may not be as affected by stigma depending on individual coping mechanisms (Crocker & Major, 1989).

Similarly, other theories have provided insight into the idea of self-stigma. In modified labeling theory, the expectations of becoming stigmatized, in addition to actually being stigmatized, are factors that influence psychosocial well-being (Link, Cullen, Struening, Shrout, & Dohrenwend, 1989). In this context, it is primarily the fear of being labeled that causes the individual to feel stigmatized. Similarly, Weiner (1995) proposed that stigmatized beliefs provoke an emotional response. This can be interpreted from the standpoint of the afflicted individual, such that he or she may feel stigmatized and respond emotionally with embarrassment, isolation, or anger.

Health Professional Stigma

It may seem unlikely that social workers and other health professionals would carry stigmatized beliefs towards clients especially those whom they know are affected by a variety of barriers to treatment engagement. Nonetheless, recent literature is beginning to document the initial impact of health professional stigma (Nordt, Rössler, & Lauber, 2006 Volmer, M๎salu, & Bell, 2008). While limited evidence exists specifically on social worker attitudes, pharmacy students who desire more social distance towards individuals with Schizophrenia are also less willing to provide them medications counseling (Volmer, et al, 2008). In addition, one Swiss study (psychiatrists, nurses, and psychologists) found that mental health professionals did not differ from the general public on their desired social distance from individuals with mental health conditions (Nordt, et al, 2006). Other studies have also come to similar conclusions (Lauber, et al, 2006 Tsao, Tummala, & Roberts, 2008 Sriram & Jabbarpour, 2005 ෌ok, Polat, Sartorius, Erkoc, & Atakli, 2004). Clients have also reported feeling ‘labeled’ and ‘marginalized’ by health professionals (Liggins & Hatcher, 2005). Individuals with mental illnesses may not even receive equivalent care (compared to non-mentally ill patients) in general health settings once health professionals become aware of their mental health conditions (Desai, Rosenheck, Druss, & Perlin, 2002).

Theory on health professional stigma is very limited, but some literature does provide insight into its possible development. In one way, stigma by health professionals may develop very much the same as the social stigma evident in the general public. Social workers may develop their own biases from their upbringing or even from burnout in their own working roles, particularly when working with individuals who have severe and persistent mental illnesses (Acker & Lawrence, 2009). Nonetheless, some indications suggest that health professional stigma may also develop in a unique way. For instance, social workers and other health professionals, similar to persons in the general public, experience their own mental health and drug use problems and often have friends or family members who experience these same issues (Siebert, 2004 Fewell, King, & Weinstein, 1993). Individuals may also self-select into a helping profession due in part to these experiences (Stanley, Manthorpe, & White, 2007). When social workers and other health professionals deal with mental health and drug use problems they may experience burnout and/or become more or less likely to recognize similar problems among their clients (Siebert, 2003). Some research suggests that mental health conditions are more prevalent among helping professionals than in the general public (Schemhammer, 2005). This problem has also been shown to impair professional social work practice behaviors (Siebert, 2004 Sherman, 1996). For example, Siebert (2003) found that social workers who used marijuana were less likely to recognize marijuana use as a problem among their clients.

The counter-transference that can develop as a result of personal experiences or behaviors may impact clients who may be vulnerable when participating in treatment and may not have the appropriate resources to determine when they are not being treated adequately (Siebert, 2004 Hepworth, Rooney, & Larsen, 2002 Rayner, Allen, & Johnson, 2005). Clients may also be disenfranchised by the treatment process and become more likely to end current treatment and less likely to seek treatment in the future. This creates a barrier to the overall well-being of individuals by preventing adequate treatment, but it also may impact the acknowledgement of their disorder. Overall, health professionals may not provide adequate intervention, early detection, or community referral options for individuals with mental or behavioral disorders (Gassman, Demone, & Albilal, 2001 Tam, Schmidt, & Weisner, 1996), because of their own stigmatizing beliefs and personal histories (Siebert, 2004 2005).


On the epistemological repercussions of 1492 - History

What is the legacy of 1492?

Digital History TOPIC ID 102

In 1992 the peoples of the Americas marked the five hundredth anniversary of Columbus's discovery of the New World. To Americans of Italian and Spanish descent, the anniversary was an occasion for celebrations. From this perspective, Columbus's voyage was a vehicle of discovery and progress, which forged a lasting link between the civilizations of the Old World and the native peoples of the New World.

Many Americans of Indian and African descent will likely regard the anniversary in less positive terms. To many of these people, the legacy of Columbus's voyages is perceived as slavery and colonialism. Rather than regarding Columbus as a discoverer, many Latin Americans regard Columbus as an invader who set in motion a train of events that devastated New World peoples and cultures. Some will note that it was Columbus who inaugurated the Atlantic slave trade. Others will maintain, not entirely without basis, that Europe's prosperity was rooted, at least in part, on the exploitation of the New World.

Assessing the impact of Columbus's voyages is not an easy task.

Disease and death was one consequence of Columbus's voyages. Pre-Columbian America had been isolated from many infections that had swept through Asia, Europe, and much of Africa. American Indians had been spared most of the diseases common to societies that raise livestock. The New World thus provided a fertile environment for epidemics of smallpox, influenza, and measles, which were most lethal to adults in their most productive years. The eight million Arawak Indians, who lived on Hispaniola, site of the first Spanish New World colony, were reduced to ten thousand by 1520. Twenty-five million Indians in Central Mexico were reduced to 1.9 million by 1585. Indian populations in the Andes and in North America were also decimated.

The development of the African slave trade was another important consequence of Columbus's voyage. Within decades, Spain introduced black slaves and sugar plants into the New World. With the Indians seemingly on the path to extinction, the Spanish and Portuguese turned to African labor, who were used to mine gold and silver and to raise crops and livestock.

The "discovery" of the New World carried epochal implications for European thought. America offered a screen on which Old World fears and aspirations could be projected. The Indians, for example, seemed to embody innocence and freedom, lacking sexual restraints, law, or private property, yet possessing health and enjoying eternal youth. Columbus's voyage also helped invigorate the utopian impulse in European thought. To take just one example, it was in 1516, just twenty-four years after Columbus's first voyage, that Sir Thomas More published his book Utopia, in which he described an ideal country where poverty crime, injustice, and other ills did not exist.

Columbus's voyages represent one of the major discontinuities in human history. His voyages truly represented a historical watershed, with vast repercussions for all aspects of life in both the Old World and the New. The year 1492 - perhaps more than any other year in modern history - was a truly landmark moment, carrying enormous implications for the natural environment, for intellectual thought, and for the international economy.

1. How would you assess the significance of Columbus's voyages?

2. Were his voyages a vehicle of progress, in your view, or more negative in their impact?


Causal discourse and the teaching of history. How do teachers explain historical causality?

In this paper we aim to describe how secondary school teachers explain multicausal historical events. To that end, we recorded and analyzed seven classes on “The discovery and colonization of America”. The results show that secondary school teachers do not simply deal with history as a catalog of actions, characters and dates. On the contrary, historical contents are presented as a mesh of events and factors, explicitly or implicitly interwoven. In the discourse analysed, causal-conditional relationships are predominant, although some intentional and narrative elements are also integrated. The teachers asked some questions specifically aimed at involving students in causal reasoning. In spite of the fact that some students recalled a great deal of information, they were likely to describe the historical accounts without explaining why they were generated. Recall protocols contained many more narrative elements than causal ones. Most of the students only remembered and understood those causal relationships which had been signaled and supported verbally by teachers during the explanation. Implications for future research are discussed.

This is a preview of subscription content, access via your institution.


The danger of thinking and dreaming

What is knowing and what defines whether something is classified as knowledge? We start from this question in order to talk about our Native communities and the systematic denial of the knowledges constructed and produced from our people’s ways of conceiving life, which is in community, between humanity and mother and sister nature, with our spirituality as the energetic presence that gives us strength. I write in the plural because my thinking is an organic part of Abya Yala’s 8 communitarian feminism – an organization and social movement that, with my contributions and those of my sisters, has made an important epistemic break: decolonizing feminism. This is a proposal that calls for dialogue to repair and heal the world. But, contrary to what is hoped for, once again as Native women we continue to be subjected to violence, persecution, and defamation, just as our maternal ancestors were. The ‘good’ Native woman will be the subdued and colonized Native woman (Fig. 1).


Watch the video: Epistemic Relativism - Relativism (November 2022).

Video, Sitemap-Video, Sitemap-Videos