...1020...2829303132...405060...


Nihilism – Wikipedia

Nihilism (; from Latin nihil, meaning ‘nothing’) is the philosophical viewpoint that suggests the denial or lack of belief towards the reputedly meaningful aspects of life. Most commonly, nihilism is presented in the form of existential nihilism, which argues that life is without objective meaning, purpose, or intrinsic value.[1] Moral nihilists assert that there is no inherent morality, and that accepted moral values are abstractly contrived. Nihilism may also take epistemological, ontological, or metaphysical forms, meaning respectively that, in some aspect, knowledge is not possible, or reality does not actually exist.

The term is sometimes used in association with anomie to explain the general mood of despair at a perceived pointlessness of existence that one may develop upon realising there are no necessary norms, rules, or laws.[2] Movements such as futurism and deconstruction,[3] among others, have been identified by commentators[4] as “nihilistic”.

Nihilism has also been described as conspicuous in or constitutive of certain historical periods: for example, Jean Baudrillard and others have called postmodernity a nihilistic epoch;[5] and some religious theologians and figures of religious authority have asserted that postmodernity[6] and many aspects of modernity[3] represent a rejection of theism, and that such rejection of theistic doctrine entails nihilism.

Nihilism has many definitions, and thus can describe multiple arguably independent philosophical positions.

Metaphysical nihilism is the philosophical theory that posits that concrete objects and physical constructs might not exist in the possible world, or that even if there exist possible worlds that contain some concrete objects, there is at least one that contains only abstract objects.

Extreme metaphysical nihilism is commonly defined as the belief that nothing exists as a correspondent component of the self-efficient world.[7] The American Heritage Medical Dictionary defines one form of nihilism as “an extreme form of skepticism that denies all existence.”[8] A similar skepticism concerning the concrete world can be found in solipsism. However, despite the fact that both deny the certainty of objects’ true existence, the nihilist would deny the existence of self whereas the solipsist would affirm it.[9] Both these positions are considered forms of anti-realism. (see also: Psychosis)[10]

Epistemological nihilism is a form of skepticism in which all knowledge is accepted as being possibly untrue or as being unable to be confirmed true.

Mereological nihilism (also called compositional nihilism) is the position that objects with proper parts do not exist (not only objects in space, but also objects existing in time do not have any temporal parts), and only basic building blocks without parts exist, and thus the world we see and experience full of objects with parts is a product of human misperception (i.e., if we could see clearly, we would not perceive compositive objects).

This interpretation of existence must be based on resolution. The resolution with which humans see and perceive the “improper parts” of the world is not an objective fact of reality, but is rather an implicit trait that can only be qualitatively explored and expressed. Therefore, there is no arguable way to surmise or measure the validity of mereological nihilism. Example: An ant can get lost on a large cylindrical object because the circumference of the object is so large with respect to the ant that the ant effectively feels as though the object has no curvature. Thus, the resolution with which the ant views the world it exists “within” is a very important determining factor in how the ant experiences this “within the world” feeling.

Existential nihilism is the belief that life has no intrinsic meaning or value. With respect to the universe, existential nihilism posits that a single human or even the entire human species is insignificant, without purpose and unlikely to change in the totality of existence. The meaninglessness of life is largely explored in the philosophical school of existentialism.

Moral nihilism, also known as ethical nihilism, is the meta-ethical view that morality does not exist as something inherent to objective reality; therefore no action is necessarily preferable to any other. For example, a moral nihilist would say that killing someone, for whatever reason, is not inherently right or wrong.

Other nihilists may argue not that there is no morality at all, but that if it does exist, it is a human construction and thus artificial, wherein any and all meaning is relative for different possible outcomes. As an example, if someone kills someone else, such a nihilist might argue that killing is not inherently a bad thing, or bad independently from our moral beliefs, because of the way morality is constructed as some rudimentary dichotomy. What is said to be a bad thing is given a higher negative weighting than what is called good: as a result, killing the individual was bad because it did not let the individual live, which was arbitrarily given a positive weighting. In this way a moral nihilist believes that all moral claims are void of any truth value. An alternative scholarly perspective is that moral nihilism is a morality in itself. Cooper writes, “In the widest sense of the word ‘morality’, moral nihilism is a morality.”[11]

Political nihilism follows the characteristic nihilist’s rejection of non-rationalized or non-proven assertions; in this case the necessity of the most fundamental social and political structures, such as government, family, and law. An influential analysis of political nihilism is presented by Leo Strauss.[12]

The Russian Nihilist movement was a Russian trend in the 1860s that rejected all authority.[13] After the assassination of Tsar Alexander II in 1881, the Nihilists gained a reputation throughout Europe as proponents of the use of violence for political change.[citation needed] The Nihilists expressed anger at what they described as the abusive nature of the Eastern Orthodox Church and of the tsarist monarchy, and at the domination of the Russian economy by the aristocracy. Although the term Nihilism was coined by the German theologian Friedrich Heinrich Jacobi (17431818), its widespread usage began with the 1862 novel Fathers and Sons by the Russian author Ivan Turgenev. The main character of the novel, Eugene Bazarov, who describes himself as a Nihilist, wants to educate the people. The “go to the people be the people” campaign reached its height in the 1870s, during which underground groups such as the Circle of Tchaikovsky, the People’s Will, and Land and Liberty formed. It became known as the Narodnik movement, whose members believed that the newly freed serfs were merely being sold into wage slavery in the onset of the Industrial Revolution, and that the middle and upper classes had effectively replaced landowners. The Russian state attempted to suppress the nihilist movement. In actions described by the Nihilists as propaganda of the deed many government officials were assassinated. In 1881 Alexander II was killed on the very day he had approved a proposal to call a representative assembly to consider new reforms.

The term nihilism was first used by Friedrich Heinrich Jacobi (17431819). Jacobi used the term to characterize rationalism[14] and in particular Immanuel Kant’s “critical” philosophy to carry out a reductio ad absurdum according to which all rationalism (philosophy as criticism) reduces to nihilismand thus it should be avoided and replaced with a return to some type of faith and revelation. Bret W. Davis writes, for example, “The first philosophical development of the idea of nihilism is generally ascribed to Friedrich Jacobi, who in a famous letter criticized Fichte’s idealism as falling into nihilism. According to Jacobi, Fichtes absolutization of the ego (the ‘absolute I’ that posits the ‘not-I’) is an inflation of subjectivity that denies the absolute transcendence of God.”[15] A related but oppositional concept is fideism, which sees reason as hostile and inferior to faith.

With the popularizing of the word nihilism by Ivan Turgenev, a new Russian political movement called the Nihilist movement adopted the term. They supposedly called themselves nihilists because nothing “that then existed found favor in their eyes”.[16]

Sren Kierkegaard (18131855) posited an early form of nihilism, which he referred to as leveling.[17] He saw levelling as the process of suppressing individuality to a point where the individual’s uniqueness becomes non-existent and nothing meaningful in her existence can be affirmed:

Levelling at its maximum is like the stillness of death, where one can hear one’s own heartbeat, a stillness like death, into which nothing can penetrate, in which everything sinks, powerless. One person can head a rebellion, but one person cannot head this levelling process, for that would make him a leader and he would avoid being levelled. Each individual can in his little circle participate in this levelling, but it is an abstract process, and levelling is abstraction conquering individuality.

Kierkegaard, an advocate of a philosophy of life, generally was an opponent and argued against levelling and its nihilist consequence, although he believed it would be “genuinely educative to live in the age of levelling [because] people will be forced to face the judgement of [levelling] alone.”[18] George Cotkin asserts Kierkegaard was against “the standardization and levelling of belief, both spiritual and political, in the nineteenth century [and he] opposed tendencies in mass culture to reduce the individual to a cipher of conformity and deference to the dominant opinion.”[19] In his day, tabloids (like the Danish magazine Corsaren) and apostate Christianity were instruments of levelling and contributed to the “reflective apathetic age” of 19th century Europe.[20] Kierkegaard argues that individuals who can overcome the levelling process are stronger for it and that it represents a step in the right direction towards “becoming a true self.”[18][21] As we must overcome levelling,[22] Hubert Dreyfus and Jane Rubin argue that Kierkegaard’s interest, “in an increasingly nihilistic age, is in how we can recover the sense that our lives are meaningful”.[23]

Note however that Kierkegaard’s meaning of “nihilism” differs from the modern definition in the sense that, for Kierkegaard, levelling led to a life lacking meaning, purpose or value,[20] whereas the modern interpretation of nihilism posits that there was never any meaning, purpose or value to begin with.

Nihilism is often associated with the German philosopher Friedrich Nietzsche, who provided a detailed diagnosis of nihilism as a widespread phenomenon of Western culture. Though the notion appears frequently throughout Nietzsche’s work, he uses the term in a variety of ways, with different meanings and connotations. Karen Carr describes Nietzsche’s characterization of nihilism “as a condition of tension, as a disproportion between what we want to value (or need) and how the world appears to operate.”[24] When we find out that the world does not possess the objective value or meaning that we want it to have or have long since believed it to have, we find ourselves in a crisis.[25] Nietzsche asserts that with the decline of Christianity and the rise of physiological decadence,[clarification needed] nihilism is in fact characteristic of the modern age,[26] though he implies that the rise of nihilism is still incomplete and that it has yet to be overcome.[27] Though the problem of nihilism becomes especially explicit in Nietzsche’s notebooks (published posthumously), it is mentioned repeatedly in his published works and is closely connected to many of the problems mentioned there.

Nietzsche characterized nihilism as emptying the world and especially human existence of meaning, purpose, comprehensible truth, or essential value. This observation stems in part from Nietzsche’s perspectivism, or his notion that “knowledge” is always by someone of some thing: it is always bound by perspective, and it is never mere fact.[28] Rather, there are interpretations through which we understand the world and give it meaning. Interpreting is something we can not go without; in fact, it is something we need. One way of interpreting the world is through morality, as one of the fundamental ways that people make sense of the world, especially in regard to their own thoughts and actions. Nietzsche distinguishes a morality that is strong or healthy, meaning that the person in question is aware that he constructs it himself, from weak morality, where the interpretation is projected on to something external. Regardless of its strength, morality presents us with meaning, whether this is created or ‘implanted,’ which helps us get through life.[29]

Nietzsche discusses Christianity, one of the major topics in his work, at length in the context of the problem of nihilism in his notebooks, in a chapter entitled “European Nihilism”.[30] Here he states that the Christian moral doctrine provides people with intrinsic value, belief in God (which justifies the evil in the world) and a basis for objective knowledge. In this sense, in constructing a world where objective knowledge is possible, Christianity is an antidote against a primal form of nihilism, against the despair of meaninglessness. However, it is exactly the element of truthfulness in Christian doctrine that is its undoing: in its drive towards truth, Christianity eventually finds itself to be a construct, which leads to its own dissolution. It is therefore that Nietzsche states that we have outgrown Christianity “not because we lived too far from it, rather because we lived too close”.[31] As such, the self-dissolution of Christianity constitutes yet another form of nihilism. Because Christianity was an interpretation that posited itself as the interpretation, Nietzsche states that this dissolution leads beyond skepticism to a distrust of all meaning.[32][33]

Stanley Rosen identifies Nietzsche’s concept of nihilism with a situation of meaninglessness, in which “everything is permitted.” According to him, the loss of higher metaphysical values that exist in contrast to the base reality of the world, or merely human ideas, gives rise to the idea that all human ideas are therefore valueless. Rejecting idealism thus results in nihilism, because only similarly transcendent ideals live up to the previous standards that the nihilist still implicitly holds.[34] The inability for Christianity to serve as a source of valuating the world is reflected in Nietzsche’s famous aphorism of the madman in The Gay Science.[35] The death of God, in particular the statement that “we killed him”, is similar to the self-dissolution of Christian doctrine: due to the advances of the sciences, which for Nietzsche show that man is the product of evolution, that Earth has no special place among the stars and that history is not progressive, the Christian notion of God can no longer serve as a basis for a morality.

One such reaction to the loss of meaning is what Nietzsche calls passive nihilism, which he recognises in the pessimistic philosophy of Schopenhauer. Schopenhauer’s doctrine, which Nietzsche also refers to as Western Buddhism, advocates a separating of oneself from will and desires in order to reduce suffering. Nietzsche characterises this ascetic attitude as a “will to nothingness”, whereby life turns away from itself, as there is nothing of value to be found in the world. This mowing away of all value in the world is characteristic of the nihilist, although in this, the nihilist appears inconsistent:[36]

A nihilist is a man who judges of the world as it is that it ought not to be, and of the world as it ought to be that it does not exist. According to this view, our existence (action, suffering, willing, feeling) has no meaning: the pathos of ‘in vain’ is the nihilists’ pathos at the same time, as pathos, an inconsistency on the part of the nihilists.

Nietzsche’s relation to the problem of nihilism is a complex one. He approaches the problem of nihilism as deeply personal, stating that this predicament of the modern world is a problem that has “become conscious” in him.[37] Furthermore, he emphasises both the danger of nihilism and the possibilities it offers, as seen in his statement that “I praise, I do not reproach, [nihilism’s] arrival. I believe it is one of the greatest crises, a moment of the deepest self-reflection of humanity. Whether man recovers from it, whether he becomes master of this crisis, is a question of his strength!”[38] According to Nietzsche, it is only when nihilism is overcome that a culture can have a true foundation upon which to thrive. He wished to hasten its coming only so that he could also hasten its ultimate departure.[26]

He states that there is at least the possibility of another type of nihilist in the wake of Christianity’s self-dissolution, one that does not stop after the destruction of all value and meaning and succumb to the following nothingness. This alternate, ‘active’ nihilism on the other hand destroys to level the field for constructing something new. This form of nihilism is characterized by Nietzsche as “a sign of strength,”[39] a willful destruction of the old values to wipe the slate clean and lay down one’s own beliefs and interpretations, contrary to the passive nihilism that resigns itself with the decomposition of the old values. This willful destruction of values and the overcoming of the condition of nihilism by the constructing of new meaning, this active nihilism, could be related to what Nietzsche elsewhere calls a ‘free spirit'[40] or the bermensch from Thus Spoke Zarathustra and The Antichrist, the model of the strong individual who posits his own values and lives his life as if it were his own work of art. It may be questioned, though, whether “active nihilism” is indeed the correct term for this stance, and some question whether Nietzsche takes the problems nihilism poses seriously enough.[41]

Martin Heidegger’s interpretation of Nietzsche influenced many postmodern thinkers who investigated the problem of nihilism as put forward by Nietzsche. Only recently has Heidegger’s influence on Nietzschean nihilism research faded.[42] As early as the 1930s, Heidegger was giving lectures on Nietzsches thought.[43] Given the importance of Nietzsches contribution to the topic of nihilism, Heidegger’s influential interpretation of Nietzsche is important for the historical development of the term nihilism.

Heidegger’s method of researching and teaching Nietzsche is explicitly his own. He does not specifically try to present Nietzsche as Nietzsche. He rather tries to incorporate Nietzsche’s thoughts into his own philosophical system of Being, Time and Dasein.[44] In his Nihilism as Determined by the History of Being (194446),[45] Heidegger tries to understand Nietzsches nihilism as trying to achieve a victory through the devaluation of the, until then, highest values. The principle of this devaluation is, according to Heidegger, the Will to Power. The Will to Power is also the principle of every earlier valuation of values.[46] How does this devaluation occur and why is this nihilistic? One of Heidegger’s main critiques on philosophy is that philosophy, and more specifically metaphysics, has forgotten to discriminate between investigating the notion of a being (Seiende) and Being (Sein). According to Heidegger, the history of Western thought can be seen as the history of metaphysics. And because metaphysics has forgotten to ask about the notion of Being (what Heidegger calls Seinsvergessenheit), it is a history about the destruction of Being. That is why Heidegger calls metaphysics nihilistic.[47] This makes Nietzsches metaphysics not a victory over nihilism, but a perfection of it.[48]

Heidegger, in his interpretation of Nietzsche, has been inspired by Ernst Jnger. Many references to Jnger can be found in Heidegger’s lectures on Nietzsche. For example, in a letter to the rector of Freiburg University of November 4, 1945, Heidegger, inspired by Jnger, tries to explain the notion of God is dead as the reality of the Will to Power. Heidegger also praises Jnger for defending Nietzsche against a too biological or anthropological reading during the Third Reich.[49]

Heidegger’s interpretation of Nietzsche influenced a number of important postmodernist thinkers. Gianni Vattimo points at a back-and-forth movement in European thought, between Nietzsche and Heidegger. During the 1960s, a Nietzschean ‘renaissance’ began, culminating in the work of Mazzino Montinari and Giorgio Colli. They began work on a new and complete edition of Nietzsche’s collected works, making Nietzsche more accessible for scholarly research. Vattimo explains that with this new edition of Colli and Montinari, a critical reception of Heidegger’s interpretation of Nietzsche began to take shape. Like other contemporary French and Italian philosophers, Vattimo does not want, or only partially wants, to rely on Heidegger for understanding Nietzsche. On the other hand, Vattimo judges Heidegger’s intentions authentic enough to keep pursuing them.[50] Philosophers who Vattimo exemplifies as a part of this back and forth movement are French philosophers Deleuze, Foucault and Derrida. Italian philosophers of this same movement are Cacciari, Severino and himself.[51] Jrgen Habermas, Jean-Franois Lyotard and Richard Rorty are also philosophers who are influenced by Heidegger’s interpretation of Nietzsche.[52]

Postmodern and poststructuralist thought question the very grounds on which Western cultures have based their ‘truths’: absolute knowledge and meaning, a ‘decentralization’ of authorship, the accumulation of positive knowledge, historical progress, and certain ideals and practices of humanism and the Enlightenment.

Jacques Derrida, whose deconstruction is perhaps most commonly labeled nihilistic, did not himself make the nihilistic move that others have claimed. Derridean deconstructionists argue that this approach rather frees texts, individuals or organizations from a restrictive truth, and that deconstruction opens up the possibility of other ways of being.[53] Gayatri Chakravorty Spivak, for example, uses deconstruction to create an ethics of opening up Western scholarship to the voice of the subaltern and to philosophies outside of the canon of western texts.[54] Derrida himself built a philosophy based upon a ‘responsibility to the other’.[55] Deconstruction can thus be seen not as a denial of truth, but as a denial of our ability to know truth (it makes an epistemological claim compared to nihilism’s ontological claim).

Lyotard argues that, rather than relying on an objective truth or method to prove their claims, philosophers legitimize their truths by reference to a story about the world that can’t be separated from the age and system the stories belong toreferred to by Lyotard as meta-narratives. He then goes on to define the postmodern condition as characterized by a rejection both of these meta-narratives and of the process of legitimation by meta-narratives. “In lieu of meta-narratives we have created new language-games in order to legitimize our claims which rely on changing relationships and mutable truths, none of which is privileged over the other to speak to ultimate truth.”[citation needed] This concept of the instability of truth and meaning leads in the direction of nihilism, though Lyotard stops short of embracing the latter.

Postmodern theorist Jean Baudrillard wrote briefly of nihilism from the postmodern viewpoint in Simulacra and Simulation. He stuck mainly to topics of interpretations of the real world over the simulations of which the real world is composed. The uses of meaning were an important subject in Baudrillard’s discussion of nihilism:

The apocalypse is finished, today it is the precession of the neutral, of forms of the neutral and of indifferenceall that remains, is the fascination for desertlike and indifferent forms, for the very operation of the system that annihilates us. Now, fascination (in contrast to seduction, which was attached to appearances, and to dialectical reason, which was attached to meaning) is a nihilistic passion par excellence, it is the passion proper to the mode of disappearance. We are fascinated by all forms of disappearance, of our disappearance. Melancholic and fascinated, such is our general situation in an era of involuntary transparency.

In Nihil Unbound: Extinction and Enlightenment, Ray Brassier maintains that philosophy has avoided the traumatic idea of extinction, instead attempting to find meaning in a world conditioned by the very idea of its own annihilation. Thus Brassier critiques both the phenomenological and hermeneutic strands of Continental philosophy as well as the vitality of thinkers like Gilles Deleuze, who work to ingrain meaning in the world and stave off the threat of nihilism. Instead, drawing on thinkers such as Alain Badiou, Franois Laruelle, Paul Churchland, and Thomas Metzinger, Brassier defends a view of the world as inherently devoid of meaning. That is, rather than avoiding nihilism, Brassier embraces it as the truth of reality. Brassier concludes from his readings of Badiou and Laruelle that the universe is founded on the nothing,[56] but also that philosophy is the “organon of extinction,” that it is only because life is conditioned by its own extinction that there is thought at all.[57] Brassier then defends a radically anti-correlationist philosophy proposing that Thought is conjoined not with Being, but with Non-Being.

The term Dada was first used by Richard Huelsenbeck and Tristan Tzara in 1916.[58] The movement, which lasted from approximately 1916 to 1922, arose during World War I, an event that influenced the artists.[59] The Dada Movement began in the old town of Zrich, Switzerland known as the “Niederdorf” or “Niederdrfli” in the Caf Voltaire.[60] The Dadaists claimed that Dada was not an art movement, but an anti-art movement, sometimes using found objects in a manner similar to found poetry. The “anti-art” drive is thought to have stemmed from a post-war emptiness. This tendency toward devaluation of art has led many to claim that Dada was an essentially nihilistic movement. Given that Dada created its own means for interpreting its products, it is difficult to classify alongside most other contemporary art expressions. Hence, due to its ambiguity, it is sometimes classified as a nihilistic modus vivendi.[59]

The term “nihilism” was actually popularized by Ivan Turgenev in his novel Fathers and Sons, whose hero, Bazarov, was a nihilist and recruited several followers to the philosophy. He found his nihilistic ways challenged upon falling in love.[61]

Anton Chekhov portrayed nihilism when writing Three Sisters. The phrase “what does it matter” or such variants is often spoken by several characters in response to events; the significance of some of these events suggests a subscription to nihilism by said characters as a type of coping strategy.

The philosophical ideas of the French author, the Marquis de Sade, are often noted as early examples of nihilistic principles.[62]

See the rest here:

Nihilism – Wikipedia

Nihilism | Internet Encyclopedia of Philosophy

Nihilism is the belief that all values are baseless and that nothing can be known or communicated. It is often associated with extreme pessimism and a radical skepticism that condemns existence. A true nihilist would believe in nothing, have no loyalties, and no purpose other than, perhaps, an impulse to destroy. While few philosophers would claim to be nihilists, nihilism is most often associated with Friedrich Nietzsche who argued that its corrosive effects would eventually destroy all moral, religious, and metaphysical convictions and precipitate the greatest crisis in human history. In the 20th century, nihilistic themes–epistemological failure, value destruction, and cosmic purposelessness–have preoccupied artists, social critics, and philosophers. Mid-century, for example, the existentialists helped popularize tenets of nihilism in their attempts to blunt its destructive potential. By the end of the century, existential despair as a response to nihilism gave way to an attitude of indifference, often associated with antifoundationalism.

It has been over a century now since Nietzsche explored nihilism and its implications for civilization. As he predicted, nihilism’s impact on the culture and values of the 20th century has been pervasive, its apocalyptic tenor spawning a mood of gloom and a good deal of anxiety, anger, and terror. Interestingly, Nietzsche himself, a radical skeptic preoccupied with language, knowledge, and truth, anticipated many of the themes of postmodernity. It’s helpful to note, then, that he believed we could–at a terrible price–eventually work through nihilism. If we survived the process of destroying all interpretations of the world, we could then perhaps discover the correct course for humankind.

“Nihilism” comes from the Latin nihil, or nothing, which means not anything, that which does not exist. It appears in the verb “annihilate,” meaning to bring to nothing, to destroy completely. Early in the nineteenth century, Friedrich Jacobi used the word to negatively characterize transcendental idealism. It only became popularized, however, after its appearance in Ivan Turgenev’s novel Fathers and Sons (1862) where he used “nihilism” to describe the crude scientism espoused by his character Bazarov who preaches a creed of total negation.

In Russia, nihilism became identified with a loosely organized revolutionary movement (C.1860-1917) that rejected the authority of the state, church, and family. In his early writing, anarchist leader Mikhael Bakunin (1814-1876) composed the notorious entreaty still identified with nihilism: “Let us put our trust in the eternal spirit which destroys and annihilates only because it is the unsearchable and eternally creative source of all life–the passion for destruction is also a creative passion!” (Reaction in Germany, 1842). The movement advocated a social arrangement based on rationalism and materialism as the sole source of knowledge and individual freedom as the highest goal. By rejecting man’s spiritual essence in favor of a solely materialistic one, nihilists denounced God and religious authority as antithetical to freedom. The movement eventually deteriorated into an ethos of subversion, destruction, and anarchy, and by the late 1870s, a nihilist was anyone associated with clandestine political groups advocating terrorism and assassination.

The earliest philosophical positions associated with what could be characterized as a nihilistic outlook are those of the Skeptics. Because they denied the possibility of certainty, Skeptics could denounce traditional truths as unjustifiable opinions. When Demosthenes (c.371-322 BC), for example, observes that “What he wished to believe, that is what each man believes” (Olynthiac), he posits the relational nature of knowledge. Extreme skepticism, then, is linked to epistemological nihilism which denies the possibility of knowledge and truth; this form of nihilism is currently identified with postmodern antifoundationalism. Nihilism, in fact, can be understood in several different ways. Political Nihilism, as noted, is associated with the belief that the destruction of all existing political, social, and religious order is a prerequisite for any future improvement. Ethical nihilism or moral nihilism rejects the possibility of absolute moral or ethical values. Instead, good and evil are nebulous, and values addressing such are the product of nothing more than social and emotive pressures. Existential nihilism is the notion that life has no intrinsic meaning or value, and it is, no doubt, the most commonly used and understood sense of the word today.

Max Stirner’s (1806-1856) attacks on systematic philosophy, his denial of absolutes, and his rejection of abstract concepts of any kind often places him among the first philosophical nihilists. For Stirner, achieving individual freedom is the only law; and the state, which necessarily imperils freedom, must be destroyed. Even beyond the oppression of the state, though, are the constraints imposed by others because their very existence is an obstacle compromising individual freedom. Thus Stirner argues that existence is an endless “war of each against all” (The Ego and its Own, trans. 1907).

Among philosophers, Friedrich Nietzsche is most often associated with nihilism. For Nietzsche, there is no objective order or structure in the world except what we give it. Penetrating the faades buttressing convictions, the nihilist discovers that all values are baseless and that reason is impotent. “Every belief, every considering something-true,” Nietzsche writes, “is necessarily false because there is simply no true world” (Will to Power [notes from 1883-1888]). For him, nihilism requires a radical repudiation of all imposed values and meaning: “Nihilism is . . . not only the belief that everything deserves to perish; but one actually puts one’s shoulder to the plough; one destroys” (Will to Power).

The caustic strength of nihilism is absolute, Nietzsche argues, and under its withering scrutiny “the highest values devalue themselves. The aim is lacking, and ‘Why’ finds no answer” (Will to Power). Inevitably, nihilism will expose all cherished beliefs and sacrosanct truths as symptoms of a defective Western mythos. This collapse of meaning, relevance, and purpose will be the most destructive force in history, constituting a total assault on reality and nothing less than the greatest crisis of humanity:

What I relate is the history of the next two centuries. I describe what is coming, what can no longer come differently: the advent of nihilism. . . . For some time now our whole European culture has been moving as toward a catastrophe, with a tortured tension that is growing from decade to decade: restlessly, violently, headlong, like a river that wants to reach the end. . . . (Will to Power)

Since Nietzsche’s compelling critique, nihilistic themes–epistemological failure, value destruction, and cosmic purposelessness–have preoccupied artists, social critics, and philosophers. Convinced that Nietzsche’s analysis was accurate, for example, Oswald Spengler in The Decline of the West (1926) studied several cultures to confirm that patterns of nihilism were indeed a conspicuous feature of collapsing civilizations. In each of the failed cultures he examines, Spengler noticed that centuries-old religious, artistic, and political traditions were weakened and finally toppled by the insidious workings of several distinct nihilistic postures: the Faustian nihilist “shatters the ideals”; the Apollinian nihilist “watches them crumble before his eyes”; and the Indian nihilist “withdraws from their presence into himself.” Withdrawal, for instance, often identified with the negation of reality and resignation advocated by Eastern religions, is in the West associated with various versions of epicureanism and stoicism. In his study, Spengler concludes that Western civilization is already in the advanced stages of decay with all three forms of nihilism working to undermine epistemological authority and ontological grounding.

In 1927, Martin Heidegger, to cite another example, observed that nihilism in various and hidden forms was already “the normal state of man” (The Question of Being). Other philosophers’ predictions about nihilism’s impact have been dire. Outlining the symptoms of nihilism in the 20th century, Helmut Thielicke wrote that “Nihilism literally has only one truth to declare, namely, that ultimately Nothingness prevails and the world is meaningless” (Nihilism: Its Origin and Nature, with a Christian Answer, 1969). From the nihilist’s perspective, one can conclude that life is completely amoral, a conclusion, Thielicke believes, that motivates such monstrosities as the Nazi reign of terror. Gloomy predictions of nihilism’s impact are also charted in Eugene Rose’s Nihilism: The Root of the Revolution of the Modern Age (1994). If nihilism proves victorious–and it’s well on its way, he argues–our world will become “a cold, inhuman world” where “nothingness, incoherence, and absurdity” will triumph.

While nihilism is often discussed in terms of extreme skepticism and relativism, for most of the 20th century it has been associated with the belief that life is meaningless. Existential nihilism begins with the notion that the world is without meaning or purpose. Given this circumstance, existence itself–all action, suffering, and feeling–is ultimately senseless and empty.

In The Dark Side: Thoughts on the Futility of Life (1994), Alan Pratt demonstrates that existential nihilism, in one form or another, has been a part of the Western intellectual tradition from the beginning. The Skeptic Empedocles’ observation that “the life of mortals is so mean a thing as to be virtually un-life,” for instance, embodies the same kind of extreme pessimism associated with existential nihilism. In antiquity, such profound pessimism may have reached its apex with Hegesias of Cyrene. Because miseries vastly outnumber pleasures, happiness is impossible, the philosopher argues, and subsequently advocates suicide. Centuries later during the Renaissance, William Shakespeare eloquently summarized the existential nihilist’s perspective when, in this famous passage near the end of Macbeth, he has Macbeth pour out his disgust for life:

Out, out, brief candle!Life’s but a walking shadow, a poor playerThat struts and frets his hour upon the stageAnd then is heard no more; it is a taleTold by an idiot, full of sound and fury,Signifying nothing.

In the twentieth century, it’s the atheistic existentialist movement, popularized in France in the 1940s and 50s, that is responsible for the currency of existential nihilism in the popular consciousness. Jean-Paul Sartre’s (1905-1980) defining preposition for the movement, “existence precedes essence,” rules out any ground or foundation for establishing an essential self or a human nature. When we abandon illusions, life is revealed as nothing; and for the existentialists, nothingness is the source of not only absolute freedom but also existential horror and emotional anguish. Nothingness reveals each individual as an isolated being “thrown” into an alien and unresponsive universe, barred forever from knowing why yet required to invent meaning. It’s a situation that’s nothing short of absurd. Writing from the enlightened perspective of the absurd, Albert Camus (1913-1960) observed that Sisyphus’ plight, condemned to eternal, useless struggle, was a superb metaphor for human existence (The Myth of Sisyphus, 1942).

The common thread in the literature of the existentialists is coping with the emotional anguish arising from our confrontation with nothingness, and they expended great energy responding to the question of whether surviving it was possible. Their answer was a qualified “Yes,” advocating a formula of passionate commitment and impassive stoicism. In retrospect, it was an anecdote tinged with desperation because in an absurd world there are absolutely no guidelines, and any course of action is problematic. Passionate commitment, be it to conquest, creation, or whatever, is itself meaningless. Enter nihilism.

Camus, like the other existentialists, was convinced that nihilism was the most vexing problem of the twentieth century. Although he argues passionately that individuals could endure its corrosive effects, his most famous works betray the extraordinary difficulty he faced building a convincing case. In The Stranger (1942), for example, Meursault has rejected the existential suppositions on which the uninitiated and weak rely. Just moments before his execution for a gratuitous murder, he discovers that life alone is reason enough for living, a raison d’tre, however, that in context seems scarcely convincing. In Caligula (1944), the mad emperor tries to escape the human predicament by dehumanizing himself with acts of senseless violence, fails, and surreptitiously arranges his own assassination. The Plague (1947) shows the futility of doing one’s best in an absurd world. And in his last novel, the short and sardonic, The Fall (1956), Camus posits that everyone has bloody hands because we are all responsible for making a sorry state worse by our inane action and inaction alike. In these works and other works by the existentialists, one is often left with the impression that living authentically with the meaninglessness of life is impossible.

Camus was fully aware of the pitfalls of defining existence without meaning, and in his philosophical essay The Rebel (1951) he faces the problem of nihilism head-on. In it, he describes at length how metaphysical collapse often ends in total negation and the victory of nihilism, characterized by profound hatred, pathological destruction, and incalculable violence and death.

By the late 20th century, “nihilism” had assumed two different castes. In one form, “nihilist” is used to characterize the postmodern person, a dehumanized conformist, alienated, indifferent, and baffled, directing psychological energy into hedonistic narcissism or into a deep ressentiment that often explodes in violence. This perspective is derived from the existentialists’ reflections on nihilism stripped of any hopeful expectations, leaving only the experience of sickness, decay, and disintegration.

In his study of meaninglessness, Donald Crosby writes that the source of modern nihilism paradoxically stems from a commitment to honest intellectual openness. “Once set in motion, the process of questioning could come to but one end, the erosion of conviction and certitude and collapse into despair” (The Specter of the Absurd, 1988). When sincere inquiry is extended to moral convictions and social consensus, it can prove deadly, Crosby continues, promoting forces that ultimately destroy civilizations. Michael Novak’s recently revised The Experience of Nothingness (1968, 1998) tells a similar story. Both studies are responses to the existentialists’ gloomy findings from earlier in the century. And both optimistically discuss ways out of the abyss by focusing of the positive implications nothingness reveals, such as liberty, freedom, and creative possibilities. Novak, for example, describes how since WWII we have been working to “climb out of nihilism” on the way to building a new civilization.

In contrast to the efforts to overcome nihilism noted above is the uniquely postmodern response associated with the current antifoundationalists. The philosophical, ethical, and intellectual crisis of nihilism that has tormented modern philosophers for over a century has given way to mild annoyance or, more interestingly, an upbeat acceptance of meaninglessness.

French philosopher Jean-Francois Lyotard characterizes postmodernism as an “incredulity toward metanarratives,” those all-embracing foundations that we have relied on to make sense of the world. This extreme skepticism has undermined intellectual and moral hierarchies and made “truth” claims, transcendental or transcultural, problematic. Postmodern antifoundationalists, paradoxically grounded in relativism, dismiss knowledge as relational and “truth” as transitory, genuine only until something more palatable replaces it (reminiscent of William James’ notion of “cash value”). The critic Jacques Derrida, for example, asserts that one can never be sure that what one knows corresponds with what is. Since human beings participate in only an infinitesimal part of the whole, they are unable to grasp anything with certainty, and absolutes are merely “fictional forms.”

American antifoundationalist Richard Rorty makes a similar point: “Nothing grounds our practices, nothing legitimizes them, nothing shows them to be in touch with the way things are” (“From Logic to Language to Play,” 1986). This epistemological cul-de-sac, Rorty concludes, leads inevitably to nihilism. “Faced with the nonhuman, the nonlinguistic, we no longer have the ability to overcome contingency and pain by appropriation and transformation, but only the ability to recognize contingency and pain” (Contingency, Irony, and Solidarity, 1989). In contrast to Nietzsche’s fears and the angst of the existentialists, nihilism becomes for the antifoundationalists just another aspect of our contemporary milieu, one best endured with sang-froid.

In The Banalization of Nihilism (1992) Karen Carr discusses the antifoundationalist response to nihilism. Although it still inflames a paralyzing relativism and subverts critical tools, “cheerful nihilism” carries the day, she notes, distinguished by an easy-going acceptance of meaninglessness. Such a development, Carr concludes, is alarming. If we accept that all perspectives are equally non-binding, then intellectual or moral arrogance will determine which perspective has precedence. Worse still, the banalization of nihilism creates an environment where ideas can be imposed forcibly with little resistance, raw power alone determining intellectual and moral hierarchies. It’s a conclusion that dovetails nicely with Nietzsche’s, who pointed out that all interpretations of the world are simply manifestations of will-to-power.

It has been over a century now since Nietzsche explored nihilism and its implications for civilization. As he predicted, nihilism’s impact on the culture and values of the 20th century has been pervasive, its apocalyptic tenor spawning a mood of gloom and a good deal of anxiety, anger, and terror. Interestingly, Nietzsche himself, a radical skeptic preoccupied with language, knowledge, and truth, anticipated many of the themes of postmodernity. It’s helpful to note, then, that he believed we could–at a terrible price–eventually work through nihilism. If we survived the process of destroying all interpretations of the world, we could then perhaps discover the correct course for humankind:

I praise, I do not reproach, [nihilism’s] arrival. I believe it is one of the greatest crises, a moment of the deepest self-reflection of humanity. Whether man recovers from it, whether he becomes master of this crisis, is a question of his strength. It is possible. . . . (Complete Works Vol. 13)

Alan PrattEmail: pratta@db.erau.eduEmbry-Riddle UniversityU. S. A.

Read the original here:

Nihilism | Internet Encyclopedia of Philosophy

Nihilism | Definition of Nihilism by Merriam-Webster

1 a : a viewpoint that traditional values and beliefs are unfounded and that existence is senseless and useless

b : a doctrine that denies any objective ground of truth and especially of moral truths

2 a : a doctrine or belief that conditions in the social organization are so bad as to make destruction desirable for its own sake independent of any constructive program or possibility

b capitalized : the program of a 19th century Russian party advocating revolutionary reform and using terrorism and assassination

Go here to read the rest:

Nihilism | Definition of Nihilism by Merriam-Webster

nihilism | Definition & History | Britannica.com

Nihilism, (from Latin nihil, nothing), originally a philosophy of moral and epistemological skepticism that arose in 19th-century Russia during the early years of the reign of Tsar Alexander II. The term was famously used by Friedrich Nietzsche to describe the disintegration of traditional morality in Western society. In the 20th century, nihilism encompassed a variety of philosophical and aesthetic stances that, in one sense or another, denied the existence of genuine moral truths or values, rejected the possibility of knowledge or communication, and asserted the ultimate meaninglessness or purposelessness of life or of the universe.

The term is an old one, applied to certain heretics in the Middle Ages. In Russian literature, nihilism was probably first used by N.I. Nadezhdin, in an 1829 article in the Messenger of Europe, in which he applied it to Aleksandr Pushkin. Nadezhdin, as did V.V. Bervi in 1858, equated nihilism with skepticism. Mikhail Nikiforovich Katkov, a well-known conservative journalist who interpreted nihilism as synonymous with revolution, presented it as a social menace because of its negation of all moral principles.

It was Ivan Turgenev, in his celebrated novel Fathers and Sons (1862), who popularized the term through the figure of Bazarov the nihilist. Eventually, the nihilists of the 1860s and 70s came to be regarded as disheveled, untidy, unruly, ragged men who rebelled against tradition and social order. The philosophy of nihilism then began to be associated erroneously with the regicide of Alexander II (1881) and the political terror that was employed by those active at the time in clandestine organizations opposed to absolutism.

If to the conservative elements the nihilists were the curse of the time, to the liberals such as N.G. Chernyshevsky they represented a mere transitory factor in the development of national thoughta stage in the struggle for individual freedomand a true spirit of the rebellious young generation. In his novel What Is to Be Done? (1863), Chernyshevsky endeavoured to detect positive aspects in the nihilist philosophy. Similarly, in his Memoirs, Prince Peter Kropotkin, the leading Russian anarchist, defined nihilism as the symbol of struggle against all forms of tyranny, hypocrisy, and artificiality and for individual freedom.

Fundamentally, 19th-century nihilism represented a philosophy of negation of all forms of aestheticism; it advocated utilitarianism and scientific rationalism. Classical philosophical systems were rejected entirely. Nihilism represented a crude form of positivism and materialism, a revolt against the established social order; it negated all authority exercised by the state, by the church, or by the family. It based its belief on nothing but scientific truth; science would be the solution of all social problems. All evils, nihilists believed, derived from a single sourceignorancewhich science alone would overcome.

The thinking of 19th-century nihilists was profoundly influenced by philosophers, scientists, and historians such as Ludwig Feuerbach, Charles Darwin, Henry Buckle, and Herbert Spencer. Since nihilists denied the duality of human beings as a combination of body and soul, of spiritual and material substance, they came into violent conflict with ecclesiastical authorities. Since nihilists questioned the doctrine of the divine right of kings, they came into similar conflict with secular authorities. Since they scorned all social bonds and family authority, the conflict between parents and children became equally immanent, and it is this theme that is best reflected in Turgenevs novel.

View original post here:

nihilism | Definition & History | Britannica.com

American Nihilist Underground Society: Nihilism, Nihilists …

Social Media Finalized The Death Of The InternetJune 2, 2017A few years ago a working farm opened up near me. These are farms that are open to the public, but show you how that exotic class of human beings known as “farmers” actually make food and survive without Amazon Prime accounts.The farm came about because a farmer allowed school groups to witness the slaughter, breeding and care of animals. Then they wanted to see how the potatoes were planted. Now tours of the fields were added, including a visit to the manure pile, where tourists could genuflect and debase themselves in order to assert humility, which always pleases the crowd.Soon the tours became more valuable than the farm output.They came from the cities — doctors, plumbers, lawyers, carpenters, architects — looking for a way to school their children in a way of life that had passed into history, hoping to bestow “authenticity” to a life defined by conformity, products, political correctness and public relations. They wanted an escape from the transactional life of the city, and an insight instead into what life is like when results in reality matter more than what other people think.In this way, the needs of the herd overwhelmed the realistic nature of original human behavior. The farm became a stage, and soon a gift shop appeared, and then there were videos and public image adjustments. Reality was forgotten and replaced by the human, as happens with every homo sapiens endeavor when it is about to fail.Humans love posturing and pretending. For them, to act like a farmer is to be the real thing because that is what people in their social group react to. They have no concern for being accurate, only for having other people nod and acknowledge them as having achieved another milestone on the path to greatness.Social media is the same thing. No one can tell you are a dog on the internet; via social media, however, you can be whatever you want. Ignore that failed marriage, day job in a cubicle, and personal ineptitude. On the social media internet, you are whatever you can project.Starting in 2007, the internet permanently shifted to the mobile device consumer audience, which means that it plunged far below the 120 IQ point minimum required by the old internet. Before Eternal September, the internet was limited to those who had demonstrated competence. After that, the herd began coming in.With the rise of Google and Facebook, the herd dominated the internet. This merely showed to us the need for hierarchy and aristocracy: if left up to the Crowd, every human venture degenerates to the lowest common denominator, and whatever makes it exceptional is lost.Social media is democracy with no standards: whatever herd shows up, and whatever majority emerges from the midst of it, takes the day. It is the equivalent of the audience for a circus or tent revival deciding our future, and in the case of social media, they choose our path by excluding anything that is not popular.Following that pattern, social media selects lies over truth. It prefers what most people want to believe is true over what is real according to the best minds we have. It is the triumph of the herd in denying reality so that each member of the herd may pretend to be a king, hero, genius, artist or inventor.On the other hand, this means the rise of an underground within the internet: the sites that cannot be found by Google, will not show up in your news feed, and will be censored by Facebook, Twitter, Reddit, Tumblr, Pinterest and Instagram. The is the underground internet, and it is rising as the utility of the public internet plummets since it is now designed for and populated with the same people who watched a lot of daytime TV in the 1980s, like the poor, old, housewives, cube McJob slaves, mentally ill, physically broke, neurotic, intoxicated and lonely.class=”post”>Ten Types Of Modern FoolJune 4, 2017When you live in a dying time, the most common response is to into denial, which consists of ignoring the actual problem and finding some way to distract oneself instead. For example, heroin addicts routinely insist that their problem is too much clutter around the house, a speck in the face of the larger problem of heroin addiction that looms over them like an unseen predatory god.With complex problems, craftier and cannier evasions commence. That is: people find proxies for dealing with the problem, or substitutes, excuses, rationalizations and justifications. These are symbolic problems that they either can conquer or will persist whether conquered or not, making them the safest enemy (one against whom the knight cannot fail or prevail). Null proxies like this consist of 90% of the activity of a democratic state or troupe of monkeys in the wild.You will find the following non-answers to be happilly promoted by humans from every race, caste, sex, class, religion, political alignment and sexual orientation. We all know our civilization has fallen and we are living in a vile and evil time, and that the solution is to give up our arrogance which insists we have the lottery of being able to do pretty much whatever we want but also having to suffer others doing the same.These null proxies are used by people who will proclaim them as “the solution” and then, like a monkey who has found a bone, will use that answer to beat on all the other monkeys to force them into submission to the will of the original monkey. Any time you hear someone speaking in this way, you may be dealing with an idiot — not always — but you are certainly dealing with someone who has lied to themselves at such a fundamental level that they will never tell the truth again:

Link:

American Nihilist Underground Society: Nihilism, Nihilists …

Nihilism, In Our Time – BBC Radio 4

Melvyn Bragg and guests discuss the history of Nihilism. The nineteenth-century philosopher, Friedrich Nietzsche, wrote, There can be no doubt that morality will gradually perish: this is the great spectacle in a hundred acts reserved for the next two centuries in Europe. And, with chilling predictions like these, Nihilism was born. The hard view that morals are pointless, loyalty is a weakness and truths are illusory, has excited, confused and appalled western thinkers ever since. But what happened to Nietzsches revolutionary ideas about truth, morality and a life without meaning? Existentialism can claim lineage to Nietzsche, as can Post Modernism, but then so can Nazism. With so many interpretations, and claims of ownership from the left and the right, has anything positive come out of the great philosopher of nothing?With Rob Hopkins, Senior Lecturer in Philosophy, University of Birmingham; Professor Raymond Tallis, Doctor and Philosopher; Professor Catherine Belsey, University of Cardiff.

Follow this link:

Nihilism, In Our Time – BBC Radio 4

Roger Ver: "Ethereum Will Overtake Bitcoin in Market Cap"

Roger Ver is quite an intriguing and somewhat controversial figure in the cryptocurrency world. As a notorious early Bitcoin investor, he has also kept close tabs on other currencies. During a recent interview, Roger mentioned how Ethereum will eventually overtake Bitcoin. A remarkable thought, even though Bitcoin is not without flaws by any means.

In the world of cryptocurrency, virtually everyone knows the name Roger Ver. He is one of the earliest Bitcoin investors and made a lot of good money doing so. He is also a very strong [financial] supporter of the Bitcoin Cash venture. This Bitcoin hard fork has made quite the impact on the overall cryptocurrency market in recent months. Diversification is key in the world of cryptocurrency, as there is a lot more to check out than just Bitcoin.

Unsurprisingly, Roger Ver is not too convinced Bitcoin will remain the top dog for much longer. In a recent interview, he mentions how Ethereum will overtake Bitcoin in the near future. Given the recent price surge of Ether, Ethereums native token, it is evident things will only get better from here on out. A lot of innovation is coming to Ethereum, as are some much-needed network improvements.

According to Roger Ver, Ether is well underway to surpass Bitcoin. All it takes is doubling in price one more time to effectively reach this goal. That is, assuming the Bitcoin price doesnt increase further. Rest assured BTC is not done just yet in this regard either. An interesting battle has been going on between both of these currencies. Market cap is just one of the metrics people need to pay attention to when it comes to these cryptocurrencies, though.

More specifically, Ether has surpassed Bitcoin in a few other key metrics. It is cheaper to use most of the time, and a lot faster in terms of confirmations. Ethereums throughput has also surpassed that of Bitcoin on multiple occasions in the past. All of this may change with improved SegWit adoption and the Lightning network launch in a few months. Until that happens, it seems Ether will remain in the lead regarding the metrics that actually matter.

Roger Ver is also impressed with Ethereums developers, by the look of things. In his opinion, Bitcoin no longer holds the top spot in a lot of regards. Once people start to realize that is exactly the case, things will get very interesting across all markets. Especially Ethereums switch to proof-ofstake will be pretty significant for the network as a whole. When Ethereum will overtake Bitcoin, remains to be determined, though.

All of this paints an interesting outlook for both Bitcoin and Ethereum. Assuming Roger Ver is correct in his assessment, we will either see a major Ethereum price increase of a big Bitcoin price dip. Right now, the latter seems almost impossible, as the Bitcoin price has been moving up as of late. Ethereum, on the other hand, has been stuck in sideways momentum for several weeks now. It will be interesting to see how things unfold in this regard. Anything is possible in the cryptocurrency world.

See the original post:

Roger Ver: "Ethereum Will Overtake Bitcoin in Market Cap"

Stem-cell therapy – Wikipedia

This article is about the medical therapy. For the cell type, see Stem cell.

Stem-cell therapy is the use of stem cells to treat or prevent a disease or condition.[1]

Bone marrow transplant is the most widely used stem-cell therapy, but some therapies derived from umbilical cord blood are also in use. Research is underway to develop various sources for stem cells, and to apply stem-cell treatments for neurodegenerative diseases and conditions such as diabetes, heart disease, and other conditions.

Stem-cell therapy has become controversial following developments such as the ability of scientists to isolate and culture embryonic stem cells, to create stem cells using somatic cell nuclear transfer and their use of techniques to create induced pluripotent stem cells. This controversy is often related to abortion politics and to human cloning. Additionally, efforts to market treatments based on transplant of stored umbilical cord blood have been controversial.

For over 30 years, bone marrow has been used to treat cancer patients with conditions such as leukaemia and lymphoma; this is the only form of stem-cell therapy that is widely practiced.[2][3][4] During chemotherapy, most growing cells are killed by the cytotoxic agents. These agents, however, cannot discriminate between the leukaemia or neoplastic cells, and the hematopoietic stem cells within the bone marrow. It is this side effect of conventional chemotherapy strategies that the stem-cell transplant attempts to reverse; a donor’s healthy bone marrow reintroduces functional stem cells to replace the cells lost in the host’s body during treatment. The transplanted cells also generate an immune response that helps to kill off the cancer cells; this process can go too far, however, leading to graft vs host disease, the most serious side effect of this treatment.[5]

Another stem-cell therapy called Prochymal, was conditionally approved in Canada in 2012 for the management of acute graft-vs-host disease in children who are unresponsive to steroids.[6] It is an allogenic stem therapy based on mesenchymal stem cells (MSCs) derived from the bone marrow of adult donors. MSCs are purified from the marrow, cultured and packaged, with up to 10,000 doses derived from a single donor. The doses are stored frozen until needed.[7]

The FDA has approved five hematopoietic stem-cell products derived from umbilical cord blood, for the treatment of blood and immunological diseases.[8]

In 2014, the European Medicines Agency recommended approval of limbal stem cells for people with severe limbal stem cell deficiency due to burns in the eye.[9]

Stem cells are being studied for a number of reasons. The molecules and exosomes released from stem cells are also being studied in an effort to make medications.[10] The paracrine soluble factors produced by stem cells, known as the stem cell secretome, has been found to be the predominant mechanism by which stem cell-based therapies mediate their effects in degenerative, auto-immune and inflammatory diseases.[11]

Research has been conducted on the effects of stem cells on animal models of brain degeneration, such as in Parkinson’s, Amyotrophic lateral sclerosis, and Alzheimer’s disease.[12][13][14] There have been preliminary studies related to multiple sclerosis.[15][16]

Healthy adult brains contain neural stem cells which divide to maintain general stem-cell numbers, or become progenitor cells. In healthy adult laboratory animals, progenitor cells migrate within the brain and function primarily to maintain neuron populations for olfaction (the sense of smell). Pharmacological activation of endogenous neural stem cells has been reported to induce neuroprotection and behavioral recovery in adult rat models of neurological disorder.[17][18][19]

Stroke and traumatic brain injury lead to cell death, characterized by a loss of neurons and oligodendrocytes within the brain. Clinical and animal studies have been conducted into the use of stem cells in cases of spinal cord injury.[20][21][22]

Stems cells are being studied in those with severe heart disease.[23]

The work[24] by Bodo-Eckehard Strauer has been discredited by the identification of hundreds of factual contradictions.[25] Among several clinical trials that have reported that adult stem-cell therapy is safe and effective, powerful effects have been reported from only a few laboratories, infarcts as well as heart failure not arising from myocardial infarction.[26] While initial animal studies demonstrated therapeutic effects,[27][28] later clinical trials achieved only modest, though statistically significant, improvements.[29][30]

Stem-cell therapy for treatment of myocardial infarction usually makes use of autologous bone-marrow stem cells (a specific type or all), however other types of adult stem cells may be used, such as adipose-derived stem cells.[31] Adult stem cell therapy for treating heart disease was commercially available in at least five continents as of 2007.[citation needed]

Possible mechanisms of recovery include:[12]

It may be possible to have adult bone-marrow cells differentiate into heart muscle cells.[12]

The first successful integration of human embryonic stem cell derived cardiomyocytes in guinea pigs (mouse hearts beat too fast) was reported in August 2012. The contraction strength was measured four weeks after the guinea pigs underwent simulated heart attacks and cell treatment. The cells contracted synchronously with the existing cells, but it is unknown if the positive results were produced mainly from paracrine as opposed to direct electromechanical effects from the human cells. Future work will focus on how to get the cells to engraft more strongly around the scar tissue. Whether treatments from embryonic or adult bone marrow stem cells will prove more effective remains to be seen.[32]

In 2013 the pioneering reports of powerful beneficial effects of autologous bone marrow stem cells on ventricular function were found to contain “hundreds” of discrepancies.[33] Critics report that of 48 reports there seemed to be just five underlying trials, and that in many cases whether they were randomized or merely observational accepter-versus-rejecter, was contradictory between reports of the same trial. One pair of reports of identical baseline characteristics and final results, was presented in two publications as, respectively, a 578 patient randomized trial and as a 391 patient observational study. Other reports required (impossible) negative standard deviations in subsets of patients, or contained fractional patients, negative NYHA classes. Overall there were many more patients published as having receiving stem cells in trials, than the number of stem cells processed in the hospital’s laboratory during that time. A university investigation, closed in 2012 without reporting, was reopened in July 2013.[34]

One of the most promising benefits of stem cell therapy is the potential for cardiac tissue regeneration to reverse the tissue loss underlying the development of heart failure after cardiac injury.[35]

The specificity of the human immune-cell repertoire is what allows the human body to defend itself from rapidly adapting antigens. However, the immune system is vulnerable to degradation upon the pathogenesis of disease, and because of the critical role that it plays in overall defense, its degradation is often fatal to the organism as a whole. Diseases of hematopoietic cells are diagnosed and classified via a subspecialty of pathology known as hematopathology. The specificity of the immune cells is what allows recognition of foreign antigens, causing further challenges in the treatment of immune disease. Identical matches between donor and recipient must be made for successful transplantation treatments, but matches are uncommon, even between first-degree relatives. Research using both hematopoietic adult stem cells and embryonic stem cells has provided insight into the possible mechanisms and methods of treatment for many of these ailments.[citation needed]

Fully mature human red blood cells may be generated ex vivo by hematopoietic stem cells (HSCs), which are precursors of red blood cells. In this process, HSCs are grown together with stromal cells, creating an environment that mimics the conditions of bone marrow, the natural site of red-blood-cell growth. Erythropoietin, a growth factor, is added, coaxing the stem cells to complete terminal differentiation into red blood cells.[36] Further research into this technique should have potential benefits to gene therapy, blood transfusion, and topical medicine.

In 2004, scientists at King’s College London discovered a way to cultivate a complete tooth in mice[37] and were able to grow bioengineered teeth stand-alone in the laboratory. Researchers are confident that the tooth regeneration technology can be used to grow live teeth in human patients.

In theory, stem cells taken from the patient could be coaxed in the lab turning into a tooth bud which, when implanted in the gums, will give rise to a new tooth, and would be expected to be grown in a time over three weeks.[38] It will fuse with the jawbone and release chemicals that encourage nerves and blood vessels to connect with it. The process is similar to what happens when humans grow their original adult teeth. Many challenges remain, however, before stem cells could be a choice for the replacement of missing teeth in the future.[39][40]

Heller has reported success in re-growing cochlea hair cells with the use of embryonic stem cells.[41]

Since 2003, researchers have successfully transplanted corneal stem cells into damaged eyes to restore vision. “Sheets of retinal cells used by the team are harvested from aborted fetuses, which some people find objectionable.” When these sheets are transplanted over the damaged cornea, the stem cells stimulate renewed repair, eventually restore vision.[42] The latest such development was in June 2005, when researchers at the Queen Victoria Hospital of Sussex, England were able to restore the sight of forty patients using the same technique. The group, led by Sheraz Daya, was able to successfully use adult stem cells obtained from the patient, a relative, or even a cadaver. Further rounds of trials are ongoing.[43]

Diabetes patients lose the function of insulin-producing beta cells within the pancreas.[44] In recent experiments, scientists have been able to coax embryonic stem cell to turn into beta cells in the lab. In theory if the beta cell is transplanted successfully, they will be able to replace malfunctioning ones in a diabetic patient.[45]

Clinical case reports in the treatment orthopaedic conditions have been reported. To date, the focus in the literature for musculoskeletal care appears to be on mesenchymal stem cells. Centeno et al. have published MRI evidence of increased cartilage and meniscus volume in individual human subjects.[46][unreliable medical source?][47] The results of trials that include a large number of subjects, are yet to be published. However, a published safety study conducted in a group of 227 patients over a 3-4-year period shows adequate safety and minimal complications associated with mesenchymal cell transplantation.[48]

Wakitani has also published a small case series of nine defects in five knees involving surgical transplantation of mesenchymal stem cells with coverage of the treated chondral defects.[49]

Stem cells can also be used to stimulate the growth of human tissues. In an adult, wounded tissue is most often replaced by scar tissue, which is characterized in the skin by disorganized collagen structure, loss of hair follicles and irregular vascular structure. In the case of wounded fetal tissue, however, wounded tissue is replaced with normal tissue through the activity of stem cells.[50] A possible method for tissue regeneration in adults is to place adult stem cell “seeds” inside a tissue bed “soil” in a wound bed and allow the stem cells to stimulate differentiation in the tissue bed cells. This method elicits a regenerative response more similar to fetal wound-healing than adult scar tissue formation.[50] Researchers are still investigating different aspects of the “soil” tissue that are conducive to regeneration.[50]

Culture of human embryonic stem cells in mitotically inactivated porcine ovarian fibroblasts (POF) causes differentiation into germ cells (precursor cells of oocytes and spermatozoa), as evidenced by gene expression analysis.[51]

Human embryonic stem cells have been stimulated to form Spermatozoon-like cells, yet still slightly damaged or malformed.[52] It could potentially treat azoospermia.

In 2012, oogonial stem cells were isolated from adult mouse and human ovaries and demonstrated to be capable of forming mature oocytes.[53] These cells have the potential to treat infertility.

Destruction of the immune system by the HIV is driven by the loss of CD4+ T cells in the peripheral blood and lymphoid tissues. Viral entry into CD4+ cells is mediated by the interaction with a cellular chemokine receptor, the most common of which are CCR5 and CXCR4. Because subsequent viral replication requires cellular gene expression processes, activated CD4+ cells are the primary targets of productive HIV infection.[54] Recently scientists have been investigating an alternative approach to treating HIV-1/AIDS, based on the creation of a disease-resistant immune system through transplantation of autologous, gene-modified (HIV-1-resistant) hematopoietic stem and progenitor cells (GM-HSPC).[55]

Stem cells are thought to mediate repair via five primary mechanisms: 1) providing an anti-inflammatory effect, 2) homing to damaged tissues and recruiting other cells, such as endothelial progenitor cells, that are necessary for tissue growth, 3) supporting tissue remodeling over scar formation, 4) inhibiting apoptosis, and 5) differentiating into bone, cartilage, tendon, and ligament tissue.[56][57]

To further enrich blood supply to the damaged areas, and consequently promote tissue regeneration, platelet-rich plasma could be used in conjunction with stem cell transplantation.[58][59] The efficacy of some stem cell populations may also be affected by the method of delivery; for instance, to regenerate bone, stem cells are often introduced in a scaffold where they produce the minerals necessary for generation of functional bone.[58][59][60][61]

Stem cells have also been shown to have a low immunogenicity due to the relatively low number of MHC molecules found on their surface. In addition, they have been found to secrete chemokines that alter the immune response and promote tolerance of the new tissue. This allows for allogeneic treatments to be performed without a high rejection risk.[62]

The ability to grow up functional adult tissues indefinitely in culture through Directed differentiation creates new opportunities for drug research. Researchers are able to grow up differentiated cell lines and then test new drugs on each cell type to examine possible interactions in vitro before performing in vivo studies. This is critical in the development of drugs for use in veterinary research because of the possibilities of species specific interactions. The hope is that having these cell lines available for research use will reduce the need for research animals used because effects on human tissue in vitro will provide insight not normally known before the animal testing phase.[63]

Stem cells are being explored for use in conservation efforts. Spermatogonial stem cells have been harvested from a rat and placed into a mouse host and fully mature sperm were produced with the ability to produce viable offspring. Currently research is underway to find suitable hosts for the introduction of donor spermatogonial stem cells. If this becomes a viable option for conservationists, sperm can be produced from high genetic quality individuals who die before reaching sexual maturity, preserving a line that would otherwise be lost.[64]

Most stem cells intended for regenerative therapy are generally isolated either from the patient’s bone marrow or from adipose tissue.[59][61] Mesenchymal stem cells can differentiate into the cells that make up bone, cartilage, tendons, and ligaments, as well as muscle, neural and other progenitor tissues, they have been the main type of stem cells studied in the treatment of diseases affecting these tissues.[65][66] The number of stem cells transplanted into damaged tissue may alter efficacy of treatment. Accordingly, stem cells derived from bone marrow aspirates, for instance, are cultured in specialized laboratories for expansion to millions of cells.[59][61] Although adipose-derived tissue also requires processing prior to use, the culturing methodology for adipose-derived stem cells is not as extensive as that for bone marrow-derived cells.[67][68] While it is thought that bone-marrow derived stem cells are preferred for bone, cartilage, ligament, and tendon repair, others believe that the less challenging collection techniques and the multi-cellular microenvironment already present in adipose-derived stem cell fractions make the latter the preferred source for autologous transplantation.[58]

New sources of mesenchymal stem cells are being researched, including stem cells present in the skin and dermis which are of interest because of the ease at which they can be harvested with minimal risk to the animal.[69] Hematopoetic stem cells have also been discovered to be travelling in the blood stream and possess equal differentiating ability as other mesenchymal stem cells, again with a very non-invasive harvesting technique.[70]

There is widespread controversy over the use of human embryonic stem cells. This controversy primarily targets the techniques used to derive new embryonic stem cell lines, which often requires the destruction of the blastocyst. Opposition to the use of human embryonic stem cells in research is often based on philosophical, moral, or religious objections.[71] There is other stem cell research that does not involve the destruction of a human embryo, and such research involves adult stem cells, amniotic stem cells, and induced pluripotent stem cells.

On 23 January 2009, the US Food and Drug Administration gave clearance to Geron Corporation for the initiation of the first clinical trial of an embryonic stem-cell-based therapy on humans. The trial aimed evaluate the drug GRNOPC1, embryonic stem cell-derived oligodendrocyte progenitor cells, on patients with acute spinal cord injury. The trial was discontinued in November 2011 so that the company could focus on therapies in the “current environment of capital scarcity and uncertain economic conditions”.[72] In 2013 biotechnology and regenerative medicine company BioTime (AMEX:BTX) acquired Geron’s stem cell assets in a stock transaction, with the aim of restarting the clinical trial.[73]

Scientists have reported that MSCs when transfused immediately within few hours post thawing may show reduced function or show decreased efficacy in treating diseases as compared to those MSCs which are in log phase of cell growth(fresh), so cryopreserved MSCs should be brought back into log phase of cell growth in invitro culture before these are administered for clinical trials or experimental therapies, re-culturing of MSCs will help in recovering from the shock the cells get during freezing and thawing. Various clinical trials on MSCs have failed which used cryopreserved product immediately post thaw as compared to those clinical trials which used fresh MSCs.[74]

Research has been conducted on horses, dogs, and cats can benefit the development of stem cell treatments in veterinary medicine and can target a wide range of injuries and diseases such as myocardial infarction, stroke, tendon and ligament damage, osteoarthritis, osteochondrosis and muscular dystrophy both in large animals, as well as humans.[75][76][77][78] While investigation of cell-based therapeutics generally reflects human medical needs, the high degree of frequency and severity of certain injuries in racehorses has put veterinary medicine at the forefront of this novel regenerative approach.[79] Companion animals can serve as clinically relevant models that closely mimic human disease.[80][81]

Veterinary applications of stem cell therapy as a means of tissue regeneration have been largely shaped by research that began with the use of adult-derived mesenchymal stem cells to treat animals with injuries or defects affecting bone, cartilage, ligaments and/or tendons.[82][65][83] There are two main categories of stem cells used for treatments: allogeneic stem cells derived from a genetically different donor within the same species[61][84] and autologous mesenchymal stem cells, derived from the patient prior to use in various treatments.[58] A third category, xenogenic stem cells, or stem cells derived from different species, are used primarily for research purposes, especially for human treatments.[63]

Bone has a unique and well documented natural healing process that normally is sufficient to repair fractures and other common injuries. Misaligned breaks due to severe trauma, as well as treatments like tumor resections of bone cancer, are prone to improper healing if left to the natural process alone. Scaffolds composed of natural and artificial components are seeded with mesenchymal stem cells and placed in the defect. Within four weeks of placing the scaffold, newly formed bone begins to integrate with the old bone and within 32 weeks, full union is achieved.[85] Further studies are necessary to fully characterize the use of cell-based therapeutics for treatment of bone fractures.

Stem cells have been used to treat degenerative bone diseases. The normally recommended treatment for dogs that have LeggCalvePerthes disease is to remove the head of the femur after the degeneration has progressed. Recently, mesenchymal stem cells have been injected directly in to the head of the femur, with success not only in bone regeneration, but also in pain reduction.[85]

Because of the general positive healing capabilities of stem cells, they have gained interest for the treatment of cutaneous wounds. This is important interest for those with reduced healing capabilities, like diabetics and those undergoing chemotherapy. In one trial, stem cells were isolated from the Wharton’s jelly of the umbilical cord. These cells were injected directly into the wounds. Within a week, full re-epithelialization of the wounds had occurred, compared to minor re-epithelialization in the control wounds. This showed the capabilities of mesenchymal stem cells in the repair of epidermal tissues.[86]

Soft-palate defects in horses are caused by a failure of the embryo to fully close at the midline during embryogenesis. These are often not found until after they have become worse because of the difficulty in visualizing the entire soft palate. This lack of visualization is thought to also contribute to the low success rate in surgical intervention to repair the defect. As a result, the horse often has to be euthanized. Recently, the use of mesenchymal stem cells has been added to the conventional treatments. After the surgeon has sutured the palate closed, autologous mesenchymal cells are injected into the soft palate. The stem cells were found to be integrated into the healing tissue especially along the border with the old tissue. There was also a large reduction in the number of inflammatory cells present, which is thought to aid in the healing process.[87]

Autologous stem cell-based treatments for ligament injury, tendon injury, osteoarthritis, osteochondrosis, and sub-chondral bone cysts have been commercially available to practicing veterinarians to treat horses since 2003 in the United States and since 2006 in the United Kingdom. Autologous stem cell based treatments for tendon injury, ligament injury, and osteoarthritis in dogs have been available to veterinarians in the United States since 2005. Over 3000 privately owned horses and dogs have been treated with autologous adipose-derived stem cells. The efficacy of these treatments has been shown in double-blind clinical trials for dogs with osteoarthritis of the hip and elbow and horses with tendon damage.[88][89]

Race horses are especially prone to injuries of the tendon and ligaments. Conventional therapies are very unsuccessful in returning the horse to full functioning potential. Natural healing, guided by the conventional treatments, leads to the formation of fibrous scar tissue that reduces flexibility and full joint movement. Traditional treatments prevented a large number of horses from returning to full activity and also have a high incidence of re-injury due to the stiff nature of the scarred tendon. Introduction of both bone marrow and adipose derived stem cells, along with natural mechanical stimulus promoted the regeneration of tendon tissue. The natural movement promoted the alignment of the new fibers and tendocytes with the natural alignment found in uninjured tendons. Stem cell treatment not only allowed more horses to return to full duty and also greatly reduced the re-injury rate over a three-year period.[62]

The use of embryonic stem cells has also been applied to tendon repair. The embryonic stem cells were shown to have a better survival rate in the tendon as well as better migrating capabilities to reach all areas of damaged tendon. The overall repair quality was also higher, with better tendon architecture and collagen formed. There was also no tumor formation seen during the three-month experimental period. Long-term studies need to be carried out to examine the long-term efficacy and risks associated with the use of embryonic stem cells.[62] Similar results have been found in small animals.[62]

Osteoarthritis is the main cause of joint pain both in animals and humans. Horses and dogs are most frequently affected arthritis. Natural cartilage regeneration is very limited and no current drug therapies are curative, but rather look to reduce the symptoms associated with the degeneration. Different types of mesenchymal stem cells and other additives are still being researched to find the best type of cell and method for long-term treatment.[62]

Adipose-derived mesenchymal cells are currently the most often used because of the non-invasive harvesting. There has been a lot of success recently injecting mesenchymal stem cells directly into the joint. This is a recently developed, non-invasive technique developed for easier clinical use. Dogs receiving this treatment showed greater flexibility in their joints and less pain.[90]

Stem cells have successfully been used to ameliorate healing in the heart after myocardial infarction in dogs. Adipose and bone marrow derived stem cells were removed and induced to a cardiac cell fate before being injected into the heart. The heart was found to have improved contractility and a reduction in the damaged area four weeks after the stem cells were applied.[91]

A different trial is underway for a patch made of a porous substance onto which the stem cells are “seeded” in order to induce tissue regeneration in heart defects. Tissue was regenerated and the patch was well incorporated into the heart tissue. This is thought to be due, in part, to improved angiogenesis and reduction of inflammation. Although cardiomyocytes were produced from the mesenchymal stem cells, they did not appear to be contractile. Other treatments that induced a cardiac fate in the cells before transplanting had greater success at creating contractile heart tissue.[92]

Spinal cord injuries are one of the most common traumas brought into veterinary hospitals.[85] Spinal injuries occur in two ways after the trauma: the primary mechanical damage, and in secondary processes, like inflammation and scar formation, in the days following the trauma. These cells involved in the secondary damage response secrete factors that promote scar formation and inhibit cellular regeneration. Mesenchymal stem cells that are induced to a neural cell fate are loaded onto a porous scaffold and are then implanted at the site of injury. The cells and scaffold secrete factors that counteract those secreted by scar forming cells and promote neural regeneration. Eight weeks later, dogs treated with stem cells showed immense improvement over those treated with conventional therapies. Dogs treated with stem cells were able to occasionally support their own weight, which has not been seen in dogs undergoing conventional therapies.[93][94][95]

Treatments are also in clinical trials to repair and regenerate peripheral nerves. Peripheral nerves are more likely to be damaged, but the effects of the damage are not as widespread as seen in injuries to the spinal cord. Treatments are currently in clinical trials to repair severed nerves, with early success. Stem cells induced to a neural fate injected in to a severed nerve. Within four weeks, regeneration of previously damaged stem cells and completely formed nerve bundles were observed.[69]

Stem cells are also in clinical phases for treatment in ophthalmology. Hematopoietic stem cells have been used to treat corneal ulcers of different origin of several horses. These ulcers were resistant to conventional treatments available, but quickly responded positively to the stem cell treatment. Stem cells were also able to restore sight in one eye of a horse with retinal detachment, allowing the horse to return to daily activities.[70]

Pre-clinical models of Sjgrens syndrome [96][97] have culminated in allogeneic MSCs implanted around the lacrimal glands in KSC dogs that were refractory to current therapy. Significantly improved scores in ocular discharge, conjunctival hyperaemia, corneal changes and Schirmer tear tests (STT) were seen.[98]

Stem-cell research and treatment was practiced in the People’s Republic of China. The Ministry of Health of the People’s Republic of China has permitted the use of stem-cell therapy for conditions beyond those approved of in Western countries. The Western World has scrutinized China for its failed attempts to meet international documentation standards of these trials and procedures.[99]

In 2005, South Korean scientists claimed to have generated stem cells that were tailored to match the recipient. Each of the 11 new stem cell lines was developed using somatic cell nuclear transfer (SCNT) technology. The resultant cells were thought to match the genetic material of the recipient, thus suggesting minimal to no cell rejection.[100]

As of 2013, Thailand still considers Hematopoietic stem cell transplants as experimental. Kampon Sriwatanakul began with a clinical trial in October 2013 with 20 patients. 10 are going to receive stem-cell therapy for Type-2 diabetes and the other 10 will receive stem-cell therapy for emphysema. Chotinantakul’s research is on Hematopoietic cells and their role for the hematopoietic system function in homeostasis and immune response.[101]

See the rest here:

Stem-cell therapy – Wikipedia

Stem Cell Therapy, Surgery, Transplant & Treatment – San …

Knee Pain Stem Cell Treatment

Just like anything else, the more or longer something is used, the faster it will wear down this includes the human body. The joints, especially the knees, are extremely

Read more

Pain in the bones and joints can reduce a person’s comfort and quality of life when it hampers everyday tasks. These ailmentscan impact various parts of the body, including the

Read more

View original post here:

Stem Cell Therapy, Surgery, Transplant & Treatment – San …

Stem cell – Wikipedia

Stem cells are biological cells that can differentiate into other types of stem cells and can divide to produce more of the same type of stem cells. They are found in multicellular organisms.

In mammals, there are two broad types of stem cells: embryonic stem cells, which are isolated from the inner cell mass of blastocysts, and adult stem cells, which are found in various tissues. In adult organisms, stem cells and progenitor cells act as a repair system for the body, replenishing adult tissues. In a developing embryo, stem cells can differentiate into all the specialized cellsectoderm, endoderm and mesoderm (see induced pluripotent stem cells)but also maintain the normal turnover of regenerative organs, such as blood, skin, or intestinal tissues.

There are three known accessible sources of autologous adult stem cells in humans:

Stem cells can also be taken from umbilical cord blood just after birth. Of all stem cell types, autologous harvesting involves the least risk. By definition, autologous cells are obtained from one’s own body, just as one may bank his or her own blood for elective surgical procedures.

Adult stem cells are frequently used in various medical therapies (e.g., bone marrow transplantation). Stem cells can now be artificially grown and transformed (differentiated) into specialized cell types with characteristics consistent with cells of various tissues such as muscles or nerves. Embryonic cell lines and autologous embryonic stem cells generated through somatic cell nuclear transfer or dedifferentiation have also been proposed as promising candidates for future therapies.[1] Research into stem cells grew out of findings by Ernest A. McCulloch and James E. Till at the University of Toronto in the 1960s.[2][3]

The classical definition of a stem cell requires that it possesses two properties:

Two mechanisms exist to ensure that a stem cell population is maintained:

1. Obligatory asymmetric replication: a stem cell divides into one mother cell that is identical to the original stem cell, and another daughter cell that is differentiated.

When a stem cell self-renews it divides and does not disrupt the undifferentiated state. This self-renewal demands control of cell cycle as well as upkeep of multipotency or pluripotency, which all depends on the stem cell. [4]

2. Stochastic differentiation: when one stem cell develops into two differentiated daughter cells, another stem cell undergoes mitosis and produces two stem cells identical to the original.

Potency specifies the differentiation potential (the potential to differentiate into different cell types) of the stem cell.[5]

In practice, stem cells are identified by whether they can regenerate tissue. For example, the defining test for bone marrow or hematopoietic stem cells (HSCs) is the ability to transplant the cells and save an individual without HSCs. This demonstrates that the cells can produce new blood cells over a long term. It should also be possible to isolate stem cells from the transplanted individual, which can themselves be transplanted into another individual without HSCs, demonstrating that the stem cell was able to self-renew.

Properties of stem cells can be illustrated in vitro, using methods such as clonogenic assays, in which single cells are assessed for their ability to differentiate and self-renew.[8][9] Stem cells can also be isolated by their possession of a distinctive set of cell surface markers. However, in vitro culture conditions can alter the behavior of cells, making it unclear whether the cells shall behave in a similar manner in vivo. There is considerable debate as to whether some proposed adult cell populations are truly stem cells.[citation needed]

Embryonic stem (ES) cells are the cells of the inner cell mass of a blastocyst, an early-stage embryo.[10] Human embryos reach the blastocyst stage 45 days post fertilization, at which time they consist of 50150 cells. ES cells are pluripotent and give rise during development to all derivatives of the three primary germ layers: ectoderm, endoderm and mesoderm. In other words, they can develop into each of the more than 200 cell types of the adult body when given sufficient and necessary stimulation for a specific cell type. They do not contribute to the extra-embryonic membranes or the placenta.

During embryonic development these inner cell mass cells continuously divide and become more specialized. For example, a portion of the ectoderm in the dorsal part of the embryo specializes as ‘neurectoderm’, which will become the future central nervous system.[11] Later in development, neurulation causes the neurectoderm to form the neural tube. At the neural tube stage, the anterior portion undergoes encephalization to generate or ‘pattern’ the basic form of the brain. At this stage of development, the principal cell type of the CNS is considered a neural stem cell. These neural stem cells are pluripotent, as they can generate a large diversity of many different neuron types, each with unique gene expression, morphological, and functional characteristics. The process of generating neurons from stem cells is called neurogenesis. One prominent example of a neural stem cell is the radial glial cell, so named because it has a distinctive bipolar morphology with highly elongated processes spanning the thickness of the neural tube wall, and because historically it shared some glial characteristics, most notably the expression of glial fibrillary acidic protein (GFAP).[12][13] The radial glial cell is the primary neural stem cell of the developing vertebrate CNS, and its cell body resides in the ventricular zone, adjacent to the developing ventricular system. Neural stem cells are committed to the neuronal lineages (neurons, astrocytes, and oligodendrocytes), and thus their potency is restricted.[11]

Nearly all research to date has made use of mouse embryonic stem cells (mES) or human embryonic stem cells (hES) derived from the early inner cell mass. Both have the essential stem cell characteristics, yet they require very different environments in order to maintain an undifferentiated state. Mouse ES cells are grown on a layer of gelatin as an extracellular matrix (for support) and require the presence of leukemia inhibitory factor (LIF) in serum media. A drug cocktail containing inhibitors to GSK3B and the MAPK/ERK pathway, called 2i, has also been shown to maintain pluripotency in stem cell culture.[14] Human ES cells are grown on a feeder layer of mouse embryonic fibroblasts (MEFs) and require the presence of basic fibroblast growth factor (bFGF or FGF-2).[15] Without optimal culture conditions or genetic manipulation,[16] embryonic stem cells will rapidly differentiate.

A human embryonic stem cell is also defined by the expression of several transcription factors and cell surface proteins. The transcription factors Oct-4, Nanog, and Sox2 form the core regulatory network that ensures the suppression of genes that lead to differentiation and the maintenance of pluripotency.[17] The cell surface antigens most commonly used to identify hES cells are the glycolipids stage specific embryonic antigen 3 and 4 and the keratan sulfate antigens Tra-1-60 and Tra-1-81. By using human embryonic stem cells to produce specialized cells like nerve cells or heart cells in the lab, scientists can gain access to adult human cells without taking tissue from patients. They can then study these specialized adult cells in detail to try and catch complications of diseases, or to study cells reactions to potentially new drugs. The molecular definition of a stem cell includes many more proteins and continues to be a topic of research.[18]

There are currently no approved treatments using embryonic stem cells. The first human trial was approved by the US Food and Drug Administration in January 2009.[19] However, the human trial was not initiated until October 13, 2010 in Atlanta for spinal cord injury research. On November 14, 2011 the company conducting the trial (Geron Corporation) announced that it will discontinue further development of its stem cell programs.[20] ES cells, being pluripotent cells, require specific signals for correct differentiationif injected directly into another body, ES cells will differentiate into many different types of cells, causing a teratoma. Differentiating ES cells into usable cells while avoiding transplant rejection are just a few of the hurdles that embryonic stem cell researchers still face.[21] Due to ethical considerations, many nations currently have moratoria or limitations on either human ES cell research or the production of new human ES cell lines. Because of their combined abilities of unlimited expansion and pluripotency, embryonic stem cells remain a theoretically potential source for regenerative medicine and tissue replacement after injury or disease.[22].

Human embryonic stem cell colony on mouse embryonic fibroblast feeder layer

The primitive stem cells located in the organs of fetuses are referred to as fetal stem cells.[23] There are two types of fetal stem cells:

Adult stem cells, also called somatic (from Greek , “of the body”) stem cells, are stem cells which maintain and repair the tissue in which they are found.[25] They can be found in children, as well as adults.[26]

Pluripotent adult stem cells are rare and generally small in number, but they can be found in umbilical cord blood and other tissues.[27] Bone marrow is a rich source of adult stem cells,[28] which have been used in treating several conditions including liver cirrhosis,[29] chronic limb ischemia [30] and endstage heart failure.[31] The quantity of bone marrow stem cells declines with age and is greater in males than females during reproductive years.[32] Much adult stem cell research to date has aimed to characterize their potency and self-renewal capabilities.[33] DNA damage accumulates with age in both stem cells and the cells that comprise the stem cell environment. This accumulation is considered to be responsible, at least in part, for increasing stem cell dysfunction with aging (see DNA damage theory of aging).[34]

Most adult stem cells are lineage-restricted (multipotent) and are generally referred to by their tissue origin (mesenchymal stem cell, adipose-derived stem cell, endothelial stem cell, dental pulp stem cell, etc.).[35][36] Muse cells (multi-lineage differentiating stress enduring cells) are a recently discovered pluripotent stem cell type found in multiple adult tissues, including adipose, dermal fibroblasts, and bone marrow. While rare, muse cells are identifiable by their expression of SSEA-3, a marker for undifferentiated stem cells, and general mesenchymal stem cells markers such as CD105. When subjected to single cell suspension culture, the cells will generate clusters that are similar to embryoid bodies in morphology as well as gene expression, including canonical pluripotency markers Oct4, Sox2, and Nanog.[37]

Adult stem cell treatments have been successfully used for many years to treat leukemia and related bone/blood cancers through bone marrow transplants.[38] Adult stem cells are also used in veterinary medicine to treat tendon and ligament injuries in horses.[39]

The use of adult stem cells in research and therapy is not as controversial as the use of embryonic stem cells, because the production of adult stem cells does not require the destruction of an embryo. Additionally, in instances where adult stem cells are obtained from the intended recipient (an autograft), the risk of rejection is essentially non-existent. Consequently, more US government funding is being provided for adult stem cell research.[40]

Multipotent stem cells are also found in amniotic fluid. These stem cells are very active, expand extensively without feeders and are not tumorigenic. Amniotic stem cells are multipotent and can differentiate in cells of adipogenic, osteogenic, myogenic, endothelial, hepatic and also neuronal lines.[41] Amniotic stem cells are a topic of active research.

Use of stem cells from amniotic fluid overcomes the ethical objections to using human embryos as a source of cells. Roman Catholic teaching forbids the use of embryonic stem cells in experimentation; accordingly, the Vatican newspaper “Osservatore Romano” called amniotic stem cells “the future of medicine”.[42]

It is possible to collect amniotic stem cells for donors or for autologuous use: the first US amniotic stem cells bank [43][44] was opened in 2009 in Medford, MA, by Biocell Center Corporation[45][46][47] and collaborates with various hospitals and universities all over the world.[48]

Adult stem cells have limitations with their potency; unlike ESCs, they are not able to differentiate into cells from all three germ layers. As such, they are deemed multipotent.

However, reprogramming allows for the creation of pluripotent cells, induced pluripotent stem cells, from adult cells. It is important to note that these are not adult stem cells, but adult cells (e.g. epithelial cells) reprogrammed to give rise to cells with pluripotent capabilities. Using genetic reprogramming with protein transcription factors, pluripotent stem cells with ESC-like capabilities have been derived.[49][50][51] The first demonstration of Induced Pluripotent Stem Cells was conducted by Shinya Yamanaka and his colleagues at Kyoto University.[52] They used the transcription factors Oct3/4, Sox2, c-Myc, and Klf4 to reprogram mouse fibroblast cells into pluripotent cells.[49][53] Subsequent work used these factors to induce pluripotency in human fibroblast cells.[54] Junying Yu, James Thomson, and their colleagues at the University of WisconsinMadison used a different set of factors, Oct4, Sox2, Nanog and Lin28, and carried out their experiments using cells from human foreskin.[49][55] However, they were able to replicate Yamanaka’s finding that inducing pluripotency in human cells was possible.

It is important to note that iPSCs and ESCs are not equivalent. They have many similar properties, such as pluripotency and differentiation potential, the expression of pluripotency genes, epigenetic patterns, embryoid body and teratoma formation, and viable chimera formation.[52][53] However, similar does not mean they are the same. In fact, there are many differences within these properties. Importantly, the chromatin of iPSCs appears to be more “closed” or methylated than that of ESCs.[52][53] Similarly, the gene expression pattern between ESCs and iPSCs, or even iPSCs sourced from different origins.[52] There are thus questions about the “completeness” of reprogramming and the somatic memory of induced pluripotent stem cells. Despite this, inducing adult cells to be pluripotent appears to be viable.

As a result of the success of these experiments, Ian Wilmut, who helped create the first cloned animal Dolly the Sheep, has announced that he will abandon somatic cell nuclear transfer as an avenue of research.[56]

Furthermore, induced pluripotent stem cells provide several therapeutic advantages. Like ESCs, they are pluripotent. They thus have great differentiation potential; theoretically, they could produce any cell within the human body (if reprogramming to pluripotency was “complete”).[52] Moreover, unlike ESCs, they potentially could allow doctors to create a pluripotent stem cell line for each individual patient.[57] In fact, frozen blood samples can be used as a source of induced pluripotent stem cells, opening a new avenue for obtaining the valued cells.[58] Patient specific stem cells allow for the screening for side effects before drug treatment, as well as the reduced risk of transplantation rejection.[57] Despite their current limited use therapeutically, iPSCs hold create potential for future use in medical treatment and research.

To ensure self-renewal, stem cells undergo two types of cell division (see Stem cell division and differentiation diagram). Symmetric division gives rise to two identical daughter cells both endowed with stem cell properties. Asymmetric division, on the other hand, produces only one stem cell and a progenitor cell with limited self-renewal potential. Progenitors can go through several rounds of cell division before terminally differentiating into a mature cell. It is possible that the molecular distinction between symmetric and asymmetric divisions lies in differential segregation of cell membrane proteins (such as receptors) between the daughter cells.[59]

An alternative theory is that stem cells remain undifferentiated due to environmental cues in their particular niche. Stem cells differentiate when they leave that niche or no longer receive those signals. Studies in Drosophila germarium have identified the signals decapentaplegic and adherens junctions that prevent germarium stem cells from differentiating.[60][61]

Stem cell therapy is the use of stem cells to treat or prevent a disease or condition. Bone marrow transplant is a form of stem cell therapy that has been used for many years without controversy. No stem cell therapies other than bone marrow transplant are widely used.[62][63]

Stem cell treatments may require immunosuppression because of a requirement for radiation before the transplant to remove the person’s previous cells, or because the patient’s immune system may target the stem cells. One approach to avoid the second possibility is to use stem cells from the same patient who is being treated.

Pluripotency in certain stem cells could also make it difficult to obtain a specific cell type. It is also difficult to obtain the exact cell type needed, because not all cells in a population differentiate uniformly. Undifferentiated cells can create tissues other than desired types.[64]

Some stem cells form tumors after transplantation;[65] pluripotency is linked to tumor formation especially in embryonic stem cells, fetal proper stem cells, induced pluripotent stem cells. Fetal proper stem cells form tumors despite multipotency.[66]

Some of the fundamental patents covering human embryonic stem cells are owned by the Wisconsin Alumni Research Foundation (WARF) they are patents 5,843,780, 6,200,806, and 7,029,913 invented by James A. Thomson. WARF does not enforce these patents against academic scientists, but does enforce them against companies.[67]

In 2006, a request for the US Patent and Trademark Office (USPTO) to re-examine the three patents was filed by the Public Patent Foundation on behalf of its client, the non-profit patent-watchdog group Consumer Watchdog (formerly the Foundation for Taxpayer and Consumer Rights).[67] In the re-examination process, which involves several rounds of discussion between the USPTO and the parties, the USPTO initially agreed with Consumer Watchdog and rejected all the claims in all three patents,[68] however in response, WARF amended the claims of all three patents to make them more narrow, and in 2008 the USPTO found the amended claims in all three patents to be patentable. The decision on one of the patents (7,029,913) was appealable, while the decisions on the other two were not.[69][70] Consumer Watchdog appealed the granting of the ‘913 patent to the USPTO’s Board of Patent Appeals and Interferences (BPAI) which granted the appeal, and in 2010 the BPAI decided that the amended claims of the ‘913 patent were not patentable.[71] However, WARF was able to re-open prosecution of the case and did so, amending the claims of the ‘913 patent again to make them more narrow, and in January 2013 the amended claims were allowed.[72]

In July 2013, Consumer Watchdog announced that it would appeal the decision to allow the claims of the ‘913 patent to the US Court of Appeals for the Federal Circuit (CAFC), the federal appeals court that hears patent cases.[73] At a hearing in December 2013, the CAFC raised the question of whether Consumer Watchdog had legal standing to appeal; the case could not proceed until that issue was resolved.[74]

Diseases and conditions where stem cell treatment is being investigated include:

Research is underway to develop various sources for stem cells, and to apply stem cell treatments for neurodegenerative diseases and conditions, diabetes, heart disease, and other conditions.[90] Research is also underway in generating organoids using stem cells, which would allow for further understanding of human development, organogenesis, and modeling of human diseases.[91]

In more recent years, with the ability of scientists to isolate and culture embryonic stem cells, and with scientists’ growing ability to create stem cells using somatic cell nuclear transfer and techniques to create induced pluripotent stem cells, controversy has crept in, both related to abortion politics and to human cloning.

Hepatotoxicity and drug-induced liver injury account for a substantial number of failures of new drugs in development and market withdrawal, highlighting the need for screening assays such as stem cell-derived hepatocyte-like cells, that are capable of detecting toxicity early in the drug development process.[92]

See the original post here:

Stem cell – Wikipedia

NeuroGen – Stem Cell Therapy & Treatment in Mumbai, India

Happy Father Shares His Delight At Improvement Seen In His Daughter After Stem Cell Therapy At Neurogen

I am very happy to inform you about Pratibhas progress in last one year. The latest development being, Pratibha has done Sitar performance in four major stages [3 in Kochi and one in Trivandrum] during this Navratri. She is now able to perform for about 40 45 minutes on stage, in which she performs one raag for 30 minutes, another raag for 10 minutes, and couple of small dhuns..

Read this article:

NeuroGen – Stem Cell Therapy & Treatment in Mumbai, India

Stem Cell Malaysia | Stem Cell Therapy & Reverse Aging

Somaplus: All Natural – Improvement Month by Month As time goes on, we will start to experience a decline in body function. We get tired easily, our skin becomes thinner, wrinkles will be noticeable, longer time is needed to recover from sickness, poor eyesight is experienced, forget easily, poorer sexual function and the list goes on. Consuming Somaplus will help a person experience noticeable improvement and will continue to improve month by month until reaching optimum level in 6 months.

Phyto-Berries is a delicious fruit berries rink presented in dried powder form, It is made from 9 different berries from Maine, California and North Carolina farms in USA. Phyto-Berries us carefully blended to retain 100% natural pulps, seeds and skin to provide highest quality antioxidant.

ORAC (Oxygen Radical Absorbance Capacity) is a scientific method to measure antioxidant capacity in food developed by The National Institute of Aging, Baltimore, USA. The US Recommended Daily Allowance (RDA) is between 3000-5000 ORAC per day.

Each serving of Phyto-Berries gives you 4000 ORAC value meeting US RDA recommendation. In fact, each serving of Phyto-Berries has equal amount of 3 mugs of fruits berries.

See original here:

Stem Cell Malaysia | Stem Cell Therapy & Reverse Aging

Liberal Democrats

Published and promoted by Nick Harvey on behalf of the Liberal Democrats,8-10 Great George Street, London, SW1P 3AE.Hosted by NationBuilder.

The Liberal Democrats and their elected representatives may use the information youve given to contact you. By providing your data to us, you are consenting to us making contact with you in the future by mail, email, telephone, text, website and apps, even though you may be registered with the Telephone Preference Service. Your data may be stored or otherwise processed in the US, governed by European Commission model contract clauses. You can always opt out of communications at any time by contacting us or visiting http://www.libdems.org.uk/optout. For more information go to http://www.libdems.org.uk/privacy.

Follow this link:

Liberal Democrats

Neoliberalism – Wikipedia

Neoliberalism or neo-liberalism[1] refers primarily to the 20th-century resurgence of 19th-century ideas associated with laissez-faire economic liberalism.[2]:7 Those ideas include economic liberalization policies such as privatization, austerity, deregulation, free trade[3] and reductions in government spending in order to increase the role of the private sector in the economy and society.[11] These market-based ideas and the policies they inspired constitute a paradigm shift away from the post-war Keynesian consensus which lasted from 1945 to 1980.[12][13]

English-speakers have used the term “neoliberalism” since the start of the 20th century with different meanings,[14] but it became more prevalent in its current meaning in the 1970s and 1980s, used by scholars in a wide variety of social sciences[15][16] as well as by critics.[17][18] Modern advocates of free market policies avoid the term “neoliberal”[19] and some scholars have described the term as meaning different things to different people[20][21] as neoliberalism “mutated” into geopolitically distinct hybrids as it travelled around the world.[4] As such, neoliberalism shares many attributes with other concepts that have contested meanings, including democracy.[22]

The definition and usage of the term have changed over time.[5] As an economic philosophy, neoliberalism emerged among European liberal scholars in the 1930s as they attempted to trace a so-called “third” or “middle” way between the conflicting philosophies of classical liberalism and socialist planning.[23]:1415 The impetus for this development arose from a desire to avoid repeating the economic failures of the early 1930s, which neoliberals mostly blamed on the economic policy of classical liberalism. In the decades that followed, the use of the term “neoliberal” tended to refer to theories which diverged from the more laissez-faire doctrine of classical liberalism and which promoted instead a market economy under the guidance and rules of a strong state, a model which came to be known as the social market economy.

In the 1960s, usage of the term “neoliberal” heavily declined. When the term re-appeared in the 1980s in connection with Augusto Pinochet’s economic reforms in Chile, the usage of the term had shifted. It had not only become a term with negative connotations employed principally by critics of market reform, but it also had shifted in meaning from a moderate form of liberalism to a more radical and laissez-faire capitalist set of ideas. Scholars now tended to associate it with the theories of Mont Pelerin Society economists Friedrich Hayek, Milton Friedman and James M. Buchanan, along with politicians and policy-makers such as Margaret Thatcher, Ronald Reagan and Alan Greenspan.[5][24] Once the new meaning of neoliberalism became established as a common usage among Spanish-speaking scholars, it diffused into the English-language study of political economy.[5] By 1994, with the passage of NAFTA and with the Zapatistas’ reaction to this development in Chiapas, the term entered global circulation.[4] Scholarship on the phenomenon of neoliberalism has been growing over the last couple of decades.[16] The impact of the global 20082009 crisis has also given rise to new scholarship that criticises neoliberalism and seeks policy alternatives.[25]

An early use of the term in English was in 1898 by the French economist Charles Gide to describe the economic beliefs of the Italian economist Maffeo Pantaleoni,[26] with the term “no-libralisme” previously existing in French,[14] and the term was later used by others including the classical liberal economist Milton Friedman in a 1951 essay.[27] In 1938 at the Colloque Walter Lippmann, the term “neoliberalism” was proposed, among other terms, and ultimately chosen to be used to describe a certain set of economic beliefs.[23]:1213[28] The colloquium defined the concept of neoliberalism as involving “the priority of the price mechanism, free enterprise, the system of competition, and a strong and impartial state”.[23]:1314 To be “neoliberal” meant advocating a modern economic policy with state intervention.[23]:48 Neoliberal state interventionism brought a clash with the opposing laissez-faire camp of classical liberals, like Ludwig von Mises.[29] Most scholars in the 1950s and 1960s understood neoliberalism as referring to the social market economy and its principal economic theorists such as Eucken, Rpke, Rstow and Mller-Armack. Although Hayek had intellectual ties to the German neoliberals, his name was only occasionally mentioned in conjunction with neoliberalism during this period due to his more pro-free market stance.[30]

During the military rule under Augusto Pinochet (19731990) in Chile, opposition scholars took up the expression to describe the economic reforms implemented there and its proponents (the “Chicago Boys”).[5] Once this new meaning was established among Spanish-speaking scholars, it diffused into the English-language study of political economy.[5] According to one study of 148 scholarly articles, neoliberalism is almost never defined but used in several senses to describe ideology, economic theory, development theory, or economic reform policy. It has largely become a term of condemnation employed by critics and suggests a market fundamentalism closer to the laissez-faire principles of the paleoliberals[who?] than to the ideas of those who originally attended the colloquium. This leaves some controversy as to the precise meaning of the term and its usefulness as a descriptor in the social sciences, especially as the number of different kinds of market economies have proliferated in recent years.[5]

Another center-left movement from modern American liberalism that used the term “neoliberalism” to describe its ideology formed in the United States in the 1970s. According to David Brooks, prominent neoliberal politicians included Al Gore and Bill Clinton of the Democratic Party of the United States.[31] The neoliberals coalesced around two magazines, The New Republic and the Washington Monthly.[32] The “godfather” of this version of neoliberalism was the journalist Charles Peters,[33] who in 1983 published “A Neoliberal’s Manifesto”.[34]

Shermer argued that the term gained popularity largely among left-leaning academics in the 1970s “to describe and decry a late twentieth-century effort by policy makers, think-tank experts, and industrialists to condemn social-democratic reforms and unapologetically implement free-market policies”.[35] Neoliberal theory argues that a free market will allow efficiency, economic growth, income distribution, and technological progress to occur. Any state intervention to encourage these phenomena will worsen economic performance.[36]

At a base level we can say that when we make reference to ‘neoliberalism’, we are generally referring to the new political, economic and social arrangements within society that emphasize market relations, re-tasking the role of the state, and individual responsibility. Most scholars tend to agree that neoliberalism is broadly defined as the extension of competitive markets into all areas of life, including the economy, politics and society.

The Handbook of Neoliberalism[4]

According to some scholars, neoliberalism is commonly used as a catchphrase and pejorative term, outpacing similar terms such as monetarism, neoconservatism, the Washington Consensus and “market reform” in much scholarly writing,[5] The term has been criticized,[37][38] particularly by those who often advocate for policies characterized as neoliberal.[39]:74 Historian Daniel Stedman Jones says the term “is too often used as a catch-all shorthand for the horrors associated with globalization and recurring financial crises”.[40]:2 The Handbook of Neoliberalism posits that the term has “become a means of identifying a seemingly ubiquitous set of market-oriented policies as being largely responsible for a wide range of social, political, ecological and economic problems”. Yet the handbook argues to view the term as merely a pejorative or “radical political slogan” is to “reduce its capacity as an analytic frame. If neoliberalism is to serve as a way of understanding the transformation of society over the last few decades then the concept is in need of unpacking”.[4] Currently, neoliberalism is most commonly used to refer to market-oriented reform policies such as “eliminating price controls, deregulating capital markets, lowering trade barriers” and reducing state influence on the economy, especially through privatization and austerity.[5] Other scholars note that neoliberalism is associated with the economic policies introduced by Margaret Thatcher in the United Kingdom and Ronald Reagan in the United States.[6]

There are several distinct usages of the term that can be identified:

Sociologists Block and Somers claim there is a dispute over what to call the influence of free market ideas which have been used to justify the retrenchment of New Deal programs and policies over the last thirty years: neoliberalism, laissez-faire or “free market ideology”.[41] Others, such as Braedley and Luxton, assert that neoliberalism is a political philosophy which seeks to “liberate” the processes of capital accumulation.[42] In contrast, Piven sees neoliberalism as essentially hyper-capitalism.[43] However, Robert W. McChesney, while defining it as “capitalism with the gloves off”, goes on to assert that the term is largely unknown by the general public, particularly in the United States.[44]:78 Lester Spence uses the term to critique trends in Black politics, defining neoliberalism as “the general idea that society works best when the people and the institutions within it work or are shaped to work according to market principles”.[45] According to Philip Mirowski, neoliberalism views the market as the greatest information processor superior to any human being. It is hence considered as the arbiter of truth. Neoliberalism is distinct from liberalism insofar as it does not advocate laissez-faire economic policy but instead is highly constructivist and advocates a strong state to bring about market-like reforms in every aspect of society.[46]

The worldwide Great Depression of the 1930s brought about high unemployment and widespread poverty and was widely regarded as a failure of economic liberalism. To renew liberalism, a group of 25 intellectuals organised the Walter Lippmann Colloquium at Paris in August 1938. It brought together Louis Rougier, Walter Lippmann, Friedrich von Hayek, Ludwig von Mises, Wilhelm Rpke and Alexander Rstow among others. Most agreed that the liberalism of laissez-faire had failed and that a new liberalism needed to take its place with a major role for the state. Mises and Hayek refused to condemn laissez-faire, but all participants were united in their call for a new project they dubbed “neoliberalism”.[48]:1819 They agreed to develop the Colloquium into a permanent think tank called Centre International dtudes pour la Rnovation du Libralisme based in Paris.

Deep disagreements in the group separated “true (third way) neoliberals” around Rstow and Lippmann on the one hand and old school liberals around Mises and Hayek on the other. The first group wanted a strong state to supervise, while the second insisted that the only legitimate role for the state was to abolish barriers to market entry. Rstow wrote that Hayek and Mises were relics of the liberalism that caused the Great Depression. Mises denounced the other faction, complaining that ordoliberalism really meant “ordo-interventionism”.[48]:1920

Neoliberalism began accelerating in importance with the founding of the Mont Pelerin Society in 1947 by Friedrich Hayek. The Colloque Walter Lippmann was largely forgotten.[49] The new society brought together the widely scattered free market thinkers and political figures.

Hayek and others believed that classical liberalism had failed because of crippling conceptual flaws and that the only way to diagnose and rectify them was to withdraw into an intensive discussion group of similarly minded intellectuals.[23]:16

With central planning in the ascendancy worldwide and few avenues to influence policymakers, the society served to bring together isolated advocates of liberalism as a “rallying point” as Milton Friedman phrased it. Meeting annually, it would soon be a “kind of international ‘who’s who’ of the classical liberal and neo-liberal intellectuals.”[50] While the first conference in 1947 was almost half American, the Europeans concentration dominated by 1951. Europe would remain the epicenter of the community with Europeans dominating the leadership.[23]:1617

In the 1960s, Latin American intellectuals began to notice the ideas of ordoliberalism; these intellectuals often used the Spanish term “neoliberalismo” to refer to this school of thought. They were particularly impressed by the social market economy and the Wirtschaftswunder (“economic miracle”) in Germany and speculated about the possibility of accomplishing similar policies in their own countries. Neoliberalism in 1960s meant essentially a philosophy that was more moderate than classical liberalism and favored using state policy to temper social inequality and counter a tendency toward monopoly.[5]

In 1976, the military dictatorship’s economic plan led by Martnez de Hoz was the first attempt at a neoliberalist plan in Argentina. They implemented a fiscal austerity plan, whose goal was to reduce money printing and thus inflation. In order to achieve this, salaries were frozen, but they were unable to reduce inflation, which led to a drop in the real salary of the working class. Aiming for a free market, they also decided to open the country’s borders, so that foreign goods could freely enter the country. Argentina’s industry, which had been on the rise for the last 20 years since Frondizi’s economic plan, rapidly declined, because it wasn’t able to compete with foreign goods. Finally, the deregulation of the financial sector, gave a short-term growth, but then rapidly fell apart when capital fled to the United States in the Reagan years.[citation needed] Following the measures, there was an increase in poverty from 9% in 1975 to 40% at the end of 1982.[51]

From 1989 to 2001, another neoliberalist plan was attempted by Domingo Cavallo. This time, the privatization of public services was the main objective of the government; although financial deregulation and open borders to foreign goods were also re-implemented. While some privatizations were welcomed, the majority of them were criticized for not being in the people’s best interests. Along with an increased labour market flexibility, the final result of this plan was an unemployment rate of 25% and 60% of people living under the poverty line, alongside 33 people killed by the police in protests that ended up with the president, Fernando de la Ra, resigning two years before his term as president was completed.[citation needed]

In Australia, neoliberal economic policies (known at the time as “economic rationalism”[52] or “economic fundamentalism”) were embraced by governments of both the Labor Party and the Liberal Party since the 1980s. The governments of Bob Hawke and Paul Keating from 1983 to 1996 pursued economic liberalisation and a program of micro-economic reform. These governments privatized government corporations, deregulated factor markets, floated the Australian dollar and reduced trade protection.[53]

Keating, as federal treasurer, implemented a compulsory superannuation guarantee system in 1992 to increase national savings and reduce future government liability for old age pensions.[54] The financing of universities was deregulated, requiring students to contribute to university fees through a repayable loan system known as the Higher Education Contribution Scheme (HECS) and encouraging universities to increase income by admitting full-fee-paying students, including foreign students.[55] The admitting of domestic full fee paying students to public universities was stopped in 2009 by the Rudd Labor government.[56]

In 1955, a select group of Chilean students (later known as the Chicago Boys) were invited to the University of Chicago to pursue postgraduate studies in economics. They worked directly under Friedman and his disciple, Arnold Harberger, while also being exposed to Hayek. When they returned to Chile in the 1960s, they began a concerted effort to spread the philosophy and policy recommendations of the Chicago and Austrian schools, setting up think tanks and publishing in ideologically sympathetic media. Under the military dictatorship headed by Pinochet and severe social repression, the Chicago boys implemented radical economic reform. The latter half of the 1970s witnessed rapid and extensive privatization, deregulation and reductions in trade barriers. In 1978, policies that would reduce the role of the state and infuse competition and individualism into areas such as labor relations, pensions, health and education were introduced.[5] These policies resulted in widening inequality as they negatively impacted the wages, benefits and working conditions of Chile’s working class.[51][59] According to Chilean economist Alejandro Foxley, by the end of Pinochet’s reign around 44% of Chilean families were living below the poverty line.[60] According to Klien, by the late 1980s the economy had stabilized and was growing, but around 45% of the population had fallen into poverty while the wealthiest 10% saw their incomes rise by 83%.[61]

In 1990, the military dictatorship ended. Hayek argued that increased economic freedom had put pressure on the dictatorship over time and increased political freedom. Years earlier, he argued that “economic control is not merely control of a sector of human life which can be separated from the rest; it is the control of the means for all our ends”.[62] The Chilean scholars Martnez and Daz rejected this argument, pointing to the long tradition of democracy in Chile. The return of democracy required the defeat of the Pinochet regime, though it had been fundamental in saving capitalism. The essential contribution came from profound mass rebellions and finally, old party elites using old institutional mechanisms to bring back democracy.[63]

Neoliberal ideas were first implemented in West Germany. The economists around Ludwig Erhard drew on the theories they had developed in the 1930s and 1940s and contributed to West Germanys reconstruction after the Second World War.[64] Erhard was a member of the Mont Pelerin Society and in constant contact with other neoliberals. He pointed out that he is commonly classified as neoliberal and that he accepted this classification.[65]

The ordoliberal Freiburg School was more pragmatic. The German neoliberals accepted the classical liberal notion that competition drives economic prosperity, but they argued that a laissez-faire state policy stifles competition as the strong devour the weak since monopolies and cartels could pose a threat to freedom of competition. They supported the creation of a well-developed legal system and capable regulatory apparatus. While still opposed to full-scale Keynesian employment policies or an extensive welfare state, German neoliberal theory was marked by the willingness to place humanistic and social values on par with economic efficiency. Alfred Mller-Armack coined the phrase “social market economy” to emphasize the egalitarian and humanistic bent of the idea.[5] According to Boas and Gans-Morse, Walter Eucken stated that “social security and social justice are the greatest concerns of our time”.[5]

Erhard emphasized that the market was inherently social and did not need to be made so.[48] He hoped that growing prosperity would enable the population to manage much of their social security by self-reliance and end the necessity for a widespread welfare state. By the name of Volkskapitalismus, there were some efforts to foster private savings. However, although average contributions to the public old age insurance were quite small, it remained by far the most important old age income source for a majority of the German population, therefore despite liberal rhetoric the 1950s witnessed what has been called a “reluctant expansion of the welfare state”. To end widespread poverty among the elderly the pension reform of 1957 brought a significant extension of the German welfare state which already had been established under Otto von Bismarck.[66] Rstow, who had coined the label “neoliberalism”, criticized that development tendency and pressed for a more limited welfare program.[48]

Hayek did not like the expression “social market economy”, but stated in 1976 that some of his friends in Germany had succeeded in implementing the sort of social order for which he was pleading while using that phrase. However, in Hayek’s view the social market economy’s aiming for both a market economy and social justice was a muddle of inconsistent aims.[67] Despite his controversies with the German neoliberals at the Mont Pelerin Society, Ludwig von Mises stated that Erhard and Mller-Armack accomplished a great act of liberalism to restore the German economy and called this “a lesson for the US”.[68] However, according to different research Mises believed that the ordoliberals were hardly better than socialists. As an answer to Hans Hellwig’s complaints about the interventionist excesses of the Erhard ministry and the ordoliberals, Mises wrote: “I have no illusions about the true character of the politics and politicians of the social market economy”. According to Mises, Erhard’s teacher Franz Oppenheimer “taught more or less the New Frontier line of” President Kennedy’s “Harvard consultants (Schlesinger, Galbraith, etc.)”.[69]

In Germany, neoliberalism at first was synonymous with both ordoliberalism and social market economy. But over time the original term neoliberalism gradually disappeared since social market economy was a much more positive term and fit better into the Wirtschaftswunder (economic miracle) mentality of the 1950s and 1960s.[48]

Following the death of Mao Zedong, Deng Xiaoping led the country through far ranging market centered reforms, with the slogan of Xiokng, that combined neoliberalism with centralized authoritarianism. These focused on agriculture, industry, education and science/defense.[70]

During her tenure as Prime Minister, Margaret Thatcher oversaw a number of neoliberal reforms including tax reduction, reforming exchange rates, deregulation and privatization.[71] These reforms were continued and supported by her successor John Major and although opposed by the Labour Party at the time they were largely left unaltered when the latter came to power in 1997. Instead, the Labour government under Tony Blair finished off a variety of uncompleted privatisation and deregulation measures.[72]

The Adam Smith Institute, a United Kingdom-based free market think tank and lobbying group formed in 1977 and a major driver of the aforementioned neoliberal reforms,[73] officially changed its libertarian label to neoliberal in October 2016.[74]

David Harvey traces the rise of neoliberalism in the United States to Lewis Powell’s 1971 confidential memorandum to the Chamber of Commerce.[70]:43 A call to arms to the business community to counter criticism of the free enterprise system, it was a significant factor in the rise of conservative organizations and think-tanks which advocated for neoliberal policies, such as the Business Roundtable, The Heritage Foundation, the Cato Institute, Citizens for a Sound Economy, Accuracy in Academia and the Manhattan Institute for Policy Research. For Powell, universities were becoming an ideological battleground, and he recommended the establishment of an intellectual infrastructure to serve as a counterweight to the increasingly popular ideas of Ralph Nader and other opponents of big business.[75][76][77] On the left, neoliberal ideas were developed and widely popularized by John Kenneth Galbraith while the Chicago School ideas were advanced and repackaged into a progressive, leftist perspective in Lester Thurow’s influential 1980 book “The Zero-Sum Society”.[78]

Early roots of neoliberalism were laid in the 1970s during the Carter administration, with deregulation of the trucking, banking and airline industries.[79][80][81] This trend continued into the 1980s under the Reagan administration, which included tax cuts, increased defense spending, financial deregulation and trade deficit expansion.[82] Likewise, concepts of supply-side economics, discussed by the Democrats in the 1970s, culminated in the 1980 Joint Economic Committee report “Plugging in the Supply Side”. This was picked up and advanced by the Reagan administration, with Congress following Reagan’s basic proposal and cutting federal income taxes across the board by 25% in 1981.[83]

During the 1990s, the Clinton administration also embraced neoliberalism[72] by supporting the passage of the North American Free Trade Agreement (NAFTA), continuing the deregulation of the financial sector through passage of the Commodity Futures Modernization Act and the repeal of the GlassSteagall Act and implementing cuts to the welfare state through passage of the Personal Responsibility and Work Opportunity Act.[82][84][85] The neoliberalism of the Clinton administration differs from that of Reagan as the Clinton administration purged neoliberalism of neoconservative positions on militarism, family values, opposition to multiculturalism and neglect of ecological issues.[71]:5051[disputed discuss] Writing in New York, journalist Jonathan Chait disputed accusations that the Democratic Party had been hijacked by neoliberals, saying that its policies have largely stayed the same since the New Deal. Instead, Chait suggested this came from arguments that presented a false dichotomy between free market economics and socialism, ignoring mixed economies.[86] Historian Walter Scheidel says that both parties shifted to promote free market capitalism in the 1970s, with the Democratic Party being “instrumental in implementing financial deregulation in the 1990s”.[87]

The Austrian School is a school of economic thought which bases its study of economic phenomena on the interpretation and analysis of the purposeful actions of individuals.[88][89][90] It derives its name from its origin in late-19th and early-20th century Vienna with the work of Carl Menger, Eugen von Bhm-Bawerk, Friedrich von Wieser and others.[91]

Among the contributions of the Austrian School to economic theory are the subjective theory of value, marginalism in price theory and the formulation of the economic calculation problem.[92] Many theories developed by “first wave” Austrian economists have been absorbed into most mainstream schools of economics. These include Carl Menger’s theories on marginal utility, Friedrich von Wieser’s theories on opportunity cost and Eugen von Bhm-Bawerk’s theories on time preference as well as Menger and Bhm-Bawerk’s criticisms of Marxian economics. The Austrian School follows an approach, termed methodological individualism, a version of which was codified by Ludwig von Mises and termed “praxeology” in his book published in English as Human Action in 1949.[93]

The former Federal Reserve Chairman Alan Greenspan, speaking of the originators of the School, said in 2000 that “the Austrian School have reached far into the future from when most of them practiced and have had a profound and, in my judgment, probably an irreversible effect on how most mainstream economists think in this country”.[94] In 1987, Nobel laureate James M. Buchanan told an interviewer: “I have no objections to being called an Austrian. Hayek and Mises might consider me an Austrian but, surely some of the others would not”.[95] Republican Congressman Ron Paul stated that he adheres to Austrian School economics and has authored six books which refer to the subject.[96][97] Paul’s former economic adviser, investment dealer Peter Schiff,[98] also calls himself an adherent of the Austrian School.[99] Jim Rogers, investor and financial commentator, also considers himself of the Austrian School of economics.[100] Chinese economist Zhang Weiying, who is known in China for his advocacy of free market reforms, supports some Austrian theories such as the Austrian theory of the business cycle.[101]

The Chicago school of economics describes a neoclassical school of thought within the academic community of economists, with a strong focus around the faculty of University of Chicago. Chicago macroeconomic theory rejected Keynesianism in favor of monetarism until the mid-1970s, when it turned to new classical macroeconomics heavily based on the concept of rational expectations.[102] The school is strongly associated with economists such as Milton Friedman, George Stigler, Ronald Coase and Gary Becker.[103]

The school emphasizes non-intervention from government and generally rejects regulation in markets as inefficient with the exception of central bank regulation of the money supply (i.e. monetarism). Although the school’s association with neoliberalism is sometimes resisted by its proponents,[102] its emphasis on reduced government intervention in the economy and a laissez-faire ideology have brought about an affiliation between the Chicago school and neoliberal economics.[12][104]

In The Road to Serfdom, Hayek has argued: “Economic control is not merely control of a sector of human life which can be separated from the rest; it is the control of the means for all our ends.”[62]

Later in his book Capitalism and Freedom (1962), Friedman developed the argument that economic freedom, while itself an extremely important component of total freedom, is also a necessary condition for political freedom. He commented that centralized control of economic activities was always accompanied with political repression.

In his view, the voluntary character of all transactions in an unregulated market economy and wide diversity that it permits are fundamental threats to repressive political leaders and greatly diminish power to coerce. Through elimination of centralized control of economic activities, economic power is separated from political power and the one can serve as counterbalance to the other. Friedman feels that competitive capitalism is especially important to minority groups since impersonal market forces protect people from discrimination in their economic activities for reasons unrelated to their productivity.[105]

Amplifying Friedman’s argument, it has often been pointed out that increasing economic freedoms tend to raise expectations on political freedoms, eventually leading to democracy. Other scholars see the existence of non-democratic yet market-liberal regimes and the undermining of democratic control by market processes as strong evidence that such a general, ahistorical nexus cannot be upheld.[106] Contemporary discussion on the relationship between neoliberalism and democracy shifted to a more historical perspective, studying extent and circumstances of how much the two are mutually dependent, contradictory or incompatible.

Stanley Fish argues that neoliberalization of academic life may promote a narrower and in his opinion more accurate definition of academic freedom “as the freedom to do the academic job, not the freedom to expand it to the point where its goals are infinite”. What Fish urges is “not an inability to take political stands, but a refraining from doing so in the name of academic responsibility”.[107]

Neoliberalism has received criticism both from the political left as well as the right,[108] in addition to myriad activists and academics.[109]

Much of the literature in support of neoliberalism relies on the idea that neoliberal market logic improves a very narrow monetized conception of performance, which is not necessarily the best approach. This focus on economic efficiency can compromise other, perhaps more important factors. Anthropologist Mark Fleming argues that when the performance of a transit system is assessed purely in terms of economic efficiency, social goods such as strong workers’ rights are considered impediments to maximum performance, which given the monetization of time means timely premium rapid networks.[110] Using the case study of the San Francisco Muni, Fleming shows that neoliberal worldview has resulted in vicious attacks on the drivers’ union, for example through the setting of impossible schedules so that drivers are necessarily late and through brutal public smear campaigns. This ultimately resulted in the passing of Proposition G, which severely undermined the powers of the Muni drivers’ union. Workers’ rights are by no means the only victims of the neoliberal focus on economic efficiency as it is important to recognize that this vision and metric of performance judgment de-emphasizes literally every public good that is not conventionally monetized. For example, the geographers Birch and Siemiatycki contend that the growth of marketization ideology has shifted discourse such that it focuses on monetary rather than social objectives, making it harder to justify public goods driven by equity, environmental concerns and social justice.[111]

David Harvey described neoliberalism as a class project, designed to impose class on society through liberalism.[112] Economists Grard Dumnil and Dominique Lvy posit that “the restoration and increase of the power, income, and wealth of the upper classes” are the primary objectives of the neoliberal agenda.[113] Economist David M. Kotz contends that neoliberalism “is based on the thorough domination of labor by capital”.[39]:43 The emergence of the “precariat”, a new class facing acute socio-economic insecurity and alienation, has been attributed to the globalization of neoliberalism.[114]

Sociologist Thomas Volscho argues that the imposition of neoliberalism in the United States arose from a conscious political mobilization by capitalist elites in the 1970s who faced two crises: the legitimacy of capitalism and a falling rate of profitability in industry. Various neoliberal ideologies (such as monetarism and supply-side economics) had been long advanced by elites, translated into policies by the Reagan administration and ultimately resulted in less governmental regulation and a shift from a tax-financed state to a debt-financed one. While the profitability of industry and the rate of economic growth never recovered to the heyday of the 1960s, the political and economic power of Wall Street and finance capital vastly increased due to the debt-financing of the state.[115]

The invisible hand of the market and the iron fist of the state combine and complement each other to make the lower classes accept desocialized wage labor and the social instability it brings in its wake. After a long eclipse, the prison thus returns to the frontline of institutions entrusted with maintaining the social order.

Loc Wacquant[116]

Several scholars have linked the rise of neoliberalism to unprecedented levels of mass incarceration of the poor in the United States.[2]:3, 346[117][118][119][120] Sociologist Loc Wacquant argues that neoliberal policy for dealing with social instability among economically marginalized populations following the implementation of other neoliberal policies, which have allowed for the retrenchment of the social welfare state and the rise of punitive workfare and have increased gentrification of urban areas, privatization of public functions, the shrinking of collective protections for the working class via economic deregulation and the rise of underpaid, precarious wage labor, is the criminalization of poverty and mass incarceration.[118]:5354[121] By contrast, it is extremely lenient in dealing with those in the upper echelons of society, in particular when it comes to economic crimes of the privileged classes and corporations such as fraud, embezzlement, insider trading, credit and insurance fraud, money laundering and violation of commerce and labor codes.[118][122] According to Wacquant, neoliberalism does not shrink government, but instead sets up a “centaur state” with little governmental oversight for those at the top and strict control of those at the bottom.[118][123]

In expanding upon Wacquant’s thesis, sociologist and political economist John L. Campbell of Dartmouth College suggests that through privatization, the prison system exemplifies the centaur state:

On the one hand, it punishes the lower class, which populates the prisons; on the other hand, it profits the upper class, which owns the prisons, and it employs the middle class, which runs them.

In addition, he says the prison system benefits corporations through outsourcing as the inmates are “slowly becoming a source of low-wage labor for some US corporations”. Both through privatization and outsourcing, Campbell argues, the penal state reflects neoliberalism.[126]:61 Campbell also argues that while neoliberalism in the United States established a penal state for the poor, it also put into place a debtor state for the middle class and that “both have had perverse effects on their respective targets: increasing rates of incarceration among the lower class and increasing rates of indebtednessand recently home foreclosureamong the middle class”-[126]:68

David McNally, Professor of Political Science at York University, argues that while expenditures on social welfare programs have been cut, expenditures on prison construction have increased significantly during the neoliberal era, with California having “the largest prison-building program in the history of the world”.[127] The scholar Bernard Harcourt contends the neoliberal concept that the state is inept when it comes to economic regulation, but efficient in policing and punishing “has facilitated the slide to mass incarceration.[128] Both Wacquant and Harcourt refer to this phenomenon as “Neoliberal Penality”.[129][130]

The effect of neoliberalism on global health, particularly the aspect of international aid, involves key players such as non-governmental organizations (NGOs), the International Monetary Fund (IMF) and the World Bank. According to James Pfeiffer,[131] neoliberal emphasis has been placed on free markets and privatization which has been tied to the “new policy agenda” in which NGOs are seen as being able to provide better social welfare than governments. International NGOs have been promoted to fill holes in public services created by the World Bank and IMF through their promotion of Structural Adjustment Programs (SAPs) which reduce government health spending and which Pfeiffer criticized as unsustainable. The reduced health spending and the gain of the public health sector by NGOs causes the local health system to become fragmented, undermines local control of health programs and contributes to local social inequality between NGO workers and local individuals.[132]

In 2016, researchers for the IMF released a paper entitled “Neoliberalism: Oversold?”, which stated:

There is much to cheer in the neoliberal agenda. The expansion of global trade has rescued millions from abject poverty. Foreign direct investment has often been a way to transfer technology and know-how to developing economies. Privatization of state-owned enterprises has in many instances led to more efficient provision of services and lowered the fiscal burden on governments.

However, it was also critical of some neoliberal policies, such as freedom of capital and fiscal consolidation for “increasing inequality, in turn jeopardizing durable expansion”.[133] The authors also note that some neoliberal policies are to blame for financial crises around the world growing bigger and more damaging.[134] The report contends the implementation of neoliberal policies by economic and political elites has led to “three disquieting conclusions”:

Writing in The Guardian, Stephen Metcalf posits that the IMF paper helps “put to rest the idea that the word is nothing more than a political slur, or a term without any analytic power”.[136]

The IMF has itself been criticized for its neoliberal policies.[137][138] Rajesh Makwana writes that “the World Bank and IMF, are major exponents of the neoliberal agenda”.[139] Sheldon Richman, editor of the libertarian journal The Freeman, also sees the IMF imposing “corporatist-flavored ‘neoliberalism’ on the troubled countries of the world”. The policies of spending cuts coupled with tax increases give “real market reform a bad name and set back the cause of genuine liberalism”. Paternalistic supranational bureaucrats foster “long-term dependency, perpetual indebtedness, moral hazard, and politicization, while discrediting market reform and forestalling revolutionary liberal change”.[140]

Rowden wrote that the IMFs monetarist approach towards prioritising price stability (low inflation) and fiscal restraint (low budget deficits) was unnecessarily restrictive and has prevented developing countries from scaling up long-term investment in public health infrastructure, resulting in chronically underfunded public health systems, demoralising working conditions that have fueled a “brain drain” of medical personnel and the undermining of public health and the fight against HIV/AIDS in developing countries.[141]

The implementation of neoliberal policies and the acceptance of neoliberal economic theories in the 1970s are seen by some academics as the root of financialization, with the financial crisis of 20072008 as one of the ultimate results.[142][42][143][144][39][145]

Nicolas Firzli has argued that the rise of neoliberalism eroded the post-war consensus and Eisenhower-era Republican centrism that had resulted in the massive allocation of public capital to large-scale infrastructure projects throughout the 1950s, 1960s and 1970s in both Western Europe and North America: “In the pre-Reagan era, infrastructure was an apolitical, positively connoted, technocratic term shared by mainstream economists and policy makers [] including President Eisenhower, a praetorian Republican leader who had championed investment in the Interstate Highway System, Americas national road grid [] But Reagan, Thatcher, Delors and their many admirers amongst Clintonian, New Labour and EU Social-Democrat decision makers in Brussels sought to dismantle the generous state subsidies for social infrastructure and public transportation across the United States, Britain and the European Union”.[146]

Following Brexit, the 2016 United States presidential election and the progressive emergence of a new kind of “self-seeking capitalism” (“Trumponomics”) moving away to some extent from the neoliberal orthodoxies of the past, we may witness a “massive increase in infrastructure investment” in the United States, Britain and other advanced economies:[147][148]

“With the victory of Donald J. Trump on November 8, 2016, the neoliberal-neoconservative policy consensus that had crystalized in 19791980 (Deng Xiaopings visit to the United States, election of Reagan and Thatcher) finally came to an end […] The deliberate neglect of Americas creaking infrastructure assets (notably public transportation and water sanitation) from the early 1980s on eventually fueled a widespread popular discontent that came back to haunt both Hillary Clinton and the Republican establishment. Donald Trump was quick to seize on the issue to make a broader slap against the laissez-faire complacency of the federal government”.[149]

Others, such as Catherine Rottenberg, do not see Trump’s victory as an end to neoliberalism, but rather a new phase of it.[150]

Mark Arthur has written that the influence of neoliberalism has given rise to an “anti-corporatist” movement in opposition to it. This “anti-corporatist” movement is articulated around the need to re-claim the power that corporations and global institutions have stripped governments of. He says that Adam Smith’s “rules for mindful markets” served as a basis for the anti-corporate movement, “following government’s failure to restrain corporations from hurting or disturbing the happiness of the neighbor [Smith]”.[151]

Nicolas Firzli has argued that the neoliberal era was essentially defined by “the economic ideas of Milton Friedman, who wrote that ‘if anything is certain to destroy our free society, to undermine its very foundation, it would be a widespread acceptance by management of social responsibilities in some sense other than to make as much money as possible. This is a fundamentally subversive doctrine'”.[152] Firzli insists that prudent, fiduciary-driven long-term investors cannot ignore the environmental, social and corporate governance consequences of actions taken by the CEOs of the companies whose shares they hold as “the long-dominant Friedman stance is becoming culturally unacceptable and financially costly in the boardrooms of pension funds and industrial firms in Europe and North America”.[152]

Counterpoints to neoliberalism:

Instead of citizens, it produces consumers. Instead of communities, it produces shopping malls. The net result is an atomized society of disengaged individuals who feel demoralized and socially powerless.

Robert W. McChesney[44]:11

American scholar and cultural critic Henry Giroux alleges neoliberalism holds that market forces should organize every facet of society, including economic and social life; and promotes a social Darwinist ethic which elevates self-interest over social needs.[163][164][165]

According to the economists Howell and Diallo, neoliberal policies have contributed to a United States economy in which 30% of workers earn low wages (less than two-thirds the median wage for full-time workers) and 35% of the labor force is underemployed as only 40% of the working-age population in the country is adequately employed.[166]

The Center for Economic Policy Research’s (CEPR) Dean Baker (2006) argued that the driving force behind rising inequality in the United States has been a series of deliberate, neoliberal policy choices including anti-inflationary bias, anti-unionism and profiteering in the health industry.[167] However, countries have applied neoliberal policies at varying levels of intensityfor example, the OECD has calculated that only 6% of Swedish workers are beset with wages it considers low and that Swedish wages are overall lower.[168] Others argue that Sweden’s adoption of neoliberal reforms, in particular the privatization of public services and reduced state benefits, has resulted in income inequality growing faster in Sweden than any other OECD nation.[169][170]

The rise of anti-austerity parties in Europe and SYRIZA’s victory in the Greek legislative elections of January 2015 have some proclaiming “the end of neoliberalism”.[171]

Kristen R. Ghodsee, ethnographer and Professor of Russian and East European Studies at the University of Pennsylvania, asserts that the triumphalist attitudes of Western powers at the end of the Cold War and the fixation with linking all leftist political ideals with the excesses of Stalinism, permitted neoliberal, free market capitalism to fill the void, which undermined democratic institutions and reforms, leaving a trail of economic misery, unemployment and rising inequality throughout the former Eastern Bloc and much of the West in the following decades that has fueled the resurgence of extremist nationalisms in both the former and the latter.[172]

In Latin America, the “pink tide” that swept leftist governments into power at the turn of the millennium can be seen as a reaction against neoliberal hegemony and the notion that “there is no alternative” (TINA) to the Washington Consensus.[173]

Notable critics of neoliberalism in theory or practice include economists Joseph Stiglitz,[174] Amartya Sen, Michael Hudson,[175] Robert Pollin,[176] Julie Matthaei[177] and Richard D. Wolff;[158] linguist Noam Chomsky;[44] geographer and anthropologist David Harvey;[70] political activist and public intellectual Cornel West;[178] Marxist feminist Gail Dines;[179] author, activist and filmmaker Naomi Klein;[180] journalist and environmental activist George Monbiot;[181] Belgian psychologist Paul Verhaeghe;[182] journalist and activist Chris Hedges;[183] and the alter-globalization movement in general, including groups such as ATTAC. Critics of neoliberalism argue that not only is neoliberalism’s critique of socialism (as unfreedom) wrong, but neoliberalism cannot deliver the liberty that is supposed to be one of its strong points.

In protest against neoliberal globalization, South Korean farmer and former president of the Korean Advanced Farmers Federation Lee Kyung-hae committed suicide by stabbing himself in the heart during a meeting of the WTO in Cancun, Mexico in 2003. He was protesting against the decision of the South Korean government to reduce subsidies to farmers. Prior to his death, he expressed his concerns in broken English:[6]:96

My warning goes out to the all citizens that human beings are in an endangered situation that uncontrolled multinational corporations and a small number of bit WTO members officials are leading an undesirable globalization of inhuman, environment-distorting, farmer-killing, and undemocratic. It should be stopped immediately otherwise the failed logic of the neo-liberalism will perish the diversities of agriculture and disastrously to all human being.[6]:96[184]

Read the rest here:

Neoliberalism – Wikipedia

Atlas Shrugged: Part I (2011) – IMDb

Edit Storyline

It was great to be alive, once, but the world was perishing. Factories were shutting down, transportation was grinding to a halt, granaries were empty–and key people who had once kept it running were disappearing all over the country. As the lights winked out and the cities went cold, nothing was left to anyone but misery. No one knew how to stop it, no one understood why it was happening – except one woman, the operating executive of a once mighty transcontinental railroad, who suspects the answer may rest with a remarkable invention and the man who created it – a man who once said he would stop the motor of the world. Everything now depends on finding him and discovering the answer to the question on the lips of everyone as they whisper it in fear: Who *is* John Galt? Written byRobb

Taglines:Who is John Galt?

Budget:$20,000,000 (estimated)

Opening Weekend USA: $1,686,347,17 April 2011, Limited Release

Gross USA: $4,752,353

Runtime: 97 min

Aspect Ratio: 2.35 : 1

Read more from the original source:

Atlas Shrugged: Part I (2011) – IMDb

Atlas Shrugged: Part I – Wikipedia

Atlas Shrugged: Part I is a 2011 American political science fiction drama film directed by Paul Johansson. An adaptation of part of Ayn Rand’s controversial 1957 novel of the same name, the film is the first in a trilogy encompassing the entire book. After various treatments and proposals floundered for nearly 40 years,[4] investor John Aglialoro initiated production in June 2010. The film was directed by Paul Johansson and stars Taylor Schilling as Dagny Taggart and Grant Bowler as Hank Rearden.

The film begins the story of Atlas Shrugged, set in a dystopian United States where John Galt leads innovators, from industrialists to artists, in a capital strike, “stopping the motor of the world” to reassert the importance of the free use of one’s mind and of laissez-faire capitalism.[5]

A sequel film, Atlas Shrugged: Part II was released on October 12, 2012. The third part in the series, Atlas Shrugged Part III: Who Is John Galt? was released on September 12, 2014.[6]

In 2016, the United States is in a sustained economic depression. Industrial disasters, resource shortages, and gasoline prices at $37 per gallon have made railroads the primary mode of transportation, but even they are in disrepair. After a major accident on the Rio Norte line of the Taggart Transcontinental railroad, CEO James Taggart shirks responsibility. His sister Dagny Taggart, Vice-President in Charge of Operation, defies him by replacing the aging track with new rails made of Rearden Metal, which is claimed to be lighter yet stronger than steel. Dagny, working for the sole purpose of fixing the rail, allows James Taggart to accept the appriasal of the major success. Dagny meets with its inventor, Hank Rearden, and they negotiate a deal they both admit serves their respective self-interests.

Politician Wesley Mouchnominally Rearden’s lobbyist in Washington, D.C.is part of a crowd that views heads of industry as persons who must be broken or tamed. James Taggart uses political influence to ensure that Taggart Transcontinental is designated the exclusive railroad for the state of Colorado. Dagny is confronted by Ellis Wyatt, a Colorado oil man angry to be forced to do business with Taggart Transcontinental. Dagny promises him that he will get the service he needs. Dagny encounters former lover Francisco d’Anconia, who presents a faade of a playboy grown bored with the pursuit of money. He reveals that a series of copper mines he built are worthless, costing his investors (including the Taggart railroad) millions.

Rearden lives in a magnificent home with a wife and a brother who are happy to live off his effort, though they overtly disrespect it. Rearden’s anniversary gift to his wife Lillian is a bracelet made from the first batch of Rearden Metal, but she considers it a garish symbol of Hank’s egotism. At a dinner party, Dagny dares Lillian to exchange it for Dagny’s diamond necklace, which she does.

As Dagny and Rearden rebuild the Rio Norte line, talented people quit their jobs and refuse all inducements to stay. Meanwhile, Dr. Robert Stadler of the State Science Institute puts out a report implying that Rearden Metal is dangerous. Taggart Transcontinental stock plummets because of its use of Rearden Metal, and Dagny leaves Taggart Transcontinental temporarily and forms her own company to finish the Rio Norte line. She renames it the John Galt Line, in defiance of the phrase “Who is John Galt?”which has come to stand for any question to which it is pointless to seek an answer.

A new law forces Rearden to sell most of his businesses, but he retains Rearden Steel for the sake of his metal and to finish the John Galt Line. Despite strong government and union opposition to Rearden Metal, Dagny and Rearden complete the line ahead of schedule and successfully test it on a record-setting run to Wyatt’s oil fields in Colorado. At the home of Wyatt, now a close friend, Dagny and Rearden celebrate the success of the line. As Dagny and Rearden continue their celebration into the night by fulfilling their growing sexual attraction, the shadowy figure responsible for the disappearances of prominent people visits Wyatt with an offer for a better society based on personal achievement.

The next morning, Dagny and Rearden begin investigating an abandoned prototype of an advanced motor that could revolutionize the world. They realize the genius of the motor’s creator and try to track him down. Dagny finds Dr. Hugh Akston, working as a cook at a diner, but he is not willing to reveal the identity of the inventor; Akston knows whom Dagny is seeking and says she will never find him, though he may find her.

Another new law limits rail freight and levies a special tax on Colorado. It is the final straw for Ellis Wyatt. When Dagny hears that Wyatt’s oil fields are on fire, she rushes to the scene of the fire where she finds a handwritten sign nailed to the wall that reads “I am leaving it as I found it. Take over. It’s yours.”

Wyatt declares in an answering machine message that he is “on strike”.

In 1972, Albert S. Ruddy approached Rand to produce a cinematic adaptation of Atlas Shrugged. Rand agreed that Ruddy could focus on the love story. “That’s all it ever was,” Rand said.[9][10][11] Rand insisted on having final script approval, which Ruddy refused to give her, thus preventing a deal. In 1978, Henry and Michael Jaffe negotiated a deal for an eight-hour Atlas Shrugged television miniseries on NBC. Jaffe hired screenwriter Stirling Silliphant to adapt the novel and he obtained approval from Rand on the final script. However, in 1979, with Fred Silverman’s rise as president of NBC, the project was scrapped.[12]

Rand, a former Hollywood screenwriter herself, began writing her own screenplay, but died in 1982 with only one third of it finished. She left her estate, including the film rights to Atlas Shrugged, to her student Leonard Peikoff, who sold an option to Michael Jaffe and Ed Snider. Peikoff would not approve the script they wrote and the deal fell through. In 1992, investor John Aglialoro bought an option to produce the film, paying Peikoff over $1 million for full creative control.[12]

In 1999, under Aglialoro’s sponsorship, Ruddy negotiated a deal with Turner Network Television for a four-hour miniseries, but the project was killed after the AOL Time Warner merger. After the TNT deal fell through, Howard and Karen Baldwin, while running Phillip Anschutz’s Crusader Entertainment, obtained the rights. The Baldwins left Crusader, taking the rights to Atlas Shrugged with them, and formed Baldwin Entertainment Group in 2004. Michael Burns of Lions Gate Entertainment approached the Baldwins to fund and distribute Atlas Shrugged.[12] A two-part draft screenplay written by James V. Hart[13] was re-written into a 127page screenplay by Randall Wallace, with Vadim Perelman expected to direct.[14] Potential cast members for this production had included Angelina Jolie,[15] Charlize Theron,[16] Julia Roberts,[16] and Anne Hathaway.[16] Between 2009 and 2010, however, these deals came apart, including studio backing from Lions Gate, and therefore none of the stars mentioned above appear in the final film. Also, Wallace did not do the screenplay, and Perelman did not direct.[1][17] Aglialoro says producers have spent “something in the $20 million range” on the project over the last 18 years.[2]

In May 2010, Brian Patrick O’Toole and Aglialoro wrote a screenplay, intent on filming in June 2010. While initial rumors claimed that the films would have a “timeless” settingthe producers say Rand envisioned the story as occurring “the day after tomorrow”[18]the released film is set in late 2016. The writers were mindful of the desire of some fans for fidelity to the novel,[18] but gave some characters, such as Eddie Willers, short shrift and omitted others, such as the composer Richard Halley. The film is styled as a mystery, with black-and-white freeze frames as each innovator goes “missing”. However, Galt appears and speaks in the film, solving the mystery more clearly than in the first third of the novel.

Though director Johansson had been reported as playing the pivotal role of John Galt, he made it clear in an interview that with regard to who is John Galt in the film, the answer was, “Not me.”[7] He explained that his portrayal of the character would be limited to the first film as a silhouetted figure wearing a trenchcoat and fedora,[8] suggesting that another actor will be cast as Galt for the subsequent parts of the trilogy.

Though Stephen Polk was initially set to direct,[19] he was replaced by Paul Johansson nine days before filming was scheduled to begin. With the 18-year-long option to the films rights set to expire on June 15, 2010, producers Harmon Kaslow and Aglialoro began principal photography on June 13, 2010, thus allowing Aglialoro to retain the motion picture rights. Shooting took five weeks, and he says that the total production cost of the movie came in on a budget around US$10 million,[20] though Box Office Mojo lists the production cost as $20 million.[3]

Elia Cmiral composed the score for the film.[21] Peter Debruge wrote in Variety that “More ambitious sound design and score, rather than the low-key filler from composer Elia Cmiral and music supervisor Steve Weisberg, might have significantly boosted the pic’s limited scale.”[22]

Matt Kibbe, President of FreedomWorks[23]

The film had a very low marketing budget and was not marketed in conventional methods.[24] Prior to the film’s release on the politically symbolic date of Tax Day, the project was promoted throughout the Tea Party movement and affiliated organizations such as FreedomWorks.[23] The National Journal reported that FreedomWorks, the Tea Party-allied group headed by former House Majority Leader Dick Armey, (R-Texas), had been trying to get the movie opened in more theaters.[23] FreedomWorks also helped unveil the Atlas Shrugged movie trailer at the February 2011 Conservative Political Action Conference.[23] Additionally, it was reported that Tea Party groups across the country were plugging the movie trailer on their websites and Facebook pages.[23] Release of the movie was also covered and promoted by Fox News TV personalities John Stossel and Sean Hannity.[25][26]

The U.S. release of Atlas Shrugged: Part I opened on 300 screens on April 15, 2011, and made US$1,676,917 in its opening weekend, finishing in 14th place overall.[27] Producers announced expansion to 423 theaters several days after release and promised 1,000 theaters by the end of April,[28] but the release peaked at 465 screens. Ticket sales dropped off significantly in its second week of release, despite the addition of 165 screens; after six weeks, the film was showing on only 32 screens and total ticket sales had not crossed the $5 million mark, recouping less than a quarter of the production budget.[29]

Atlas Shrugged: Part I was released on DVD and Blu-ray Disc on November 8, 2011 by 20th Century Fox Home Entertainment.[30] More than 100,000 DVD inserts were recalled within days due to the jacket’s philosophically incorrect description of “Ayn Rand’s timeless novel of courage and self-sacrifice”.[31] As of April 2013, 247,044 DVDs had been sold, grossing $3,433,445.[32]

The film received overwhelmingly negative reviews. Rotten Tomatoes gives the film a score of 11% based on 47 reviews, with an average score of 3.6 out of 10. The site’s consensus was: “Passionate ideologues may find it compelling, but most filmgoers will find this low-budget adaptation of the Ayn Rand bestseller decidedly lacking.”[33] Metacritic gives the film a “generally unfavorable” rating of 28%, as determined by averaging 19 professional reviews.[34] Some commentators noted differences in film critics’ reactions from audience members’ reactions; from the latter group, the film received high scores even before the film was released.[35][36][37]

Let’s say you know the novel, you agree with Ayn Rand, you’re an objectivist or a libertarian, and you’ve been waiting eagerly for this movie. Man, are you going to get a letdown. It’s not enough that a movie agree with you, in however an incoherent and murky fashion. It would help if it were like, you know, entertaining?

Roger Ebert, Chicago Sun-Times, April 14, 2011[1]

Roger Ebert of the Chicago Sun-Times gave the film only one star, calling it “the most anticlimactic non-event since Geraldo Rivera broke into Al Capone’s vault.”[1] Columnist Cathy Young of The Boston Globe gave the film a negative review.[38] Chicago Tribune published a predominantly negative review, arguing that the film lacks Rand’s philosophical theme, while at the same time saying “the actors, none of them big names, are well-suited to the roles. The story has drive, color and mystery. It looks good on the screen.”[39] In the New York Post, Kyle Smith gave the film a mostly negative review, grading it at 2.5/4 stars, criticizing its “stilted dialogue and stern, unironic hectoring” and calling it “stiff in the joints”, but also adding that it “nevertheless contains a fire and a fury that makes it more compelling than the average mass-produced studio item.”[40]

Reviews in the conservative press were more mixed. American economist Mark Skousen praised the film, writing in Human Events, “The script is true to the philosophy of Ayn Rand’s novel.”[41] The Weekly Standard senior editor Fred Barnes noted that the film “gets Rand’s point across forcefully without too much pounding”, that it is “fast-paced” when compared with the original novel’s 1200-page length, and that it is “at least as relevant today as it was when the novel was published in 1957.”[42] Jack Hunter, contributing editor to The American Conservative, wrote, “If you ask the average film critic about the new movie adaptation of Ayn Rand’s Atlas Shrugged they will tell you it is a horrible movie. If you ask the average conservative or libertarian they will tell you it is a great movie. Objectively, it is a mediocre movie at best. Subjectively, it is one of the best mediocre movies you’ll ever see.”[43] In the National Post, Peter Foster credited the movie for the daunting job of fidelity to the novel, wryly suggested a plot rewrite along the lines of comparable current events, and concluded, “if it sinks without trace, its backers should at least be proud that they lost their own money.”[44]

The poor critical reception of Atlas Shrugged: Part I initially made Aglialoro reconsider his plans for the rest of the trilogy.[45] In an interview with The Hollywood Reporter, he said he was continuing with plans to produce Part II and Part III for release on April 15 in 2012 and 2013, respectively.[46] In a later interview with The Boston Globe, Aglialoro was ambivalent: “I learned something long ago playing poker. If you think you’re beat[en], don’t go all in. If Part 1 makes [enough of] a return to support Part 2, I’ll do it. Other than that, I’ll throw the hand in.”[47]

In July 2011, Aglialoro planned to start production of Atlas Shrugged: Part II in September, with its release timed to coincide with the 2012 U.S. elections.[48] In October 2011, producer Harmon Kaslow stated that he hoped filming for Part II would begin in early 2012, “with hopes of previewing it around the time of the nominating conventions”. Kaslow anticipated that the film, which would encompass the second third of Atlas Shrugged, would “probably be 30 to 40 minutes longer than the first movie.” Kaslow also stated his intent that Part II would have a bigger production budget, as well as a larger advertising budget.[49]

On February 2, 2012, Kaslow and Aglialoro, the producers of Atlas Shrugged: Part II, announced a start date for principal photography in April 2012 with a release date of October 12, 2012.[50] Joining the production team was Duncan Scott, who, in 1986, was responsible for creating a new, re-edited version with English subtitles of the 1942 Italian film adaptation of We the Living. The first film’s entire cast was replaced for the sequel.

The sequel film, Atlas Shrugged: Part II, was released on October 12, 2012.[51] Critics gave the film a 5% rating on Rotten Tomatoes based on 22 reviews.[52] One reviewer gave the film a “D” rating,[53] while another reviewer gave the film a “1” rating (of 4).[54] In naming Part II to its list of 2012’s worst films, The A.V. Club said “The irony of Part II’s mere existence is rich enough: The free market is a religion for Rand acolytes, and it emphatically rejected Part I.”[55]

Visit link:

Atlas Shrugged: Part I – Wikipedia

Atlas Shrugged: (Centennial Edition) by Ayn Rand …

INTRODUCTIONby Leonard Peikoff

Ayn Rand is one of Americas favorite authors. In a recent Library of Congress/Book of the Month Club survey, American readers ranked Atlas Shruggedher masterworkas second only to the Bible in its influence on their lives. For decades, at scores of college campuses around the country, students have formed clubs to discuss the works of Ayn Rand. In 1998, the Oscar-nominated Ayn Rand: A Sense of Life, a documentary film about her life, played to sold-out venues throughout America and Canada. In recognition of her enduring popularity, the United States Postal Service in 1999 issued an Ayn Rand stamp.

Every book by Ayn Rand published in her lifetime is still in print, and hundreds of thousands of copies of them are sold every year, so far totaling more than twenty million. Why?

Ayn Rand understood, all the way down to fundamentals, why man needs the unique form of nourishment that is literature. And she provided a banquet that was at once intellectual and thrilling.

The major novels of Ayn Rand contain superlative values that are unique in our age. Atlas Shrugged (1957) and The Fountainhead (1943) offer profound and original philosophic themes, expressed in logical, dramatic plot structures. They portray an uplifted vision of man, in the form of protagonists characterized by strength, purposefulness, integrityheroes who are not only idealists, but happy idealists, self-confident, serene, at home on earth. (See synopses later in this guide.)

Ayn Rands first novel, We the Living (1936), set in the post-revolutionary Soviet Union, is an indictment not merely of Soviet-style Communism, but of any and every totalitarian state that claims the right to sacrifice the supreme value of an individual human life.

Anthem (1946), a prose poem set in the future, tells of one mans rebellion against an utterly collectivized world, a world in which joyless, selfless men are permitted to exist only for the sake of serving the group. Written in 1937, Anthem was first published in England; it was refused publication in America until 1946, for reasons the reader can discover by reading it for himself.

Ayn Rand wrote in a highly calculated literary style intent on achieving precision and luminous clarity, yet that style is at the same time colorful, sensuously evocative, and passionate. Her exalted vision of man and her philosophy for living on earth, Objectivism, have changed the lives of tens of thousands of readers and launched a major philosophic movement with a growing impact on American culture.

You are invited to sit down to the banquet which is Ayn Rands novels. I hope you personally enjoy them as much as I did.

About the Books

Atlas Shrugged (1957) is a mystery story, Ayn Rand once commented, “not about the murder of mans body, but about the murderand rebirthof mans spirit.” It is the story of a manthe novels herowho says that he will stop the motor of the world, and does. The deterioration of the U.S. accelerates as the story progresses. Factories, farms, shops shut down or go bankrupt in ever larger numbers. Riots break out as food supplies become scarce. Is he, then, a destroyer or the greatest of liberators? Why does he have to fight his battle, not against his enemies but against those who need him most, including the woman, Dagny Taggart, a top railroad executive, whom he passionately loves? What is the worlds motorand the motive power of every man?

Peopled by larger-than-life heroes and villains, and charged with awesome questions of good and evil, Atlas Shrugged is a novel of tremendous scope. It presents an astounding panorama of human lifefrom the productive genius who becomes a worthless playboy (Francisco dAnconia)to the great steel industrialist who does not know that he is working for his own destruction (Hank Rearden)to the philosopher who becomes a pirate (Ragnar Danneskjold)to the composer who gives up his career on the night of his triumph (Richard Halley). Dramatizing Ayn Rands complete philosophy, Atlas Shrugged is an intellectual revolution told in the form of an action thriller of violent eventsand with a ruthlessly brilliant plot and irresistible suspense.

We do not want to spoil the plot by giving away its secret or its deeper meaning, so as a hint only we will quote here one brief exchange from the novel:

“If you saw Atlas, the giant who holds the world on his shoulders, if you saw that he stood, blood running down his chest, his knees buckling, his arms trembling but still trying to hold the world aloft with the last of his strength, and the greater the effort the heavier the world bore down upon his shoulderswhat would you tell him to do?”

“Idont know. Whatcould he do? What would you tell him?”

“To shrug.”The Fountainhead (1943) introduced the world to architect Howard Roark, an intransigent, egoistic hero of colossal stature. A man whose arrogant pride in his work is fully earned, Roark is an innovator who battles against a tradition-worshipping society. Expelled from a prestigious architectural school, refused work, reduced to laboring in a granite quarry, Roark is never stopped. He has to withstand not merely professional rejection, but also the enmity of Ellsworth Toohey, leading humanitarian; of Gail Wynand, powerful publisher; and of Dominique Francon, the beautiful columnist who loves him fervently yet, for reasons you will discover, is bent on destroying his career.

At the climax of the novel, the untalented but successful architect Peter Keating, a college friend of his, pleads with Roark for help in designing a prestigious project that Roark himself wanted but was too unpopular to win. Roark agrees to design the project secretly on condition that it be built strictly according to his drawings. During construction, however, Roarks building is thoroughly mutilated. Having no recourse in law, Roark takes matters into his own hands in a famous act of dynamiting. In the process and the subsequent courtroom trial, he makes his stand clear, risking his career, his love, and his life.

The Fountainhead portrays individualism versus collectivism, not in politics, but in mans soul; it presents the motivations and the basic premises that produce the character of an individualist or a collectivist.The novel was made into a motion picture in 1949, starring Gary Cooper and Patricia Neal, for which Ayn Rand wrote the screenplay. The movie, available on video, often plays on cable TV and at art-house cinemas, where it is always received enthusiastically.

We the Living (1936), Ayn Rands first and most autobiographical novel, is a haunting account of mens struggle for survival in the post-revolutionary Soviet Union. In a country where people fear being thought disloyal to the Communist state, three individuals stand forth with the mark of the unconquered in their being: Kira, who wants to become a builder, and the two men who love herLeo, an aristocrat, and Andrei, an idealistic Communist.

When Leo becomes ill with tuberculosis, Kira strives to get him the medical attention needed to save his life. But she is trapped in a society that regards the individual as expendable. No matter where she turns, she faces closed doors and refusals. The State tells her: “One hundred thousand workers died in the civil war. Whyin the face of the Union of Socialist Soviet Republicscant one aristocrat die?”

Kiras love for Leo is such that the price of saving his life is no object. To pay for sending him to a sanitarium, she becomes the mistress of Andrei Taganovwho is not only an idealist, but also an officer of the Soviet secret police. The gripping and poignant resolution of the love triangle is an indictment not merely of Soviet-style Communism, but of the totalitarian state as such.

During World War II, an Italian film of We the Living was produced without Ayn Rands knowledge. Largely faithful to the book, the film was approved by Italys Fascist government on the grounds that it was anti-communist. But the Italian public understood that the movie was just as anti-fascist as it was anti-communist. People grasped Ayn Rands theme that dictatorship as such is evil, and embraced the movie. Five months after its release, Mussolinis government figured out what everyone else knew, and banned the movie. This is eloquent proof of Ayn Rands claim that the book is not merely “about Soviet Russia.”

After the war, the movie was re-edited under Ayn Rands supervision. The movie is still played at art-house cinemas, and is now available on videotape.

Anthem (1946), a novelette in the form of a prose poem, depicts a grim world of the future that is totally collectivized. Technologically primitive, it is a world in which candles are the very latest advance. From birth to death, mens lives are directed for them by the State. At Palaces of Mating, the State enacts its eugenics program; once born and schooled, people are assigned jobs they dare not refuse, toiling in the fields until they are consigned to the Home of the Useless.

This is a world in which men live and die for the sake of the State. The State is all, the individual is nothing. It is a world in which the word “I” has vanished from the language, replaced by “We.” For the sin of speaking the unspeakable “I,” men are put to death.

Equality 7-2521, however, rebels.

Though assigned to the life work of street sweeper by the rulers who resent his brilliant, inquisitive mind, he secretly becomes a scientist. Enduring the threat of torture and imprisonment, he continues in his quest for knowledge and ultimately rediscovers electric light. But when he shares it with the Council of Scholars, he is denounced for the sin of thinking what no other men think. He runs for his life, escaping to the uncharted forest beyond the citys edge. There, with his beloved, he begins a more intense sequence of discoveries, both personal and intellectual, that help him break free from the collectivist States brutal morality of sacrifice. He learns that mans greatest moral duty is the pursuit of his own happiness. He discovers and speaks the sacred word: I.

Anthems theme is the meaning and glory of mans ego.

About Objectivism

Ayn Rand held that philosophy was not a luxury for the few, but a life-and-death necessity of everyones survival. She described Objectivism, the intellectual framework of her novels, as a philosophy for living on earth. Rejecting all forms of supernaturalism and religion, Objectivism holds that Reality, the world of nature, exists as an objective absolutefacts are facts, independent of mans feelings, wishes, hopes, or fears; in short, “wishing wont make it so.” Further, Ayn Rand held that Reasonthe faculty that identifies and integrates the material provided by mans sensesis mans only source of knowledge, both of facts and of values. Reason is mans only guide to action, and his basic means of survival. Hence her rejection of all forms of mysticism, such as intuition, instinct, revelation, etc.

On the question of good and evil, Objectivism advocates a scientific code of morality: the morality of rational self-interest, which holds Mans Life as the standard of moral value. The good is that which sustains Mans Life; the evil is that which destroys it. Rationality, therefore, is mans primary virtue. Each man should live by his own mind and for his own sake, neither sacrificing himself to others nor others to himself. Man is an end in himself. His own happiness, achieved by his own work and trade, is each mans highest moral purpose.

In politics, as a consequence, Objectivism upholds not the welfare state, but laissez-faire capitalism (the complete separation of state and economics) as the only social system consistent with the requirements of Mans Life. The proper function of government is the original American system: to protect each individuals inalienable rights to life, liberty, property, and the pursuit of happiness.

Objectivism defines “art” as the re-creation of reality according to an artists metaphysical value-judgments. The greatest school in art history, it holds, is Romanticism, whose art represents things not as they are, but as they might be and ought to be.

The fundamentals of Objectivism are set forth in many nonfiction books including: For the New Intellectual; The Virtue of Selfishness; Capitalism: The Unknown Ideal; Return of the Primitive: The Anti-Industrial Revolution; Philosophy: Who Needs It; and The Romantic Manifesto. Objectivism: The Philosophy of Ayn Rand, written by Ayn Rands intellectual heir Leonard Peikoff and published in 1991, is the definitive presentation of her entire system of philosophy.

ABOUT AYN RAND

Ayn Rand was born in St. Petersburg, Russia, on February 2, 1905. At the age of nine, she decided to make fiction-writing her career. In late 1925 she obtained permission to leave the USSR for a visit to relatives in the United States. Arriving in New York in February 1926, she first spent six months with her relatives in Chicago before moving to Los Angeles.

On her second day in Hollywood, the famous director Cecil B. De Mille noticed her standing at the gate of his studio, offered her a ride to the set of his silent movie The King of Kings, and gave her a job, first as an extra and later as a script reader. During the next week at the studio, she met an actor, Frank OConnor, whom she married in 1929; they were happily married until his death fifty years later.

After struggling for several years at various menial jobs, including one in the wardrobe department at RKO, she sold her first screenplay, “Red Pawn,” to Universal Studios in 1932 and then saw her first play, Night of January 16th, produced in Hollywood and (in 1935) on Broadway. In 1936, her first novel, We the Living, was published.

She began writing The Fountainhead in 1935. In the character of Howard Roark, she presented for the first time the Ayn Rand hero, whose depiction was the chief goal of her writing: the ideal man, man as “he could be and ought to be.” The Fountainhead was rejected by a dozen publishers but finally accepted by Bobbs-Merrill; it came out in 1943. The novel made publishing history by becoming a best-seller within two years purely through word of mouth; it gained lasting recognition for Ayn Rand as a champion of individualism.

Atlas Shrugged (1957) was her greatest achievement and last work of fiction. In this novel she dramatizes her unique philosophy of Objectivism in an intellectual mystery story that integrates ethics, metaphysics, epistemology, politics, economics, and sex. Although she considered herself primarily a fiction writer, she realized early that in order to create heroic characters, she had to identify the philosophic principles which make such people possible. She proceeded to develop a “philosophy for living on earth.” Objectivism has now gained a worldwide audience and is an ever growing presence in American culture. Her novels continue to sell in enormous numbers every year, proving themselves enduring classics of literature.

Ayn Rand died on March 6, 1982, at her home in New York City.

Recollections of Ayn RandA Conversation with Leonard Peikoff, Ph.D.,Ayn Rand’s longtime associate and intellectual heir

Dr. Peikoff, you met Miss Rand when you were seventeen and were associated with her until her death, thirty-one years later. What were your first impressions of her? What was she like?

The strongest first impression I had of her was her passion for ideas. Ayn Rand was unlike anyone I had ever imagined. Her mind was utterly first-handed: she said what no one else had ever said or probably ever thought, but she said these things so logicallyso simply, factually, persuasivelythat they seemed to be self-evident. She radiated the kind of intensity that one could imagine changing the course of history. Her brilliantly perceptive eyes looked straight at you and missed nothing: neither did her methodical, painstaking, virtually scientific replies to my questions miss anything. She made me think for the first time that thinking is important. I said to myself after I left her home: “All of life will be different now. If she exists, everything is possible.”

In her fiction, Ayn Rand presented larger-than-life heroesembodiments of her philosophy of rational egoismthat have inspired countless readers over the years. Was Ayn Rands own life like that of her characters? Did she practice her own ideals?

Yes, always. From the age of nine, when she decided on a career as a writer, everything she did was integrated toward her creative purpose. As with Howard Roark, dedication to thought and thus to her work was the root of Ayn Rands person.

In every aspect of life, she once told me, a man should have favorites. He should define what he likes or wants most and why, and then proceed to get it. She always did just thatfleeing the Soviet dictatorship for America, tripping her future husband on a movie set to get him to notice her, ransacking ancient record shops to unearth some lost treasure, even decorating her apartment with an abundance of her favorite color, blue-green.

Given her radical views in morality and politics, did she ever soften or compromise her message?Never. She took on the whole worldliberals, conservatives, communists, religionists, Babbitts and avant-garde alikebut opposition had no power to sway her from her convictions.

I never saw her adapting her personality or viewpoint to please another individual. She was always the same and always herself, whether she was talking with me alone, or attending a cocktail party of celebrities, or being cheered or booed by a hall full of college students, or being interviewed on national television.

Couldnt she have profited by toning things down a little?

She could never be tempted to betray her convictions. A Texas oil man once offered her up to a million dollars to use in spreading her philosophy, if she would only add a religious element to it to make it more popular. She threw his proposal into the wastebasket. “What would I do with his money,” she asked me indignantly, “if I have to give up my mind in order to get it?”

Her integrity was the result of her method of thinking and her conviction that ideas really matter. She knew too clearly how she had reached her ideas, why they were true, and what their opposites were doing to mankind.

Who are some writers that Ayn Rand respected and enjoyed reading?

She did not care for most contemporary writers. Her favorites were the nineteenth century Romantic novelists. Above all, she admired Victor Hugo, though she often disagreed with his explicit views. She liked Dostoevsky for his superb mastery of plot structure and characterization, although she had no patience for his religiosity. In popular literature, she read all of Agatha Christie twice, and also liked the early novels of Mickey Spillane.

In addition to writing best-sellers, Ayn Rand originated a distinctive philosophy of reason. If someone wants to get an insight into her intellectual and creative development, what would you suggest?

A reader ought first to read her novels and main nonfiction in order to understand her views and values. Then, to trace her early literary development, a reader could pick up The Early Ayn Rand, a volume I edited after her death. It features a selection of short stories and plays that she wrote while mastering English and the art of fiction-writing. For a glimpse of her lifelong intellectual development, I would recommend the recent book Journals of Ayn Rand, edited by David Harriman.

Ayn Rands life was punctuated by disappointments with people, frustration, and early poverty. Was she embittered? Did she achieve happiness in her own life?

She did achieve happiness. Whatever her disappointments or frustrations, they went down, as she said about Roark, only to a certain point. Beneath it was her self-esteem, her values, and her conviction that happiness, not pain, is what matters. I remember a spring day in 1957. She and I were walking up Madison Avenue in New York toward the office of Random House, which was in the process of bringing out Atlas Shrugged. She was looking at the city she had always loved most, and now, after decades of rejection, she had seen the top publishers in that city competing for what she knew, triumphantly, was her masterpiece. She turned to me suddenly and said: “Dont ever give up what you want in life. The struggle is worth it.” I never forgot that. I can still see the look of quiet radiance on her face.

DISCUSSION QUESTIONS

The Fountainhead

We the Living

Anthem

a) “It is a sin to write this. It is a sin to think words no others think.”

b) “I wished to know the meaning of things. I am the meaning.”

c) “I owe nothing to my brothers, nor do I gather debts from them.”

Objectivism

Related Titles

Fiction in PaperbackAnthem (New York: Signet, 1961).Atlas Shrugged (New York: Signet, 1959).The Fountainhead (New York: Signet, 25th anniv. ed., 1968).Night of January 16th (New York: Plume, 1987).We the Living (New York: Signet, 1960).

Nonfiction in PaperbackCapitalism: The Unknown Ideal (New York: Signet, 1967).The Early Ayn Rand: A Selection from Her Unpublished Fiction(New York: Signet, 1986).For the New Intellectual (New York: Signet, 1963).Philosophy: Who Needs It (New York: Signet, 1964).Return of the Primitive: The Anti-Industrial Revolution (New York:Meridian, 1999).The Romantic Manifesto (New York: Signet, 2nd rev. ed., 1971).The Virtue of Selfishness (New York: Signet, 1984).

On Ayn Rand and ObjectivismThe Ayn Rand Reader, edited by Gary Hull and Leonard Peikoff(New York: Plume, 1999).Journals of Ayn Rand, edited by David Harriman (New York:Dutton, 1997).Objectivism: The Philosophy of Ayn Rand, by Leonard Peikoff(New York: Meridian, 1993).

See more here:

Atlas Shrugged: (Centennial Edition) by Ayn Rand …

Atlas Shrugged Movie (Official Site)

12.13.17

The Atlas Society warns us all of some very serious diseases: STDs (Socially Transmitted Diseases).

11.24.17

Midas Mulligan’s Black Friday Sale is back for its 7th year!

10.14.17

Read a speech by Atlas Shrugged Producer John Aglialoro.

09.28.17

The Atlas Society founder, Dr. David Kelley, is retiring.

08.31.17

Anthem is being made into a graphic novel!

07.21.17

John Aglialoro provides an update on the Atlas Shrugged mini-series.

06.15.17

Galt’s Gulch members save $100 on FreedomFest 2017!

05.25.17

Celebrate Throwback Thursday with Dr. David Kelley’s lecture, “The Primacy of Existence.”

05.04.17

Atlas Distribution to release final Robin Williams’ film.

04.18.17

WATCH: Ayn Rand on taxes and the economy.

03.30.17

Test screen “Little Pink House” before it hits theaters!

03.23.17

Atlas Shrugged Mystery Boxes are back!

03.09.17

Vote for the most influential libertarian.

02.16.17

Watch the new Dagny Taggart Draw My Life video.

02.02.17

TONIGHT: Watch Atlas Shrugged Part 1 for free.

01.26.17

Celebrate Ayn Rand’s birthday by watching Atlas Shrugged Part 1 for free.

12.31.16

Celebrate another year in Galt’s Gulch Online.

12.07.16

Another chance to snag an Atlas Shrugged Mystery Box!

11.25.16

The 6th annual Midas Mulligan’s Black Friday Gulch Sale is happening now!

11.18.16

Introducing the Atlas Shrugged Mystery Box.

10.21.16

Voting fo the Atlas Art Contest is happening now.

10.06.16

WATCH: Matt Kibbe shows that the zombie apocalypse is real.

09.15.16

Weigh in on the new Atlas Shrugged movie.

09.01.16

Celebrate Atlas Shrugged Day with a NEW limited edition collector’s item.

Go here to read the rest:

Atlas Shrugged Movie (Official Site)


...1020...2829303132...405060...