Monthly Archives: July 2016

History of Evolution | Internet Encyclopedia of Philosophy

Posted: July 12, 2016 at 6:25 am

The word "evolution" in its broadest sense refers to change or growth that occurs in a particular order. Although this broad version of the term would include astronomical evolution and the evolution of computer design, this article focuses on the evolution of biological organisms. That use of the term dates back to the ancient Greeks, but today the word is more often used to refer to Darwin's theory of evolution by natural selection. This theory is sometimes crudely referred to as the theory of "survival of the fittest." It was proposed by CharlesDarwin in On the Origin of Species in 1859 and, independently, by Alfred Wallace in 1858although Wallace, unlike Darwin, said the human soul is not the product of evolution.

Greek and medieval references to "evolution" use it as a descriptive term for a state of nature, in which everything in nature has a certain order or purpose. This is a teleological view of nature. For example, Aristotle classified all living organisms hierarchically in his great scala naturae or Great Chain of Being, with plants at the bottom, moving through lesser animals, and on to humans at the pinnacle of creation, each becoming progressively more perfect in form. It was the medieval philosophers, such as Augustine, who began to incorporate teleological views of nature with religion: God is the designer of all creatures, and everything has a purpose and a place as ordained by Him.

In current times, to some, the terms "evolution" and "God" may look like unlikely bed fellows (see the discussion on teleology). This is due primarily to today's rejection by biologists of a teleological view of evolution in favor of a more mechanistic one. The process of rejection is commonly considered to have begun with Descartes and to have culminated in Darwins theory of evolution by natural selection.

Fundamental to natural selection is the idea of change by common descent. This implies that all living organisms are related to each other; for any two species, if we look back far enough we will find that they are descended from a common ancestor. This is a radically different view than Aristotles Great Chain of Being, in which each species is formed individually with its own purpose and place in nature and where no species evolves into a new species. Evolution by natural selection is a purely mechanistic theory of change that does not appeal to any sense of purpose or a designer. There is no foresight or purpose in nature, and there is no implication that one species is more perfect than another. There is only change driven by selection pressures from the environment. Although the modern theory of biological evolution by natural selection is well accepted among professional biologists, there is still controversy about whether natural selection selects for fit genes or fit organisms or fit species.

Evolution by natural selection is a theory about the process of change. Although Darwin's original theory did not specify that genes account for an organism's heritable traits, that is now universally accepted among modern evolutionists. In a given population, natural selection occurs when genetically-based traits that promote survival in one's environment are passed onto future generations and become more frequent in later generations. Organisms develop different survival and reproduction enhancing traits in response to their different environments (with abundance or shortage of food, presence or absence of predators, and so forth) and, given enough time and environmental changes, these small changes can accumulate to form a whole new species. Thus for Darwin there is no sharp distinction between a new variation and a new species. This theory accounts for the diversity of Earth's organisms better than theological design theories or competing scientific theories such as Lamarck's theory that an organism can pass on to its offspringcharacteristics that it acquired during its lifetime.

Evolution by natural selection works on three principles: variation (within a given generation there will be variation in traits, some that aid survival and reproduction and some that dont, and some that have a genetic basis and some that dont); competition (there will be limited resources that individuals must compete for, and traits that aid survival and reproduction will help in competition); and heritability (only traits that aid survival and reproduction and have a genetic basis can passed onto future generations).

Evolution is not so much a modern discovery as some of its advocates would have us believe. It made its appearance early in Greek philosophy, and maintained its position more or less, with the most diverse modifications, and frequently confused with the idea of emanation, until the close of ancient thought. The Greeks had, it is true, no term exactly equivalent to " evolution"; but when Thales asserts that all things originated from water; when Anaximenes calls air the principle of all things, regarding the subsequent process as a thinning or thickening, they must have considered individual beings and the phenomenal world as, a result of evolution, even if they did not carry the process out in detail. Anaximander is often regarded as a precursor of the modem theory of development. He deduces living beings, in a gradual development, from moisture under the influence of warmth, and suggests the view that men originated from animals of another sort, since if they had come into existence as human beings, needing fostering care for a long time, they would not have been able to maintain their existence. In Empedocles, as in Epicurus and Lucretius, who follow in Hs footsteps, there are rudimentary suggestions of the Darwinian theory in its broader sense; and here too, as with Darwin, the mechanical principle comes in; the process is adapted to a certain end by a sort of natural selection, without regarding nature as deliberately forming its results for these ends.

If the mechanical view is to be found in these philosophers, the teleological occurs in Heraclitus, who conceives the process as a rational development, in accordance with the Logos and names steps of the process, as from igneous air to water, and thence to earth. The Stoics followed Heraclitus in the main lines of their physics. The primal principle is, as with him, igneous air. only that this is named God by them with much greater definiteness. The Godhead has life in itself, and develops into the universe, differentiating primarily into two kinds of elements the finer or active, and the coarser or passive. Formation or development goes on continuously, under the impulse of the formative principle, by whatever name it is known, until all is once more dissolved by the ekpyrosis into the fundamental principle, and the whole process begins over again. Their conception of the process as analogous to the development of the seed finds special expression in their term of logos spermatikos. In one point the Stoics differ essentially from Heraclitus. With them the whole process is accomplished according to certain ends indwelling in the Godhead, which is a provident, careful intelligence, while no providence is assumed in Heraclitus.

Empedocles asserts definitely that the sphairos, as the full reconciliation of opposites, is opposed, as the superior, to the individual beings brought into existence by hatred, which are then once more united by love to the primal essence, the interchange of world-periods thus continuing indefinitely. Development is to be found also in the atomistic philosopher Democritus; in a purely mechanical manner without any purpose, bodies come into existence out of atoms, and ultimately entire worlds appear and disappear from and to eternity. Like his predecessors, Deinocritus, deduces organic beings from what is inorganic-moist earth or slime.

Development, as well as the process of becoming, in general, was denied by the Eleatic philosophers. Their doctrine, diametrically opposed to the older thoroughgoing evolutionism, had its influence in determining the acceptance of unchangeable ideas, or forms, by Plato and Aristotle. Though Plato reproduces the doctrine of Heraclitus as to the flux of all things in the phenomenal world, he denies any continuous change in the world of ideas. Change is permanent only in so far as the eternal forms stamp themselves upon individual objects. Though this, as a rule, takes place but imperfectly, the stubborn mass is so far affected that all works out as far as possible for the best. The demiurge willed that all should become as far as possible like himself; and so the world finally becomes beautiful and perfect. Here we have a development, though the principle which has the most real existence does not change; the forms, or archetypal ideas, remain eternally what they are.

In Aristotle also the forms are the real existences, working in matter but eternally remaining the same, at once the motive cause and the effectual end of all things. Here the idea of evolution is clearer than in Plato, especially for the physical world, which is wholly dominated by purpose. The transition from lifeless to living matter is a gradual one, so that the dividing-line between them is scarcely perceptible. Next to lifeless matter comes the vegetable kingdom, which seems, compared with the inorganic, to have life, but appears lifeless compared with the organic. The transition from plants to animals is again a gradual one. The lowest organisms originate from the primeval slime, or from animal differentiation; there is a continual progression from simple, undeveloped types to the higher and more perfect. As the highest stage, the end and aim of the whole process, man appears; all lower forms are merely unsuccessful attempts to produce him. The ape is a transitional stage between man and other viviparous animals. If development has so important a work in Aristotle's physics, it is not less important in his metaphysics. The whole transition from potentiality to actuality (from dynamis toentelecheia) is nothing but a transition from the lower to the higher, everything striving to assimilate itself to the absolutely perfect, to the Divine. Thus Aristotle, like Plato, regards the entire order of the universe as a sort of deification. But the part played in the development by the Godhead, the absolutely immaterial form, is less than that of the forms which operate in matter, since, being already everything,, it is incapable of becoming anything else. Thus Aristotle, despite his evolutionistic notions, does not take the view of a thoroughgoing evolutionist as regards the universe; nor do the Neoplatonists, whose highest principle remains wholly unchanged, though all things emanate from it.

The idea of evolution was not particularly dominant in patristic and scholastic theology and philosophy, both on account of the dualism which runs through them as an echo of Plato and Aristotle, and on account of the generally accepted Christian theory of creation. However, evolution is not generally denied; and with Augustine (De civitate dei, xv. 1) it is taken as the basis for a philosophy of history. Erigena and some of his followers seem to teach a sort of evolution. The issue of finite beings from God is called analysis or resolution in contrast to the reverse or deification the return to God, who once more assimilates all things. God himself, although denominated the beginning, middle, and end, all in all remains unmixed in his own essence, transcendent though immanent in the world. The teaching of. Nicholas of Cusa is similar to Erigena's, though a certain amount of Pythagoreanism comes in here. The world exhibits explicitly what the Godhead implicitly contains; the world is an animated, ordered whole, in which God is everywhere present. Since God embraces all things in himself, he unites all opposites: he is the complicatio omnium contradictoriorum. The idea of evolution thus appears in Nicholas in a rather pantheistic form, but it is not developed.

In spite of some obscurities in his conception of the world Giordano Bruno is a little clearer. According to him God is the immanent first cause in the universe; there is no difference between matter and form; matter, which includes in itself forms and ends, is the source of all becoming and of all actuality. The infinite ether which fills infinite space conceals within itself the nucleus of all things, and they proceed from it according to determinate laws, yet in a teleological manner. Thus the worlds originate not by an arbitrary act, but by an inner necessity of the divine nature. They are natura naturata, as distinguished from the operative nature of God, natitra naturans, which is present in all thin-S as the being- of all that is, the beauty of all that is fair. As in the Stoic teaching, with which Bruno's philosophy has much in common, the conception of evolution comes out clearly both for physics and metaphysics.

Leibniz attempted to reconcile the mechanical-physical and the teleological views, after Descartes, in his Principia philosophitce, excluding all purpose, had explained nature both lifeless and living, as mere mechanism. It is right, however, to point out that Descartes had a metaphysics above his physics, in which the conception of God took an important place, and that thus the mechanical notion of evolution did not really include everything. In Leibnitz the principles of mechanics and physics are dependent upon the direction of a supreme intelligence, without which they would be inexplicable to us. Only by such a preliminary assumption are we able to recognize that one ordered thing follows upon another continuously. It is in this sense that the law of continuity is to be understood, which is of such great importance in Leibnitz. At bottom it is the same as the law of ordered development. The genera of all beings follow continuously one upon another, and between the main classes, as between animals and vegetables, there must be a continuous sequence of intermediate beings. Here again, however, evolution is not taught in its most thorough form, since the divine monad, of God, does not come into the world but transcends it.

Among the German philosophers of the eighteenth century Herder must be mentioned first of the pioneers of modern evolutionism. He lays down the doctrine of a continuous development in the unity of nature from inorganic to organic, from the stone to the plant, from the plant to the animal, and from the animal to man. As nature develops according to fixed laws and natural conditions, so does history, which is only a continuation of the process of nature. Both nature and history labor to educate man in perfect humanity; but as this is seldom attained, a future life is suggested. Lessing had dwelt on the education of the human race as a development to the higher and more perfect. It is only recently that the significance of Herder, in regard to the conception and treatment of historic development, has been adequately recognized. Goethe also followed out the idea of evolution in his zoological and botanical investigations, with his theory of the metamorphosis of plants and his endeavor to discover unity in different organisms.

Kant is also often mentioned as having been an early teacher of the modern theory of descent. It is true he considers the analogy of the forms which he finds in various classes of organisms a ground for supposing that they may have come originally from a common source. He calls the hypothesis that specifically different being have originated one from the other "a daring adventure of the reason." But he entertains the thought that in a later epoch "an orang-outang or a chimpanzee may develop the organs which serve for walking, grasping objects, and speaking-in short, that lie may evolve the structure of man, with an organ for the use of reason, which shall gradually develop itself by social culture." Here, indeed, important ideas of Darwin were anticipated; but Kant's critical system was such that development could have no predominant place in it.

The idea of evolution came out more strongly in his German idealistic successors, especially in Schelling, who regarded nature as a preliminary stage to mind, and the process of physical development as continuing in history. The unconscious productions of nature are only unsuccessful attempts to reflect itself; lifeless nature is an immature intelligence, so that in its phenomena an intelligent character appears only unconsciously. Its highest aim, that, of becoming an object to itself, is only attained in the highest and last reflection-in man, or in what we call reason, through which for the first time nature returns perfectly upon itself. All stages of nature are connected by a common life, and show in their development a conclusive unity. The course of history as a whole must be conceived as offering a gradually progressive revelation of the Absolute. For this he names three periods-that of fate, that of nature, and that of providence, of which we are now in the second. Schelling's followers carried the idea of development somewhat further than their master. This is true especially of Oken, who conceives natural science as the science of the eternal transformation of God into the world, of the dissolution of the Absolute into plurality, and of its continuous further operation in this plurality. The development is continued through the vegetable and animal kingdoms up to man, who in his art and science and polity completely establishes the will of nature. Oken, it is true, conceived man as the sole object of all animal development, so that the lower stages are only abortive attempts to produce him-a theory afterward controverted by Ernst von Baer and Cuvier, the former of whom, standing somewhat in opposition to Darwin, is of great interest to the student of the history of the theory of evolution.

Some evolutionistic ideas are found in Krause and Schleiermacher; but Hegel, with his absolute idealism, is a more notable representative of them. In his system philosophy is the science of the Absolute, of the absolute reason developing or unfolding itself. Reason develops itself first in the abstract element of thought, then expresses itself externally in nature, and finally returns from this externalization into itself in mind. As Heraclitus had taught eternal becoming, so Hegel, who avowedly accepted all the propositions of the Ephesian philosopher in his logic, taught eternal proceeding. The difference between the Greek and the German was that the former believed in the flux of matter, of fire transmuting itself by degrees into all things, and in nature as the sole existence, outside of which there was nothing; while the latter conceived the abstract idea or reason as that which really is or becomes, and nature as only a necessary but transient phase in the process of development. With Heraclitus evolution meant the return of all things into the primal principle followed by a new world-development; with Hegel it was an eternal process of thought, giving no answer to the question as to the end of historical development.

While Heraclitus had laid down his doctrine of eternal becoming rather by intuition than on the ground of experience, and the entire evolutionary process of Hegel had been expressly conceived as based on pure thought, Darwin's and Wallace's epoch-making doctrine rested upon a vast mass of ascertained facts. He was, of course, not the first to lay down the origin of species one from another as a formal doctrine. Besides those predecessors of his to whom allusion has already been made, two others may be mentioned here: his grandfather, Erasmus Darwin, who emphasized organic variability; and still more Lamarck, who denied the immutability of species and forms, and claimed to have demonstrated by observation the gradual development of the animal kingdom. What is new in Charles Darwin is not his theory of descent, but its confirmation by the theory of natural selection and the survival of the fittest in the struggle for existence. Thus a result is brought about which corresponds as far as possible to a rational end in a purely mechanical process, without any cooperation of teleological principles, without any innate tendency in the organisms to proceed to a higher stage. This theory postulates in the later organisms deviations from the earlier ones; and that these deviations, in so far as they are improvements, perpetuate themselves and become generic marks of differentiation. This, however, imports a difficulty, since the origin of the first of these deviations is inexplicable. The differentia of mankind, whom Darwin, led by the force of analogy, deduces from a species of apes, consists in intellect and moral qualities, but comes into existence only by degrees. The moral sensibilities develop from the original social impulse innate in man; this impulse is an effort to secure not so much individual happiness as the general welfare.

It would be impossible to name here all those who, in different countries, have followed in Darwin's footsteps, first in the biological field and then in those of psychology, ethics, sociology, and religion. They have carried his teaching further in several directions, modifying it to some extent and making it fruitful, while positivism has not seldom come into alliance with it. In Germany Ernst Haeckel must be mentioned with his biogenetic law, according to which the development of the individual is an epitome of the history of the race, and with his less securely grounded notion of the world-ether as a creative deity. In France Alfred Fouillee worked out a theory of idea-forces, a combination of Platonic idealism with English (though not specifically Darwinian) evolutionism. Marie-Jean Guyau understood by evolution a life led according to the fundamental law that the most intensive life is also the most extensive. He develops his ethics altogether from the facts of the social existence of mankind, and his religion is a universal sociomorphism, the feeling of the unity of man with the entire cosmos.

The most careful and thorough development of the whole system took place in England. For a long time it was represented principally by the work of Herbert Spencer, who had come out for the principle of evolution even before the publication of Darwin's Origin of Species. He carries the idea through the whole range of philosophy in his great System of Synthetic Philosophy and undertakes to show that development is the highest law of all nature, not merely of the organic. As the foundation of ill that exists, though itself unknowable and only revealing itself in material and mental forms, he places a power, the Absolute, of which we have but an indefinite conception. The individual processes of the world of phenomena are classed under the head of evolution, or extension of movement, with which integration of matter, union into a single whole, is connected, and dissolution or absorption of movement, which includes disintegration of matter, the breaking of connection. Both processes go on simultaneously, and include the history of every existence which we can perceive. In the course of their development the organisms incorporate matter with themselves; the plant grows by taking into itself elements which have previously existed in the form of gases, and the animal by assimilating elements found in plants and in other animals. The same sort of integration is observed in social organisms, as when nomadic families unite into a tribe, or subjects under a prince, and princes under a king. In like manner integration is evident in the development of language, of art, and of science, especially philosophy. But as the individuals unite into a whole, a strongly marked differentiation goes on at the same time, as in the distinction between the surface and the interior of the earth, or between various climates. Natural selection is not considered necessary to account for varying species, but gradual conditions of life create them. The aim of the development is to show a condition of perfect balance in the whole; when this is attained, the development, in virtue of the continuous operation of external powers, passes into dissolution. Those epochs of development and of dissolution follow alternately upon each other. This view of Spencer suggests the hodos ano and hodos kato of Heraclitus, and his flowing back of individual things into the primal principle.

Similar principles are carried out not only for organic phenomena but also for mental and social; and on the basis of the theory of evolution a remarkable combination of intuitionism and empiricism is achieved. In his principles of sociology Spencer lays down the laws of hyperorganic evolution, and gives the various stages of human customs and especially of religious ideas, deducing all religion much too one-sidedly from ancestor-worship. The belief in an immortal " second self " is explained by such phenomena as shadows and echoes. The notion of gods is suppose to arise from the idea of a ghostly life after death. In his Principles of Ethics he attempts a similar compromise between intuitionism and empiricism, deducing the consciousness of duty from innumerable accumulated experiences. The compelling element in moral actions, originally arising from fear of religious, civil, or social punishment, disappears with the development of true morality. There is no permanent opposition between egoism and altruism, but the latter develops simultaneously with the former.

Spencer's ethical principles were fruitfully modified, especially by Sir Leslie Stephen and S. Alexander, though with constant adherence to the idea of development. While the doctrine of evolution in Huxley and Tyndall is associated with agnosticism, and thus freed from all connection with metaphysics, as indeed was the case with Spencer, in spite of his recognition of the Absolute as the necessary basis for religion and for thought, in another direction an attempt was made to combine evolutionism closely with a metaphysics in which the idea of God was prominent. Thus the evolution theory of Clifford and Romanes led them to a thoroughgoing monism, and that of J. M. F. Schiller to pluralism. According to the last-named a personal deity, limited in power, exists side by side with a multitude of intellectual beings, who existed before the formation of the world in a chaotic state as absolutely isolated individuals. The process of world formation begins with the decision of the divine Spirit to bring a harmony of the cosmos out of these many existences. Though Spencer's influence in philosophical development was not so great in Germany as in England, the idea of development has continued in recent years to exert no little power. Space forbids more than a mention of Lotze's teleological idealism; Von Harttmann's absolute monism, in which the goal of the teleological development of the universe is the reversion of the will into not-willing; Wundt's metaphysics of the will, according to which the world is a development, an eternal becoming, in which nature is a preliminary stage to mind; and Nietzsche's individualism, the final point of which is the development of the superman.

The author of this article is anonymous. The IEP is actively seeking an author who will write a replacement article.

Read the rest here:

History of Evolution | Internet Encyclopedia of Philosophy

Posted in Evolution | Comments Off on History of Evolution | Internet Encyclopedia of Philosophy

Evolution – Bulbapedia, the community-driven Pokmon encyclopedia

Posted: at 6:25 am

From Bulbapedia, the community-driven Pokmon encyclopedia.

Evolution (Japanese: evolution) is a process in which a Pokmon changes into a different species of Pokmon. This change is not merely physical, however, as Pokmon of a higher evolutionary stage have different (and usually more powerful) base stats than their predecessors, may have different moves that can be learned, and sometimes change their types, though usually at least one of the types of the previous form is preserved. Other statistics, such as Nature and EVs, as well as shininess, are preserved. With respect to real-world phenomena, Pokmon Evolution is more similar to metamorphosis than evolution. Evolution also appears to be a mostly independent phenomena from the aging process for most species, though Baby Pokmon need to evolve to their next stage in order to breed.

Professor Elm and Professor Rowan are the leading experts in Pokmon Evolution. According to the latter's research, over 90% of all Pokmon are connected to at least one other through Evolution (this is true only if Legendary Pokmon are excluded.) Rowan is currently investigating whether Evolution is a form of maturity in Pokmon, and looking at the implications this process has on Legendary Pokmon, which don't evolve.

An evolution family is a group of Pokmon who will all, if bred with Ditto or a Pokmon in the same Egg Group, make a Pokmon Egg that will hatch into the same Pokmon, excluding baby Pokmon. This also means that the most basic form has the potential to become any of the rest of the family, although it will ultimately be able to follow only one evolutionary path.

Pokmon can be divided into different evolutionary stages, based on where they appear in their evolution family. All Pokmon fall into one of four groups: baby Pokmon, unevolved Pokmon, first-evolution Pokmon, and second-evolution Pokmon. These groups are also the basis for the TCG's grouping of Baby Pokmon, Basic Pokmon, Stage 1 Pokmon, and Stage 2 Pokmon, respectively.

Due to the fact that no evolution family contains both a baby Pokmon and a second-evolution Pokmon, many regard baby Pokmon as the most basic form, while moving their evolved counterparts one level higher. For example, originally, Pikachu was regarded as an unevolved Pokmon, however, with the release of Pichu in Generation II, many now consider it to be more on par with Pokmon like Charmeleon, though its TCG classification remains the same.

Perhaps the most well-known types of evolution families are those that feature two separate evolutionary events in the Pokmon's development. Indeed, this type of evolution family is what all of the starter Pokmon in the core series are a part of (excluding the starter Pikachu in Pokmon Yellow, as Pichu did not yet exist and it could not be evolved into Raichu; and Eevee, which could only be taken by Blue), as well as all pseudo-legendary Pokmon. An example of this type of evolution family is below.

By far the most common type of evolution family, these families are based in a Pokmon that will only ever evolve once in its development. About one third of all Pokmon that would later get a baby form were part of this kind of evolution family before their baby form was revealed. An example of this type of evolution family is below.

The least common type of evolution family is that in which no evolutionary event takes place, meaning that it is made up of only one member. Many of the Pokmon that have no evolutionary relatives are Legendary Pokmon. However, there are still 61 other Pokmon that do not evolve. Below is a list of all non-Legendary Pokmon that do not evolve (Phione is not included due to its status as a Legendary being disputed).

Not belonging to an evolutionary family is not indicative of strength, or a lack thereof. Some Pokmon, such as Heracross and Skarmory, are comparable to fully evolved Pokmon while others, like Delibird and Luvdisc, are more comparable to unevolved Pokmon. Often this indicates a Pokmon's possibility to be eligible for future new evolutions or pre-evolutions.

Several families, while also one- and two-evolution families, are also branch evolution families. What this means is that there is a split in the evolutionary line at some point so that even though two Pokmon of the same species evolve the same amount of times, they can become one of two or more entirely different creatures. Eevee is the best-known example of this, evolving eight different ways depending on the method used. An example of this type of evolution family is below.

A major difference between the final forms of an evolution family with a branch in evolution is in the way that their base stats line up. For example, Kirlia can evolve into both Gardevoir and Gallade, which both have 518 total base stats. However, Gallade's base stat in Attack is 125 and its base stat in Special Attack is 65. The reverse is true for Gardevoir, whose Special Attack is 125 and whose Attack is 65. This is true of many opposing evolutions, with one focusing in one specific stat, the other focusing in a separate stat, and both having the same total stats. This is especially obvious in the Eeveelutions, who each have exactly the same base stats, though organized differently.

A listing of the stat focuses is below.

The various triggers for a Pokmon's evolution are almost as varied as the Pokmon themselves, and some Pokmon have a unique evolution method. The most common of them is Evolution by leveling up at or above a certain level. Other methods include the following:

Additionally, holding an Everstone prevents a Pokmon from evolving, as well as surprising a Pokmon via the B Button. The latter method is known as an "Evolution cancel".

Pokmon that get knocked out during a battle will evolve at the end of that battle if its requirements have been met. However, before Generation VI, losing a battle would make Pokmon not evolve even if the conditions have been met.

Pokmon that can evolve into more than one Pokmon will usually have the ways in which the evolution is activated being slightly similar, such as having both being initiated by evolutionary stone or by trading while holding an item. Closely-related Pokmon, such as Nidoran and Nidoran, will also have very similar, if not identical, evolution methods.

Some Pokmon have different evolutions depending on their gender. For example, only female Combee can evolve into Vespiquen; male Combee cannot evolve at all. Meanwhile, all Snorunt can evolve into Glalie, but females ones have the option of evolving into Froslass instead. This instance occurs in a similar way with Kirlia, albeit with males having split evolution instead.

Also, there have been situations in which the current party must be configured in a specific manner for some Pokmon to evolve. So far, only three Pokmon need to have these special requirements. Mantyke will evolve into Mantine if leveled up with a Remoraid in the player's party. Nincada will evolve into Ninjask when it reaches level 20. However, if there happens to be an empty space in the player's party (and a spare Pok Ball in Generation IV onward), a Shedinja will also appear in the party. Pancham evolves into Pangoro if its level is 32 or higher and there is a Dark-type Pokmon in the player's party.

Some Pokmon evolve in other unique ways. If one trades a Karrablast for a Shelmet, they will evolve into Escavalier and Accelgor, respectively, though neither will evolve if one of them holds an Everstone. When Inkay reaches level 30, the player must hold the 3DS upside-down for it to evolve into Malamar. Also introduced was a weather-based evolution: Sliggoo will evolve into Goodra beginning at level 50 only if it is raining in the area that the player is in. Finally, Sylveon can only be obtained be leveling up an Eevee that knows any Fairy-type moves and has at least two hearts of affection in Pokmon-Amie.

Some missions Hey You, Pikachu! involve Pikachu interacting with other Pokmon in certain ways to cause their evolution. In Caring for Caterpie, the player and Pikachu supervise a group of Caterpie, who will evolve into Metapod and then Butterfree if treated well. In Field Trip, Pikachu can water wild Oddish and Gloom, causing them to evolve into Gloom and Vileplume, respectively.

In Pokmon Colosseum and XD: Gale of Darkness, while evolution typically works as normal in the main series, Shadow Pokmon are incapable of evolving until they are purified and return to normal. In Pokmon XD: Gale of Dakrness, the player's Eevee is incapable of evolving into Espeon or Umbreon through normal methods, because the game does not have a Time mechanic. However, early in the game, the player is given their choice of evolution item to evolve it, including the Sun and Moon Shards, Key Items that will evolve Eevee into Espeon or Umbreon respectively after it levels up.

In Pokmon Conquest, because the mechanics of levels, experience, and friendship do not exist, Pokmon typically evolve once they reach a certain link threshold with their partnered Warrior or Warlord. Pokmon that normally evolve via high friendship in the main series games, such as Golbat, instead evolve after reaching a certain link percentage, usually between 60 and 70 percent. Pokmon that normally evolve at a set level instead evolve when a certain stat reaches a specific value. For example, Spheal evolves when its HP has reached a value of 138, which is partially determined by the link with its Warrior. Warriors with Pokmon that require an evolutionary stone to evolve must equip themselves with that item and then perform an action that causes their link to improve, such as completing a battle.

In Pokmon Pinball and Pokmon Pinball: Ruby & Sapphire, the player can evolve Pokmon they caught in Catch 'Em Mode in a separate mode called Evolution Mode (EVO Mode in Pinball RS). In this mode the player selects an evolution-capable Pokmon in their possession, then guide their ball towards three symbols representative of their method of evolution in the main games, such as EX for Level evolution, or a Link Cable for Trade evolution. If the player collects the three symbols in time, they can bring their ball to the Center Hole to evolve their Pokmon, awarding them with their Pokdex entry and points.

In the Mystery Dungeon series, Evolution is restricted until the player complete the main scenario of the respective games. Evolution is typically done in a ritual held in several locations across the Pokmon world, including Luminous Cave, the Luminous Spring, or the Tree of Life. However, starting in Explorers of Time and Explorers of Darkness, the player character and their partner may not evolve until they complete an additional scenario. Pokmon who evolve through unusual methods require an additional item to act as a catalyst.

In Pokmon Mystery Dungeon: Gates to Infinity forward, enemy Pokmon may evolve at will after defeating a member of the player's party. In Super Mystery Dungeon, the player character and their partner evolve into their final forms several times throughout the story. In addition, recruitable Pokmon that exist as NPCs in this game or previous games will refuse evolution. However, because all Pokmon can be recruited separately though the Connection Orb, the player can still access their respective evolved forms in alternate ways.

In Pokmon Snap, the player can interact with Pokmon in certain ways that will make them evolve.

In the anime, Evolution happens in much the same way as it does in the games; though level-based evolutions and trade-based evolutions do not occur using those methods, there are similarities in the way they come about. For example, Misty's Poliwhirl evolved into Politoed because it found Ash's King's Rock and was holding it when Misty sent it out, while in the games it is required that Poliwhirl be traded while holding the King's Rock for the evolution to take place (It should be noted that Poliwhirl had been through a machine in connection with it being healed at the Pokmon Center, while holding the item). When a Beedrill attacked Ash's Metapod, it caused a crack to appear on its shell, which Butterfree came out of.

Additionally, a difference can be seen in the fact that Pokmon evolve during a battle, as opposed to after it. Pokmon may also evolve when they are needed to, for an extra boost of power or gaining new abilities, instead of after a set amount of training, such as when Ash's Charmeleon evolved into Charizard. In addition, Pokmon can sometimes choose not to evolve, even if they evolve by a 'natural' method such as leveling up. It appears that Evolution has emotional implications for Pokmon - some Pokmon, such as Team Rocket's Meowth, dislike their evolved forms, while others such as Ash's Pikachu simply want to prove they can be powerful without evolving. Conversely, when Pokmon do evolve, this can often be linked with an experience that causes them to mature emotionally or deal with an emotional issue, such as when the Poochyena in A Bite to Remember evolved, or the Paras in The Problem With Paras. Poochyena, for some reason, had an aversion to using the move Bite, while Paras was extremely timid and weak in battle. Both of them evolved shortly after overcoming these issues.

For a list of all evolutions that Pokmon belonging to the main cast have undergone, see List of anime Pokmon by evolution.

Evolution in the Pokmon Trading Card Game is very similar in some aspects to its counterpart in the core series. However, it differs mostly in the fact that there are no different methods needed to evolve a Pokmon, but instead, all Pokmon evolve simply by placing the next stage on top of a Pokmon in play that it evolves into.

Pokmon cannot be evolved on the first turn of the game or on the first turn they come into play. They also cannot be evolved if on the same turn they were previously evolved or devolved.

There are four different stages of evolution in the TCG, Baby Pokmon, Basic Pokmon, Stage 1 Pokmon, and Stage 2 Pokmon. Of these, only Baby and Basic Pokmon may be placed onto the Bench during the setup phase and during play; Stage 1 and Stage 2 Pokmon are considered to be evolution cards and therefore unable to be played except on top of their corresponding pre-evolved forms. The stage of evolution is indicated in a conspicuous place on each and every Pokmon card, though the placement differs among the four generations of cards.

Within the deck and discard pile, only Stage 1 and Stage 2 cards are considered to be "evolution cards" for the purpose of a Trainer card or Pokmon Power which allows them to be searched for. In play, a Basic Pokmon card can be considered an evolution card if it is evolved from its Baby stage.

A Baby Pokmon is much the same in the TCG as it is in the core series of games. In fact, as with baby Pokmon released beyond Generation II, it is not even necessary for a Pokmon to even go through this stage of their evolutionary line, as the Pokmon can just start from their basic form. Baby Pokmon are among the weakest in the TCG, most often having 30 HP, as well as one of two special Pok-Bodys: one prevents all damage done to the Baby Pokmon while it is Asleep (Baby Pokmon with this Pok-Body also usually have an attack that changes their status to Asleep), and the other forces a Pokmon attempting to attack the Baby Pokmon to flip a coin, the attack doing nothing if that coin ends up tails.

A Basic Pokmon is the most basic of Pokmon cards, as can be deduced from its name. Commonly basic Pokmon will have low HP, a common rarity, and low damage and Energy costs. These cards can be placed directly into play without another Pokmon card needing to be in play first. Pokmon that evolve from a Pokmon released in a later generation, such as Electabuzz or Pikachu, always are basic Pokmon, despite being the second Pokmon in their own evolutionary lines. Baby Pokmon, Shining Pokmon, Pokmon , Pokemon SP, and Pokmon-EX are always Basic, the latter four cannot evolve.

A Stage 1 Pokmon are the first kind of evolution card, being able to be evolved from a Basic Pokmon. Stage 1 cards are most commonly uncommon in rarity. Stage 1 Pokmon are also able to be Dark Pokmon and Light Pokmon.

A Stage 2 Pokmon is the highest of evolution cards, commonly rare or holographic in rarity, and can only, in normal conditions, be evolved from a Stage 1 Pokmon. Stage 2 Pokmon are also able to be Dark Pokmon and Light Pokmon.

M Pokmon-EX cards were introduced in XY expansion and introduce the Mega Evolution mechanic featured in Pokmon X and Y. They are identified by a stylized graphic on the card name. M Pokmon-EX can only be played by Mega Evolving from basic Pokmon-EX. Doing so ends a players Turn immediately. Other than this, M Pokmon-EX share the same rules and design as regular Pokmon-EX and evolving Pokmon, with the addition of boosted Hit Points and more powerful Attacks.

A Pokmon card that is in the player's hand must say specifically that it evolves from a Pokmon card that is in play on the player's side. For example, Dark Blastoise states on the card "Evolves from Dark Wartortle". This means that any card named Dark Wartortle may be evolved into Dark Blastoise. However, a card simply named Wartortle cannot. Likewise, Pokmon such as Rhyhorn cannot be evolved into a Pokmon that says on it "Evolves from Team Magma's Rhyhorn".

However, Pokmon cards from different sets may evolve into one another. For example, Dark Crobat can evolve from either Dark Golbat of the Team Rocket set or Dark Golbat of the EX Team Rocket Returns set. So long as the card names match precisely both to (here Dark Crobat) and from (here Dark Golbat), the evolution is legal. This rule, of course, can be circumvented by certain means, such as Pokmon Powers and Trainer cards, however, this is not common.

Evolution in Pokmon, for most species, is more akin to metamorphosis than to actual evolution. This is because real life evolution happens to a population rather than to individuals, and happens over much larger time scales than in the Pokmon world. In the Pokmon Adventures manga, it is mentioned that Pokmon Evolution is an entirely separate phenomenon from the normal process of evolution, and is a mysterious ability exclusive to Pokmon that is still not fully understood. In Pokmon Super Mystery Dungeon, it is described in the health class at the school in Serene Village as when a Pokmon's body rapidly grows larger with many other changes bringing a Pokmon closer to being an adult, being described akin to puberty.

Read more:

Evolution - Bulbapedia, the community-driven Pokmon encyclopedia

Posted in Evolution | Comments Off on Evolution – Bulbapedia, the community-driven Pokmon encyclopedia

What is Evolution – explanation and definitions

Posted: at 6:25 am

Human evolution drawn on a chalkboard. Martin Wimmer/E+/Getty Images

The Theory of Evolution is a scientific theory that essentially states species change over time. There are many different ways species change, but most of them are based on the idea of Natural Selection. The Theory of Evolution through Natural Selection was the first scientific theory that put together evidence of change through time as well as a mechanism for how it happens.

The idea that traits are passed down from parents to offspring has been around since the ancient Greek philosophers' time.

In the middle 1700s, Carolus Linnaeus came up with his taxonomic naming system, which grouped like species together and implied there was an evolutionary connection between species within the same group.

The late 1700s saw the first theories that species changed over time. Scientists like the Comte de Buffon and Charles Darwin's grandfather, Erasmus Darwin, both proposed that species changed over time, but neither man could explain how or why they changed.

They also kept their ideas under wraps due to how controversial the thoughts were compared to accepted religious views at the time.

John Baptiste Lamarck, a student of the Comte de Buffon, was the first to publicly state species changed over time. However, part of his theory was incorrect. Lamarck proposed that acquired traits were passed down to offspring. Georges Cuvier was able to prove that part of the theory incorrect, but he also had evidence that there were once living species that had evolved and gone extinct.

Cuvier believed in catastrophism, meaning these changes and extinctions in nature happened suddenly and violently. James Hutton and Charles Lyell countered Cuvier's argument with the idea of uniformitarianism.

This theory said changes happen slowly and accumulate over time.

Sometimes called "survival of the fittest," Natural Selection was most famously explained by Charles Darwin in his book On the Origin of Species. In the book, Darwin proposed that individuals with traits most suitable to their environments lived long enough to reproduce and passed down those desirable traits to their offspring.

If an individual had less than favorable traits, they would die and not pass on those traits. Over time, only the "fittest" traits of the species survived. Eventually, after enough time passed, these small adaptations would add up to create new species.

Darwin was not the only person to come up with this idea at that time. Alfred Russel Wallace also had evidence and came to the same conclusions as Darwin around the same time. They collaborated for a short time and jointly presented their findings. Armed with evidence from all over the world due to their various travels, Darwin and Wallace received favorable responses in the scientific community about their ideas. The partnership ended when Darwin published his book.

One very important part of the Theory of Evolution through Natural Selection is the understanding that individuals cannot evolve; they can only adapt to their environments. Those adaptations add up over time and eventually the entire species has evolved from what it was like earlier. This can lead to new species forming and sometimes extinction of older species.

There are many pieces of evidence that support the Theory of Evolution. Darwin relied on the similar anatomies of species to link them. He also had some fossil evidence that showed slight changes in the body structure of the species over time, often leading to vestigial structures. Of course, the fossil record is incomplete and has "missing links." With today's technology, there are many other types of evidence for evolution. This includes similarities in the embryos of different species, the same DNA sequences found across all species, and an understanding of how DNA mutations work in microevolution. More fossil evidence has also been found since Darwin's time, although there are still many gaps in the fossil record.

Today, the Theory of Evolution is often portrayed in the media as a controversial subject. Primate evolution and the idea that humans evolved from monkeys has been a major point of friction between scientific and religious communities. Politicians and court decisions have debated whether or not schools should teach evolution or if they should also teach alternate points of view like Intelligent Design or Creationism.

The State of Tennessee v. Scopes, or the Scopes "Monkey" Trial, was a famous court battle over teaching evolution in the classroom. In 1925, a substitute teacher named John Scopes was arrested for illegally teaching evolution in a Tennessee science class. This was the first major court battle over evolution, and it brought attention to a formerly taboo subject.

The Theory of Evolution is often seen as the main overarching theme that ties all topics of Biology together. It includes Genetics, Population Biology, Anatomy and Physiology, and Embryology, among others. While the Theory has itself evolved and expanded over time, the principles laid out by Darwin in the 1800s still hold true today.

For definitions of evolution-related terms, see our glossary.

Original post:

What is Evolution - explanation and definitions

Posted in Evolution | Comments Off on What is Evolution – explanation and definitions

Evolution – Biology-Online Dictionary

Posted: at 6:25 am

Home Evolution

Definition

noun, plural: evolutions

(1) The change in genetic composition of a population over successive generations, which may be caused by natural selection, inbreeding, hybridization, or mutation.

(2) The sequence of events depicting the evolutionary development of a species or of a group of related organisms; phylogeny.

Supplement

In order for evolution to occur, there must be genetic variation. Genetic variation brings about evolution. Without it there will be no evolution. There are two major mechanisms that drive evolution. First is natural selection. Individuals with advantageous traits are more likely to reproduce successfully, passing these traits to the next generation. This kind of evolution driven by natural selection is called adaptive evolution. Another mechanism involves genetic drift, which produces random changes in the frequency of traits in a population. Evolution that arises from genetic drift is called neutral evolution.

Word origin: Latin evolutio (an unrolling, unfolding), ex- (from, out of) + volere (to roll).

See also: Darwinism.

Related terms:

See the rest here:

Evolution - Biology-Online Dictionary

Posted in Evolution | Comments Off on Evolution – Biology-Online Dictionary

Introduction to Human Evolution | The Smithsonian Institution …

Posted: at 6:25 am

Human evolution

Human evolution is the lengthy process of change by which people originated from apelike ancestors. Scientific evidence shows that the physical and behavioral traits shared by all people originated from apelike ancestors and evolved over a period of approximately six million years.

One of the earliest defining human traits, bipedalism -- the ability to walk on two legs -- evolved over 4 million years ago. Other important human characteristics -- such as a large and complex brain, the ability to make and use tools, and the capacity for language -- developed more recently. Many advanced traits -- including complex symbolic expression, art, and elaborate cultural diversity -- emerged mainly during the past 100,000 years.

Humans are primates. Physical and genetic similarities show that the modern human species, Homo sapiens, has a very close relationship to another group of primate species, the apes. Humans and the great apes (large apes) of Africa -- chimpanzees (including bonobos, or so-called pygmy chimpanzees) and gorillas -- share a common ancestor that lived between 8 and 6 million years ago. Humans first evolved in Africa, and much of human evolution occurred on that continent. The fossils of early humans who lived between 6 and 2 million years ago come entirely from Africa.

Most scientists currently recognize some 15 to 20 different species of early humans. Scientists do not all agree, however, about how these species are related or which ones simply died out. Many early human species -- certainly the majority of them left no living descendants. Scientists also debate over how to identify and classify particular species of early humans, and about what factors influenced the evolution and extinction of each species.

Early humans first migrated out of Africa into Asia probably between 2 million and 1.8 million years ago. They entered Europe somewhat later, between 1.5 million and 1 million years. Species of modern humans populated many parts of the world much later. For instance, people first came to Australia probably within the past 60,000 years and to the Americas within the past 30,000 years or so. The beginnings of agriculture and the rise of the first civilizations occurred within the past 12,000 years.

Paleoanthropology is the scientific study of human evolution. Paleoanthropology is a subfield of anthropology, the study of human culture, society, and biology. The field involves an understanding of the similarities and differences between humans and other species in their genes, body form, physiology, and behavior. Paleoanthropologists search for the roots of human physical traits and behavior. They seek to discover how evolution has shaped the potentials, tendencies, and limitations of all people. For many people, paleoanthropology is an exciting scientific field because it investigates the origin, over millions of years, of the universal and defining traits of our species. However, some people find the concept of human evolution troubling because it can seem not to fit with religious and other traditional beliefs about how people, other living things, and the world came to be. Nevertheless, many people have come to reconcile their beliefs with the scientific evidence.

Early human fossils and archeological remains offer the most important clues about this ancient past. These remains include bones, tools and any other evidence (such as footprints, evidence of hearths, or butchery marks on animal bones) left by earlier people. Usually, the remains were buried and preserved naturally. They are then found either on the surface (exposed by rain, rivers, and wind erosion) or by digging in the ground. By studying fossilized bones, scientists learn about the physical appearance of earlier humans and how it changed. Bone size, shape, and markings left by muscles tell us how those predecessors moved around, held tools, and how the size of their brains changed over a long time. Archeological evidence refers to the things earlier people made and the places where scientists find them. By studying this type of evidence, archeologists can understand how early humans made and used tools and lived in their environments.

The process of evolution involves a series of natural changes that cause species (populations of different organisms) to arise, adapt to the environment, and become extinct. All species or organisms have originated through the process of biological evolution. In animals that reproduce sexually, including humans, the term species refers to a group whose adult members regularly interbreed, resulting in fertile offspring -- that is, offspring themselves capable of reproducing. Scientists classify each species with a unique, two-part scientific name. In this system, modern humans are classified as Homo sapiens.

Evolution occurs when there is change in the genetic material -- the chemical molecule, DNA -- which is inherited from the parents, and especially in the proportions of different genes in a population. Genes represent the segments of DNA that provide the chemical code for producing proteins. Information contained in the DNA can change by a process known as mutation. The way particular genes are expressed that is, how they influence the body or behavior of an organism -- can also change. Genes affect how the body and behavior of an organism develop during its life, and this is why genetically inherited characteristics can influence the likelihood of an organisms survival and reproduction.

Evolution does not change any single individual. Instead, it changes the inherited means of growth and development that typify a population (a group of individuals of the same species living in a particular habitat). Parents pass adaptive genetic changes to their offspring, and ultimately these changes become common throughout a population. As a result, the offspring inherit those genetic characteristics that enhance their chances of survival and ability to give birth, which may work well until the environment changes. Over time, genetic change can alter a species' overall way of life, such as what it eats, how it grows, and where it can live. Human evolution took place as new genetic variations in early ancestor populations favored new abilities to adapt to environmental change and so altered the human way of life.

Dr. Rick Potts provides a video short introduction to some of the evidence for human evolution, in the form of fossils and artifacts.

The rest is here:

Introduction to Human Evolution | The Smithsonian Institution ...

Posted in Evolution | Comments Off on Introduction to Human Evolution | The Smithsonian Institution …

EvolutionM.net – Mitsubishi Lancer Evolution | Reviews, News …

Posted: at 6:25 am

Perspective. When you take any testimony into account, its the single most important variable to consider. My vantage point of the RS is a bit unique coming from an Evo background, where the benchmark for a turbo-AWD car is performance first. Handling, acceleration, braking, and upgrade potential are all important to me. The furthest turbo-AWD [] More

A friend of mine recently picked up a 2016 GT350R, and we met at our local Cars & Coffee to weigh his car. As it turns out, we were able to find all versions of the S550 present and get weights on them, sans the V6. The GT350R is gorgeous inside and out. I took [] More

Recently Ive picked up some corner scales. My intention is to start to weigh, photograph, and write mini features on local cars. Im just getting ramped up, and its been pretty cold locally, so there isnt much data yet. Ill be posting results in this thread:**Corner weight database, click me** Im also using the [] More

Today marks the closing of a book, and that makes me feel a little sad. Im driving to Hallmark Mitsubishi in Nashville to review a Final Edition Evolution X, which will likely be the last new Evo Ill ever get to drive. Ive arranged this with my friend Evan, who is the sales manager here. [] More

Create new posts and participate in discussions. It's free!

Read more:

EvolutionM.net - Mitsubishi Lancer Evolution | Reviews, News ...

Posted in Evolution | Comments Off on EvolutionM.net – Mitsubishi Lancer Evolution | Reviews, News …

The Abolition of Work by Bob Black – Inspiracy

Posted: at 6:22 am

No one should ever work.

Work is the source of nearly all the misery in the world. Almost any evil youd care to name comes from working or from living in a world designed for work. In order to stop suffering, we have to stop working.

That doesnt mean we have to stop doing things. It does mean creating a new way of life based on play; in other words, a ludic revolution. By play I mean also festivity, creativity, conviviality, commensality, and maybe even art. There is more to play than childs play, as worthy as that is. I call for a collective adventure in generalized joy and freely interdependent exuberance. Play isnt passive. Doubtless we all need a lot more time for sheer sloth and slack than we ever enjoy now, regardless of income or occupation, but once recovered from employment-induced exhaustion nearly all of us want to act.

The ludic life is totally incompatible with existing reality. So much the worse for reality, the gravity hole that sucks the vitality from the little in life that still distinguishes it from mere survival. Curiously or maybe not all the old ideologies are conservative because they believe in work. Some of them, like Marxism and most brands of anarchism, believe in work all the more fiercely because they believe in so little else.

Liberals say we should end employment discrimination. I say we should end employment. Conservatives support right-to-work laws. Following Karl Marxs wayward son-in-law Paul Lafargue I support the right to be lazy. Leftists favor full employment. Like the surrealists except that Im not kidding I favor full unemployment. Trotskyists agitate for permanent revolution. I agitate for permanent revelry. But if all the ideologues (as they do) advocate work and not only because they plan to make other people do theirs they are strangely reluctant to say so. They will carry on endlessly about wages, hours, working conditions, exploitation, productivity, profitability. Theyll gladly talk about anything but work itself. These experts who offer to do our thinking for us rarely share their conclusions about work, for all its saliency in the lives of all of us. Among themselves they quibble over the details. Unions and management agree that we ought to sell the time of our lives in exchange for survival, although they haggle over the price. Marxists think we should be bossed by bureaucrats. Libertarians think we should be bossed by businessmen. Feminists dont care which form bossing takes so long as the bosses are women. Clearly these ideology-mongers have serious differences over how to divvy up the spoils of power. Just as clearly, none of them have any objection to power as such and all of them want to keep us working.

You may be wondering if Im joking or serious. Im joking and serious. To be ludic is not to be ludicrous. Play doesnt have to be frivolous, although frivolity isnt triviality; very often we ought to take frivolity seriously. Id like life to be a game but a game with high stakes. I want to play for keeps.

The alternative to work isnt just idleness. To be ludic is not to be quaaludic. As much as I treasure the pleasure of torpor, its never more rewarding than when it punctuates other pleasures and pastimes. Nor am I promoting the managed time-disciplined safety-valve called leisure; far from it. Leisure is nonwork for the sake of work. Leisure is time spent recovering from work and in the frenzied but hopeless attempt to forget about work. Many people return from vacations so beat that they look forward to returning to work so they can rest up. The main difference between work and leisure is that at work at least you get paid for your alienation and enervation.

I am not playing definitional games with anybody. When I say I want to abolish work, I mean just what I say, but I want to say what I mean by defining my terms in non-idiosyncratic ways. My minimum definition of work is forced labor, that is, compulsory production. Both elements are essential. Work is production enforced by economic or political means, by the carrot or the stick. (The carrot is just the stick by other means.) But not all creation is work. Work is never done for its own sake, its done on account of some product or output that the worker (or, more often, somebody else) gets out of it. This is what work necessarily is. To define it is to despise it. But work is usually even worse than its definition decrees. The dynamic of domination intrinsic to work tends over time toward elaboration. In advanced work-riddled societies, including all industrial societies whether capitalist or communist, work invariably acquires other attributes which accentuate its obnoxiousness.

Usually and this is even more true in communist than capitalist countries, where the state is almost the only employer and everyone is an employee work is employment, i.e., wage-labor, which means selling yourself on the installment plan. Thus 95% of Americans who work, work for somebody (or something) else. In Cuba or China or any other alternative model which might be adduced, the corresponding figure approaches 100%. Only the embattled Third World peasant bastions Mexico, India, Brazil, Turkey temporarily shelter significant concentrations of agriculturists who perpetuate the traditional arrangement of most laborers in the last several millennia, the payment of taxes (= ransom) to the state or rent to parasitic landlords in return for being otherwise left alone. Even this raw deal is beginning to look good. All industrial (and office) workers are employees and under the sort of surveillance which ensures servility.

But modern work has worse implications. People dont just work, they have jobs. One person does one productive task all the time on an or-else basis. Even if the task has a quantum of intrinsic interest (as increasingly many jobs dont) the monotony of its obligatory exclusivity drains its ludic potential. A job that might engage the energies of some people, for a reasonably limited time, for the fun of it, is just a burden on those who have to do it for forty hours a week with no say in how it should be done, for the profit of owners who contribute nothing to the project, and with no opportunity for sharing tasks or spreading the work among those who actually have to do it. This is the real world of work: a world of bureaucratic blundering, of sexual harassment and discrimination, of bonehead bosses exploiting and scapegoating their subordinates who by any rational-technical criteria should be calling the shots. But capitalism in the real world subordinates the rational maximization of productivity and profit to the exigencies of organizational control.

The degradation which most workers experience on the job is the sum of assorted indignities which can be denominated as discipline. Foucault has complexified this phenomenon but it is simple enough. Discipline consists of the totality of totalitarian controls at the workplacesurveillance, rotework, imposed work tempos, production quotas, punching-in and out, etc. Discipline is what the factory and the office and the store share with the prison and the school and the mental hospital. It is something historically original and horrible. It was beyond the capacities of such demonic dictators of yore as Nero and Genghis Khan and Ivan the Terrible. For all their bad intentions they just didnt have the machinery to control their subjects as thoroughly as modern despots do. Discipline is the distinctively diabolical modern mode of control, it is an innovative intrusion which must be interdicted at the earliest opportunity.

Such is work. Play is just the opposite. Play is always voluntary. What might otherwise be play is work if its forced. This is axiomatic. Bernie de Koven has defined play as the suspension of consequences. This is unacceptable if it implies that play is inconsequential. The point is not that play is without consequences. This is to demean play. The point is that the consequences, if any, are gratuitous. Playing and giving are closely related, they are the behavioral and transactional facets of the same impulse, the play-instinct. They share an aristocratic disdain for results. The player gets something out of playing; thats why he plays. But the core reward is the experience of the activity itself (whatever it is). Some otherwise attentive students of play, like Johan Huizinga (Homo Ludens), define it as gameplaying or following rules. I respect Huizingas erudition but emphatically reject his constraints. There are many good games (chess, baseball, Monopoly, bridge) which are rule-governed but there is much more to play than game-playing. Conversation, sex, dancing, travelthese practices arent rule-governed but they are surely play if anything is. And rules can be played with at least as readily as anything else.

Work makes a mockery of freedom. The official line is that we all have rights and live in a democracy. Other unfortunates who arent free like we are have to live in police states. These victims obey orders or-else, no matter how arbitrary. The authorities keep them under regular surveillance. State bureaucrats control even the smaller details of everyday life. The officials who push them around are answerable only to higher-ups, public or private. Either way, dissent and disobedience are punished. Informers report regularly to the authorities. All this is supposed to be a very bad thing.

And so it is, although it is nothing but a description of the modern workplace. The liberals and conservatives and libertarians who lament totalitarianism are phonies and hypocrites. There is more freedom in any moderately de-Stalinized dictatorship than there is in the ordinary American workplace. You find the same sort of hierarchy and discipline in an office or factory as you do in a prison or a monastery. In fact, as Foucault and others have shown, prisons and factories came in at about the same time, and their operators consciously borrowed from each others control techniques. A worker is a part-time slave. The boss says when to show up, when to leave, and what to do in the meantime. He tells you how much work to do and how fast. He is free to carry his control to humiliating extremes, regulating, if he feels like it, the clothes you wear or how often you go to the bathroom. With a few exceptions he can fire you for any reason, or no reason. He has you spied on by snitches and supervisors, he amasses a dossier on every employee. Talking back is called insubordination, just as if a worker is a naughty child, and it not only gets you fired, it disqualifies you for unemployment compensation. Without necessarily endorsing it for them either, it is noteworthy that children at home and in school receive much the same treatment, justified in their case by their supposed immaturity. What does this say about their parents and teachers who work?

The demeaning system of domination Ive described rules over half the waking hours of a majority of women and the vast majority of men for decades, for most of their lifespans. For certain purposes its not too misleading to call our system democracy or capitalism or better stil l industrialism, but its real names are factory fascism and office oligarchy. Anybody who says these people are free is lying or stupid. You are what you do. If you do boring, stupid, monotonous work, chances are youll end up boring, stupid, and monotonous. Work is a much better explanation for the creeping cretinization all around us than even such significant moronizing mechanisms as television and education. People who are regimented all their lives, handed to work from school and bracketed by the family in the beginning and the nursing home in the end, are habituated to hierarchy and psychologically enslaved. Their aptitude for autonomy is so atrophied that their fear of freedom is among their few rationally grounded phobias. Their obedience training at work carries over into the families they start, thus reproducing the system in more ways than one, and into politics, culture and everything else. Once you drain the vitality from people at work, theyll likely submit to hierarchy and expertise in everything. Theyre used to it.

We are so close to the world of work that we cant see what it does to us. We have to rely on outside observers from other times or other cultures to appreciate the extremity and the pathology of our present position. There was a time in our own past when the work ethic would have been incomprehensible, and perhaps Weber was on to something when he tied its appearance to a religion, Calvinism, which if it emerged today instead of four centuries ago would immediately and appropriately be labeled a cult. Be that as it may, we have only to draw upon the wisdom of antiquity to put work in perspective. The ancients saw work for what it is, and their view prevailed, the Calvinist cranks notwithstanding, until overthrown by industrialism but not before receiving the endorsement of its prophets.

Lets pretend for a moment that work doesnt turn people into stultified submissives. Lets pretend, in defiance of any plausible psychology and the ideology of its boosters, that it has no effect on the formation of character. And lets pretend that work isnt as boring and tiring and humiliating as we all know it really is. Even then, work would still make a mockery of all humanistic and democratic aspirations, just because it usurps so much of our time. Socrates said that manual laborers make bad friends and bad citizens because they have no time to fulfill the responsibilities of friendship and citizenship. He was right. Because of work, no matter what we do, we keep looking at our watches. The only thing free about so-called free time is that it doesnt cost the boss anything. Free time is mostly devoted to getting ready for work, going to work, returning from work, and recovering from work. Free time is a euphemism for the peculiar way labor, as a factor of production, not only transports itself at its own expense to and from the workplace, but assumes primary responsibility for its own maintenance and repair. Coal and steel dont do that. Lathes and typewriters dont do that. No wonder Edward G. Robinson in one of his gangster movies exclaimed, Work is for saps!

Both Plato and Xenophon attribute to Socrates and obviously share with him an awareness of the destructive effects of work on the worker as a citizen and as a human being. Herodotus identified contempt for work as an attribute of the classical Greeks at the zenith of their culture. To take only one Roman example, Cicero said that whoever gives his labor for money sells himself and puts himself in the rank of slaves. His candor is now rare, but contemporary primitive societies which we are wont to look down upon have provided spokesmen who have enlightened Western anthropologists. The Kapauku of West Irian, according to Posposil, have a conception of balance in life and accordingly work only every other day, the day of rest designed to regain the lost power and health. Our ancestors, even as late as the eighteenth century when they were far along the path to our present predicament, at least were aware of what we have forgotten, the underside of industrialization. Their religious devotion to St. Monday thus establishing a de facto five-day week 150-200 years before its legal consecration was the despair of the earliest factory owners. They took a long time in submitting to the tyranny of the bell, predecessor of the time clock. In fact it was necessary for a generation or two to replace adult males with women accustomed to obedience and children who could be molded to fit industrial needs. Even the exploited peasants of the ancien rgime wrested substantial time back from their landlords work. According to Lafargue, a fourth of the French peasants calendar was devoted to Sundays and holidays, and Chayanovs figures from villages in Czarist Russia hardly a progressive society likewise show a fourth or fifth of peasants days devoted to repose. Controlling for productivity, we are obviously far behind these backward societies. The exploited muzhiks would wonder why any of us are working at all. So should we.

To grasp the full enormity of our deterioration, however, consider the earliest condition of humanity, without government or property, when we wandered as hunter-gatherers. Hobbes surmised that life was then nasty, brutish and short. Others assume that life was a desperate unremitting struggle for subsistence, a war waged against a harsh Nature with death and disaster awaiting the unlucky or anyone who was unequal to the challenge of the struggle for existence. Actually, that was all a projection of fears for the collapse of government authority over communities unaccustomed to doing without it, like the England of Hobbes during the Civil War. Hobbes compatriots had already encountered alternative forms of society which illustrated other ways of lifein North America, particularlybut already these were too remote from their experience to be understandable. (The lower orders, closer to the condition of the Indians, understood it better and often found it attractive. Throughout the seventeenth century, English settlers defected to Indian tribes or, captured in war, refused to return to the colonies. But the Indians no more defected to white settlements than West Germans climbed the Berlin Wall from the west.) The survival of the fittest version the Thomas Huxley version of Darwinism was a better account of economic conditions in Victorian England than it was of natural selection, as the anarchist Kropotkin showed in his book Mutual Aid, A Factor in Evolution. (Kropotkin was a scientist a geographer whod had ample involuntary opportunity for fieldwork whilst exiled in Siberia: he knew what he was talking about.) Like most social and political theory, the story Hobbes and his successors told was really unacknowledged autobiography.

The anthropologist Marshall Sahlins, surveying the data on contemporary hunter-gatherers, exploded the Hobbesian myth in an article entitled The Original Affluent Society. They work a lot less than we do, and their work is hard to distinguish from what we regard as play. Sahlins concluded that hunters and gatherers work less than we do; and, rather than a continuous travail, the food quest is intermittent, leisure abundant, and there is a greater amount of sleep in the daytime per capita per year than in any other condition of society. They worked an average of four hours a day, assuming they were working at all. Their labor, as it appears to us, was skilled labor which exercised their physical and intellectual capacities; unskilled labor on any large scale, as Sahlins says, is impossible except under industrialism. Thus it satisfied Friedrich Schillers definition of play, the only occasion on which man realizes his complete humanity by giving full play to both sides of his twofold nature, thinking and feeling. As he put it: The animal works when deprivation is the mainspring of its activity, and it plays when the fullness of its strength is this mainspring, when superabundant life is its own stimulus to activity. (A modern version dubiously developmental is Abraham Maslows counterposition of deficiency and growth motivation.) Play and freedom are, as regards production, coextensive. Even Marx, who belongs (for all his good intentions) in the productivist pantheon, observed that the realm of freedom does not commence until the point is passed where labor under the compulsion of necessity and external utility is required. He never could quite bring himself to identify this happy circumstance as what it is, the abolition of workits rather anomalous, after all, to be pro-worker and anti-workbut we can.

The aspiration to go backwards or forwards to a life without work is evident in every serious social or cultural history of pre-industrial Europe, among them M. Dorothy Georges England in Transition and Peter Burkes Popular Culture in Early Modern Europe. Also pertinent is Daniel Bells essay Work and Its Discontents, the first text, I believe, to refer to the revolt against work in so many words and, had it been understood, an important correction to the complacency ordinarily associated with the volume in which it was collected, The End of Ideology. Neither critics nor celebrants have noticed that Bells end-of-ideology thesis signaled not the end of social unrest but the beginning of a new, uncharted phase unconstrained and uninformed by ideology. It was Seymour Lipset (in Political Man), not Bell, who announced at the same time that the fundamental problems of the Industrial Revolution have been solved, only a few years before the post- or meta-industrial discontents of college students drove Lipset from UC Berkeley to the relative (and temporary) tranquillity of Harvard.

As Bell notes, Adam Smith in The Wealth of Nations, for all his enthusiasm for the market and the division of labor, was more alert to (and more honest about) the seamy side of work than Ayn Rand or the Chicago economists or any of Smiths modern epigones. As Smith observed: The understandings of the greater part of men are necessarily formed by their ordinary employments. The man whose life is spent in performing a few simple operations has no occasion to exert his understanding He generally becomes as stupid and ignorant as it is possible for a human creature to become. Here, in a few blunt words, is my critique of work. Bell, writing in 1956, the Golden Age of Eisenhower imbecility and American self-satisfaction, identified the unorganized, unorganizable malaise of the 1970s and since, the one no political tendency is able to harness, the one identified in HEWs report Work in America, the one which cannot be exploited and so is ignored. That problem is the revolt against work. It does not figure in any text by any laissez-faire economist Milton Friedman, Murray Rothbard, Richard Posner because, in their terms, as they used to say on Lost in Space, it does not compute.

If these objections, informed by the love of liberty, fail to persuade humanists of a utilitarian or even paternalist turn, there are others which they cannot disregard. Work is hazardous to your health, to borrow a book title. In fact, work is mass murder or genocide. Directly or indirectly, work will kill most of the people who read these words. Between 14,000 and 25,000 workers are killed annually in this country on the job. Over two million are disabled. Twenty to twenty-five million are injured every year. And these figures are based on a very conservative estimation of what constitutes a work-related injury. Thus they dont count the half-million cases of occupational disease every year. I looked at one medical textbook on occupational diseases which was 1,200 pages long. Even this barely scratches the surface. The available statistics count the obvious cases like the 100,000 miners who have black lung disease, of whom 4,000 die every year. What the statistics dont show is that tens of millions of people have their lifespans shortened by work which is all that homicide means, after all. Consider the doctors who work themselves to death in their late 50s. Consider all the other workaholics.

Even if you arent killed or crippled while actually working, you very well might be while going to work, coming from work, looking for work, or trying to forget about work. The vast majority of victims of the automobile are either doing one of these work-obligatory activities or else fall afoul of those who do them. To this augmented body-count must be added the victims of auto-industrial pollution and work-induced alcoholism and drug addiction. Both cancer and heart disease are modern afflictions normally traceable, directly or indirectly, to work.

Work, then, institutionalizes homicide as a way of life. People think the Cambodians were crazy for exterminating themselves, but are we any different? The Pol Pot regime at least had a vision, however blurred, of an egalitarian society. We kill people in the six-figure range (at least) in order to sell Big Macs and Cadillacs to the survivors. Our forty or fifty thousand annual highway fatalities are victims, not martyrs. They died for nothing or rather, they died for work. But work is nothing to die for.

State control of the economy is no solution. Work is, if anything, more dangerous in the state-socialist countries than it is here. Thousands of Russian workers were killed or injured building the Moscow subway. Chernobyl and other Soviet nuclear disasters covered up until recently make Times Beach and Three Mile Islandbut not Bhopallook like elementary-school air-raid drills. On the other hand, deregulation, currently fashionable, wont help and will probably hurt. From a health and safety standpoint, among others, work was at its worst in the days when the economy most closely approximated laissez-faire. Historians like Eugene Genovese have argued persuasively thatas antebellum slavery apologists insistedfactory wage-workers in the Northern American states and in Europe were worse off than Southern plantation slaves. No rearrangement of relations among bureaucrats and businessmen seems to make much difference at the point of production. Serious implementation of even the rather vague standards enforceable in theory by OSHA would probably bring the economy to a standstill. The enforcers apparently appreciate this, since they dont even try to crack down on most malefactors.

What Ive said so far ought not to be controversial. Many workers are fed up with work. There are high and rising rates of absenteeism, turnover, employee theft and sabotage, wildcat strikes, and overall goldbricking on the job. There may be some movement toward a conscious and not just visceral rejection of work. And yet the prevalent feeling, universal among bosses and their agents and also widespread among workers themselves, is that work itself is inevitable and necessary.

I disagree. It is now possible to abolish work and replace it, insofar as it serves useful purposes, with a multitude of new kinds of free activities. To abolish work requires going at it from two directions, quantitative and qualitative. On the one hand, on the quantitative side, we have to cut down massively on the amount of work being done. At present most work is useless or worse and we should simply get rid of it. On the other hand and I think this the crux of the matter and the revolutionary new departure we have to take what useful work remains and transform it into a pleasing variety of game-like and craft-like pastimes, indistinguishable from other pleasurable pastimes except that they happen to yield useful end-products. Surely that shouldnt make them less enticing to do. Then all the artificial barriers of power and property could come down. Creation could become recreation. And we could all stop being afraid of each other.

I dont suggest that most work is salvageable in this way. But then most work isnt worth trying to save. Only a small and diminishing fraction of work serves any useful purpose independent of the defense and reproduction of the work-system and its political and legal appendages. Thirty years ago, Paul and Percival Goodman estimated that just five percent of the work then being done presumably the figure, if accurate, is lower now would satisfy our minimal needs for food, clothing and shelter. Theirs was only an educated guess but the main point is quite clear: directly or indirectly, most work serves the unproductive purposes of commerce or social control. Right off the bat we can liberate tens of millions of salesmen, soldiers, managers, cops, stockbrokers, clergymen, bankers, lawyers, teachers, landlords, security guards, ad-men and everyone who works for them. There is a snowball effect since every time you idle some bigshot you liberate his flunkies and underlings also. Thus the economy implodes.

Forty percent of the workforce are white-collar workers, most of whom have some of the most tedious and idiotic jobs ever concocted. Entire industries, insurance and banking and real estate for instance, consist of nothing but useless paper-shuffling. It is no accident that the tertiary sector, the service sector, is growing while the secondary sector (industry) stagnates and the primary sector (agriculture) nearly disappears. Because work is unnecessary except to those whose power it secures, workers are shifted from relatively useful to relatively useless occupations as a measure to ensure public order. Anything is better than nothing. Thats why you cant go home just because you finish early. They want your time, enough of it to make you theirs, even if they have no use for most of it. Otherwise why hasnt the average work week gone down by more than a few minutes in the last sixty years?

Next we can take a meat-cleaver to production work itself. No more war production, nuclear power, junk food, feminine hygiene deodorant and above all, no more auto industry to speak of. An occasional Stanley Steamer or Model T might be all right, but the auto-eroticism on which such pest-holes as Detroit and Los Angeles depend is out of the question. Already, without even trying, weve virtually solved the energy crisis, the environmental crisis and assorted other insoluble social problems.

Finally, we must do away with far and away the largest occupation, the one with the longest hours, the lowest pay and some of the most tedious tasks around. I refer to housewives doing housework and child-rearing. By abolishing wage-labor and achieving full unemployment we undermine the sexual division of labor. The nuclear family as we know it is an inevitable adaptation to the division of labor imposed by modern wage-work. Like it or not, as things have been for the last century or two it is economically rational for the man to bring home the bacon, for the woman to do the shitwork and provide him with a haven in a heartless world, and for the children to be marched off to youth concentration camps called schools, primarily to keep them out of Moms hair but still under control, but incidentally to acquire the habits of obedience and punctuality so necessary for workers. If you would be rid of patriarchy, get rid of the nuclear family whose unpaid shadow work, as Ivan Illich says, makes possible the work-system that makes it necessary. Bound up with this no-nukes strategy is the abolition of childhood and the closing of the schools. There are more full-time students than full-time workers in this country. We need children as teachers, not students. They have a lot to contribute to the ludic revolution because theyre better at playing than grown-ups are. Adults and children are not identical but they will become equal through interdependence. Only play can bridge the generation gap.

I havent as yet even mentioned the possibility of cutting way down on the little work that remains by automating and cybernizing it. All the scientists and engineers and technicians freed from bothering with war research and planned obsolescence should have a good time devising means to eliminate fatigue and tedium and danger from activities like mining. Undoubtedly theyll find other projects to amuse themselves with. Perhaps theyll set up world-wide all-inclusive multi-media communications systems or found space colonies. Perhaps. I myself am no gadget freak. I wouldnt care to live in a pushbutton paradise. I dont want robot slaves to do everything; I want to do things myself. There is, I think, a place for labor-saving technology, but a modest place. The historical and pre-historical record is not encouraging. When productive technology went from hunting-gathering to agriculture and on to industry, work increased while skills and self-determination diminished. The further evolution of industrialism has accentuated what Harry Braverman called the degradation of work. Intelligent observers have always been aware of this. John Stuart Mill wrote that all the labor-saving inventions ever devised havent saved a moments labor. Karl Marx wrote that it would be possible to write a history of the inventions, made since 1830, for the sole purpose of supplying capital with weapons against the revolts of the working class. The enthusiastic technophiles Saint-Simon, Comte, Lenin, B.F. Skinner have always been unabashed authoritarians also; which is to say, technocrats. We should be more than skeptical about the promises of the computer mystics. They work like dogs; chances are, if they have their way, so will the rest of us. But if they have any particularized contributions more readily subordinated to human purposes than the run of high tech, lets give them a hearing.

What I really want to see is work turned into play. A first step is to discard the notions of a job and an occupation. Even activities that already have some ludic content lose most of it by being reduced to jobs which certain people, and only those people, are forced to do to the exclusion of all else. Is it not odd that farm workers toil painfully in the fields while their air-conditioned masters go home every weekend and putter about in their gardens? Under a system of permanent revelry, we will witness the Golden Age of the dilettante which will put the Renaissance to shame. There wont be any more jobs, just things to do and people to do them.

The secret of turning work into play, as Charles Fourier demonstrated, is to arrange useful activities to take advantage of whatever it is that various people at various times in fact enjoy doing. To make it possible for some people to do the things they could enjoy, it will be enough just to eradicate the irrationalities and distortions which afflict these activities when they are reduced to work. I, for instance, would enjoy doing some (not too much) teaching, but I dont want coerced students and I dont care to suck up to pathetic pedants for tenure.

Second, there are some things that people like to do from time to time, but not for too long, and certainly not all the time. You might enjoy baby-sitting for a few hours in order to share the company of kids, but not as much as their parents do. The parents meanwhile profoundly appreciate the time to themselves that you free up for them, although theyd get fretful if parted from their progeny for too long. These differences among individuals are what make a life of free play possible. The same principle applies to many other areas of activity, especially the primal ones. Thus many people enjoy cooking when they can practice it seriously at their leisure, but not when theyre just fueling up human bodies for work.

Third other things being equal some things that are unsatisfying if done by yourself or in unpleasant surroundings or at the orders of an overlord are enjoyable, at least for a while, if these circumstances are changed. This is probably true, to some extent, of all work. People deploy their otherwise wasted ingenuity to make a game of the least inviting drudge-jobs as best they can. Activities that appeal to some people dont always appeal to all others, but everyone at least potentially has a variety of interests and an interest in variety. As the saying goes, anything once. Fourier was the master at speculating about how aberrant and perverse penchants could be put to use in post-civilized society, what he called Harmony. He thought the Emperor Nero would have turned out all right if as a child he could have indulged his taste for bloodshed by working in a slaughterhouse. Small children who notoriously relish wallowing in filth could be organized in Little Hordes to clean toilets and empty the garbage, with medals awarded to the outstanding. I am not arguing for these precise examples but for the underlying principle, which I think makes perfect sense as one dimension of an overall revolutionary transformation. Bear in mind that we dont have to take todays work just as we find it and match it up with the proper people, some of whom would have to be perverse indeed.

If technology has a role in all this, it is less to automate work out of existence than to open up new realms for re/creation. To some extent we may want to return to handicrafts, which William Morris considered a probable and desirable upshot of communist revolution. Art would be taken back from the snobs and collectors, abolished as a specialized department catering to an elite audience, and its qualities of beauty and creation restored to integral life from which they were stolen by work. Its a sobering thought that the Grecian urns we write odes about and showcase in museums were used in their own time to store olive oil. I doubt our everyday artifacts will fare as well in the future, if there is one. The point is that theres no such thing as progress in the world of work; if anything, its just the opposite. We shouldnt hesitate to pilfer the past for what it has to offer, the ancients lose nothing yet we are enriched.

The reinvention of daily life means marching off the edge of our maps. There is, it is true, more suggestive speculation than most people suspect. Besides Fourier and Morrisand even a hint, here and there, in Marx there are the writings of Kropotkin, the syndicalists Pataud and Pouget, anarcho-communists old (Berkman) and new (Bookchin). The Goodman brothers Communitas is exemplary for illustrating what forms follow from given functions (purposes), and there is something to be gleaned form the often hazy heralds of alternative/appropriate/intermediate/convivial technology, like Schumacher and especially Illich, once you disconnect their fog machines. The situationists as represented by Vaneigems Revolution of Everyday Life and in the Situationist International Anthology are so ruthlessly lucid as to be exhilarating, even if they never did quite square the endorsement of the rule of the workers councils with the abolition of work. Better their incongruity, though, than any extant version of leftism, whose devotees look to be the last champions of work, for if there were no work there would be no workers, and without workers, whom would the left have to organize?

So the abolitionists will be largely on their own. No one can say what would result from unleashing the creative power stultified by work. Anything can happen. The tiresome debaters problem of freedom vs. necessity, with its theological overtones, resolves itself practically once the production of use-values is coextensive with the consumption of delightful play-activity.

Life will become a game, or rather many games, but not as it is nowa zero/sum game. An optimal sexual encounter is the paradigm of productive play. The participants potentiate each others pleasures, nobody keeps score, and everybody wins. The more you give, the more you get. In the ludic life, the best of sex will diffuse into the better part of daily life. Generalized play leads to the libidinization of life. Sex, in turn, can become less urgent and desperate, more playful. If we play our cards right, we can all get more out of life than we put into it; but only if we play for keeps.

Workers of the world relax!

This essay originated as a speech in 1980. A revised and enlarged version was published as a pamphlet in 1985, and in the first edition of The Abolition of Work and Other Essays (Loompanics Unlimited, 1986). It has also appeared in many periodicals and anthologies, including translations into French, German, Italian, Dutch and Slovene. Revised by the author for the Inspiracy Press edition.

Part I: The Abolition of Work

See the original post here:

The Abolition of Work by Bob Black - Inspiracy

Posted in Abolition Of Work | Comments Off on The Abolition of Work by Bob Black – Inspiracy

Atheism – RationalWiki

Posted: at 6:20 am

Our belief is not a belief. Our principles are not a faith. We do not rely solely upon science and reason, because these are necessary rather than sufficient factors, but we distrust anything that contradicts science or outrages reason. We may differ on many things, but what we respect is free inquiry, openmindedness, and the pursuit of ideas for their own sake.

Atheism, from the Greek a-, meaning "without", and theos, meaning "god", is the absence of belief in the existence of gods. Theos includes the Abrahamic YHWH(s), Zeus, the Flying Spaghetti Monster, and every other deity from A to Z[1] (and 0-9,!, ", #, $ or any other character, obviously). For the definition of atheism, the terms "God" and "a god" are used interchangeably as there is no difference between a monotheistic deity and a polytheistic pantheon of deities when it comes to complete disbelief in them. This also has the deliberate intent of ignoring the privileged position Yahweh has held in English grammar. Most atheists also do not believe in anything supernatural or paranormal (someone like this would be considered a naturalist).

We are all atheists about most of the gods that humanity has ever believed in. Some of us just go one god further.

Tied up with some of the more awkward aspects of defining the term "atheist" is the question of what god, or type of god, is being denied. This is particularly important for those who claim that atheism is supported by evidence (more specifically, the lack of evidence for a theistic case).

If the god being denied is the interventionist God, which most theists hold to exist, then the argument against the existence of this being is easy; the lack of any demonstrable interventions demonstrates the god's lack of existence. In this case, absence of evidence is evidence of absence. However, if the god being denied is of a less interventionist, or deist, type god, then the above argument regarding evidence doesn't work. Indeed, the only possible "evidence" for a deist god is the very existence of the universe, and most sane people don't tend to deny the universe exists. On the other hand as said "evidence" is simply asserted and isn't testable in any way, it is a lot less than wholly convincing and we return to "What can be asserted without evidence can be dismissed without evidence."

Whether atheism also requires a person to disbelieve in all other forms of magic, or ghosts, or psychic powers is also a question. These are not "gods" in the conventional sense at all, but they are still supernatural entities or powers. More "hardline" atheists would insist that disbelief in all things supernatural is mandatory for the label of "atheist." They would argue that this follows from the fact that atheism is a rational position, and that therefore atheists should take rational positions on other matters also. What does and what does not constitute a "god" in the case of atheism can often be very subjective; the definition could be restricted to monotheistic "creator" gods, or expanded to include all supernatural entities, or used to describe only things that are worshipped or idolised. The variables that arise when trying to perfectly codify "atheism" are numerous, and this is fitting with its position as specifically a lack of belief.

However, atheism only makes sense in the context of the ubiquity of religion and theistic belief worldwide. If religions didn't exist, atheism wouldn't exist and any discussion of the subject would be inherently meaningless - the world doesn't feature books, internet debates and billboard campaigns saying that it's fine to disbelieve in Bertrand Russell's celestial teapot precisely because few, if any, people believe in the teapot. Therefore a working, albeit still slightly subjective, definition of what constitutes a "god" can be developed based on the beliefs of self-declared religions of the world. As a thought experiment we can conceive of a religion that achieves literal overnight success by promoting some god, Athkel,[3] who will become a worldwide phenomenon tomorrow. An atheist would simply not believe in Athkel tomorrow, despite the fact they had no belief in him/her yesterday because it is a self defined religious deity.

There are many ways to describe different types of atheism and some of these are explained below. These shouldn't be read as factions or sects within atheism in the same way as denominations and sects within religion, Protestant/Catholicism in Christianity, Sunni/Shiite in Islam, and their multiple sub-groups for example. One does not "join" a group of implicit atheists. Instead of being sects that dictate people's beliefs, these should be taken as models to, at least roughly, describe people's beliefs and their attitudes towards belief itself. There are many similarities, all of which are included in the blanket term "atheist." However - as is typical in atheist thought - not all atheists consider these divisions particularly relevant, worthwhile, or meaningful.

The commonality among these various modes of atheism is the statement that no god or gods created natural phenomena such as the existence of life or the universe. Instead, these are usually explained through science, specifically without resort to supernatural explanations. Morality in atheism is also not based on religious precepts such as divine commandments or revelation through a holy text - many alternative philosophies exist to derive or explain morality, such as humanism.

Implicit atheism is simply the state of not believing in any gods.

Explicit atheism is a conscious rejection, either of the belief in gods or of their existence. Explicit atheists can be weak or strong atheists, but all strong atheists are explicit atheists.

Weak atheism (sometimes equated with "pragmatic atheism" or "negative atheism") describes the state of living as if no gods exist. It does not require an absolute statement of God's non-existence. The argument is based on the fact that as there is no evidence that gods, spatial teapots or fairies exist, we have no reason to believe in them. This argument could also be classified as extreme agnosticism, or "agnostic atheism" - as it is an acknowledgment of the lack of evidence but acting as if there were no gods.

Pragmatic atheists however are frequently reluctant to make outright statements like "Gods (or fairies) do not exist", because of the great difficulties involved in proving the absolute non-existence of anything - the idea that nothing can be proved is held in the philosophy of pyrrhonism. Consequently many pragmatic atheists would argue that the burden of proof does not lie with them to provide evidence against the extraordinary concept that gods exist. They would argue that it is up to the supporters of various religions to provide evidence for the existence of their own deities, and that no argument is necessary on the atheist's part.

Christopher Hitchens put it another way when he said: "What can be asserted without evidence can also be dismissed without evidence."

Strong atheism (sometimes equated with "theoretical atheism") makes an explicit statement against the existence of gods. Strong atheists would disagree with weak atheists about the inability to disprove the existence of gods. Strong atheism specifically combats religious beliefs and other arguments for belief in some god (or gods), such as Pascal's Wager, and argument from design. These arguments tend to be geared toward demonstrating that the concept of god is logically inconsistent or incoherent in order to actively disprove the existence of a god.[4]Theological noncognitivism, which asserts the meaninglessness of religious language, is an argument commonly invoked by strong atheists.[5] In contrast, weak atheist arguments tend to concentrate on the evidence (or lack thereof) for god, while strong atheist arguments tend to concentrate on making a positive case for the non-existence of god.

An apatheist has no interest in accepting or denying claims that a god or gods exist or do not exist. An apatheist considers the very question of the existence or non-existence of gods or other supernatural beings to be irrelevant and not worth consideration under any circumstances.

In short: they simply don't care. (Well, OK, they care enough to give themselves a name - so that people explicitly know what it is they don't care anything about. But that's it.)

Antitheism is, perhaps surprisingly, technically separate from any and all positions on the existence or non-existence of any given deity. Antitheism simply argues that a given (or all possible) human implementations of religious beliefs, metaphysically "true" or not, lead to results that are harmful and undesirable, either to the adherent, to society, or - usually - to both. As justification the antitheists will often point to the incompatibility of religion-based morality with modern humanistic values, or to the atrocities and bloodshed wrought by religion and by religious wars. Religious moderation as compared to religious extremism is an example of theistic anti-theism, also known as dystheism. Dystheism also encompasses questioning the morals even of a deity you believe in, e.g. chosing to obey commandments on nonviolence over calls to violence from God, despite them both being clearly put forward by this alleged giver of all morals.

We must question the story logic of having an all-knowing all-powerful God, who creates faulty Humans, and then blames them for his own mistakes.

Not all atheists are "disaffected with religion" some were just never raised with or indoctrinated with religious beliefs in the first place. Hence a substantial number have nothing to become disaffected with. However, in areas where religious belief is essentially taken as normal, there is a high chance that a person will have been religious before "coming out" as an atheist. As the term "atheist" only really means something in the context of ubiquitous religious belief, being disaffected or unconvinced by religion is certainly a factor in most, if not all, people who declare themselves as an atheist. As has been said previously, there is debate in the atheist community and not all atheists would agree with all of these reasons or even consider them relevant to atheism.

One of the major intellectual issues regarding disenchantment with religion is the fact that most world religions insist that all other faiths are wrong. While some moderate believers may like to take a stance that "all religions are right, they're just different interpretations", it's undeniable that heresy and apostasy are looked down upon very harshly in many faiths. This suggests the possibility that no religion is right, and further suggests that, because the vast majority of believers in any faith are born into it, being a member of the "correct" group or "the elect" is merely an accident of birth in most cases. There is also historical evidence that organized religion, while professing a peaceful moral code, is often the basis for exclusion and war as well as a method to motivate people in political conflicts. The enmity among different religions and even among sects within the same religion adds credibility to this idea.

Other reasons may be more directly to do with a religion or its specifics - namely (1) the evils that the concept of religion has produced over the ages, (2) the hypocrisy of professed believers and religious leaders who exhort their followers to help the poor, love their neighbors and behave morally but become wealthy through donations to the church and carry love for certain neighbors to an immoral extreme as defined by their own professed religious beliefs, and (3) the contradiction between talk of a loving god and a world in which children starve to death and innocent people are tortured and killed. Issues with religion may arise due to the nature of fundamentalists - insisting that their holy texts are literally true. This leads to attempts by such fundamentalists to undermine education by censoring scientific knowledge that seems to contradict their beliefs. Intelligent design is a prominent case of this (see Kitzmiller v. Dover Area School District). Often this doesn't sit well with moderate believers and especially those who may be on the verge of losing their faith, especially when the evidence provided by daily experience suggests that there may be no events that cannot be explained by common sense and scientific study.

Other issues that atheists have with religion involve the characteristics of supposed gods. Atheists sometimes view the idea that a supreme all-knowing deity would have the narcissistic need to be worshiped, and would punish anyone for worshiping a different god (or none at all), to be perverse.

Lastly, formerly religious atheists often report to have had their belief system unsettled by lack of evidence supporting the notion of the supernatural.

Arguments related to the burden of proof deal with whether atheists must disprove theism or theists must prove theism. Conventionally, the burden of proof lies with someone proposing a positive idea - or as Karl Popper fans would put it, those who are proposing something falsifiable. By this standard, atheists have no need to prove anything, and just need to render arguments for the existence of God as non-compelling. However, the ubiquity of religion in society and history have often shifted the burden of proof to atheists, who must subsequently prove a negative. Assuming that God exists is known as presuppositionalism and has always been a key tenet of Christian apologetics but is usually rejected by more sensible scholars. The absurdity of being asked to prove a negative is demonstrated in Bertrand Russell's teapot thought experiment - where no matter how hard you look, you can't thoroughly disprove the belief that a teapot is out there in space, orbiting the sun somewhere between Earth and Mars. This sort of presuppositional thinking is illogical, so asking an atheist to disprove God is an unreasonable request.

Occam's razor can also be invoked as a guide to making the fewest assumptions, and assuming God exists a priori is a major assumption that should be avoided. Combining these thoughts to lay the burden of proof on theists indicates that without supporting evidence, the default position on God must be either weak-ish atheism or agnosticism rather than theism. Proponents of atheism argue that the burden of proof has not been met by those proposing that a god exists, let alone the specific gods described by major religions.

If someone doesn't value evidence, what evidence are you going to provide to prove that they should value it? If someone doesnt value logic, what logical argument could you provide to show the importance of logic?

Logical arguments try to show that God cannot possibly exist (at least as described). Barring any escape hatch arguments like Goddidit, some properties of God are not compatible with each other or known facts about the world, and thus a creator-god cannot be a logically consistent and existent entity. These arguments are heavily dependent on the use of common descriptions of the Abrahamic God as a target: things such as omnipotence, omnipresence, and omnibenevolence. As a result, they are not as useful in trying to refute the claims of, say, Neopaganism, and are also vulnerable to the tactic of moving the goalposts by changing the descriptions of God.

The omnipotence paradox postulates that true omnipotence is not logically possible or not compatible with omniscience. This is primarily a logical argument based on the general question of whether an omnipotent being could limit its own power - if yes, it would cease to be omnipotent, if no, it wouldn't be omnipotent. Hence the paradox that shows, through contradiction, that God cannot exist as usually described.

Other logical arguments try to prove that god is not compatible with our scientific knowledge of reality. The Problem of evil states that a good god wouldn't permit gratuitous evil, yet such evil occurs, so a good god does not exist.[6] The argument from design is often given as proof of a creator, but it raises the following logical question: if the world is so complex that it must have had a creator, then the creator must be at least as complex and must therefore have a creator, and this would have to have had a more complex creator ad infinitum. Also, the argument from design does not offer evidence for any specific relgion; while it could be taken as support for the existence of a god or gods, it doesn't argue for the Christian God any more than, say, the Hindu pantheon.

While believers hasten to point out that their gods don't need to follow logic, let alone the known laws of physics, this is really a case of special pleading and doesn't so much prove anything itself. Atheists therefore tend to reject these counters to the logical arguments as they mostly beg the question of a creator's existence and, very arbitrarily, plead that a creator can be exempt from the same logic that was used to "prove" its existence.

I know of no society in human history that ever suffered because its people became too desirous of evidence in support of their core beliefs.

At the root of the worldview of most atheists is evidence, and atheists point out that sufficient evidence for the existence of gods is currently very lacking, and thus there is no reason to believe in them. Evidential arguments are less ambitious than logical arguments because, rather than proving that there is reason not to believe in a god, they show that there is no reason to believe in a god (See Burden of proof above). It is important to remember that what constitutes sufficient evidence can be quite subjective, although rationalism and science do offer some standardization. Various "holy books" exist that testify to the existence of gods, and claim that alleged miracles and personal experiences all constitute evidence in favor of the existence of a god character of some sort. However, atheists reject these as insufficient because the naturalistic explanations behind them (tracing authors of the holy texts, psychological experiments, and scientific experiments to explain experiences, and so on) are more plausible - indeed, the very existence of plausible naturalistic explanations renders the supernatural explanations obsolete. In addition these books make claims for a variety of faiths, so to accept the Bible's stories as evidence, one would also have to accept as evidence the miracle stories from other religions' holy books.

Atheists often cite evidence that processes attributed to a god might also occur naturally as evidential arguments. If evolution and the big bang are true, then why would a creator god have needed them?[7]Occam's razor makes theistic explanations less compelling.

Many atheists argue, in similar vein to the born-again Christian who "just knows" that God exists, that the day-to-day experience of the atheist demonstrates quite clearly that God does not. This is because they have an image in their heads of what this "God" would have to look like, viz., an entity in the vein of the God of the Old Testament who runs around zapping entire cities, turning people into pillars of salt, and generally answering people's prayers in flashes of fire and brimstoneor, answering prayers for the victory of a given football team, but not answering those made on behalf of starving children in disadvantaged parts of the world.

Nobody knows for sure how many clergy members are secretly atheists (or are secretly on the fence, with serious doubts about their religion). But almost everyone I've spoken with in Clergy Project strongly suspects that the numbers are high.

Studying religion in depth during training for clerical work can lead a person to examine religious ideas critically. The study of Christian theology will include the whole of the Bible, and include historical background which can lead to rational doubt. [9][10]

In 2011, the Freedom From Religion Foundation and the Richard Dawkins Foundation for Science and Reason launched a confidential support group for clergy who no longer believe, the Clergy Project, and by December 2012 the group had almost 400 members. One of the founders of Clergy Project is Dan Barker, co-president of the Freedom From Religion Foundation, who was an evangelical preacher for nineteen years before becoming an atheist.[11]Gretta Vosper is openly atheist as a minister and her congregation supports her. Former Methodist pastor Teresa MacBain received online support from Clergy Project before coming out as an atheist dramatically at an atheist convention in spring 2012. She became Public Relations Director of American Atheists. [12] MacBain currently works helping atheist groups to build communities with what she sees as the positive aspects of religion like music, ritual and community service without God.[13]

Freethought Blogger Greta Christina articulates a possible effect of clergy openly leaving Christianity on their parishioners' beliefs. The more traditional position of clergy is that they are somehow endowed with answers to all questions of faith. If these trained religious authorities start saying they have no answers to normal "Crises of Faith", even more if some of them suggest the most reasonable answer is atheism, lay Christians will find continuing with their belief more difficult. [14] It is worth noting, however, that modern clergy trained in most US or UK universities are discouraged from claiming to be exempt from such crises of faith, and to encourage people to share a "journey of spiritual discovery". Perhaps atheism must simply be accepted as an outcome of that endeavor.

Because atheism is effectively a lack of inherent religious or political ideology, there is very little that unifies all atheists.

That said, atheists do tend to fit a certain profile.

Specific research on atheists conducted in 2006 suggests that the true proportion of atheists is 2%[15][16][17] to 4% in the United States, 17% in Great Britain and 32% in France. A 2004 Telegraph poll found that 44% of Britons believed in a god, 35% did not, and 21% did not know.[18]

According to a 2012 WIN-Gallup International poll, 13% of the world identifies as "atheist", 23% identifies as "not religious", and 59% identifies as "religious"; these results were 3% more "atheist", 9% less "religious", and 6% more "non-religious" than 2005. Of note, in the United States 13% fewer people identified as "religious".[19]

Many studies have shown that groups with higher intelligence or more education have significantly more atheists.[20] A recent meta-analysis of 39 eligible studies from 1927 to 2002 was published in Mensa Magazine, and concluded that atheists are more likely to be of higher intelligence than their religious counterparts.[21] According to an article in the prestigious science journal Nature in 1998 the belief in a personal god or afterlife was very low among the members of the U.S. National Academy of Sciences. Only 7.0% believed in a personal god as compared to more than 85% of the general U.S. population.[22] A 2012 WIN-Gallup International poll found that people with college education were 16% less likely to describe themselves as religious than those without complete high school education.[19] A survey conducted by the Times of India in 2015 revealed that 22% of IIT-Bombay graduates do not believe in the existence of God, while another 30% do not know.[23] According to a Harvard survey, there are more atheists and agnostics entering Harvard University, one of the top ranked schools in America, than Catholics and Protestants. According to the same study, atheists and agnostics also make up a much higher percentage of the students than the general public.[24][25]This may suggest that the more intelligent subjects are more unlikely to believe in god or supernatural powers. An alternative interpretation is that having completed the kind of education that makes you likely to do well in IQ tests is also likely to have either divested you of religiosity or at least made you less susceptible to the kind of beliefs in a personal god which characterise Christian fundamentalism. Yet another possibility is that those with more education are simply more likely to have thought seriously about religion and scrutinized the things they were brought up to believe; the higher intelligence among atheists may simply be because those who achieve high levels of education tend to be smarter than average (meaning that it's not so much that smart people are atheists as that atheists tend to be smart people). If so, then if atheism were to become mainstream, we could expect the average age of atheists to go down, eventually approaching the average age of religious people.

The Programme for International Student Assessment notes that the best education is present in China and Singapore, while the poorest is present in Peru, Colombia, Qatar and Indonesia.[26] China is noted for having an atheist majority[27] and Singapore is noted for having a religious majority of Buddhists.[28] Peru and Colombia have an overwhelming religious Catholic Christian majority[29][30] and Qatar and Indonesia have an overwhelming religious Islamic majority.[31][32]

Education professor Yong Zhao asserts that the reason why countries with such differing religious attitudes succeed, while countries with other differing religious attitudes fail is simply due to the excessive workload and testing present in the Confucian cultural circle, the students within which make for outstanding test takers.[33]

Studies have shown that groups with more income have significantly more atheists. A 2012 WIN-Gallup International poll found that people in the highest quintile of income were 17% less likely to describe themselves as religious than the bottom quintile.[19] This is likely because those with more education tend to have higher incomes.

A recent study published in the Annals of Family Medicine suggests that, despite what some may think, religiousness does not appear to have a significant effect on how much physicians care for the underserving.[34]

The Pew Research Center (2014) reports that in the US:[35]

The Pew report also reported that 57% of "unaffiliated" were male and 43% were female.

Atheists are becoming more numerous but also more diverse. White middle-class men such as Dawkins, Harris and Hitchens no longer define the movement. One blogger argues that

Other atheists [Who?] strongly disagree and want to see the atheist movement focus on philosophical arguments against religion and pseudoscience.[36]

African American atheists are a small minority (2% of the American population) facing severe prejudice.

In most African-American communities, it is more acceptable to be a criminal who goes to church on Sunday, while selling drugs to kids all week, than to be an atheist who ... contributes to society and supports his family.

Despite this black atheists are getting together in online groups and giving each other confidence, also online groups progress to arranging offline meetings. [37] Atheists of color frequently feel they have different priorities from white atheist groups; they may be allied to faith groups that help poor blacks and fight racial discrimination. Atheists of color also form their own groups focusing more on economic and social problems their communities face and hope general atheist groups will focus more on these issues in the future. Sikivu Hutchinson is one of many atheists of color campaigning against injustice faced by poor people, black people, LGBT people, women and other oppressed groups. [38][39]

Isn't it enough to see that a garden is beautiful without having to believe that there are fairies at the bottom of it, too?

There has been a long history of rational people who have not accepted superstitious or magical explanations of natural phenomena and who have felt that "gods" are not necessary for the working of the world. The Eastern philosophy of Buddhism is broadly atheistic, explicitly eschewing the notion of a creation myth. In the Western world, there have been atheists almost as long as there has been philosophy and writing. Some of the most famous thinkers of the ancient world have been critical of belief in deities or eschewed religion entirely - many favouring logic and rationality to inform their lives and their actions, rather than religious texts. Democritus, who originally conceived of the atom, hypothesized a world without magic holding it together. Critias, one of the Thirty Tyrants of Athens, preceded Marx when he called religion a tool to control the masses.

Perhaps the best example of an explicitly atheistic ancient philosophy is the Carvaka school of thought, which originated in India in the first millennium BCE. The Carvakas posited a materialistic universe, rejected the idea of an afterlife, and emphasized the need to enjoy this life.[43]

Modern atheism in the Western world can be traced to the Age of Enlightenment. Important thinkers of that era who were atheists include Baron d'Holbach and Denis Diderot. The Scottish philosopher David Hume, though not explicitly avowing atheism, wrote critical essays on religions and religious beliefs (his most famous being a critique of belief in miracles), and posited naturalistic explanations for the origins of religion in The Natural History of Religion as well as criticizing traditional arguments for the existence of God in Dialogues Concerning Natural Religion.

Not until recently, however, did the term known as "atheism" begin to carry its current connotation. In an increasing number of countries around the world it is a neutral or unimportant label. The nation of New Zealand, for example, has thrice elected an agnostic woman (Helen Clark) as Prime Minister, followed by its current agnostic leader (John Key). Several Prime Ministers of the UK have been atheists, including Clement Attlee, and the current deputy PM, Nick Clegg. Also, the former Prime Minister of Australia, Julia Gillard, is openly atheist, and at least one other former Australian PM was atheist. However, in more religious areas such as the United States or Saudi Arabia the term carries a heavy stigma. Indeed, prejudice against atheists is so high in the United States that one study found that they are America's most distrusted minority.[44]

The reason for such attitudes towards atheists in these nations is unclear. Firstly, there is no stated creed with which to disagree (except perhaps for "strong" atheists, whose only belief is that there are no gods). Nor are atheists generally organized into lobbies or interest groups or political action committees (at least none that wield massive power), unlike the many groups that lobby on behalf of various religions. And yet an atheist would be the least likely to be elected President of the United States. According to the American Values Survey, about 67% of all voters would be uncomfortable with an atheist president, and no other group including Mormons, African Americans, and homosexuals would lose so much of the potential vote based on one single trait alone.[45][46] One potential reason for this is that in the United States, Christian groups have managed to push and implant the concept that without religion there can be no morality - often playing to people's needs for absolutes and written rules - absolute morality is presented as something inherently true and achievable only by believers.

The mistrust of atheism is often accompanied by snarl words, straw man arguments and various other myths and legends in order to denigrate the idea of disbelief in established gods. Some misconceptions about atheism should be addressed:

Atheism is a religion in the same way as 'off' is a television station.

One of the widest misconceptions, often used as a strong criticism, is that atheism is a religion. However, while there are secular religions, atheism is most commonly defined as "no religion." To expand the definition of "religion" to include atheism would thus destroy any use the word "religion" would have in describing anything. It is quite often pointed out that if atheism is a religion it would be akin to stating that the act of not collecting stamps is a hobby, or that being unemployed is an occupation. Following from this, atheists do not worship Charles Darwin or any other individual. Although some think that atheism requires evolution to be a complete worldview,[49] there is no worship of anything or anyone in atheism, and acceptance of evolution isn't exclusive to atheists - for that matter there is no necessity for an atheist to accept the evidence for evolution (Stalin is a good example: he rejected Darwinian evolution, promoting Lysenkoism instead, and he consistently purged evolution biologists in favor of Lysenkoists). By definition, if atheists worshiped Darwin as a god, they wouldn't be atheists. Basically, "atheism" is a word for a negative. However, this leads to a few semantic issues.

This confuses the religious because they are used to terms of religious identity being a declaration of allegiance to a view, rather than of separation from. This confusion then leads them to assert that a denial of their religion must be an avowal of another. They then do things like declare the so-called New Atheists as hypocrites for denigrating religion while sticking to an unstated one of their own, or declare that because science has an epistemology and religion has an epistemology, therefore science is just another faith (when religion's problem is that science's epistemology provably works much better than religion's).

Atheism is actually a religion - indeed, much like "not collecting stamps" might be called a hobby, or "not smoking" might be called a habit.

A standard response is to note that if atheism is a religion, then "bald" is a hair color, "not kicking a kitten" is a form of animal abuse, and so on. Another is to note that if the definition of religion was expanded enough to legitimately include atheism - say, by defining a religion as "any philosophy on life" - then practically everything in the world would be a religion, such as socio-economic policies or views on equality. (British law has come close to finding this in employment discrimination cases.)

A new movement of atheist churches appears to be developing (such as Sunday Assembly), but what they do is not worship; rather, they are places where like-minded people get together on Sunday mornings to have fun, celebrate life and whatever. This is a relatively new phenomenon, and its prospects for the future are unclear.[51]

Atheists, as a whole, are not a unified group, so accusation that "atheists" are doing x, y and z hold little water. In fact, a disaffection with organized religion, and the potential for groupthink, is what causes many believers to abandon faith and come out as atheists. It doesn't follow that such individuals would happily join another organised group. Debate within the atheistic community is robust - debates even about whether there is even an "atheistic community" at all, for instance - and the fact that this debate exists presupposes no dogmatic mandate (or at least not a widely followed one) from an organized group. It does follow from this lack of organisation that there is no atheist equivalent of the Bible, Koran, or other holy text. There are, of course, atheist writings, but one does not need to adhere to opinions held by, say, Friedrich Nietzsche, Richard Dawkins or Christopher Hitchens to be considered an atheist. Some atheists will actively oppose what these kinds of authors do and say. In fact, some atheists wish they could believe.[52]

Believers sometimes denigrate atheists on the grounds that they "hate God." This, however, makes no sense. People who make such assertive claims towards atheists are confusing atheism with misotheism.

What I'm asking you to entertain is that there is nothing we need to believe on insufficient evidence in order to have deeply ethical and spiritual lives.

Morality is one of the larger issues facing the world, and many religions and believers openly express the notion that they have the monopoly on deciding, explaining, and enforcing moral judgments. Many religious people will assume that since morals rise from (their) god, without (their) god one cannot have morals. Contrary to the claims of such people, "no gods" does not equal "no morality." There are strong humanistic, cultural, and genetic rationales for the existence of morality and ethical behavior, and many people, not just atheists, recognize this fact.

Some atheist groups are doing charitable work traditionally done by religious organizations like funding scholarships as an alternative to faith based scholarships [53] and at least one atheist group volunteers to do environmental protection work.[54]

In the US, where criticism of atheism is common, it often works well for politicians and evangelists to compare atheism to the "evils" of communism, or even to Communism itself. These "evils" are not inextricably fused with the values of atheism in reality. Although most orthodox Marxists are atheists (Marxism treats religion as a "false consciousness" that needs to be eliminated), the atrocities wrought by Stalin and others were not on account of their being atheists, but on account of their being totalitarians and authoritarians. Additionally, there have been many anti-communists who were atheists or agnostics, such as Ayn Rand and the computer pioneer John von Neumann. In North Korea, one of the only 4 countries where communism still exists (the others being China, Vietnam and Cuba), it is mandatory to believe that the Kim-dynasty consists of supreme omnipotent deities.

Atheism and agnosticism are not entirely mutually exclusive, and atheists are not "actually agnostic because no one can ever know whether God exists." This is a highly contested point among religious believers and atheistic philosophers alike, as most, if not all, thinking atheists would happily change their minds given the right evidence, and thus could be considered "agnostic" in this sense. However, this conflates the ideas of belief and knowledge. Atheism is a statement of a lack of belief, and not a lack of knowledge - which is often accepted on all sides of the theistic debate. Atheism takes the position that it is rational to think that gods don't exist, based on logic and lack of evidence. Agnostics, on the other hand, state that the lack of knowledge cannot inform their opinion at all. There are agnostic atheists, who can be either weak or strong. It is at least logically possible for a theist to be an agnostic (e.g., "I believe in a pantheon of lobsterish zoomorphic deities, but cannot prove this with evidence, and acknowledge and embrace that my belief is rooted in faith")but it is markedly difficult to find anyone who will fess up to such a position.

Atheism is not a philosophy; it is not even a view of the world; it is simply an admission of the obvious. In fact, "atheism" is a term that should not even exist. No one ever needs to identify himself as a "non-astrologer" or a "non-alchemist." We do not have words for people who doubt that Elvis is still alive or that aliens have traversed the galaxy only to molest ranchers and their cattle. Atheism is nothing more than the noises reasonable people make in the presence of unjustified religious beliefs.

One difficulty with the term "atheism" is that it defines what its adherents do not believe in, rather than in what they do believe in. The lack of positive statements of belief has led to the fact that there is really no overarching organisation that speaks for atheists (some would regard this as a good thing, keeping atheism from becoming an organised religion) and has led to the comparison that organising atheists is like "herding cats", i.e., impossible. It is possible that the only thing which does really unite atheists is a lack of belief in gods; thus an overarching organisation to represent them would be physically impossible.

Primarily because of the prevalence of extreme discrimination against atheists, people have tried to come up with more positive terms or campaigns to get the godless philosophy noticed and respected. This allows atheists to feel more united and happy with their beliefs (or lack of), but has also led to organisations that will help them in situations, such as legal cases, where individuals couldn't do it on their own. The most prominent examples:

To date, none of these alternative descriptions seems to have taken hold a great deal and the term of choice for most people remains "atheist." "Freethinker" is probably the term with most support, as it dates back at least to the 19th Century. "Naturalism" may be the second most popular, although the name may lead people to confuse it with naturism or with some kind of eco-hippy ideal. "Bright" is the most recent term invented, and as a result is currently the most controversial and divisive. Supporters of the Brights movement see it as a positive and constructive redefinition (on par with the re-branding of homosexuality with the word "gay", which until then primarily meant "happy" or "joyous") while its detractors see it as nothing more than a shameless attempt to turn atheism into an organized religion, and the use of "bright" as a cynical attempt to appear more intellectual.

In some contexts words such as "rationalist" and "skeptic" may also be code words for "atheist." Although not all atheists need to be rationalists, and not all rationalists need to be atheists, the connection is more in the method a person uses to derive their beliefs rather than what their beliefs actually are.

As in the quote above, some who have expressed criticism to religion, among them Richard Dawkins, have pointed out that the word atheism enforces theism as a social norm, as modern languages usually have no established terms for people who do not believe in other supernatural phenomena (a-fairyist for people who do not believe in fairies, a-unicornist, a-alchemist, a-astrologer, etc).

With the existence of deities being central belief of almost all religious systems, it is not surprising that atheism is seen as more threatening than competing belief systems, regardless of how different they may be. This often manifests in the statement that "freedom of religion" doesn't include freedom from religion. It is also important for theists that the political hierarchy, the priesthood, should do their utmost to discourage dissent - as true believers make better tithe givers. Most religious codes are more than a bit irritated with those who do not believe. The Bible, for example, includes clear ad hominem attacks on non-believers, The fool has said in his heart, "There is no God." (Psalm14:1 and Psalm53:1), while the penalty for apostasy in Islamic law is death - and this is still endorsed today. One author has proposed a correction to Psalm 53, as follows:[57]

In the USA the increased public visibility of atheism - what some commentators call the "New Atheism", seen in the popularity of books like The God Delusion - has brought renewed energy to the debate between believers and non-believers.[58] As part of that debate, some believers have put considerable effort into trying to stop what they think of as the irresponsible promotion of atheism. Their efforts range from material that has academic pretensions to arguments that are plainly abusive, focusing on "smacking" atheists with PRATT arguments regarding how great the Bible isn't is - and, of course, a heavy bias towards their own religion being true.[59] What these arguments tend to have in common is that they are less about providing arguments for religious belief and more about keeping atheists quiet, with questions such as "don't you have anything better to do than talk about the God you don't believe in?" or arguing that "faith is better than reason so shut up".[60] It's not entirely unexpected that this would be the thrusts of several anti-atheist arguments - after all, according to several Christians in influential positions, mere knowledge that atheism exists can be dangerous.[61]

Atheists may view the Bible and other religious works as literature, fiction, mythology, epic, philosophy, agit-prop, irrelevant, history, or various combinations thereof. Many atheists may find the book repulsively ignorant and primitive, while other atheists may find inspiration from certain passages even though they don't believe in the supernatural events and miracles mentioned in the Bible. Many atheists see religious works as interesting historical records of the myths and beliefs of humanity. By definition atheists do not believe any religious text to be divinely inspired truth: in other words, "Dude, it's just a book" (or, in fact, a somewhat random collection of different books).

There are several types of evidence to support the idea that "it's just a book." Textual analysis of the various books of the Bible reveals vastly differing writing styles among the authors of the individual books of the Old and New Testaments, suggesting that these works represent many different (human) voices, and not a sole, divinely inspired voice. The existence of Apocrypha, writings dating from the time of the Bible that were not included into official canon by Jews or Christians (and peppered with mystical events such as encounters with angels, demons, and dragons), further suggests that "divine authorship" is not a reliable claim. Within Christianity, there are even differences among sects regarding which books are Apocrypha and which are included in the Bible, or which are included under the heading "Apocrypha", indicating that they constitute holy writings but are not meant to be taken as literally as the other books. The Book of Tobit, for example, is included in the Catholic Bible but considered Apocrypha by Protestants and is wholly absent from the Jewish Bible.

Another problem with the "divine authorship" of the Bible is the existence of texts that pre-date it but contain significant similarities to certain Biblical stories. The best-known among these is the flood story, found in numerous versions in texts from across the ancient Middle East, including the Sumerian Epic of Gilgamesh, which bears textual similarities with the Biblical account. Another such story with apparent Babylonian origin is that of the Tower of Babel. It has been suggested that some of these stories were appropriated by the Jews during the Babylonian Exile.

Studies of the history of the Bible, although not undertaken with the intent of disproving it (in fact, many Biblical historians set out to prove the Bible's veracity), shed light on the Bible's nature as a set of historical documents, ones which were written by humans and were affected by the cultural circumstances surrounding their creation. It should be noted that this type of rational discourse neither proves nor requires an atheistic worldview: one can believe that the Bible is not the infallible word of God either because one adheres to a non-Judeo-Christian religion or because one is a Christian or Jew but not a Biblical literalist. These criticisms of Biblical "truth" serve mainly to counter the arguments of fundamentalists, who are among atheism's most vociferous critics.

Atheists and the nonreligious face persecution and discrimination in many nations worldwide. In Bangladesh, Egypt, Indonesia, Kuwait, Pakistan and Jordan, atheists (and others) are denied free speech through blasphemy laws. In Afghanistan, Iran, Maldives, Mauritania, Pakistan, Saudi Arabia and Sudan being an atheist can carry the death penalty. In many nations citizens are forced to register as adherents of a limited range of religions, which denies atheists and adherents of alternative religions the right to free expression. Atheists can lose their right to citizenship and face restrictions on their right to marry. [62][63] In many parts of the world atheists face increasing prejudice and hate speech like that which ethnic and religious minorities suffer. Saudi Arabia introduced new laws banning atheist thought in any form; there a Muslim expressing religious views the government disliked was falsely called an atheist, sentenced to seven years in prison and 600 lashes. In Egypt young people talking about their right to state atheist ideas on television or on YouTube were detained.[64]

I don't know that atheists should be considered as citizens, nor should they be considered patriots. This is one nation under God.

Research in the American Sociological Review finds that atheists are the group that Americans least relate to for shared vision or want to have marry into their family. [66]

From the report's conclusions

To be an atheist in such an environment is not to be one more religious minority among many in a strongly pluralist society. Rather, Americans construct the atheist as the symbolic representation of one who rejects the basis for moral solidarity and cultural membership in American society altogether.

A 2012 Gallup poll shows presidential candidates who are open atheists are the least likely demographic to be voted into office. [67]

In some parts of the United States people who are openly atheist may be attacked, spat on, turned out of the family home, sent to Bible camp and forced to pretend religiosity. [68]

In the US, atheists are the least trusted and liked people out of all social groups, possibly because of their cracker-stealing banana fetishes and their superior knowledge[69] of actual religious content. They top the charts when people are asked "who would you least trust to be elected President" or "who would you least want to marry your beautiful, sweet, innocent Christian daughter."[70][71] It probably doesn't help that the U.S. is one of the most religious developed countries in the world.[72]

Many have lost jobs and been harassed out of their homes for what is essentially a lack of any belief that could act as motivation to cause harm. Chuck Norris infamously claimed that he would like to tattoo "In God We Trust" onto atheist foreheads before booting them out of Jesusland[citationneeded], possibly to work as slaves in the Mines of Mora (he claims this is a joke, but few actually laughed). More extreme fundamentalists seem to want them outright banned from existence; blogger Andrew Schlafly seriously considered banning them from his website and George H. W. Bush declared "I don't know that atheists should be considered as citizens, nor should they be considered patriots. This is one nation under God," questioning whether anyone who disbelieves in God should even be allowed to vote (or at least be allowed to vote themselves out of persecution).[73] A creationist group has refined this way of thinking, stating that atheists and other "evolutionists" should be disenfranchised, as anyone who believes the theory of evolution is clearly mentally incompetent.[74]

Six US states have laws on the books that prohibit atheists from holding public office.[75] This despite a U.S. Supreme Court ruling -- Torcaso v. Watkins (1961)[76] -- that prohibits discrimination against atheist officeholders.[77] These states are:

If atheism isn't a hanging offense in these places, they probably wish it were.[citationNOT needed] (Ok, maybe not Maryland, but you get the point.)

In some European countries being an atheist is unremarkable.

France has an entirely secular culture, with a suitably large proportion of the population declaring "no religion." In Scandinavia, while the majority of the population are members of their respective national churches, irreligiosity is nevertheless widespread and being openly atheist is completely unremarkable.[78] In the UK, Tony Blair's spin-doctor Alistar Campbell was led to declare that "we don't do god"[79] and Tony himself said that he kept quiet about religion because people would think he was "a nutter". The previous deputy Prime Minister was an atheist, while the Prime Minister himself has said that his Church of England faith "comes and goes". Overall, atheists in Europe aren't demonized as they are in America and other countries led by fundamentalists. Despite this, British Muslims who become atheists can face ostracism, threats and even physical abuse.[80]

Read more from the original source:
Atheism - RationalWiki

Posted in Atheism | Comments Off on Atheism – RationalWiki

Atheism – The New York Times

Posted: at 6:20 am

Latest Articles

The position also includes humanism and secular ethics and came after a $2.2 million donation from Louis J. Appignani, a retired businessman.

By LAURIE GOODSTEIN

A friend of Christopher Hitchens writes that there is simply no truth to the rumor that he abandoned atheism at the end of his life.

A new book says the impious author of God Is Not Great might have been exploring faith before he died in 2011. Mr. Hitchenss secular friends disagree.

By MARK OPPENHEIMER

Men armed with machetes surrounded the activist, Mohammad Nazim Uddin, and slashed his head, then shot him, a police official said.

By ELLEN BARRY and MAHER SATTAR

Readers discuss whether one can express certainty about the existence of God.

How can atheists and believers stop acting like enemy combatants in a spiritual or intellectual war?

By WILLIAM IRWIN

At Redeemer Presbyterian Church on the Upper West Side, weekly sessions seek converts among a fervent and growing number of atheists in this country.

By SAMUEL G. FREEDMAN

Readers who are atheists explain their views.

Are religion and science locked in a zero-sum struggle for supremacy, or is there room for common ground?

By JAMES RYERSON

Secular voters must demand candidates who reflect their values.

By SUSAN JACOBY

The court rules in favor of those faithful to the omnipotent food clump.

By JOHN HODGMAN

Talking about Hillary Clinton and Benghazi, Ted Cruz notes that his 5-year-old daughter gets a spanking when she lies.

By MATT FLEGENHEIMER

Whitmarsh argues that atheism isnt a product of the modern age but reaches back to early Western intellectual tradition in the ancient Greek world.

By REBECCA NEWBERGER GOLDSTEIN

When a pastoral change has revolutionary implications.

Those institutions offer even atheists and spiritual seekers a language of moral discourse and training in congregational leadership.

By SAMUEL G. FREEDMAN

From the International Herald Tribune archive: The head of the Jesuists warns against a new godless society.

When Francis speaks, millions listen whether they are Muslim or Baptist, Hindu or atheist. He is a celebrity to those who admire his warmth and a rudder to those who share his concerns.

By VIVIAN YEE

In this short documentary, a former Pentecostal preacher starts a secular church in the South.

By JASON COHN and CAMILLE SERVAN-SCHREIBER

In this short documentary, a former Pentecostal preacher starts a secular congregation in the heart of the Bible Belt.

By JASON COHN and CAMILLE SERVAN-SCHREIBER

The Freedom From Religion Foundation believes the city has shown favoritism to Catholicism and has ignored the importance of separation of church and state.

By GINIA BELLAFANTE

The position also includes humanism and secular ethics and came after a $2.2 million donation from Louis J. Appignani, a retired businessman.

By LAURIE GOODSTEIN

A friend of Christopher Hitchens writes that there is simply no truth to the rumor that he abandoned atheism at the end of his life.

A new book says the impious author of God Is Not Great might have been exploring faith before he died in 2011. Mr. Hitchenss secular friends disagree.

By MARK OPPENHEIMER

Men armed with machetes surrounded the activist, Mohammad Nazim Uddin, and slashed his head, then shot him, a police official said.

By ELLEN BARRY and MAHER SATTAR

Readers discuss whether one can express certainty about the existence of God.

How can atheists and believers stop acting like enemy combatants in a spiritual or intellectual war?

By WILLIAM IRWIN

At Redeemer Presbyterian Church on the Upper West Side, weekly sessions seek converts among a fervent and growing number of atheists in this country.

By SAMUEL G. FREEDMAN

Readers who are atheists explain their views.

Are religion and science locked in a zero-sum struggle for supremacy, or is there room for common ground?

By JAMES RYERSON

Secular voters must demand candidates who reflect their values.

By SUSAN JACOBY

The court rules in favor of those faithful to the omnipotent food clump.

By JOHN HODGMAN

Talking about Hillary Clinton and Benghazi, Ted Cruz notes that his 5-year-old daughter gets a spanking when she lies.

By MATT FLEGENHEIMER

Whitmarsh argues that atheism isnt a product of the modern age but reaches back to early Western intellectual tradition in the ancient Greek world.

By REBECCA NEWBERGER GOLDSTEIN

When a pastoral change has revolutionary implications.

Those institutions offer even atheists and spiritual seekers a language of moral discourse and training in congregational leadership.

By SAMUEL G. FREEDMAN

From the International Herald Tribune archive: The head of the Jesuists warns against a new godless society.

When Francis speaks, millions listen whether they are Muslim or Baptist, Hindu or atheist. He is a celebrity to those who admire his warmth and a rudder to those who share his concerns.

By VIVIAN YEE

In this short documentary, a former Pentecostal preacher starts a secular church in the South.

By JASON COHN and CAMILLE SERVAN-SCHREIBER

In this short documentary, a former Pentecostal preacher starts a secular congregation in the heart of the Bible Belt.

By JASON COHN and CAMILLE SERVAN-SCHREIBER

The Freedom From Religion Foundation believes the city has shown favoritism to Catholicism and has ignored the importance of separation of church and state.

By GINIA BELLAFANTE

See the rest here:
Atheism - The New York Times

Posted in Atheism | Comments Off on Atheism – The New York Times

North Atlantic Treaty Organization (NATO) | Britannica.com

Posted: at 6:20 am

Alternative title: NATO

North Atlantic Treaty Organization (NATO), military alliance established by the North Atlantic Treaty (also called the Washington Treaty) of April 4, 1949, which sought to create a counterweight to Soviet armies stationed in central and eastern Europe after World War II. Its original members were Belgium, Canada, Denmark, France, Iceland, Italy, Luxembourg, the Netherlands, Norway, Portugal, the United Kingdom, and the United States. Joining the original signatories were Greece and Turkey (1952); West Germany (1955; from 1990 as Germany); Spain (1982); the Czech Republic, Hungary, and Poland (1999); Bulgaria, Estonia, Latvia, Lithuania, Romania, Slovakia, and Slovenia (2004); and Albania and Croatia (2009). France withdrew from the integrated military command of NATO in 1966 but remained a member of the organization; it resumed its position in NATOs military command in 2009.

The heart of NATO is expressed in Article 5 of the North Atlantic Treaty, in which the signatory members agree that

an armed attack against one or more of them in Europe or North America shall be considered an attack against them all; and consequently they agree that, if such an armed attack occurs, each of them, in exercise of the right of individual or collective self-defense recognized by Article 51 of the Charter of the United Nations, will assist the Party or Parties so attacked by taking forthwith, individually and in concert with the other Parties, such action as it deems necessary, including the use of armed force, to restore and maintain the security of the North Atlantic area.

NATO invoked Article 5 for the first time in 2001, after terrorist attacks organized by exiled Saudi Arabian millionaire Osama bin Laden destroyed the World Trade Center in New York City and part of the Pentagon outside Washington, D.C., killing some 3,000 people.

Article 6 defines the geographic scope of the treaty as covering an armed attack on the territory of any of the Parties in Europe or North America. Other articles commit the allies to strengthening their democratic institutions, to building their collective military capability, to consulting each other, and to remaining open to inviting other European states to join.

Barkley, Alben W.: North Atlantic Treaty signingEncyclopdia Britannica, Inc.After World War II in 1945, western Europe was economically exhausted and militarily weak (the western Allies had rapidly and drastically reduced their armies at the end of the war), and newly powerful communist parties had arisen in France and Italy. By contrast, the Soviet Union had emerged from the war with its armies dominating all the states of central and eastern Europe, and by 1948 communists under Moscows sponsorship had consolidated their control of the governments of those countries and suppressed all noncommunist political activity. What became known as the Iron Curtain, a term popularized by Winston Churchill, had descended over central and eastern Europe. Further, wartime cooperation between the western Allies and the Soviets had completely broken down. Each side was organizing its own sector of occupied Germany, so that two German states would emerge, a democratic one in the west and a communist one in the east.

In 1948 the United States launched the Marshall Plan, which infused massive amounts of economic aid to the countries of western and southern Europe on the condition that they cooperate with each other and engage in joint planning to hasten their mutual recovery. As for military recovery, under the Brussels Treaty of 1948, the United Kingdom, France, and the Low CountriesBelgium, the Netherlands, and Luxembourgconcluded a collective-defense agreement called the Western European Union. It was soon recognized, however, that a more formidable alliance would be required to provide an adequate military counterweight to the Soviets.

By this time Britain, Canada, and the United States had already engaged in secret exploratory talks on security arrangements that would serve as an alternative to the United Nations (UN), which was becoming paralyzed by the rapidly emerging Cold War. In March 1948, following a virtual communist coup dtat in Czechoslovakia in February, the three governments began discussions on a multilateral collective-defense scheme that would enhance Western security and promote democratic values. These discussions were eventually joined by France, the Low Countries, and Norway and in April 1949 resulted in the North Atlantic Treaty.

Spurred by the North Korean invasion of South Korea in June 1950, the United States took steps to demonstrate that it would resist any Soviet military expansion or pressures in Europe. General Dwight D. Eisenhower, the leader of the Allied forces in western Europe in World War II, was named Supreme Allied Commander Europe (SACEUR) by the North Atlantic Council (NATOs governing body) in December 1950. He was followed as SACEUR by a succession of American generals.

The North Atlantic Council, which was established soon after the treaty came into effect, is composed of ministerial representatives of the member states, who meet at least twice a year. At other times the council, chaired by the NATO secretary-general, remains in permanent session at the ambassadorial level. Just as the position of SACEUR has always been held by an American, the secretary-generalship has always been held by a European.

NATOs military organization encompasses a complete system of commands for possible wartime use. The Military Committee, consisting of representatives of the military chiefs of staff of the member states, subsumes two strategic commands: Allied Command Operations (ACO) and Allied Command Transformation (ACT). ACO is headed by the SACEUR and located at Supreme Headquarters Allied Powers Europe (SHAPE) in Casteau, Belgium. ACT is headquartered in Norfolk, Virginia, U.S. During the alliances first 20 years, more than $3 billion worth of infrastructure for NATO forcesbases, airfields, pipelines, communications networks, depotswas jointly planned, financed, and built, with about one-third of the funding from the United States. NATO funding generally is not used for the procurement of military equipment, which is provided by the member statesthough the NATO Airborne Early Warning Force, a fleet of radar-bearing aircraft designed to protect against a surprise low-flying attack, was funded jointly.

A serious issue confronting NATO in the early and mid-1950s was the negotiation of West Germanys participation in the alliance. The prospect of a rearmed Germany was understandably greeted with widespread unease and hesitancy in western Europe, but the countrys strength had long been recognized as necessary to protect western Europe from a possible Soviet invasion. Accordingly, arrangements for West Germanys safe participation in the alliance were worked out as part of the Paris Agreements of October 1954, which ended the occupation of West German territory by the western Allies and provided for both the limitation of West German armaments and the countrys accession to the Brussels Treaty. In May 1955 West Germany joined NATO, which prompted the Soviet Union to form the Warsaw Pact alliance in central and eastern Europe the same year. The West Germans subsequently contributed many divisions and substantial air forces to the NATO alliance. By the time the Cold War ended, some 900,000 troopsnearly half of them from six countries (United States, United Kingdom, France, Belgium, Canada, and the Netherlands)were stationed in West Germany.

Frances relationship with NATO became strained after 1958, as President Charles de Gaulle increasingly criticized the organizations domination by the United States and the intrusion upon French sovereignty by NATOs many international staffs and activities. He argued that such integration subjected France to automatic war at the decision of foreigners. In July 1966 France formally withdrew from the military command structure of NATO and required NATO forces and headquarters to leave French soil; nevertheless, de Gaulle proclaimed continued French adherence to the North Atlantic Treaty in case of unprovoked aggression. After NATO moved its headquarters from Paris to Brussels, France maintained a liaison relationship with NATOs integrated military staffs, continued to sit in the council, and continued to maintain and deploy ground forces in West Germany, though it did so under new bilateral agreements with the West Germans rather than under NATO jurisdiction. In 2009 France rejoined the military command structure of NATO.

From its founding, NATOs primary purpose was to unify and strengthen the Western Allies military response to a possible invasion of western Europe by the Soviet Union and its Warsaw Pact allies. In the early 1950s NATO relied partly on the threat of massive nuclear retaliation from the United States to counter the Warsaw Pacts much larger ground forces. Beginning in 1957, this policy was supplemented by the deployment of American nuclear weapons in western European bases. NATO later adopted a flexible response strategy, which the United States interpreted to mean that a war in Europe did not have to escalate to an all-out nuclear exchange. Under this strategy, many Allied forces were equipped with American battlefield and theatre nuclear weapons under a dual-control (or dual-key) system, which allowed both the country hosting the weapons and the United States to veto their use. Britain retained control of its strategic nuclear arsenal but brought it within NATOs planning structures; Frances nuclear forces remained completely autonomous.

A conventional and nuclear stalemate between the two sides continued through the construction of the Berlin Wall in the early 1960s, dtente in the 1970s, and the resurgence of Cold War tensions in the 1980s after the Soviet Unions invasion of Afghanistan in 1979 and the election of U.S. President Ronald Reagan in 1980. After 1985, however, far-reaching economic and political reforms introduced by Soviet leader Mikhail Gorbachev fundamentally altered the status quo. In July 1989 Gorbachev announced that Moscow would no longer prop up communist governments in central and eastern Europe and thereby signaled his tacit acceptance of their replacement by freely elected (and noncommunist) administrations. Moscows abandonment of control over central and eastern Europe meant the dissipation of much of the military threat that the Warsaw Pact had formerly posed to western Europe, a fact that led some to question the need to retain NATO as a military organizationespecially after the Warsaw Pacts dissolution in 1991. The reunification of Germany in October 1990 and its retention of NATO membership created both a need and an opportunity for NATO to be transformed into a more political alliance devoted to maintaining international stability in Europe.

After the Cold War, NATO was reconceived as a cooperative-security organization whose mandate was to include two main objectives: to foster dialogue and cooperation with former adversaries in the Warsaw Pact and to manage conflicts in areas on the European periphery, such as the Balkans. In keeping with the first objective, NATO established the North Atlantic Cooperation Council (1991; later replaced by the Euro-Atlantic Partnership Council) to provide a forum for the exchange of views on political and security issues, as well as the Partnership for Peace (PfP) program (1994) to enhance European security and stability through joint military training exercises with NATO and non-NATO states, including the former Soviet republics and allies. Special cooperative links were also set up with two PfP countries: Russia and Ukraine.

The second objective entailed NATOs first use of military force, when it entered the war in Bosnia and Herzegovina in 1995 by staging air strikes against Bosnian Serb positions around the capital city of Sarajevo. The subsequent Dayton Accords, which were initialed by representatives of Bosnia and Herzegovina, the Republic of Croatia, and the Federal Republic of Yugoslavia, committed each state to respecting the others sovereignty and to settling disputes peacefully; it also laid the groundwork for stationing NATO peacekeeping troops in the region. A 60,000-strong Implementation Force (IFOR) was initially deployed, though a smaller contingent remained in Bosnia under a different name, the Stabilization Force (SFOR). In March 1999 NATO launched massive air strikes against Serbia in an attempt to force the Yugoslav government of Slobodan Miloevi to accede to diplomatic provisions designed to protect the predominantly Muslim Albanian population in the province of Kosovo. Under the terms of a negotiated settlement to the fighting, NATO deployed a peacekeeping force called the Kosovo Force (KFOR).

The crisis over Kosovo and the ensuing war gave renewed impetus to efforts by the European Union (EU) to construct a new crisis-intervention force, which would make the EU less dependent on NATO and U.S. military resources for conflict management. These efforts prompted significant debates about whether enhancing the EUs defensive capabilities would strengthen or weaken NATO. Simultaneously there was much discussion of the future of NATO in the post-Cold War era. Some observers argued that the alliance should be dissolved, noting that it was created to confront an enemy that no longer existed; others called for a broad expansion of NATO membership to include Russia. Most suggested alternative roles, including peacekeeping. By the start of the second decade of the 21st century, it appeared likely that the EU would not develop capabilities competitive with those of NATO or even seek to do so; as a result, earlier worries associated with the spectre of rivalry between the two Brussels-based organizations dissipated.

North Atlantic Treaty Organization: flag-raising ceremony, 1999NATO photosDuring the presidency of Bill Clinton (19932001), the United States led an initiative to enlarge NATO membership gradually to include some of the former Soviet allies. In the concurrent debate over enlargement, supporters of the initiative argued that NATO membership was the best way to begin the long process of integrating these states into regional political and economic institutions such as the EU. Some also feared future Russian aggression and suggested that NATO membership would guarantee freedom and security for the newly democratic regimes. Opponents pointed to the enormous cost of modernizing the military forces of new members; they also argued that enlargement, which Russia would regard as a provocation, would hinder democracy in that country and enhance the influence of hard-liners. Despite these disagreements, the Czech Republic, Hungary, and Poland joined NATO in 1999; Bulgaria, Estonia, Latvia, Lithuania, Romania, Slovakia, and Slovenia were admitted in 2004; and Albania and Croatia acceded to the alliance in 2009.

Meanwhile, by the beginning of the 21st century, Russia and NATO had formed a strategic relationship. No longer considered NATOs chief enemy, Russia cemented a new cooperative bond with NATO in 2001 to address such common concerns as international terrorism, nuclear nonproliferation, and arms control. This bond was subsequently subject to fraying, however, in large part because of reasons associated with Russian domestic politics.

Events following the September 11 terrorist attacks in 2001 led to the forging of a new dynamic within the alliance, one that increasingly favoured the military engagement of members outside Europe, initially with a mission against Taliban forces in Afghanistan beginning in the summer of 2003 and subsequently with air operations against the regime of Muammar al-Qaddafi in Libya in early 2011. As a result of the increased tempo of military operations undertaken by the alliance, the long-standing issue of burden sharing was revived, with some officials warning that failure to share the costs of NATO operations more equitably would lead to unraveling of the alliance. Most observers regarded that scenario as unlikely, however.

Corrections? Updates? Help us improve this article! Contact our editors with your Feedback.

See the original post:
North Atlantic Treaty Organization (NATO) | Britannica.com

Posted in NATO | Comments Off on North Atlantic Treaty Organization (NATO) | Britannica.com