Transforming Health: The divisive wash-up – InDaily

Adelaide's independent news Get InDaily in your inbox. Daily. Subscribe

Adelaide Tuesday August 15, 2017

SA Health has commissioned consultants to evaluate the biggest hospital system overhaul in the states history. But one conclusion is already inescapable: Transforming Health has fractured the vital relationship between SAs doctors and the bureaucrats who employ them.

Transforming Health came with more buzz than the release of a new Apple product, says South Australian Salaried Medical Officers Association (SASMOA) senior industrial officer Bernadette Mulholland.

More than 600 medical staff and interested parties packed the Adelaide Convention Centre in November 2014 to hear the about the massive change planned for South Australias hospital system, and to be heard.

But as major changes began to roll through the system, doctors enthusiasm soured into suspicion.

The trust of clinicians and community so necessary to implement such broad sweeping changes was quickly eroded as it became clear that the focus by Government and SA Health prioritised economic rationalism rather than clinical, patient and community (outcomes), says Mulholland.

Within a short period, clinicians questioned the motivation of the Transforming Health (program) and recognised the potential devastation of health services to their local community and adverse clinical outcomes.

What absolutely concerned me was the damage that was caused to the relationship between the administration and medical officers.

Data provided by SA Health didnt match what some doctors believed to be happening on the ground, and when concerns about the accuracy of data were raised, many felt they were not being listened to.

Clinicians felt under pressure from administrators who now referred to clinicians providing any opposition as naysayers and dismissed any feedback that did not support change.

The process undermined trust and created a divide between Government and clinicians which wont be forgotten for some years.

Trust in the administration now lost through this process will be difficult to earn back from many clinicians.

SA Health has held regular forums to discuss Transforming Health with unions including SASMOA, the Australian Nursing and Midwifery Federation (ANMF) and the Ambulance Employees Association (AEA) throughout the process.

But, according to Mulholland, in all the time that (SA Health CEO) Vickie Kaminski has been in that job, weve met her twice.

Asked how SA Health had allowed the relationship to deteriorate so dramatically, Kaminski told InDaily that different unions had responded to the process differently, and that many doctors have been highly supportive of Transforming Health.

SASMOAs had a tougher time wrapping their head around it (than other unions) but I think thats because its individuals, its I understand it (doctors) livelihood, its their place of work and youre changing that.

Transforming Health clinical ambassador Dorothy Keefe tells InDaily: There are many members of SASMOA who are actually very supportive of whats happening.

And I think SASMOAs been struggling a bit because of the differing views within its own membership. Of course, unhappiness makes better media than happiness.

Mulholland tells InDaily she is disappointed the administration is still bashing SASMOA.

It isnt constructive. I find it unhelpful, she says.

It maintains the relationship that we dont want.

Its clear, however, that any large-scale hospitals overhaul was never going to be easy for SA Health to manage.

Late last year, the department accepted the recommendations of a scathing review into the operations of the Central Adelaide Local Health Network, which oversees the Royal Adelaide Hospital and the Queen Elizabeth Hospital, among others.

That report found medical staff were largely resistant to change, instead retaining a culture which is rooted in a mid-20th century view of the profession, of their relationship with the organisation and of care delivery.

There was no stigma against clinicians assuming someone else will take up the mantle of change management.

And when effective medical leadership is absent, change is inevitably difficult, lacks traction and sustainability, and is often associated with overt displays of anger and sometimes unprofessional behaviour.

Many doctors interviewed for the review reported that theirs was a resistant culture, a culture which rewarded and encouraged stasis rather than genuine change and a culture which had failed to come to grips with the reality of a resource constrained system.

There were, however, some clear exceptions to the change-resistant culture the report describes, characterised by the effective leadership of doctors who as a result have been able to bring others to a shared view that change is both important and desirable.

The problem for South Australias ambulance service, meanwhile, has not been the pace of change, but the lack thereof.

With major specialist services to be consolidated within the states largest hospitals under Transforming Health, more patients would have to travel farther, often in ambulances, to receive Best Care. First Time. Every Time. (as the Transforming Health mantra goes).

The ambulance service was to be a major beneficiary of the program.

A $16 million package was promised, with new ambulance stations, new vehicles and more paramedics to help the ambulance service cope.

However, asked to describe the major successes of Transforming Health, Ambulance Employees Association General Secretary Phil Palmer tells InDaily, from an ambulance perspective, none at this stage.

We dont have any extra boots on the ground yet, due to (SA Health) / Treasury refusing to release funds until it was too late.

Recruiting should have started 18 months ago at least but did not start until early this year.

It requires a 12-month long internship to make a degree-qualified graduate road-ready, with authority to practice as a paramedic.

Palmer says paramedics workload continues to climb and it is already beyond SA Ambulance capacity to cope.

Blown-out response times are now the norm, and there have already been two deaths that had 23-minute plus responses to cases that should have been attended in eight minutes, he says.

(Transforming Health) has created more need for patient transfers, but no extra resources to meet increased demand.

Premier Jay Weatherills announcement in June that Transforming Health would come to an end with the opening of the new Royal Adelaide Hospital (early next month) and the closure of the Repat (due before the end of the year) came as a surprise to the AEA.

We heard it in the news, says Palmer.

We do not at all accept that the process is complete.

There has been no improvement in patient flow through hospitals; the discharge system remains inefficient, emergency departments are more overcrowded than ever, and ramping is the worst we have ever seen in South Australia.

The policy formerly known as Transforming Health was rebranded well before its work was done.

The negative public reaction was a result of the failure of (SA Health) to bring their workforce, and the public, with them.

Evidence-based change was gazumped by opinion polling.

From the beginning, nurses were expected to be among the major losers out of Transforming Health.

South Australia has the highest number of nurses per head of population in the country a fact noted regularly in public statements by Health Minister Jack Snelling.

But Weatherill told ABC Radio Adelaide this morning that his government was proud of that fact and major clear-out of nurses simply hasnt come to pass in South Australia, or not yet.

ANMF SA Branch CEO Elizabeth Dabars said late last year that her union had secured a commitment from the State Government that there would be no forced redundancies of nurses as a result of Transforming Health.

Kaminski tells InDaily jobnumbers have been going in the opposite direction: Theres been displacement, where nurses have moved around the system, (but) I think overall were trending up.

Wed like to, at some point, get down to the national average, but what were trying to do right now is the location of service, and being able to make sure were able to have the right service, right place, right time.

Kaminski said the evaluation of Transforming Health would shed further light on the outcomes of the program.

We have engaged people to do that evaluation for us, to be objective and third-party, she said.

Were asking them to be frank.

This is the second in InDailys two-part series on Transforming Health.

You can read part one here.

Loading next article

Go here to read the rest:

Transforming Health: The divisive wash-up - InDaily

Zaretsky: The best ‘ism’ to explain our time – Daily Commercial

Surrealism is celebrating its 100th birthday this year. The poet Guillaume Apollinaire coined the term to describe his play Les Mamelles de Tiresias (The Teats of Tiresias), which opened in a small Parisian theater in 1917. Beginning with an actress removing her breasts and ending early with an unscripted riot featuring a pistol-flailing audience member the play launched a movement that long convulsed French art and politics.

The centenary arrives in a surreal news environment. Indeed, among the dozens of isms used to explain the Trump presidency from isolationism and pluto-populism to narcissism and authoritarianism none does a better job than surrealism in capturing the current mood.

Andre Breton, the Pope of Surrealism, defined it as a psychic automatism in its pure state exempt from any moral concern. In his First Manifesto of Surrealism, Breton railed against rationalism and the reign of logic. Clarity and coherence lost bigly to the tumult of unconscious desires, while civility and courtesy were for bourgeois losers. Upping the ante in his Second Manifesto, he claimed the simplest Surrealist act consists of dashing down into the street, pistol in hand, and firing blindly, as fast as you can pull the trigger, into the crowd.

Unarmed Surrealists were content to brandish their ids. What was once the stuff of repression was now ripe for expression. Everything that welled up into the conscious mind flowed across paper and canvas. The true Surrealist turns his mind into a receptacle, refusing to favor one group of words over another. Instead, it is up to the miraculous equivalent to intervene.

Or not. As a sober reader finds, most Surrealist literature is unreadable. The precursor to Surrealism, the Romanian Tristan Tzara, famously composed poems by cutting words from a newspaper, tossing them into a bag, pulling them out and reciting them one by one. The result, Tzara declared, will resemble you. (Perhaps thats true if you happen to be crashed on your kitchen floor, sleeping off an all-night bender.) As for Breton, he favored automatic writing by becoming a recording machine for his unconscious. The final product, he beamed, shines by its extreme degree of immediate absurdity.

Trumpian word salads bear the surrealist seal of absurdity. In Exquisite Corpse a Surrealist exercise aimed at unleashing the unconscious you write a word on a piece of paper, pass it to your neighbor who jots a second word without looking at the first word, and so on. This led to sentences like The exquisite/corpse/shall drink/the new/wine. Trumps gift of free association His one problem is he didnt go to Russia that night because he had extracurricular activities, and they froze to death allows him to play a solitaire variation of the game.

A French translator recently marveled that Trump seems to have thematic clouds in his head that he would pick from with no need of a logical thread to link them. This is true not just of his speech, but also of his governing strategy.

Igniting a reaction similar to those following Marcel Duchamp entering a urinal at an art show, Trump has exhibited his Surrealist aesthetic in bureaucratic Washington. But he subverts ready-made expectations instead of ready-made objects. With a Surrealist flair for showmanship worthy of Salvador Dali, he randomly pairs titles and individuals. Thus, his son-in-law, a New York real estate developer, plays Middle East envoy one day, opioid crisis czar the next. Trumps claim that if Jared Kushner cannot bring peace to the Middle East, no one can expresses the Surrealist conviction that where reason and strategy have failed, unreason and whim will prevail.

The same aesthetic lies behind or, rather, below the Wall. Its failure to make economic, strategic or diplomatic sense is not beside the point; it is the point. Its raison dtre is to shock the political establishment and to give shape to what, until now, had been the repressed desires of Trumps base. Think of it not as a real security measure, but as a virtual sculpture that will allow its audience to touch, and not just talk about their phobias. Like a Surrealist object, the Wall is a shape-shifter opaque or transparent, continuous or discontinuous, topped with barbed wire or solar panels and expresses the Surrealist values of excess and extravagance, aggression and transgression.

In the end, Trumpism, like Surrealism, seeks to force reality to conform to individual desires, no matter how illicit, illegal or simply outrageous. This might work aesthetically, even financially just ask Dali, whose name Breton turned into the anagram Avida Dollars and, it seems, politically. But, one can hope, only in the short term.

Eventually, Surrealisms revolt against the reality-based community ended with a whimper, with its art relegated to post-dinner games and dorm room posters. One day, perhaps, politicians will look back on Trumpism in the same dismissive way.

Robert Zaretsky teaches at the University of Houston and is finishing a book on Catherine the Great and the French Enlightenment. He wrote this for the Los Angeles Times.

See original here:

Zaretsky: The best 'ism' to explain our time - Daily Commercial

How America Lost Its Mind – The Atlantic

You are entitled to your own opinion, but you are not entitled to your own facts.

Daniel Patrick Moynihan

We risk being the first people in history to have been able to make their illusions so vivid, so persuasive, so realistic that they can live in them.

Daniel J. Boorstin, The Image: A Guide to Pseudo-Events in America (1961)

When did America become untethered from reality?

I first noticed our national lurch toward fantasy in 2004, after President George W. Bushs political mastermind, Karl Rove, came up with the remarkable phrase reality-based community. People in the reality-based community, he told a reporter, believe that solutions emerge from your judicious study of discernible reality Thats not the way the world really works anymore. A year later, The Colbert Report went on the air. In the first few minutes of the first episode, Stephen Colbert, playing his right-wing-populist commentator character, performed a feature called The Word. His first selection: truthiness. Now, Im sure some of the word police, the wordinistas over at Websters, are gonna say, Hey, thats not a word! Well, anybody who knows me knows that Im no fan of dictionaries or reference books. Theyre elitist. Constantly telling us what is or isnt true. Or what did or didnt happen. Whos Britannica to tell me the Panama Canal was finished in 1914? If I wanna say it happened in 1941, thats my right. I dont trust bookstheyre all fact, no heart Face it, folks, we are a divided nation divided between those who think with their head and those who know with their heart Because thats where the truth comes from, ladies and gentlementhe gut.

Whoa, yes, I thought: exactly. America had changed since I was young, when truthiness and reality-based community wouldnt have made any sense as jokes. For all the fun, and all the many salutary effects of the 1960sthe main decade of my childhoodI saw that those years had also been the big-bang moment for truthiness. And if the 60s amounted to a national nervous breakdown, we are probably mistaken to consider ourselves over it.

Try 2 FREE issues of The Atlantic

Each of us is on a spectrum somewhere between the poles of rational and irrational. We all have hunches we cant prove and superstitions that make no sense. Some of my best friends are very religious, and others believe in dubious conspiracy theories. Whats problematic is going overboardletting the subjective entirely override the objective; thinking and acting as if opinions and feelings are just as true as facts. The American experiment, the original embodiment of the great Enlightenment idea of intellectual freedom, whereby every individual is welcome to believe anything she wishes, has metastasized out of control. From the start, our ultra-individualism was attached to epic dreams, sometimes epic fantasiesevery American one of Gods chosen people building a custom-made utopia, all of us free to reinvent ourselves by imagination and will. In America nowadays, those more exciting parts of the Enlightenment idea have swamped the sober, rational, empirical parts. Little by little for centuries, then more and more and faster and faster during the past half century, we Americans have given ourselves over to all kinds of magical thinking, anything-goes relativism, and belief in fanciful explanationsmall and large fantasies that console or thrill or terrify us. And most of us havent realized how far-reaching our strange new normal has become.

Much more than the other billion or so people in the developed world, we Americans believereally believein the supernatural and the miraculous, in Satan on Earth, in reports of recent trips to and from heaven, and in a story of lifes instantaneous creation several thousand years ago.

We believe that the government and its co-conspirators are hiding all sorts of monstrous and shocking truths from us, concerning assassinations, extraterrestrials, the genesis of aids, the 9/11 attacks, the dangers of vaccines, and so much more.

And this was all true before we became familiar with the terms post-factual and post-truth, before we elected a president with an astoundingly open mind about conspiracy theories, whats true and whats false, the nature of reality.

We have passed through the looking glass and down the rabbit hole. America has mutated into Fantasyland.

How widespread is this promiscuous devotion to the untrue? How many Americans now inhabit alternate realities? Any given survey of beliefs is only a sketch of what people in general really think. But reams of survey research from the past 20 years reveal a rough, useful census of American credulity and delusion. By my reckoning, the solidly reality-based are a minority, maybe a third of us but almost certainly fewer than half. Only a third of us, for instance, dont believe that the tale of creation in Genesis is the word of God. Only a third strongly disbelieve in telepathy and ghosts. Two-thirds of Americans believe that angels and demons are active in the world. More than half say theyre absolutely certain heaven exists, and just as many are sure of the existence of a personal Godnot a vague force or universal spirit or higher power, but some guy. A third of us believe not only that global warming is no big deal but that its a hoax perpetrated by scientists, the government, and journalists. A third believe that our earliest ancestors were humans just like us; that the government has, in league with the pharmaceutical industry, hidden evidence of natural cancer cures; that extraterrestrials have visited or are visiting Earth. Almost a quarter believe that vaccines cause autism, and that Donald Trump won the popular vote in 2016. A quarter believe that our previous president maybe or definitely was (or is?) the anti-Christ. According to a survey by Public Policy Polling, 15 percent believe that the media or the government adds secret mind-controlling technology to television broadcast signals, and another 15 percent think thats possible. A quarter of Americans believe in witches. Remarkably, the same fraction, or maybe less, believes that the Bible consists mainly of legends and fablesthe same proportion that believes U.S. officials were complicit in the 9/11 attacks.

When I say that a third believe X and a quarter believe Y, its important to understand that those are different thirds and quarters of the population. Of course, various fantasy constituencies overlap and feed one anotherfor instance, belief in extraterrestrial visitation and abduction can lead to belief in vast government cover-ups, which can lead to belief in still more wide-ranging plots and cabals, which can jibe with a belief in an impending Armageddon.

Why are we like this?

The short answer is because were Americansbecause being American means we can believe anything we want; that our beliefs are equal or superior to anyone elses, experts be damned. Once people commit to that approach, the world turns inside out, and no cause-and-effect connection is fixed. The credible becomes incredible and the incredible credible.

The word mainstream has recently become a pejorative, shorthand for bias, lies, oppression by the elites. Yet the institutions and forces that once kept us from indulging the flagrantly untrue or absurdmedia, academia, government, corporate America, professional associations, respectable opinion in the aggregatehave enabled and encouraged every species of fantasy over the past few decades.

A senior physician at one of Americas most prestigious university hospitals promotes miracle cures on his daily TV show. Cable channels air documentaries treating mermaids, monsters, ghosts, and angels as real. When a political-science professor attacks the idea that there is some public that shares a notion of reality, a concept of reason, and a set of criteria by which claims to reason and rationality are judged, colleagues just nod and grant tenure. The old fringes have been folded into the new center. The irrational has become respectable and often unstoppable.

Our whole social environment and each of its overlapping partscultural, religious, political, intellectual, psychologicalhave become conducive to spectacular fallacy and truthiness and make-believe. There are many slippery slopes, leading in various directions to other exciting nonsense. During the past several decades, those naturally slippery slopes have been turned into a colossal and permanent complex of interconnected, crisscrossing bobsled tracks, which Donald Trump slid down right into the White House.

American moxie has always come in two types. We have our wilder, faster, looser side: Were overexcited gamblers with a weakness for stories too good to be true. But we also have the virtues embodied by the Puritans and their secular descendants: steadiness, hard work, frugality, sobriety, and common sense. A propensity to dream impossible dreams is like other powerful tendenciesokay when kept in check. For most of our history, the impulses existed in a rough balance, a dynamic equilibrium between fantasy and reality, mania and moderation, credulity and skepticism.

The great unbalancing and descent into full Fantasyland was the product of two momentous changes. The first was a profound shift in thinking that swelled up in the 60s; since then, Americans have had a new rule written into their mental operating systems: Do your own thing, find your own reality, its all relative.

The second change was the onset of the new era of information. Digital technology empowers real-seeming fictions of the ideological and religious and scientific kinds. Among the webs 1 billion sites, believers in anything and everything can find thousands of fellow fantasists, with collages of facts and facts to support them. Before the internet, crackpots were mostly isolated, and surely had a harder time remaining convinced of their alternate realities. Now their devoutly believed opinions are all over the airwaves and the web, just like actual news. Now all of the fantasies look real.

Today, each of us is freer than ever to custom-make reality, to believe whatever and pretend to be whoever we wish. Which makes all the lines between actual and fictional blur and disappear more easily. Truth in general becomes flexible, personal, subjective. And we like this new ultra-freedom, insist on it, even as we fear and loathe the ways so many of our wrongheaded fellow Americans use it.

Treating real life as fantasy and vice versa, and taking preposterous ideas seriously, is not unique to Americans. But we are the global crucible and epicenter. We invented the fantasy-industrial complex; almost nowhere outside poor or otherwise miserable countries are flamboyant supernatural beliefs so central to the identities of so many people. This is American exceptionalism in the 21st century. The country has always been a one-of-a-kind place. But our singularity is different now. Were still rich and free, still more influential and powerful than any other nation, practically a synonym for developed country. But our drift toward credulity, toward doing our own thing, toward denying facts and having an altogether uncertain grip on reality, has overwhelmed our other exceptional national traits and turned us into a less developed country.

People see our shocking Trump momentthis post-truth, alternative facts momentas some inexplicable and crazy new American phenomenon. But whats happening is just the ultimate extrapolation and expression of mind-sets that have made America exceptional for its entire history.

America was created by true believers and passionate dreamers, and by hucksters and their suckers, which made America successfulbut also by a people uniquely susceptible to fantasy, as epitomized by everything from Salems hunting witches to Joseph Smiths creating Mormonism, from P. T. Barnum to speaking in tongues, from Hollywood to Scientology to conspiracy theories, from Walt Disney to Billy Graham to Ronald Reagan to Oprah Winfrey to Trump. In other words: Mix epic individualism with extreme religion; mix show business with everything else; let all that ferment for a few centuries; then run it through the anything-goes 60s and the internet age. The result is the America we inhabit today, with reality and fantasy weirdly and dangerously blurred and commingled.

I dont regret or disapprove of many of the ways the 60s permanently reordered American society and culture. Its just that along with the familiar benefits, there have been unreckoned costs.

In 1962, people started referring to hippies, the Beatles had their first hit, Ken Kesey published One Flew Over the Cuckoos Nest, and the Harvard psychology lecturer Timothy Leary was handing out psilocybin and LSD to grad students. And three hours south of San Francisco, on the heavenly stretch of coastal cliffs known as Big Sur, a pair of young Stanford psychology graduates founded a school and think tank they named after a small American Indian tribe that had lived on the grounds long before. In 1968, one of its founding figures recalled four decades later,

This is not overstatement. Essentially everything that became known as New Age was invented, developed, or popularized at the Esalen Institute. Esalen is a mother church of a new American religion for people who think they dont like churches or religions but who still want to believe in the supernatural. The institute wholly reinvented psychology, medicine, and philosophy, driven by a suspicion of science and reason and an embrace of magical thinking (also: massage, hot baths, sex, and sex in hot baths). It was a headquarters for a new religion of no religion, and for science containing next to no science. The idea was to be radically tolerant of therapeutic approaches and understandings of reality, especially if they came from Asian traditions or from American Indian or other shamanistic traditions. Invisible energies, past lives, astral projection, whateverthe more exotic and wondrous and unfalsifiable, the better.

Not long before Esalen was founded, one of its co-founders, Dick Price, had suffered a mental breakdown and been involuntarily committed to a private psychiatric hospital for a year. His new institute embraced the radical notion that psychosis and other mental illnesses were labels imposed by the straight world on eccentrics and visionaries, that they were primarily tools of coercion and control. This was the big idea behind One Flew Over the Cuckoos Nest, of course. And within the psychiatric profession itself this idea had two influential proponents, who each published unorthodox manifestos at the beginning of the decadeR. D. Laing (The Divided Self) and Thomas Szasz (The Myth of Mental Illness). Madness, Laing wrote when Esalen was new, is potentially liberation and renewal. Esalens founders were big Laing fans, and the institute became a hotbed for the idea that insanity was just an alternative way of perceiving reality.

These influential critiques helped make popular and respectable the idea that much of science is a sinister scheme concocted by a despotic conspiracy to oppress people. Mental illness, both Szasz and Laing said, is a theory not a fact. This is now the universal bottom-line argument for anyonefrom creationists to climate-change deniers to anti-vaccine hystericswho prefers to disregard science in favor of his own beliefs.

You know how young people always think the universe revolves around them, as if theyre the only ones who really get it? And how before their frontal lobes, the neural seat of reason and rationality, are fully wired, they can be especially prone to fantasy? In the 60s, the universe cooperated: It did seem to revolve around young people, affirming their adolescent self-regard, making their fantasies of importance feel real and their fantasies of instant transformation and revolution feel plausible. Practically overnight, America turned its full attention to the young and everything they believed and imagined and wished.

If 1962 was when the decade really got going, 1969 was the year the new doctrines and their gravity were definitively cataloged by the grown-ups. Reason and rationality were over. The countercultural effusions were freaking out the old guard, including religious people who couldnt quite see that yet another Great Awakening was under way in America, heaving up a new religion of believers who have no option but to follow the road until they reach the Holy City that lies beyond the technocracy the New Jerusalem. That line is from The Making of a Counter Culture: Reflections on the Technocratic Society and Its Youthful Opposition, published three weeks after Woodstock, in the summer of 1969. Its author was Theodore Roszak, age 35, a Bay Area professor who thereby coined the word counterculture. Roszak spends 270 pages glorying in the younger generations brave rejection of expertise and all that our culture values as reason and reality. (Note the scare quotes.) So-called experts, after all, are on the payroll of the state and/or corporate structure. A chapter called The Myth of Objective Consciousness argues that science is really just a state religion. To create a new culture in which the non-intellective capacities become the arbiters of the good [and] the true, he writes, nothing less is required than the subversion of the scientific world view, with its entrenched commitment to an egocentric and cerebral mode of consciousness. He welcomes the radical rejection of science and technological values.

Earlier that summer, a University of Chicago sociologist (and Catholic priest) named Andrew Greeley had alerted readers of The New York Times Magazine that beyond the familiar signifiers of youthful rebellion (long hair, sex, drugs, music, protests), the truly shocking change on campuses was the rise of anti-rationalism and a return of the sacredmysticism and magic, the occult, sances, cults based on the book of Revelation. When hed chalked a statistical table on a classroom blackboard, one of his students had reacted with horror: Mr. Greeley, I think youre an empiricist.

As 1969 turned to 1970, a 41-year-old Yale Law School professor was finishing his book about the new youth counterculture. Charles Reich was a former Supreme Court clerk now tenured at one of ultra-rationalisms American headquarters. But hanging with the young people had led him to a midlife epiphany and apostasy. In 1966, he had started teaching an undergraduate seminar called The Individual in America, for which he assigned fiction by Kesey and Norman Mailer. He decided to spend the next summer, the Summer of Love, in Berkeley. On the road back to New Haven, he had his Pauline conversion to the kids values. His class at Yale became hugely popular; at its peak, 600 students were enrolled. In 1970, The Greening of America became The New York Times best-selling book (as well as a much-read 70-page New Yorker excerpt), and remained on the list for most of a year.

At 16, I bought and read one of the 2 million copies sold. Rereading it today and recalling how much I loved it was a stark reminder of the follies of youth. Reich was shamelessly, uncritically swooning for kids like me. The Greening of America may have been the mainstreams single greatest act of pandering to the vanity and self-righteousness of the new youth. Its underlying theoretical scheme was simple and perfectly pitched to flatter young readers: There are three types of American consciousness, each of which makes up an individuals perception of reality his head, his way of life. Consciousness I people were old-fashioned, self-reliant individualists rendered obsolete by the new Corporate Stateessentially, your grandparents. Consciousness IIs were the fearful and conformist organization men and women whose rationalism was a tyrannizing trap laid by the Corporate Stateyour parents.

And then there was Consciousness III, which had made its first appearance among the youth of America, spreading rapidly among wider and wider segments of youth, and by degrees to older people. If you opposed the Vietnam War and dressed down and smoked pot, you were almost certainly a III. Simply by being young and casual and undisciplined, you were ushering in a new utopia.

Reich praises the gaiety and humor of the new Consciousness III wardrobe, but his book is absolutely humorlessbecause its a response to this moment of utmost sterility, darkest night and most extreme peril. Conspiracism was flourishing, and Reich bought in. Now that the Corporate State has added depersonalization and repression to its other injustices, it has threatened to destroy all meaning and suck all joy from life. Reichs magical thinking mainly concerned how the revolution would turn out. The American Corporate State, having produced this new generation of longhaired hyperindividualists who insist on trusting their gut and finding their own truth, is now accomplishing what no revolutionaries could accomplish by themselves. The machine has begun to destroy itself. Once everyone wears Levis and gets high, the old ways will simply be swept away in the flood.

The inevitable/imminent happy-cataclysm part of the dream didnt happen, of course. The machine did not destroy itself. But Reich was half-right. An epochal change in American thinking was under way and not, as far as anybody knows, reversible There is no returning to an earlier consciousness. His wishful error was believing that once the tidal surge of new sensibility brought down the flood walls, the waters would flow in only one direction, carving out a peaceful, cooperative, groovy new continental utopia, hearts and minds changed like his, all of America Berkeleyized and Vermontified. Instead, Consciousness III was just one early iteration of the anything-goes, post-reason, post-factual America enabled by the tsunami. Reichs faith was the converse of the Enlightenment rationalists hopeful fallacy 200 years earlier. Granted complete freedom of thought, Thomas Jefferson and company assumed, most people would follow the path of reason. Wasnt it pretty to think so.

I remember when fantastical beliefs went fully mainstream, in the 1970s. My irreligious mother bought and read The Secret Life of Plants, a big best seller arguing that plants were sentient and would be the bridesmaids at a marriage of physics and metaphysics. The amazing truth about plants, the book claimed, had been suppressed by the FDA and agribusiness. My mom didnt believe in the conspiracy, but she did start talking to her ficuses as if they were pets. In a review, The New York Times registered the book as another data point in how the incredible is losing its pariah status. Indeed, mainstream publishers and media organizations were falling over themselves to promote and sell fantasies as nonfiction. In 1975 came a sensational autobiography by the young spoon bender and mind reader Uri Geller as well as Life After Life, by Raymond Moody, a philosophy Ph.D. who presented the anecdotes of several dozen people whod nearly died as evidence of an afterlife. The book sold many millions of copies; before long the International Association for Near Death Studies formed and held its first conference, at Yale.

During the 60s, large swaths of academia made a turn away from reason and rationalism as theyd been understood. Many of the pioneers were thoughtful, their work fine antidotes to postwar complacency. The problem was the nature and extent of their influence at that particular time, when all premises and paradigms seemed up for grabs. That is, they inspired half-baked and perverse followers in the academy, whose arguments filtered out into the world at large: All approximations of truth, science as much as any fable or religion, are mere stories devised to serve peoples needs or interests. Reality itself is a purely social construction, a tableau of useful or wishful myths that members of a society or tribe have been persuaded to believe. The borders between fiction and nonfiction are permeable, maybe nonexistent. The delusions of the insane, superstitions, and magical thinking? Any of those may be as legitimate as the supposed truths contrived by Western reason and science. The takeaway: Believe whatever you want, because pretty much everything is equally true and false.

These ideas percolated across multiple academic fields. In 1965, the French philosopher Michel Foucault published Madness and Civilization in America, echoing Laings skepticism of the concept of mental illness; by the 1970s, he was arguing that rationality itself is a coercive regime of truthoppression by other means. Foucaults suspicion of reason became deeply and widely embedded in American academia.

Meanwhile, over in sociology, in 1966 a pair of professors published The Social Construction of Reality, one of the most influential works in their field. Not only were sanity and insanity and scientific truth somewhat dubious concoctions by elites, Peter Berger and Thomas Luckmann explainedso was everything else. The rulers of any tribe or society do not just dictate customs and laws; they are the masters of everyones perceptions, defining reality itself. To create the all-encompassing stage sets that everyone inhabits, rulers first use crude mythology, then more elaborate religion, and finally the extreme step of modern science. Reality? Knowledge? If we were going to be meticulous, Berger and Luckmann wrote, we would put quotation marks around the two aforementioned terms every time we used them. What is real to a Tibetan monk may not be real to an American businessman.

When I first read that, at age 18, I loved the quotation marks. If reality is simply the result of rules written by the powers that be, then isnt everyone ableno, isnt everyone obligedto construct their own reality? The book was timed perfectly to become a foundational text in academia and beyond.

A more extreme academic evangelist for the idea of all truths being equal was a UC Berkeley philosophy professor named Paul Feyerabend. His best-known book, published in 1975, was Against Method: Outline of an Anarchistic Theory of Knowledge. Rationalism, it declared, is a secularized form of the belief in the power of the word of God, and science a particular superstition. In a later edition of the book, published when creationists were passing laws to teach Genesis in public-school biology classes, Feyerabend came out in favor of the practice, comparing creationists to Galileo. Science, he insisted, is just another form of belief. Only one principle, he wrote, can be defended under all circumstances and in all stages of human development. It is the principle: anything goes.

Over in anthropology, where the exotic magical beliefs of traditional cultures were a main subject, the new paradigm took over completelydont judge, dont disbelieve, dont point your professorial finger. This was understandable, given the times: colonialism ending, genocide of American Indians confessed, U.S. wars in the developing world. Who were we to roll our eyes or deny what these people believed? In the 60s, anthropology decided that oracles, diviners, incantations, and magical objects should be not just respected, but considered equivalent to reason and science. If all understandings of reality are socially constructed, those of Kalabari tribesmen in Nigeria are no more arbitrary or faith-based than those of college professors.

In 1968, a UC Davis psychologist named Charles Tart conducted an experiment in which, he wrote, a young woman who frequently had spontaneous out-of-body experiencesdidnt claim to have them but had themspent four nights sleeping in a lab, hooked up to an EEG machine. Her assigned task was to send her mind or soul out of her body while she was asleep and read a five-digit number Tart had written on a piece of paper placed on a shelf above the bed. He reported that she succeeded. Other scientists considered the experiments and the results bogus, but Tart proceeded to devote his academic career to proving that attempts at objectivity are a sham and magic is real. In an extraordinary paper published in 1972 in Science, he complained about the scientific establishments almost total rejection of the knowledge gained while high or tripping. He didnt just want science to take seriously experiences of ecstasy, mystical union, other dimensions, rapture, beauty, space-and-time transcendence. He was explicitly dedicated to going there. A perfectly scientific theory may be based on data that have no physical existence, he insisted. The rules of the scientific method had to be revised. To work as a psychologist in the new era, Tart argued, a researcher should be in the altered state of consciousness hes studying, high or delusional at the time of data collection or during data reduction and theorizing. Tarts new mode of research, he admitted, posed problems of consensual validation, given that only observers in the same [altered state] are able to communicate adequately with one another. Tart popularized the term consensus reality for what you or I would simply call reality, and around 1970 that became a permanent interdisciplinary term of art in academia. Later he abandoned the pretense of neutrality and started calling it the consensus trancepeople committed to reason and rationality were the deluded dupes, not he and his tribe.

Even the social critic Paul Goodman, beloved by young leftists in the 60s, was flabbergasted by his own students by 1969. There was no knowledge, he wrote, only the sociology of knowledge. They had so well learned that research is subsidized and conducted for the benefit of the ruling class that they did not believe there was such a thing as simple truth.

Ever since, the American right has insistently decried the spread of relativism, the idea that nothing is any more correct or true than anything else. Conservatives hated how relativism undercut various venerable and comfortable ruling ideascertain notions of entitlement (according to race and gender) and aesthetic beauty and metaphysical and moral certainty. Yet once the intellectual mainstream thoroughly accepted that there are many equally valid realities and truths, once the idea of gates and gatekeeping was discredited not just on campuses but throughout the culture, all American barbarians could have their claims taken seriously. Conservatives are correct that the anything-goes relativism of college campuses wasnt sequestered there, but when it flowed out across America it helped enable extreme Christianities and lunacies on the rightgun-rights hysteria, black-helicopter conspiracism, climate-change denial, and more. The term useful idiot was originally deployed to accuse liberals of serving the interests of true believers further on the left. In this instance, however, postmodern intellectualspost-positivists, poststructuralists, social constructivists, post-empiricists, epistemic relativists, cognitive relativists, descriptive relativiststurned out to be useful idiots most consequentially for the American right. Reality has a well-known liberal bias, Stephen Colbert once said, in character, mocking the beliefs-trump-facts impulse of todays right. Neither side has noticed, but large factions of the elite left and the populist right have been on the same team.

As the Vietnam War escalated and careened, antirationalism flowered. In his book about the remarkable protests in Washington, D.C., in the fall of 1967, The Armies of the Night, Norman Mailer describes chants (Out demons, outback to darkness, ye servants of Satan!) and a circle of hundreds of protesters intending to form a ring of exorcism sufficiently powerful to raise the Pentagon three hundred feet. They were hoping the building would turn orange and vibrate until all evil emissions had fled this levitation. At that point the war in Vietnam would end.

By the end of the 60s, plenty of zealots on the left were engaged in extreme magical thinking. They hadnt started the decade that way. In 1962, Students for a Democratic Society adopted its founding document, drafted by 22-year-old Tom Hayden. The manifesto is sweet and reasonable: decrying inequality and poverty and the pervasiveness of racism in American life, seeing the potential benefits as well as the downsides of industrial automation, declaring the group in basic opposition to the communist system.

Then, kaboom, the big bang. Anything and everything became believable. Reason was chucked. Dystopian and utopian fantasies seemed plausible. In 1969, the SDSs most apocalyptic and charismatic faction, calling itself Weatherman, split off and got all the attention. Its members believed that they and other young white Americans, aligned with black insurgents, would be the vanguard in a new civil war. They issued statements about the need for armed struggle as the only road to revolution and how dope is one of our weapons Guns and grass are united in the youth underground. And then factions of the new left went to work making and setting off thousands of bombs in the early 1970s.

Left-wingers werent the only ones who became unhinged. Officials at the FBI, the CIA, and military intelligence agencies, as well as in urban police departments, convinced themselves that peaceful antiwar protesters and campus lefties in general were dangerous militants, and expanded secret programs to spy on, infiltrate, and besmirch their organizations. Which thereby validated the preexisting paranoia on the new left and encouraged its wing nuts revolutionary delusions. In the 70s, the CIA and Army intelligence set up their infamous Project Star Gate to see whether they could conduct espionage by means of ESP.

The far right had its own glorious 60s moment, in the form of the new John Birch Society, whose founders believed that both Republican and Democratic presidential Cabinets included conscious, deliberate, dedicated agent[s] of the Soviet conspiracy determined to create a world-wide police state, absolutely and brutally governed from the Kremlin, as the societys founder, Robert Welch, put it in a letter to friends.

This furiously, elaborately suspicious way of understanding the world started spreading across the political spectrum after the assassination of John F. Kennedy in 1963. Dallas couldnt have been the work of just one nutty loser with a mail-order rifle, could it have? Surely the Communists or the CIA or the Birchers or the Mafia or some conspiratorial combination must have arranged it all, right? The shift in thinking didnt register immediately. In his influential book The Paranoid Style in American Politics, published two years after the presidents murder, Richard Hofstadter devoted only two sentences and a footnote to it, observing that conspiratorial explanations of Kennedys assassination dont have much currency in the United States.

Elaborate paranoia was an established tic of the Bircherite far right, but the left needed a little time to catch up. In 1964, a left-wing American writer published the first book about a JFK conspiracy, claiming that a Texas oilman had been the mastermind, and soon many books were arguing that the official government inquiry had ignored the hidden conspiracies. One of them, Rush to Judgment, by Mark Lane, a lawyer on the left, was a New York Times best seller for six months. Then, in 1967, New Orleanss district attorney, Jim Garrison, indicted a local businessman for being part of a conspiracy of gay right-wingers to assassinate Kennedya Nazi operation, whose sponsors include some of the oil-rich millionaires in Texas, according to Garrison, with the CIA, FBI, and Robert F. Kennedy complicit in the cover-up. After NBC News broadcast an investigation discrediting the theory, Garrison said the TV segment was a piece of thought control, obviously commissioned by NBCs parent company RCA, one of the top 10 defense contractors and thus desperate because we are in the process of uncovering their hoax.

The notion of an immense and awful JFK-assassination conspiracy became conventional wisdom in America. As a result, more Americans than ever became reflexive conspiracy theorists. Thomas Pynchons novel Gravitys Rainbow, a complicated global fantasy about the interconnections among militarists and Illuminati and stoners, and the validity of paranoid thinking, won the 1974 National Book Award. Conspiracy became the high-end Hollywood dramatic premiseChinatown, The Conversation, The Parallax View, and Three Days of the Condor came out in the same two-year period. Of course, real life made such stories plausible. The infiltration by the FBI and intelligence agencies of left-wing groups was then being revealed, and the Watergate break-in and its cover-up were an actual criminal conspiracy. Within a few decades, the belief that a web of villainous elites was covertly seeking to impose a malevolent global regime made its way from the lunatic right to the mainstream. Delusional conspiracism wouldnt spread quite as widely or as deeply on the left, but more and more people on both sides would come to believe that an extraordinarily powerful cabalinternational organizations and think tanks and big businesses and politicianssecretly ran America.

Each camp, conspiracists on the right and on the left, was ostensibly the enemy of the other, but they began operating as de facto allies. Relativist professors enabled science-denying Christians, and the antipsychiatry craze in the 60s appealed simultaneously to left-wingers and libertarians (as well as to Scientologists). Conspiracy theories were more of a modern right-wing habit before people on the left signed on. However, the belief that the federal government had secret plans to open detention camps for dissidents sprouted in the 70s on the paranoid left before it became a fixture on the right.

Americans felt newly entitled to believe absolutely anything. Im pretty certain that the unprecedented surge of UFO reports in the 70s was not evidence of extraterrestrials increasing presence but a symptom of Americans credulity and magical thinking suddenly unloosed. We wanted to believe in extraterrestrials, so we did. What made the UFO mania historically significant rather than just amusing, however, was the web of elaborate stories that were now being spun: not just of sightings but of landings and abductionsand of government cover-ups and secret alliances with interplanetary beings. Those earnest beliefs planted more seeds for the extravagant American conspiracy thinking that by the turn of the century would be rampant and seriously toxic.

A single ide fixe like this often appears in both frightened and hopeful versions. That was true of the suddenly booming belief in alien visitors, which tended toward the sanguine as the 60s turned into the 70s, even in fictional depictions. Consider the extraterrestrials that Jack Nicholsons character in Easy Rider earnestly describes as hes getting high for the first time, and those at the center of Close Encounters of the Third Kind eight years later. One evening in southern Georgia in 1969, the year Easy Rider came out, a failed gubernatorial candidate named Jimmy Carter saw a moving moon-size white light in the sky that didnt have any solid substance to it and got closer and closer, stopped, turned blue, then red and back to white, and then zoomed away.

The first big nonfiction abduction tale appeared around the same time, in a best-selling book about a married couple in New Hampshire who believed that while driving their Chevy sedan late one night, they saw a bright object in the sky that the wife, a UFO buff already, figured might be a spacecraft. She began having nightmares about being abducted by aliens, and both of them underwent hypnosis. The details of the abducting aliens and their spacecraft that each described were different, and changed over time. The mans hypnotized description of the aliens bore an uncanny resemblance to the ones in an episode of The Outer Limits broadcast on ABC just before his hypnosis session. Thereafter, hypnosis became the standard way for people who believed that they had been abducted (or that they had past lives, or that they were the victims of satanic abuse) to recall the supposed experience. And the couples story established the standard abduction-tale format: Humanoid creatures take you aboard a spacecraft, communicate telepathically or in spoken English, medically examine you by inserting long needles into you, then let you go.

The husband and wife were undoubtedly sincere believers. The sincerely credulous are perfect suckers, and in the late 60s, a convicted thief and embezzler named Erich von Dniken published Chariots of the Gods?, positing that extraterrestrials helped build the Egyptian pyramids, Stonehenge, and the giant stone heads on Easter Island. That book and its many sequels sold tens of millions of copies, and the documentary based on it had a huge box-office take in 1970. Americans were ready to believe von Dnikens fantasy to a degree they simply wouldnt have been a decade earlier, before the 60s sea change. Certainly a decade earlier NBC wouldnt have aired an hour-long version of the documentary in prime time. And while Im at it: Until wed passed through the 60s and half of the 70s, Im pretty sure we wouldnt have given the presidency to some dude, especially a born-again Christian, who said hed recently seen a huge, color-shifting, luminescent UFO hovering near him.

By the 1980s, things appeared to have returned more or less to normal. Civil rights seemed like a done deal, the war in Vietnam was over, young people were no longer telling grown-ups they were worthless because they were grown-ups. Revolution did not loom. Sex and drugs and rock and roll were regular parts of life. Starting in the 80s, loving America and making money and having a family were no longer unfashionable.

The sense of cultural and political upheaval and chaos dissipatedwhich lulled us into ignoring all the ways that everything had changed, that Fantasyland was now scaling and spreading and becoming the new normal. What had seemed strange and amazing in 1967 or 1972 became normal and ubiquitous.

Extreme religious and quasi-religious beliefs and practices, Christian and New Age and otherwise, didnt subside, but grew and thrivedand came to seem unexceptional.

Relativism became entrenched in academiatenured, you could say. Michel Foucaults rival Jean Baudrillard became a celebrity among American intellectuals by declaring that rationalism was a tool of oppressors that no longer worked as a way of understanding the world, pointless and doomed. In other words, as he wrote in 1986, the secret of theorythis whole intellectual realm now called itself simply theoryis that truth does not exist.

This kind of thinking was by no means limited to the ivory tower. The intellectuals new outlook was as much a product as a cause of the smog of subjectivity that now hung thick over the whole American mindscape. After the 60s, truth was relative, criticizing was equal to victimizing, individual liberty became absolute, and everyone was permitted to believe or disbelieve whatever they wished. The distinction between opinion and fact was crumbling on many fronts.

Belief in gigantic secret conspiracies thrived, ranging from the highly improbable to the impossible, and moved from the crackpot periphery to the mainstream.

Many Americans announced that theyd experienced fantastic horrors and adventures, abuse by Satanists, and abduction by extraterrestrials, and their claims began to be taken seriously. Parts of the establishmentpsychology and psychiatry, academia, religion, law enforcementencouraged people to believe that all sorts of imaginary traumas were real.

America didnt seem as weird and crazy as it had around 1970. But thats because Americans had stopped noticing the weirdness and craziness. We had defined every sort of deviancy down. And as the cultural critic Neil Postman put it in his 1985 jeremiad about how TV was replacing meaningful public discourse with entertainment, we were in the process of amusing ourselves to death.

The Reagan presidency was famously a triumph of truthiness and entertainment, and in the 1990s, as problematically batty beliefs kept going mainstream, presidential politics continued merging with the fantasy-industrial complex.

In 1998, as soon as we learned that President Bill Clinton had been fellated by an intern in the West Wing, his popularity spiked. Which was baffling only to those who still thought of politics as an autonomous realm, existing apart from entertainment. American politics happened on television; it was a TV series, a reality show just before TV became glutted with reality shows. A titillating new story line that goosed the ratings of an existing series was an established scripted-TV gimmick. The audience had started getting bored with The Clinton Administration, but the Monica Lewinsky subplot got people interested again.

Just before the Clintons arrived in Washington, the right had managed to do away with the federal Fairness Doctrine, which had been enacted to keep radio and TV shows from being ideologically one-sided. Until then, big-time conservative opinion media had consisted of two magazines, William F. Buckley Jr.s biweekly National Review and the monthly American Spectator, both with small circulations. But absent a Fairness Doctrine, Rush Limbaughs national right-wing radio show, launched in 1988, was free to thrive, and others promptly appeared.

For most of the 20th century, national news media had felt obliged to pursue and present some rough approximation of the truth rather than to promote a truth, let alone fictions. With the elimination of the Fairness Doctrine, a new American laissez-faire had been officially declared. If lots more incorrect and preposterous assertions circulated in our mass media, that was a price of freedom. If splenetic commentators could now, as never before, keep believers perpetually riled up and feeling the excitement of being in a mob, so be it.

Limbaughs virtuosic three hours of daily talk started bringing a sociopolitical alternate reality to a huge national audience. Instead of relying on an occasional magazine or newsletter to confirm your gnarly view of the world, now you had talk radio drilling it into your head for hours every day. As Limbaughs show took off, in 1992 the producer Roger Ailes created a syndicated TV show around him. Four years later, when NBC hired someone else to launch a cable news channel, Ailes, who had been working at NBC, quit and created one with Rupert Murdoch.

Fox News brought the Limbaughvian talk-radio version of the world to national TV, offering viewers an unending and immersive propaganda experience of a kind that had never existed before.

For Americans, this was a new condition. Over the course of the century, electronic mass media had come to serve an important democratic function: presenting Americans with a single shared set of facts. Now TV and radio were enabling a reversion to the narrower, factional, partisan discourse that had been normal in Americas earlier centuries.

And there was also the internet, which eventually would have mooted the Fairness Doctrine anyhow. In 1994, the first modern spam message was sent, visible to everyone on Usenet: global alert for all: jesus is coming soon. Over the next year or two, the masses learned of the World Wide Web. The tinder had been gathered and stacked since the 60s, and now the match was lit and thrown. After the 60s and 70s happened as they happened, the internet may have broken Americas dynamic balance between rational thinking and magical thinking for good.

Before the web, cockamamy ideas and outright falsehoods could not spread nearly as fast or as widely, so it was much easier for reason and reasonableness to prevail. Before the web, institutionalizing any one alternate reality required the long, hard work of hundreds of full-time militants. In the digital age, however, every tribe and fiefdom and principality and region of Fantasylandevery screwball with a computer and an internet connectionsuddenly had an unprecedented way to instruct and rile up and mobilize believers, and to recruit more. False beliefs were rendered both more real-seeming and more contagious, creating a kind of fantasy cascade in which millions of bedoozled Americans surfed and swam.

Why did Senator Daniel Patrick Moynihan begin remarking frequently during the 80s and 90s that people were entitled to their own opinions but not to their own facts? Because until then, that had not been necessary to say. Our marketplace of ideas became exponentially bigger and freer than ever, its true. Thomas Jefferson said that hed rather be exposed to the inconveniences attending too much liberty than those attending too small a degree of itbecause in the new United States, reason is left free to combat every sort of error of opinion. However, I think if he and our other Enlightenment forefathers returned, they would see the present state of affairs as too much of a good thing. Reason remains free to combat unreason, but the internet entitles and equips all the proponents of unreason and error to a previously unimaginable degree. Particularly for a people with our history and propensities, the downside of the internet seems at least as profound as the upside.

The way internet search was designed to operate in the 90sthat is, the way information and beliefs now flow, rise, and fallis democratic in the extreme. Internet search algorithms are an example of Greshams law, whereby the bad drives outor at least overrunsthe good. On the internet, the prominence granted to any factual assertion or belief or theory depends on the preferences of billions of individual searchers. Each click on a link is effectively a vote pushing that version of the truth toward the top of the pile of results.

Exciting falsehoods tend to do well in the perpetual referenda, and become self-validating. A search for almost any alternative theory or belief seems to generate more links to true believers pages and sites than to legitimate or skeptical ones, and those tend to dominate the first few pages of results. For instance, beginning in the 90s, conspiracists decided that contrails, the skinny clouds of water vapor that form around jet-engine exhaust, were composed of exotic chemicals, part of a secret government scheme to test weapons or poison citizens or mitigate climate changeand renamed them chemtrails. When I Googled chemtrails proof, the first seven results offered so-called evidence of the nonexistent conspiracy. When I searched for government extraterrestrial cover-up, only one result in the first three pages didnt link to an article endorsing a conspiracy theory.

Before the web, it really wasnt easy to stumble across false or crazy information convincingly passing itself off as true. Today, however, as the Syracuse University professor Michael Barkun saw back in 2003 in A Culture of Conspiracy, such subject-specific areas as crank science, conspiracist politics, and occultism are not isolated from one another, but rather

The consequence of such mingling is that an individual who enters the communications system pursuing one interest soon becomes aware of stigmatized material on a broad range of subjects. As a result, those who come across one form of stigmatized knowledge will learn of others, in connections that imply that stigmatized knowledge is a unified domain, an alternative worldview, rather than a collection of unrelated ideas.

Academic research shows that religious and supernatural thinking leads people to believe that almost no big life events are accidental or random. As the authors of some recent cognitive-science studies at Yale put it, Individuals explicit religious and paranormal beliefs are the best predictors of their perception of purpose in life eventstheir tendency to view the world in terms of agency, purpose, and design. Americans have believed for centuries that the country was inspired and guided by an omniscient, omnipotent planner and interventionist manager. Since the 60s, that exceptional religiosity has fed the tendency to believe in conspiracies. In a recent paper called Conspiracy Theories and the Paranoid Style(s) of Mass Opinion, based on years of survey research, two University of Chicago political scientists, J. Eric Oliver and Thomas J. Wood, confirmed this special American connection. The likelihood of supporting conspiracy theories is strongly predicted, they found, by a propensity to attribute the source of unexplained or extraordinary events to unseen, intentional forces and a weakness for melodramatic narratives as explanations for prominent events, particularly those that interpret history relative to universal struggles between good and evil. Oliver and Wood found the single strongest driver of conspiracy belief to be belief in end-times prophecies.

As a 13-year-old, I watched William F. Buckley Jr.s Firing Line with my conservative dad, attended Teen Age Republicans summer camp, and, at the behest of a Nixon-campaign advance man in Omaha, ripped down Rockefeller and Reagan signs during the 1968 Nebraska primary campaign. A few years later, I was a McGovern-campaign volunteer, but I still watched and admired Buckley on PBS. Over the years, Ive voted for a few Republicans for state and local office. Today I disagree about political issues with friends and relatives to my right, but we agree on the essential contours of reality.

People on the left are by no means all scrupulously reasonable. Many give themselves over to the appealingly dubious and the untrue. But fantastical politics have become highly asymmetrical. Starting in the 1990s, Americas unhinged right became much larger and more influential than its unhinged left. There is no real left-wing equivalent of Sean Hannity, let alone Alex Jones. Moreover, the far right now has unprecedented political power; it controls much of the U.S. government.

Why did the grown-ups and designated drivers on the political left manage to remain basically in charge of their followers, while the reality-based right lost out to fantasy-prone true believers?

One reason, I think, is religion. The GOP is now quite explicitly Christian. The party is the American coalition of white Christians, papering over doctrinal and class differencesand now led, weirdly, by one of the least religious presidents ever. If more and more of a political partys members hold more and more extreme and extravagantly supernatural beliefs, doesnt it make sense that the party will be more and more open to make-believe in its politics?

I doubt the GOP elite deliberately engineered the synergies between the economic and religious sides of their contemporary coalition. But as the incomes of middle- and working-class people flatlined, Republicans pooh-poohed rising economic inequality and insecurity. Economic insecurity correlates with greater religiosity, and among white Americans, greater religiosity correlates with voting Republican. For Republican politicians and their rich-getting-richer donors, thats a virtuous circle, not a vicious one.

Religion aside, America simply has many more fervid conspiracists on the right, as research about belief in particular conspiracies confirms again and again. Only the American right has had a large and organized faction based on paranoid conspiracism for the past six decades. As the pioneer vehicle, the John Birch Society zoomed along and then sputtered out, but its fantastical paradigm and belligerent temperament has endured in other forms and under other brand names. When Barry Goldwater was the right-wing Republican presidential nominee in 1964, he had to play down any streaks of Bircher madness, but by 1979, in his memoir With No Apologies, he felt free to rave on about the globalist conspiracy and its pursuit of a new world order and impending period of slavery; the Council on Foreign Relations secret agenda for one-world rule; and the Trilateral Commissions plan for seizing control of the political government of the United States. The right has had three generations to steep in this, its taboo vapors wafting more and more into the main chambers of conservatism, becoming familiar, seeming less outlandish. Do you believe that a secretive power elite with a globalist agenda is conspiring to eventually rule the world through an authoritarian world government? Yes, say 34 percent of Republican voters, according to Public Policy Polling.

In the late 1960s and 70s, the reality-based left more or less won: retreat from Vietnam, civil-rights and environmental-protection laws, increasing legal and cultural equality for women, legal abortion, Keynesian economics triumphant.

But then the right wanted its turn to win. It pretty much accepted racial and gender equality and had to live with social welfare and regulation and bigger government, but it insisted on slowing things down. The political center moved rightbut in the 70s and 80s not yet unreasonably. Most of America decided that we were all free marketeers now, that business wasnt necessarily bad, and that government couldnt solve all problems. We still seemed to be in the midst of the normal cyclical seesawing of American politics. In the 90s, the right achieved two of its wildest dreams: The Soviet Union and international communism collapsed; and, as violent crime radically declined, law and order was restored.

But also starting in the 90s, the farthest-right quarter of Americans, lets say, couldnt and wouldnt adjust their beliefs to comport with their sides victories and the dramatically new and improved realities. Theyd made a god out of Reagan, but they ignored or didnt register that he was practical and reasonable, that he didnt completely buy his own antigovernment rhetoric. After Reagan, his hopped-up true-believer faction began insisting on total victory. But in a democracy, of course, total victory by any faction is a dangerous fantasy.

Another way the GOP got loopy was by overdoing libertarianism. I have some libertarian tendencies, but at full-strength purity its an ideology most boys grow out of. On the American right since the 80s, however, they have not. Republicans are very selective, cherry-picking libertarians: Let business do whatever it wants and dont spoil poor people with government handouts; let individuals have gun arsenals but not abortions or recreational drugs or marriage with whomever they wish; and dont mention Ayn Rands atheism. Libertarianism, remember, is an ideology whose most widely read and influential texts are explicitly fiction. I grew up reading Ayn Rand, Speaker of the House Paul Ryan has said, and it taught me quite a bit about who I am and what my value systems are, and what my beliefs are. It was that fiction that allowed him and so many other higher-IQ Americans to see modern America as a dystopia in which selfishness is righteous and they are the last heroes. I think a lot of people, Ryan said in 2009, would observe that we are right now living in an Ayn Rand novel. Im assuming he meant Atlas Shrugged, the novel that Trumps secretary of state (and former CEO of ExxonMobil) has said is his favorite book. Its the story of a heroic cabal of mens-men industrialists who cause the U.S. government to collapse so they can take over, start again, and make everything right.

For a while, Republican leaders effectively encouraged and exploited the predispositions of their variously fantastical and extreme partisans. Karl Rove was stone-cold cynical, the Wizard of Ozs evil twin coming out from behind the curtain for a candid chat shortly before he won a second term for George W. Bush, about how judicious study of discernible reality [is] not the way the world really works anymore. These leaders were rational people who understood that a large fraction of citizens dont bother with rationality when they vote, that a lot of voters resent the judicious study of discernible reality. Keeping those people angry and frightened won them elections.

But over the past few decades, a lot of the rabble they roused came to believe all the untruths. The problem is that Republicans have purposefully torn down the validating institutions, the political journalist Josh Barro, a Republican until 2016, wrote last year. They have convinced voters that the media cannot be trusted; they have gotten them used to ignoring inconvenient facts about policy; and they have abolished standards of discourse. The partys ideological center of gravity swerved way to the right of Rove and all the Bushes, finally knocking them and their clubmates aside. What had been the partys fantastical fringe became its middle. Reasonable Republicanism was replaced by absolutism: no new taxes, virtually no regulation, abolish the EPA and the IRS and the Federal Reserve.

When I was growing up in Nebraska, my Republican parents loathed all Kennedys, distrusted unions, and complained about confiscatory federal income-tax rates of 91 percent. But conservatism to them also meant conserving the natural environment and allowing people to make their own choices, including about abortion. They were emphatically reasonable, disinclined to believe in secret Communist/Washington/elite plots to destroy America, rolling their eyes and shaking their heads about far-right acquaintancessuch as our neighbors, the parents of the future Mrs. Clarence Thomas, who considered Richard Nixon suspiciously leftish. My parents never belonged to a church. They were godless Midwestern Republicans, born and raisedwhich wasnt so odd 40 years ago. Until about 1980, the Christian right was not a phrase in American politics. In 2000, my widowed mom, having voted for 14 Republican presidential nominees in a row, quit a party that had become too Christian for her.

Read more from the original source:

How America Lost Its Mind - The Atlantic

The philosopher who poisoned German theology – Catholic Herald Online (blog)

Portrait by Jakob Schlesinger, Berlin 1831

Modern German Catholic thought is influenced by a heretical view of God's nature

Otto von Bismarck, the 19th-century Chancellor of Germany, tried and failed to bring the Catholic Church to heel. He would have been delighted to see its state today. With pews emptying at a great rate, and few priestly vocations, the fact that the Church remains one of the largest employers could only prove that it had become the servant of state that he hoped it would be. Yet perhaps Bismarck might want to know: How have others achieved what I failed to bring about? At least part of the answer comes from within the German Church.

Theologically, Germany has been ground zero for centuries: just think of Albert the Great mentoring St Thomas Aquinas, or the Jesuit-led Counter-Reformation which answered Luthers schismatic dissent. But German theology has never quite recovered from its greatest challenge: Enlightenment rationalism and the attempts to overcome it through Hegelian dialectic. Even today, Hegels influence dominates German theology.

The Hegelian view of Gods involvement in the unfolding of history as Geist (Spirit) is at root a Christian heresy, reminiscent of the spiritualism of the 12th-century theologian Joachim de Fiore. For the Hegelian, God suffers with, and changes, precisely through the sin and suffering of his creatures, dialectically pouring out his love and mercy through the progress of history.

Citing a Lutheran hymn, God Himself is Dead, Hegel argues that God unites death to his nature. And so when we encounter suffering and death, we taste the particularities of the eternal divine history. As he puts it, suffering is a moment in the nature of God himself; it has taken place in God himself. For Hegel, suffering is an aspect of Gods eternal nature. Our sin and suffering is necessary for God to be God.

This heretical view has had widespread influence in modern Catholic and Protestant accounts of Gods nature. Its often given a pastoral veneer of the God who weeps with us. Yet, tragically unaware of his error, the Hegelian homilist preaches a God who cannot save: a God who is so eternally bound to our tears he cannot truly wipe them away.

Many 20th-century German theologians followed in Hegels footsteps. A basic principle was Hegels dialectic process itself as revelatory, which is to say they smuggled into their ideas on doctrinal development the notion that God was continuing to reveal himself in history, as though there was always something becoming in God, and thus, in the Church. Hegels spiritual forerunner Joachim de Fiore had predicted a third age of the Holy Spirit which would sing a new Church into being, and its striking how many German theologians have been entranced by the idea of a future Church very different to the holy and apostolic one of the past.

This is not to say Hegel is the answer to Bismarcks hypothetical question. There is a great difference between the Left Hegelian Ludwig Feuerbachs idea of religion as projection of inner spirit and the theologies of Karl Rahner or Walter Kasper. But there is nevertheless something deeply Hegelian about making the unfolding of human experience in history a standard for theological development to which God or the Church, always in mercy, must conform. Unfortunately, this is a terrible standard for change which leads not only to false reform, but to apostasy and desolation.

The standard for development, as 19th century German theologian Matthias Scheeben understood as well as Cardinal Newman, must be divinely revealed truths, the deposit of faith, passed from Christ to his apostles. Spiritual renewal in Germany can only begin if German bishops, priests, and laity alike recognize that change and development must be ordered to eternal truths, not to the needs of state, the Geist of culture, or the historical unfolding of inner human experience. The Church conforms not to the needs of nations, but to the fullness of Truth revealed by God Incarnate in Jesus Christ.

C C Pecknold is associate professor of theology at The Catholic University of America

This article first appeared in the August 11 2017 issue of the Catholic Herald. To read the magazine in full, from anywhere in the world, go here

Originally posted here:

The philosopher who poisoned German theology - Catholic Herald Online (blog)

Hegelianism – Wikipedia

Hegelianism is the philosophy of G. W. F. Hegel which can be summed up by the dictum that "the rational alone is real",[1] which means that all reality is capable of being expressed in rational categories. His goal was to reduce reality to a more synthetic unity within the system of absolute idealism.

Hegel's method in philosophy consists of the triadic development (Entwicklung) in each concept and each thing. Thus, he hopes, philosophy will not contradict experience, but will give data of experience to the philosophical, which is the ultimately true explanation. If, for instance, we wish to know what liberty is, we take that concept where we first find itthe unrestrained action of the savage, who does not feel the need of repressing any thought, feeling, or tendency to act.

Next, we find that the savage has given up this freedom in exchange for its opposite, the restraint, or, as he considers it, the tyranny, of civilization and law. Finally, in the citizen under the rule of law, we find the third stage of development, namely liberty in a higher and a fuller sense than how the savage possessed itthe liberty to do, say, and think many things beyond the power of the savage.

In this triadic process, the second stage is the direct opposite, the annihilation, or at least the sublation, of the first. The third stage is the first returned to itself in a higher, truer, richer, and fuller form. The three stages are, therefore, styled:

These three stages are found succeeding one another throughout the whole realm of thought and being, from the most abstract logical process up to the most complicated concrete activity of organized mind in the succession of states or the production of systems of philosophy.

In logic which, according to Hegel, is really metaphysic we have to deal with the process of development applied to reality in its most abstract form. According to Hegel, in logic, we deal in concepts robbed of their empirical content: in logic we are discussing the process in vacuo, so to speak. Thus, at the very beginning of Hegel's study of reality, he finds the logical concept of being.

Now, being is not a static concept according to Hegel, as Aristotle supposed it was. It is essentially dynamic, because it tends by its very nature to pass over into nothing, and then to return to itself in the higher concept, becoming. For Aristotle, there was nothing more certain than that being equaled being, or, in other words, that being is identical with itself, that everything is what it is. Hegel does not deny this; but, he adds, it is equally certain that being tends to become its opposite, nothing, and that both are united in the concept becoming. For instance, the truth about this table, for Aristotle, is that it is a table. (This is not necessarily true. Aristotle made a distinction between things made by art and things made by nature. Things made by art--such as a table--follow this description of thinghood. Living things however are self-generating and constantly creating their own being. Being in the sense of a living thing is highly dynamic and is defined by the thing creating its own being. He describes life not in terms of being but coming-into-being. For instance a baby's goal is to become old. It is neither absolutely young or absolutely old and somewhere in the process of being young and becoming old. It sounds like Hegel made the comparison between being and not being while Aristotle made the comparison between art and nature.)

For Hegel, the equally important truth is that it was a tree, and it "will be" ashes. The whole truth, for Hegel, is that the tree became a table and will become ashes. Thus, becoming, not being, is the highest expression of reality. It is also the highest expression of thought because then only do we attain the fullest knowledge of a thing when we know what it was, what it is, and what it will be-in a word, when we know the history of its development.

In the same way as "being" and "nothing" develop into the higher concept becoming, so, farther on in the scale of development, life and mind appear as the third terms of the process and in turn are developed into higher forms of themselves. (It is interesting here to note that Aristotle saw "being" as superior to "becoming", because anything which is still becoming something else is imperfect. Hence, God, for Aristotle, is perfect because He never changes, but is eternally complete.) But one cannot help asking what is it that develops or is developed?

Its name, Hegel answers, is different in each stage. In the lowest form it is "being", higher up it is "life", and in still higher form it is "mind". The only thing always present is the process (das Werden). We may, however, call the process by the name of "spirit" (Geist) or "idea" (Begriff). We may even call it God, because at least in the third term of every triadic development the process is God.

The first and most wide-reaching consideration of the process of spirit, God, or the idea, reveals to us the truth that the idea must be studied (1) in itself; this is the subject of logic or metaphysics; (2) out of itself, in nature; this is the subject of the philosophy of nature; and (3) in and for itself, as mind; this is the subject of the philosophy of mind (Geistesphilosophie).

Passing over the rather abstract considerations by which Hegel shows in his Logik the process of the idea-in-itself through being to becoming, and finally through essence to notion, we take up the study of the development of the idea at the point where it enters into otherness in nature. In nature the idea has lost itself, because it has lost its unity and is splintered, as it were, into a thousand fragments. But the loss of unity is only apparent, because in reality the idea has merely concealed its unity.

Studied philosophically, nature reveals itself as so many successful attempts of the idea to emerge from the state of otherness and present itself to us as a better, fuller, richer idea, namely, spirit, or mind. Mind is, therefore, the goal of nature. It is also the truth of nature. For whatever is in nature is realized in a higher form in the mind which emerges from nature.

The philosophy of mind begins with the consideration of the individual, or subjective, mind. It is soon perceived, however, that individual, or subjective, mind is only the first stage, the in-itself stage, of mind. The next stage is objective mind, or mind objectified in law, morality, and the State. This is mind in the condition of out-of-itself.

There follows the condition of absolute mind, the state in which mind rises above all the limitations of nature and institutions, and is subjected to itself alone in art, religion, and philosophy. For the essence of mind is freedom, and its development must consist in breaking away from the restrictions imposed on it in it otherness by nature and human institutions.

Hegel's philosophy of the State, his theory of history, and his account of absolute mind are perhaps the most often read portions of his philosophy due to their accessibility. The State, he says, is mind objectified. The individual mind, which, on account of its passions, its prejudices, and its blind impulses, is only partly free, subjects itself to the yoke of necessitythe opposite of freedomin order to attain a fuller realization of itself in the freedom of the citizen.

This yoke of necessity is first met within the recognition of the rights of others, next in morality, and finally in social morality, of which the primal institution is the family. Aggregates of families form civil society, which, however, is but an imperfect form of organization compared with the State. The State is the perfect social embodiment of the idea, and stands in this stage of development for God Himself.

The State, studied in itself, furnishes for our consideration constitutional law. In relation to other States it develops international law; and in its general course through historical vicissitudes it passes through what Hegel calls the "Dialectics of History".

Hegel teaches that the constitution is the collective spirit of the nation and that the government and the written constitution is the embodiment of that spirit. Each nation has its own individual spirit, and the greatest of crimes is the act by which the tyrant or the conqueror stifles the spirit of a nation.

War, Hegel suggests, can never be ruled out, as one can never know when or if one will occur, an example being the Napoleonic overrunning of Europe and putting down of Royalist systems. War represents a crisis in the development of the idea which is embodied in the different States, and out of this crisis usually the State which holds the more advanced spirit wins out, though it may also suffer a loss, lick its wounds, yet still win in the spiritual sense, as happened for example when the northerners sacked Rome, its form of legality and religion all "won" out in spite of the losses on the battlefield.

A peaceful revolution is also possible according to Hegel when the changes required to solve the crisis are ascertained by thoughtful insight and when this insight spreads throughout the body politic:

If a people [Volk] can no longer accept as implicitly true what its constitution expresses to it as the truth, if its consciousness or Notion and its actuality are not at one, then the peoples spirit is torn asunder. Two things may then occur. First, the people may either by a supreme internal effort dash into fragments this law which still claims authority, or it may more quietly and slowly effect changes on the yet operative law, which is, however, no longer true morality, but which the mind has already passed beyond. In the second place, a peoples intelligence and strength may not suffice for this, and it may hold to the lower law; or it may happen that another nation has reached its higher constitution, thereby rising in the scale, and the first gives up its nationality and becomes subject to the other. Therefore it is of essential importance to know what the true constitution is; for what is in opposition to it has no stability, no truth, and passes away. It has a temporary existence, but cannot hold its ground; it has been accepted, but cannot secure permanent acceptance; that it must be cast aside, lies in the very nature of the constitution. This insight can be reached through Philosophy alone. Revolutions take place in a state without the slightest violence when the insight becomes universal; institutions, somehow or other, crumble and disappear, each man agrees to give up his right. A government must, however, recognize that the time for this has come; should it, on the contrary, knowing not the truth, cling to temporary institutions, taking what though recognized is unessential, to be a bulwark guarding it from the essential (and the essential is what is contained in the Idea), that government will fall, along with its institutions, before the force of mind. The breaking up of its government breaks up the nation itself; a new government arises, or it may be that the government and the unessential retain the upper hand.[2]

The "ground" of historical development is, therefore, rational; since the State, if it is not in contradiction, is the embodiment of reason as spirit. Many, at first considered to be, contingent events of history can become, in reality or in necessity, stages in the logical unfolding of the sovereign reason which gets embodied in an advanced State. Such a "necessary contingency" when expressed in passions, impulse, interest, character, personality, get used by the "cunning of reason", which, in retrospect, was to its own purpose.

We are, therefore, to understand historical happenings as the stern, reluctant working of reason towards the full realization of itself in perfect freedom. Consequently, we must interpret history in rational terms, and throw the succession of events into logical categories and this interpretation is, for Hegel, a mere inference from actual history.

Thus, the widest view of history reveals three most important stages of development: Oriental imperial (the stage of oneness, of suppression of freedom), Greek social democracy (the stage of expansion, in which freedom was lost in unstable demagogy), and Christian constitutional monarchy (which represents the reintegration of freedom in constitutional government).

Even in the State, mind is limited by subjection to other minds. There remains the final step in the process of the acquisition of freedom, namely, that by which absolute mind in art, religion, and philosophy subjects itself to itself alone. In art, mind has the intuitive contemplation of itself as realized in the art material, and the development of the arts has been conditioned by the ever-increasing "docility" with which the art material lends itself to the actualization of mind or the idea.

In religion, mind feels the superiority of itself to the particularizing limitations of finite things. Here, as in the philosophy of history, there are three great moments, Oriental religion, which exaggerated the idea of the infinite, Greek religion, which gave undue importance to the finite, and Christianity, which represents the union of the infinite and the finite. Last of all, absolute mind, as philosophy, transcends the limitations imposed on it even in religious feeling, and, discarding representative intuition, attains all truth under the form of reason.

Whatever truth there is in art and in religion is contained in philosophy, in a higher form, and free from all limitations. Philosophy is, therefore, "the highest, freest and wisest phase of the union of subjective and objective mind, and the ultimate goal of all development."

The far reaching influence of Hegel is due in a measure to the undoubted vastness of the scheme of philosophical synthesis which he conceived and partly realized. A philosophy which undertook to organize under the single formula of triadic development every department of knowledge, from abstract logic up to the philosophy of history, has a great deal of attractiveness to those who are metaphysically inclined. But Hegel's influence is due in a still larger measure to two extrinsic circumstances.

His philosophy is the highest expression of that spirit of collectivism which characterized the nineteenth century. In theology especially Hegel revolutionized the methods of inquiry. The application of his notion of development to Biblical criticism and to historical investigation is obvious to anyone who compares the spirit and purpose of contemporary theology with the spirit and purpose of the theological literature of the first half of the nineteenth century.[citation needed]

In science, too, and in literature, the substitution of the category of becoming for the category of being is a very patent fact, and is due to the influence of Hegel's method. In political economy and political science the effect of Hegel's collectivistic conception of the State supplanted to a large extent the individualistic conception which was handed down from the eighteenth century to the nineteenth century.

Hegel's philosophy became known outside Germany from the 1820s onwards, and Hegelian schools developed in northern Europe, Italy, France, Eastern Europe, America and Britain.[3] These schools are collectively known as post-Hegelian philosophy, post-Hegelian idealism or simply post-Hegelianism.[4]

Hegel's immediate followers in Germany are generally divided into the "Right Hegelians" and the "Left Hegelians" (the latter also referred to as the "Young Hegelians").

The Rightists developed his philosophy along lines which they considered to be in accordance with Christian theology. They included Karl Friedrich Gschel, Johann Philipp Gabler, Johann Karl Friedrich Rosenkranz, and Johann Eduard Erdmann.

The Leftists accentuated the anti-Christian tendencies of Hegel's system and developed schools of materialism, socialism, rationalism, and pantheism. They included Ludwig Feuerbach, Karl Marx, Bruno Bauer, and David Strauss. Max Stirner socialized with the left Hegelians but built his own philosophical system largely opposing that of these thinkers.

In Britain, Hegelianism was represented during the nineteenth century by, and largely overlapped the British Idealist school of James Hutchison Stirling, Thomas Hill Green, William Wallace, John Caird, Edward Caird, Richard Lewis Nettleship, F.H. Bradley, and J. M. E. McTaggart.

In Denmark, Hegelianism was represented by Johan Ludvig Heiberg and Hans Lassen Martensen from the 1820s to the 1850s.

In mid-19th century Italy, Hegelianism was represented by Bertrando Spaventa.

Hegelianism in North America was represented by Friedrich August Rauch, Thomas Watson and William T. Harris, as well as the St. Louis Hegelians. In its most recent form it seems to take its inspiration from Thomas Hill Green, and whatever influence it exerts is opposed to the prevalent pragmatic tendency.

In Poland, Hegelianism was represented by Karol Libelt, August Cieszkowski and Jzef Kremer.

Benedetto Croce and tienne Vacherot were the leading Hegelians towards the end of the nineteenth century in Italy and France, respectively. Among Catholic philosophers who were influenced by Hegel the most prominent were Georg Hermes and Anton Gnther.

Hegelianism also inspired Giovanni Gentile's philosophy of actual idealism and Fascism, the concept that people are motivated by ideas and that social change is brought by the leaders.

Hegelianism spread to Imperial Russia through St. Petersburg in the 1840s, and was as other intellectual waves were considered an absolute truth amongst the intelligentsia, until the arrival of Darwinism in the 1860s.[5]

Excerpt from:

Hegelianism - Wikipedia

The best ism to explain our time: Surrealism, which turns 100 this year – Los Angeles Times

Surrealism is celebrating its 100th birthday this year. The poet Guillaume Apollinaire coined the term to describe his play Les Mamelles de Tiresias (The Teats of Tiresias), which opened in a small Parisian theater in 1917. Beginning with an actress removing her breasts and ending early with an unscripted riot featuring a pistol-flailing audience member the play launched a movement that long convulsed French art and politics.

The centenary arrives in a surreal news environment. Indeed, among the dozens of isms used to explain the Trump presidency from isolationism and pluto-populism to narcissism and authoritarianism none does a better job than surrealism in capturing the current mood.

Andr Breton, the Pope of Surrealism, defined it as a psychic automatism in its pure state exempt from any moral concern. In his First Manifesto of Surrealism, Breton railed against rationalism and the reign of logic. Clarity and coherence lost bigly to the tumult of unconscious desires, while civility and courtesy were for bourgeois losers. Upping the ante in his Second Manifesto, he claimed the simplest Surrealist act consists of dashing down into the street, pistol in hand, and firing blindly, as fast as you can pull the trigger, into the crowd.

Unarmed Surrealists were content to brandish their ids. What was once the stuff of repression was now ripe for expression. Everything that welled up into the conscious mind flowed across paper and canvas. The true Surrealist turns his mind into a receptacle, refusing to favor one group of words over another. Instead, it is up to the miraculous equivalent to intervene.

Or not. As a sober reader finds, most Surrealist literature is unreadable. The precursor to Surrealism, the Romanian Tristan Tzara, famously composed poems by cutting words from a newspaper, tossing them into a bag, pulling them out and reciting them one by one. The result, Tzara declared, will resemble you. (Perhaps thats true if you happen to be crashed on your kitchen floor, sleeping off an all-night bender.) As for Breton, he favored automatic writing by becoming a recording machine for his unconscious. The final product, he beamed, shines by its extreme degree of immediate absurdity.

Trumpian word salads bear the surrealist seal of absurdity. In Exquisite Corpse a Surrealist exercise aimed at unleashing the unconscious you write a word on a piece of paper, pass it to your neighbor who jots a second word without looking at the first word, and so on. This led to sentences like The exquisite/corpse/shall drink/the new/wine. Trumps gift of free association His one problem is he didnt go to Russia that night because he had extracurricular activities, and they froze to death allows him to play a solitaire variation of the game.

A French translator recently marveled that Trump seems to have thematic clouds in his head that he would pick from with no need of a logical thread to link them. This is true not just of his speech, but also of his governing strategy.

Igniting a reaction similar to those following Marcel Duchamp entering a urinal at an art show, Trump has exhibited his Surrealist aesthetic in bureaucratic Washington. But he subverts ready-made expectations instead of ready-made objects. With a Surrealist flair for showmanship worthy of Salvador Dali, he randomly pairs titles and individuals. Thus, his son-in-law, a New York real estate developer, plays Middle East envoy one day, opioid crisis czar the next. Trumps claim that if Jared Kushner cannot bring peace to the Middle East, no one can expresses the Surrealist conviction that where reason and strategy have failed, unreason and whim will prevail.

The same aesthetic lies behind or, rather, below the Wall. Its failure to make economic, strategic or diplomatic sense is not beside the point; it is the point. Its raison dtre is to shock the political establishment and to give shape to what, until now, had been the repressed desires of Trumps base. Think of it not as a real security measure, but as a virtual sculpture that will allow its audience to touch, and not just talk about their phobias. Like a Surrealist object, the Wall is a shape-shifter opaque or transparent, continuous or discontinuous, topped with barbed wire or solar panels and expresses the Surrealist values of excess and extravagance, aggression and transgression.

In the end, Trumpism, like Surrealism, seeks to force reality to conform to individual desires, no matter how illicit, illegal or simply outrageous. This might work aesthetically, even financially just ask Dali, whose name Breton turned into the anagram Avida Dollars and, it seems, politically. But, one can hope, only in the short term.

Eventually, Surrealisms revolt against the reality-based community ended with a whimper, with its art relegated to post-dinner games and dorm room posters. One day, perhaps, politicians will look back on Trumpism in the same dismissive way.

Robert Zaretsky teaches at the University of Houston and is finishing a book on Catherine the Great and the French Enlightenment.

Follow the Opinion section on Twitter @latimesopinion or Facebook

More:

The best ism to explain our time: Surrealism, which turns 100 this year - Los Angeles Times

How Silicon Valley’s Workplace Culture Produced James Damore’s Google Memo – The New Yorker

Last week, a software engineer at Google, James Damore, posted a ten-page memo, titled Googles Ideological Echo Chamber , to an internal company network. Citing a range of psychological studies, Wikipedia entries, and media articles on our culture of shaming and misrepresentation, Damore argued that women are underrepresented in the tech industry largely because of their innate biological differences from mentheir stronger interest in people rather than things, their propensity for neuroticism, their higher levels of anxiety. Damore criticized the companys diversity initiatives, which focus on recruitment, hiring, and professional development, as discriminatory, and advanced concrete suggestions for improving them: de-moralize diversity, de-emphasize empathy, stop alienating conservatives, and be open about the science of human nature. On Monday, Googles C.E.O., Sundar Pichai, sent a note to his employees decrying the memos harmful gender stereotypes and noting that portions of it violated the companys code of conduct. Damore was fired, and promptly filed a charge with the National Labor Relations Board.

As soon as news of the memo broke, tech workers took to the Internet. (Ours is a privileged moment: never before has it been so easy to gain access to the errant musings, rapid-fire opinions, and random proclivities of venture capitalists and others we enrich.) There were calls for Damore to be blacklisted from the industry; nuanced analyses of the memos underlying assumptions and ripple effects; facile analyses of the same; message-board debates about sexual harassment, affirmative action, evolutionary biology, eugenics, and wrongthink; and disagreements about the appropriateness of Googles response. (Firing people for their ideas should be opposed, Jeet Heer, a self-described Twitter Essayist and an editor at The New Republic , tweeted.) George Orwells 1984 was trotted out, discursively, and quickly retired. More than a handful of people pointed out that the field of programming was created , and once dominated, by women. Eric Weinstein, the managing director of Thiel Capital, an investment firm helmed by Peter Thiel , tweeted disapprovingly at Googles corporate account, Stop teaching my girl that her path to financial freedom lies not in coding but in complaining to HR.

Though Damores memo draws on familiar political rhetoric, its style and structure are unique products of Silicon Valleys workplace culture . At software companies, in particular, people talkand argue, and dogpile, and offer unsolicited opinionsall the time, all over the place, including in forums like the one where Damore posted Googles Ideological Echo Chamber. In my experience in the tech industry, such forums serve as repositories for all sorts of discussionsfeature launches, bug fixes, birth announcements, introductions, farewellsand are meant, in part, to promote the open-source ethos that everyone can, and should, pitch in. But they also favor the kind of discourse that people outside the industry may recognize from online platforms such as Reddit and Hacker News; it is solution-oriented, purporting to value objectivity and rationalism above all, and tends to see the engineers dispassion as a tool for solving a whole range of technical and social problems. (Being emotionally unengaged helps us better reason about the facts, Damore writes.) But the format is ill-suited to conversations about politics and social justice.

One of the documents that resurfaced in the online discussion of the Google memo was What You Cant Say , by Paul Grahamthe co-founder, along with his wife, Jessica Livingston, of the startup accelerator Y Combinator , which runs Hacker News. The five-thousand-word essay, which Graham published on his personal blog, in 2004, begins with the premise that there exist moral fashions that are both arbitrary and pernicious. Fashion is mistaken for good design; moral fashion is mistaken for good, he writes. The essay makes a case for contrarian thinking through a series of flattering analogiesGalileo was seen as a heretic in his time; John Milton was advised to keep quiet about the evils of the Roman Inquisitionand argues that opinions considered unfashionable in their time are often retroactively respected, if not taken as gospel. The statements that make people mad are the ones they worry might be believed, Graham writes. I suspect the statements that make people maddest are those they worry might be true. At several points, he refers to political correctness.

What You Cant Say is by no means a seminal text, but it is the sort of text that has, historically, spoken to a tech audience. Googles Ideological Echo Chamber, with its veneer of cool rationalism, echoes Grahams essay in certain ways. But, where Grahams argument is made thoughtfully and in good faithhe is a proponent of intellectual inquiry, even if the outcome is controversialDamores is a sort of performance. His memo shows a deep misunderstanding of what constitutes power in Silicon Valley, and where that power lies. True, Google and its peers have put money and other company resources toward diversity efforts, and they very likely will continue to do so. But today, in mid-2017, menwhite menare still very much in the majority. It is still largely white men who make decisions, and largely white men who prosper. By positioning diversity programs as discriminatory, Damore paints exactly the opposite picture. He frames employees like himself as a silenced minority, and his contrarian opinions as a kind of Galilean heresy.

It is conceivable, of course, that Damore distributed his memo to thousands of his colleagues because he genuinely thought that it was the best way to strike up a conversation. Open and honest discussion with those who disagree can highlight our blind spots and help us grow, he writes. Perhaps he expected that the ensuing dialogue would be akin to a debate over a chunk of code. But, given the memos various denigrating assertions about his co-workers, it is difficult to imagine that it was offered in good faith. Damore wasnt fired for his political views; he was fired for how (and where) he applied them. The memo also hints at a larger anxietya fear, possibly, of the future. But technological advancement and social change move at different velocities; someone like Damore might sooner be automated out of a job than replaced by a woman.

Minority groups in tech are no strangers to being second-guessed, condescended to, overlooked, underpaid, and uncredited. But seeing Damores arguments made publicand, in some cases, seeing them elicit supportwas a fresh smack in the face. It was a reminder that plenty of tech workers and executives still consider hiring women and people of color lowering the bar, and that proving ones place is a constant, Sisyphean task. After all, not so long ago, advocacy on behalf of womenand black, Latino, nonbinary, and otherwise underrepresented peoplewas the unfashionable, contrarian alternative in the tech industry.

Read the rest here:

How Silicon Valley's Workplace Culture Produced James Damore's Google Memo - The New Yorker

Muslims and Modernism – Kasmir Monitor

The nineteenth century witnessed a great change in the outlook of Muslims of the Subcontinent. Colonialism, along with the development of scientific attitude, affected the religious universe drastically. And, this, in turn, led to a hot debate on religious dogmas and rationality; rather a paradigm shift in the thought of educated Muslims. This shift created a modernist school, comprised mainly of those Muslims who showed a keen receptiveness to western institutions of learning and who judged things through the prism of modernity. This intellectual vibrancy took place in a more enthusiastic and radical way around the person of Sir Syed Ahmad khan, who was born in Delhi in 1817. To make the re-conciliation between religion and western attitude was central to his religious philosophy. He started a famous periodical Tahdhib al-Akhlaq and set up a scientific society for translating English books into Urdu so that the Muslims of the subcontinent would get acquainted with the advanced/progressive ideas of the West. While expounding the belief in naturalism, he stated, Today we are in need of modern Ilm al Kalam by which we should refute the dogmas of modern Science or show that they are in conformity with the Islamic creeds. According to him, whole physical universe including man is the work of God and religion is His word, so there cant be any contradiction between the two. The only touchstone of a real religion can be this: if it is in conformity with human nature or with nature in general, then it is true and real. Like the modernists of Christian world, he too tried to relinquish the metaphysical realities from the realm of faith. Reason and empiricism, according to him, are the only yardsticks to measure the reality. Swathed with the ideas of rationalism, he maintained that there is no intermediary between God and the prophet(SAW). Gabriel is in reality a symbolic representation of the prophetic faculty. Eschatologically, he further maintained that paradise and hell described in a sensuous terms in the sacred text are just emblematical representation of the psychological states of individuals in the life after death. Ibn Khuldun, a great Muslim historian and thinker, dealt well with the people like Sir Syed who were the preachers of rationalism during medieval era and has rightly mentioned in his famous Muqadimah that the mind is an accurate scale, whose recordings are certain and reliable; but to use it to weigh questions relating to the unity of God, or after life, or nature of prophecy or other such subjects falling outside its range, is like trying to use a goldsmiths scale to weigh mountains. To reconstruct the edifice of Muslim civilization, Sir Syed strongly advocated the ijtihadic endeavour. Apart from trying to untie the cosmic knots with reason and science, his buttressing to nullify Taqlid was very energetic and progressive. Taqlid is the sole reason, according to him, for the downfall of Muslim Ummah. Sir Syed Ahmad Khan not only started a sort of neo-Muttazilite understanding of the cosmos and the sacred text but also endeavoured to dilute the antagonistic attitude of western colonials. To meet this end, he dedicated himself to write an exegesis of the Bible in the light of Islamic intellectualism. Tabayin al-Kalam fi Tafsir al-Torahwa al-injilalamillahal-Islam is the name of that exegesis. It is not a commentary in a sense of Muslim Tafsir of the Quran. It is a collection of critical essays on certain aspects of Christianity that tends to stress the common ground rather than the differences between the Christians and Muslims. The main contention of Sir Syed, as Syed MunirWasti would put it, is that there is no fundamental difference between the account of Christianity given in the Bible and that given in the Quran. The Muslim society in India was very much hesitant to get socially intermingled with Christians. In order to dismantle this social barrier, he wrote a booklet, entitled Ahkam-iTaam-i Ahli-Kitab, to explain that Muslim Jurisprudence doesnt prevent Muslims from dining with the people of Book provided Haram food is not served.

See more here:

Muslims and Modernism - Kasmir Monitor

How well do you know your suburb? – Daily Advertiser

11 Aug 2017, 2:21 p.m.

How well do you know where you live?

How well do you know where you live?

Are your neighbours likely to be young or old? Single or with kids? Renting or paying off a home? Born overseas or in Australia?

Take our seven-question quiz and find out. And if you get stuck try again, you'll getdifferent questions each time. There are also some hints below.

Enter the name of your suburb.

Once you have your score youcan compare your resultwith other people from your area.

The quiz covers almost every one of Australia's 15,000-plus suburbs. The only ones not included are those with tiny populations.

Oceania includes Australia, Papua New Guinea New Zealand and Pacific Islands such as Fiji, Vanuatu and Tonga.

The Americas includes North and South America.

Family households include any home that consists of a couple or some dependent children. For example, a family household can be a married couple without kids, a same-sex couple living together, a single parent looking after their two children, or a blended household with step parentsand stepchildren.

Christianitytakes in all denominations such as Catholicism, Protestantism and Seventh Day Adventism.

No Religion includes Agnosticism, Atheism and secular beliefs such as Rationalism and Humanism.

The data used in this quiz comes from the2016 Census.

Read the original:

How well do you know your suburb? - Daily Advertiser

Arithmetic Is on China’s Side – Truthdig

Eden Collinsworth

Eden Collinsworth is a former media executive and business consultant. She was president of Arbor House Publishing Co. and founder of the Los Angeles monthly lifestyle magazine Buzz before becoming a vice president at Hearst Corp...

Some 30 years ago, I took a bullet train from the airport in Shanghai to the center of that city. I was being hurled ahead at 268 miles an hour on a thin layer of air between the train and the magnetized narrow tracks. But that was not what fueled my disbelief. Far more disconcerting was what I saw outside the window when we slowed down: Some of the peasantsknee deep in rice paddieswere talking on cell phones.

Entering the telecommunications age with satellite-based platforms, the Chinese were able to leapfrog over the expensive cable-based systems in the West. Currently, over 75 percent of its 1.3 billion-plus people have at least one cell phone.

Even before Donald Trump forfeited the United States place at the global table, Chinathe worlds largest nation with a self-appointed government that seeks access to global markets and resourceshas been chipping away at the edifice of Americas dominance. As of May this year, the U.S. debt to China was $1.102 trillion, which is 28 percent of the $3.9 trillion in Treasury bills, notes and bonds held by foreign countries.

Chinas state-owned firms have sought out iconic Western companies for direct investment, taking stakes in Greeces largest port, Portugals biggest power plant, Londons Heathrow Airport and Canadas energy giant, Nexen. These are only a few of the Asian countrys investments in an unprecedented range of overseas deals projected to be worth between $2 trillion and $3 trillion by 2020.

The Chinese government has built other countries infrastructures, and it has made loans to nations hobbled by deficit. To add to this outreach, China is spending billions of dollars a year in the most extensive program of image-building the world has ever seen.

There was a time when WesternersAmericans in particularthought that the Chinese would convert to Western ways. But China has not become more like us. Indeed, it would be an understatement to say that China stage-manages the exposure of Western ideas to its citizens. So, no, the Chinese do not intend to become like us.

Chinese cultureformed over 2,500 yearsembraces a Confucian perspective, which is in stark contrast to the linear rationalism attributed to Western belief. Confucius Analects(sayings) concentrate on the practical rather than the theoretical. They advise against reducing morality to a universal truth. Unlike the West, where Judeo-Christian ethics designate a non-negotiable right and wrong, the Chinese do not adhere to absolutes. Since China comprises 20 percent of the planets population, one in five people in the world believes there is no single way of being wrong and many ways of being right.

Where does that leave the rest of us?

We in the West would like to believe that individual freedom determines our choices, but in reality we are ruled in large part by the prevailing time in which we live, and our world today is interconnected. Like it or not, we must confront a challenging question: What will anchor us in our own distinct beliefs and ethics while respecting other distinct cultures with different ethical and political systems?

To consider this, well need to take our eyes off the mirror in front of us and look at people with different truths and values, because only then will we be able to make the difficult decisions. At times, those decisions will require us to relinquish some part of our ground. At other times, they will call on us to protect the ground we are determined to hold. It will be these hard-won decisionsnot the false promises of politiciansthat will enable us to navigate the future of our conflicted, crowded world.

Visit link:

Arithmetic Is on China's Side - Truthdig

Donald Trump’s face-off with North Korea has made more than a few people terrified – the Irish News


the Irish News
Donald Trump's face-off with North Korea has made more than a few people terrified
the Irish News
All machismo, no rationalism #NorthKorea. Key, Esq. (@kishenybarot) August 8, 2017. Trump is sittin' here threatening Kim Jong Un and instead of him being scared we are. Sam Without A Hoodie (@hood_goat) August 8, 2017. Observers noted the ...

and more »

Read the original here:

Donald Trump's face-off with North Korea has made more than a few people terrified - the Irish News

Thinking their way through new superstitions – Print – Times of India

Bengaluru: Challenge accepted -- AS Nataraj has been waiting to hear these words for the past 16 years after framing a seemingly simple challenge of 10 questions. To make it easier, he insists on only eight correct answers for the challenger to be eligible for the Rs 1 crore reward. The catch? The answers would involve the challenger accurately predicting an individual's future using janam kundali or astrological chart. Now you didn't see that coming, did you?

"The reward was Rs 10 lakh when I first issued the challenge in 2001. I increased it to Rs 1 crore because no one came forward despite initial promises. I am now sure that even if I raise the prize to Rs 100 crore, nobody will volunteer," says Nataraj, the 77-year-old founder of Akhila Karnataka Vicharavadi Sangha. His aim is to debunk astrology's main claim to fame - the power to pinpoint the future. "I know it is not true because I was also an astrologer," laughs Nataraj, author of Jyothishyakke Savaalu (Challenge to Astrology) and a veteran TV talking head on the matter.

The other challenge doing the rounds is aimed at busting a scientifically untested brain training programme. Narendra Nayak, the rationalist crusader from Mangaluru, has been holding demonstrations and challenging proponents of mid-brain activation for the last two years. The groups behind this fad take money from parents to enhance brainpower of their children through the 'activation'. Those trained can apparently see after being blindfolded. "People fall for new tricks all the time. Mid-brain activation involves teaching children to lie (about peeking from behind the blindfold). The organisers use pseudo-science jargons and it becomes difficult for lay persons to understand," says Nayak, president of the Federation of Indian Rationalist Associations (FIRA).

LOGIC WINS

For every new trickster in town, there are a few rationalists like Nayak who demand that fantastic claims should be backed by evidence, scientific reasoning and stone-cold rationale. If not, people like him resort to dramatic one-upmanship and myth busting on public platforms to uphold what they see as truth and rationality.

"Earlier, we used to go after petty godmen who produced ash from thin air or put their hands in boiling water. Now, the picture has changed," says Nayak, a 67-year-old trained bio-chemist. The new age miracles involve coming up with sales pitches to sell anything from yoga, millets, salt room therapy and apple cider vinegar as cures for various ills, including cancer, he says. The marketers rely on scientific terms or the ancient Indian label to bamboozle people.

As a trained scientist, the pseudo science gets Nayak going. Recently, he wrote a detailed complaint to the Advertising Standards Council of India about tall claims made by a coconut oil manufacturer in an ad. The regulatory body found that many of the claims such as the oil being a 'natural antiseptic' , 'restores thyroid function and reduces obesity' were not substantiated and hence, misleading. They asked the adverstiser to withdraw the ad or modify it.

ATHEISTIC START

For most such activists, rationalism starts with a healthy dose of atheism. Nayak says he became an atheist at the age of 11 after coming to a conclusion ("maybe hasty") about there being no god despite his prayers. A national science talent scholarship cemented his rationalist leanings and later, after a meeting with the legendary rationalist Abraham Kovoor, he joined the movement.

It isn't easy to break down strong beliefs. Nataraj, who became a rationalist after practicing astrology for several years, says he can hold his own in heated TV debates because he has studied several works about astrology. "There are times when TV astrologers have asked me in private why I oppose astrology as I know so much about it. I tell them we have to have proof," says Nataraj.

UPHILL BATTLE

Public confrontations have a tendency to deteriorate quickly. Sanal Edamaruku, a Delhi-based rationalist, had to relocate to Finland to avoid arrest in a blasphemy case filed by a Mumbai church. Edamaruku, who exposed 'Pilot' Baba and other assorted godmen across India, says in the Mumbai case, he was held up at a TV studio for hours after a violent mob thronged outside, opposing him for saying that miracle tears of a statue came from a leaky drainpipe. "I am not a hatemonger but I gave my opinion after observation (he was invited to see the statue). Listeners can choose to disbelieve. But the situation turned violent and I escaped through the studio's back gate after three-four hours," says Edamaruku, who is bringing out his memoir detailing 25 of the most memorable investigations he has done so far.

Go here to see the original:

Thinking their way through new superstitions - Print - Times of India

empiricism | philosophy | Britannica.com

Empiricism, in philosophy, the view that all concepts originate in experience, that all concepts are about or applicable to things that can be experienced, or that all rationally acceptable beliefs or propositions are justifiable or knowable only through experience. This broad definition accords with the derivation of the term empiricism from the ancient Greek word empeiria, experience.

Concepts are said to be a posteriori (Latin: from the latter) if they can be applied only on the basis of experience, and they are called a priori (from the former) if they can be applied independently of experience. Beliefs or propositions are said to be a posteriori if they are knowable only on the basis of experience and a priori if they are knowable independently of experience (see a posteriori knowledge). Thus, according to the second and third definitions of empiricism above, empiricism is the view that all concepts, or all rationally acceptable beliefs or propositions, are a posteriori rather than a priori.

The first two definitions of empiricism typically involve an implicit theory of meaning, according to which words are meaningful only insofar as they convey concepts. Some empiricists have held that all concepts are either mental copies of items that are directly experienced or complex combinations of concepts that are themselves copies of items that are directly experienced. This view is closely linked to the notion that the conditions of application of a concept must always be specified in experiential terms.

Read More on This Topic

Western philosophy: The rise of empiricism and rationalism

The scientific contrast between Vesaliuss rigorous observational techniques and Galileos reliance on mathematics was similar to the philosophical contrast between Bacons experimental method and Descartess emphasis on a priori reasoning. Indeed, these differences can be conceived in more abstract terms as the contrast between empiricism and rationalism. This theme dominated the philosophical...

The third definition of empiricism is a theory of knowledge, or theory of justification. It views beliefs, or at least some vital classes of beliefe.g., the belief that this object is redas depending ultimately and necessarily on experience for their justification. An equivalent way of stating this thesis is to say that all human knowledge is derived from experience.

Empiricism regarding concepts and empiricism regarding knowledge do not strictly imply each other. Many empiricists have admitted that there are a priori propositions but have denied that there are a priori concepts. It is rare, however, to find a philosopher who accepts a priori concepts but denies a priori propositions.

Stressing experience, empiricism often opposes the claims of authority, intuition, imaginative conjecture, and abstract, theoretical, or systematic reasoning as sources of reliable belief. Its most fundamental antithesis is with the latteri.e., with rationalism, also called intellectualism or apriorism. A rationalist theory of concepts asserts that some concepts are a priori and that these concepts are innate, or part of the original structure or constitution of the mind. A rationalist theory of knowledge, on the other hand, holds that some rationally acceptable propositionsperhaps including every thing must have a sufficient reason for its existence (the principle of sufficient reason)are a priori. A priori propositions, according to rationalists, can arise from intellectual intuition, from the direct apprehension of self-evident truths, or from purely deductive reasoning.

In both everyday attitudes and philosophical theories, the experiences referred to by empiricists are principally those arising from the stimulation of the sense organsi.e., from visual, auditory, tactile, olfactory, and gustatory sensation. (In addition to these five kinds of sensation, some empiricists also recognize kinesthetic sensation, or the sensation of movement.) Most philosophical empiricists, however, have maintained that sensation is not the only provider of experience, admitting as empirical the awareness of mental states in introspection or reflection (such as the awareness that one is in pain or that one is frightened); such mental states are then often described metaphorically as being present to an inner sense. It is a controversial question whether still further types of experience, such as moral, aesthetic, or religious experience, ought to be acknowledged as empirical. A crucial consideration is that, as the scope of experience is broadened, it becomes increasingly difficult to distinguish a domain of genuinely a priori propositions. If, for example, one were to take the mathematicians intuition of relationships between numbers as a kind of experience, one would be hard-pressed to identify any kind of knowledge that is not ultimately empirical.

Even when empiricists agree on what should count as experience, however, they may still disagree fundamentally about how experience itself should be understood. Some empiricists, for example, conceive of sensation in such a way that what one is aware of in sensation is always a mind-dependent entity (sometimes referred to as a sense datum). Others embrace some version of direct realism, according to which one can directly perceive or be aware of physical objects or physical properties (see epistemology: realism). Thus there may be radical theoretical differences even among empiricists who are committed to the notion that all concepts are constructed out of elements given in sensation.

Test Your Knowledge

Libya Quiz

Two other viewpoints related to but not the same as empiricism are the pragmatism of the American philosopher and psychologist William James, an aspect of which was what he called radical empiricism, and logical positivism, sometimes also called logical empiricism. Although these philosophies are empirical in some sense, each has a distinctive focus that warrants its treatment as a separate movement. Pragmatism stresses the involvement of ideas in practical experience and action, whereas logical positivism is more concerned with the justification of scientific knowledge.

When describing an everyday attitude, the word empiricism sometimes conveys an unfavourable implication of ignorance of or indifference to relevant theory. Thus, to call a doctor an Empiric has been to call him a quacka usage traceable to a sect of medical men who were opposed to the elaborate medicaland in some views metaphysicaltheories inherited from the Greek physician Galen of Pergamum (129c. 216 ce). The medical empiricists opposed to Galen preferred to rely on treatments of observed clinical effectiveness, without inquiring into the mechanisms sought by therapeutic theory. But empiricism, detached from this medical association, may also be used, more favourably, to describe a hard-headed refusal to be swayed by anything but the facts that the thinker has observed for himself, a blunt resistance to received opinion or precarious chains of abstract reasoning.

As a more strictly defined movement, empiricism reflects certain fundamental distinctions and occurs in varying degrees.

Britannica Lists & Quizzes

Health & Medicine Quiz

Science List

Arts & Culture Quiz

Society List

A distinction that has the potential to create confusion is the one that contrasts the a posteriori not with the a priori but with the innate. Since logical problems are easily confused with psychological problems, it is difficult to disentangle the question of the causal origin of concepts and beliefs from the question of their content and justification.

A concept, such as five, is said to be innate if a persons possession of it is causally independent of his experiencee.g., his perception of various groupings of five objects. Similarly, a belief is innate if its acceptance is causally independent of the believers experience. It is therefore possible for beliefs to be innate without being a priori: for example, the babys belief that its mothers breast will nourish it is arguably causally independent of his experience, though experience would be necessary to justify it.

Another supposedly identical, but in fact more or less irrelevant, property of concepts and beliefs is that of the universality of their possession or acceptancethat a priori or innate concepts and beliefs must be held by everyone. There may be, in fact, some basis for inferring universality from innateness, since many innate characteristics, such as the fear of loud noises, appear to be common to the whole human species. But there is no inconsistency in the supposition that a concept or belief is innate in one person and learned from experience in another.

Two main kinds of concept have been held to be a priori. First, there are certain formal concepts of logic and of mathematics that reflect the basic structure of discourse: not, and, or, if, all, some, existence, unity, number, successor, and infinity. Secondly, there are the categorial conceptssuch as substance, cause, mind, and Godwhich, according to some philosophers, are imposed by the mind upon the raw data of sensation in order to make experiences possible. One might add to these the more specific theoretical concepts of physics, which are sometimes said to apply to entities that are unobservable in principle.

In the long history of debate over the a priori, it was long taken for granted that all a priori propositions are necessarily truei.e., true by virtue of the meanings of their terms (analytic) or true by virtue of the fact that their negations imply a contradiction. Propositions such as all triangles have three sides, all bachelors are unmarried, and all red things are coloured are necessarily true in one or both of these senses. Likewise, it was held that propositions that are contingently true, or true merely by virtue of the way the world happens to be, are a posteriori. John is a bachelor and Johns house is red are propositions of this type.

In the 1970s, however, the American philosopher Saul Kripke argued to the contrary that some a priori propositions are contingent and some a posteriori propositions are necessary. According to Kripke, the referential properties of natural kind terms like heat can be understood by imagining that their referents were fixed, upon their introduction into the language, by means of certain definite descriptions, such as the cause of sensations of warmth. In other words, heat was introduced as a name for whatever phenomenon happened to satisfy the description the cause of sensations of warmth. Of course, the phenomenon in question is now known to be molecular motion. Thus heat refers to molecular motion, then and now, because molecular motion was the cause of sensations of warmth when the term was introduced. Given this introduction, however, the proposition heat causes sensations of warmth must be a priori. Because its introduction stipulated that heat is the phenomenon that causes sensations of warmth, it is knowable independently of experience that heat causes sensations of warmth, even though it is only a contingent matter of fact that it does. On the other hand, the proposition heat is molecular motion is a posteriori, because this fact about heat was discovered (and could only be discovered) through empirical scientific investigation. But the proposition is also necessary, according to Kripke, because once the referent of heat has been fixed as molecular motion, there are no imaginable circumstances in which the term could refer to anything else. This conclusion is supported by the intuition that, if it were discovered tomorrow that sensations of warmth in humans are actually caused by something other than molecular motion, one would not say that heat is not molecular motion but rather that sensations of warmth are caused by something other than heat. Kripke proposed a similar analysis of the referential properties of proper names like Aristotle, according to which a proposition like Aristotle was the teacher of Alexander the Great is contingent but a priori.

Empiricism, whether concerned with concepts or knowledge, can be held with varying degrees of strength. On this basis, absolute, substantive, and partial empiricisms can be distinguished.

Absolute empiricists hold that there are no a priori concepts, either formal or categorial, and no a priori beliefs or propositions. Absolute empiricism about the former is more common than that about the latter, however. Although nearly all Western philosophers admit that obvious tautologies (e.g., all red things are red) and definitional truisms (e.g., all triangles have three sides) are a priori, many of them would add that these represent a degenerate case.

A more moderate form of empiricism is that of the substantive empiricists, who are unconvinced by attempts that have been made to interpret formal concepts empirically and who therefore concede that formal concepts are a priori, though they deny that status to categorial concepts and to the theoretical concepts of physics, which they hold are a posteriori. According to this view, allegedly a priori categorial and theoretical concepts are either defective, reducible to empirical concepts, or merely useful fictions for the prediction and organization of experience.

The parallel point of view about knowledge assumes that the truth of logical and mathematical propositions is determined, as is that of definitional truisms, by the relationships between meanings that are established prior to experience. The truth often espoused by ethicists, for example, that one is truly obliged to rescue a person from drowning only if it is possible to do so, is a matter of meanings and not of facts about the world. On this view, all propositions that, in contrast to the foregoing example, are in any way substantially informative about the world are a posteriori. Even if there are a priori propositions, they are formal or verbal or conceptual in nature, and their necessary truth derives simply from the meanings that attached to the words they contain. A priori knowledge is useful because it makes explicit the hidden implications of substantive, factual assertions. But a priori propositions do not themselves express genuinely new knowledge about the world; they are factually empty. Thus All bachelors are unmarried merely gives explicit recognition to the commitment to describe as unmarried anyone who has been described as a bachelor.

Substantive empiricism about knowledge regards all a priori propositions as being more-or-less concealed tautologies. If a persons duty is thus defined as that which he should always do, the statement A person should always do his duty then becomes A person should always do what he should always do. Deductive reasoning is conceived accordingly as a way of bringing this concealed tautological status to light. That such extrication is nearly always required means that a priori knowledge is far from trivial.

For the substantive empiricist, truisms and the propositions of logic and mathematics exhaust the domain of the a priori. Science, on the other handfrom the fundamental assumptions about the structure of the universe to the singular items of evidence used to confirm its theoriesis regarded as a posteriori throughout. The propositions of ethics and those of metaphysics, which deals with the ultimate nature and constitution of reality (e.g., only that which is not subject to change is real), are either disguised tautologies or pseudo-propositionsi.e., combinations of words that, despite their grammatical respectability, cannot be taken as true or false assertions at all.

The least thoroughgoing type of empiricism here distinguished, ranking third in degree, can be termed partial empiricism. According to this view, the realm of the a priori includes some concepts that are not formal and some propositions that are substantially informative about the world. The theses of the transcendental idealism of Immanuel Kant (17201804), the general scientific conservation laws, the basic principles of morality and theology, and the causal laws of nature have all been held by partial empiricists to be both synthetic (substantially informative) and a priori. As noted above, philosophers who embrace the Kripkean notion of reference fixing would add to this class propositions such as heat is the cause of sensations of warmth and Aristotle was the teacher of Alexander the Great, both of which derive their presumed aprioricity from the hypothetical circumstances in which their subject terms were introduced. At any rate, in all versions of partial empiricism there remain a great many straightforwardly a posteriori concepts and propositions: ordinary singular propositions about matters of fact and the concepts that figure in them are held to fall in this domain.

So-called common sense might appear to be inarticulately empiricist; and empiricism might be usefully thought of as a critical force resisting the pretensions of a more speculative rationalist philosophy. In the ancient world the kind of rationalism that many empiricists oppose was developed by Plato (c. 428c. 328 bce), the greatest of rationalist philosophers. The ground was prepared for him by three earlier bodies of thought: the Ionian cosmologies of the 6th century bce, with their distinction between sensible appearance and a reality accessible only to pure reason; the philosophy of Parmenides (early 5th century bce), the important early monist, in which purely rational argument is used to prove that the world is really an unchanging unity; and Pythagoreanism, which, holding that the world is really made of numbers, took mathematics to be the repository of ultimate truth.

The first empiricists in Western philosophy were the Sophists, who rejected such rationalist speculation about the world as a whole and took humanity and society to be the proper objects of philosophical inquiry. Invoking skeptical arguments to undermine the claims of pure reason, they posed a challenge that invited the reaction that comprised Platos philosophy.

Plato, and to a lesser extent Aristotle, were both rationalists. But Aristotles successors in the ancient Greek schools of Stoicism and Epicureanism advanced an explicitly empiricist account of the formation of human concepts. For the Stoics the human mind is at birth a clean slate, which comes to be stocked with concepts by the sensory impingement of the material world upon it. Yet they also held that there are some concepts or beliefs, the common notions, that are present to the minds of all humans; and these soon came to be conceived in a nonempirical way. The empiricism of the Epicureans, however, was more pronounced and consistent. For them human concepts are memory images, the mental residues of previous sense experience, and knowledge is as empirical as the ideas of which it is composed.

Most medieval philosophers after St. Augustine (354430) took an empiricist position, at least about concepts, even if they recognized much substantial but nonempirical knowledge. The standard formulation of this age was: There is nothing in the intellect that was not previously in the senses. Thus St. Thomas Aquinas (122574) rejected innate ideas altogether. Both soul and body participate in perception, and all ideas are abstracted by the intellect from what is given to the senses. Human ideas of unseen things, such as angels and demons and even God, are derived by analogy from the seen.

The 13th-century scientist Roger Bacon emphasized empirical knowledge of the natural world and anticipated the polymath Renaissance philosopher of science Francis Bacon (15611626) in preferring observation to deductive reasoning as a source of knowledge. The empiricism of the 14th-century Franciscan nominalist William of Ockham was more systematic. All knowledge of what exists in nature, he held, comes from the senses, though there is, to be sure, abstractive knowledge of necessary truths; but this is merely hypothetical and does not imply the existence of anything. His more extreme followers extended his line of reasoning toward a radical empiricism, in which causation is not a rationally intelligible connection between events but merely an observed regularity in their occurrence.

In the earlier and unsystematically speculative phases of Renaissance philosophy, the claims of Aristotelian logic to yield substantial knowledge were attacked by several 16th-century logicians; in the same century, the role of observation was also stressed. One mildly skeptical Christian thinker, Pierre Gassendi (15921655), advanced a deliberate revival of the empirical doctrines of Epicurus. But the most important defender of empiricism was Francis Bacon, who, though he did not deny the existence of a priori knowledge, claimed that, in effect, the only knowledge that is worth having (as contributing to the relief of the human condition) is empirically based knowledge of the natural world, which should be pursued by the systematicindeed almost mechanicalarrangement of the findings of observation and is best undertaken in the cooperative and impersonal style of modern scientific research. Bacon was, in fact, the first to formulate the principles of scientific induction.

A materialist and nominalist, Thomas Hobbes (15881679) combined an extreme empiricism about concepts, which he saw as the outcome of material impacts on the bodily senses, with an extreme rationalism about knowledge, of which, like Plato, he took geometry to be the paradigm. For him all genuine knowledge is a priori, a matter of rigorous deduction from definitions. The senses provide ideas; but all knowledge comes from reckoning, from deductive calculations carried out on the names that the thinker has assigned to them. Yet all knowledge also concerns material and sensible existences, since everything that exists is a body. (On the other hand, many of the most important claims of Hobbess ethics and political philosophy certainly seem to be a posteriori, insofar as they rely heavily on his experience of human beings and the ways in which they interact.)

The most elaborate and influential presentation of empiricism was made by John Locke (16321704), an early Enlightenment philosopher, in the first two books of his Essay Concerning Human Understanding (1690). All knowledge, he held, comes from sensation or from reflection, by which he meant the introspective awareness of the workings of ones own mind. Locke often seemed not to separate clearly the two issues of the nature of concepts and the justification of beliefs. His Book I, though titled Innate Ideas, is largely devoted to refuting innate knowledge. Even so, he later admitted that much substantial knowledgein particular, that of mathematics and moralityis a priori. He argued that infants know nothing; that if humans are said to know innately what they are capable of coming to know, then all knowledge is, trivially, innate; and that no beliefs whatever are universally accepted. Locke was more consistent about the empirical character of all concepts, and he described in detail the ways in which simple ideas can be combined to form complex ideas of what has not in fact been experienced. One group of dubiously empirical conceptsthose of unity, existence, and numberhe took to be derived both from sensation and from reflection. But he allowed one a priori conceptthat of substancewhich the mind adds, seemingly from its own resources, to its conception of any regularly associated group of perceptible qualities.

Bishop George Berkeley (16851753), a theistic idealist and opponent of materialism, applied Lockes empiricism about concepts to refute Lockes account of human knowledge of the external world. Because Berkeley was convinced that in sense experience one is never aware of anything but what he called ideas (mind-dependent qualities), he drew and embraced the inevitable conclusion that physical objects are simply collections of perceived ideas, a position that ultimately leads to phenomenalismi.e., to the view that propositions about physical reality are reducible to propositions about actual and possible sensations. He accounted for the continuity and orderliness of the world by supposing that its reality is upheld in the perceptions of an unsleeping God. The theory of spiritual substance involved in Berkeleys position seems to be vulnerable, however, to most of the same objections as those that he posed against Locke. Although Berkeley admitted that he did not have an idea of mind (either his own or the mind of God), he claimed that he was able to form what he called a notion of it. It is not clear how to reconcile the existence of such notions with a thoroughgoing empiricism about concepts.

The Scottish skeptical philosopher David Hume (171176) fully elaborated Lockes empiricism and used it reductively to argue that there can be no more to the concepts of body, mind, and causal connection than what occurs in the experiences from which they arise. Like Berkeley, Hume was convinced that perceptions involve no constituents that can exist independently of the perceptions themselves. Unlike Berkeley, he could find neither an idea nor a notion of mind or self, and as a result his radical empiricism contained an even more parsimonious view of what exists. While Berkeley thought that only minds and their ideas exist, Hume thought that only perceptions exist and that it is impossible to form an idea of anything that is not a perception or a complex of perceptions. For Hume all necessary truth is formal or conceptual, determined by the various relations that hold between ideas.

Voltaire (16941778) imported Lockes philosophy into France. Its empiricism, in a very stark form, became the basis of sensationalism, in which all of the constituents of human mental life are analyzed in terms of sensations alone.

A genuinely original and clarifying attempt to resolve the controversy between empiricists and their opponents was made in the transcendental idealism of Kant, who drew upon both Hume and Gottfried Wilhelm Leibniz (16461716). With the dictum that, although all knowledge begins with experience it does not all arise from experience, he established a clear distinction between the innate and the a priori. He held that there are a priori concepts, or categoriessubstance and cause being the most importantand also substantial or synthetic a priori truths. Although not derived from experience, the latter apply to experience. A priori concepts and propositions do not relate to a reality that transcends experience; they reflect, instead, the minds way of organizing the amorphous mass of sense impressions that flow in upon it.

Lockean empiricism prevailed in 19th-century England until the rise of Hegelianism in the last quarter of the century (see also Georg Wilhelm Friedrich Hegel). To be sure, the Scottish philosophers who followed Hume but avoided his skeptical conclusions insisted that humans do have substantial a priori knowledge. But the philosophy of John Stuart Mill (180673) is thoroughly empiricist. He held that all knowledge worth having, including mathematics, is empirical. The apparent necessity and aprioricity of mathematics, according to Mill, is the result of the unique massiveness of its empirical confirmation. All real knowledge for Mill is inductive and empirical, and deduction is sterile. (It is not clear that Mill consistently adhered to this position, however. In both his epistemology and his ethics, he sometimes seemed to recognize the need for first principles that could be known without proof.) The philosopher of evolution Herbert Spencer (18201903) offered another explanation of the apparent necessity of some beliefs: they are the well-attested (or naturally selected) empirical beliefs inherited by living humans from their evolutionary ancestors. Two important mathematicians and pioneers in the philosophy of modern physics, William Kingdon Clifford (184579) and Karl Pearson (18571936), defended radically empiricist philosophies of science, anticipating the logical empiricism of the 20th century.

The most influential empiricist of the 20th century was the great British philosopher and logician Bertrand Russell (18721970). Early in his career Russell admitted both synthetic a priori knowledge and concepts of unobservable entities. Later, through discussions with his pupil Ludwig Wittgenstein (18891951), Russell became convinced that the truths of logic and mathematics are analytic and that logical analysis is the essence of philosophy. In his empiricist phase, Russell analyzed concepts in terms of what one is directly acquainted with in experience (where experience was construed broadly enough to include not only awareness of sense data but also awareness of properties construed as universals). In his neutral monist phase, he tried to show that even the concepts of formal logic are ultimately empirical, though the experience that supplies them may be introspective instead of sensory.

Doctrines developed by Russell and Wittgenstein influenced the German-American philosopher Rudolf Carnap (18911970) and the Vienna Circle, a discussion group in which the philosophy of logical positivism was developed. The empirical character of logical positivism is especially evident in its formulation of what came to be known as the verification principle, according to which a sentence is meaningful only if it is either tautologous or in principle verifiable on the basis of sense experience.

Later developments in epistemology served to make some empiricist ideas about knowledge and justification more attractive. One of the traditional problems faced by more radical forms of empiricism was that they seemed to provide too slender a foundation upon which to justify what humans think they know. If sensations can occur in the absence of physical objects, for example, and if what one knows immediately is only the character of ones own sensations, how can one legitimately infer knowledge of anything else? Hume argued that the existence of a sensation is not a reliable indicator of anything other than itself. In contrast, adherents of a contemporary school of epistemology known as externalism have argued that sensations (and other mental states) can play a role in justifying what humans think they know, even though the vast majority of humans are unaware of what that role is. The crude idea behind one form of externalism, reliablism, is that a belief is justified when it is produced through a reliable processi.e., a process that reliably produces true beliefs. Humans may be evolutionarily conditioned to respond to certain kinds of sensory stimuli with a host of generally true, hence justified, beliefs about their environment. Thus, within the framework of externalist epistemology, empiricism might not lead so easily to skepticism.

Read this article:

empiricism | philosophy | Britannica.com

Rationalism and Nuclear Lunacy – Center for Research on Globalization

The Democratic Peoples Republic of North Korea is ramping up its nuclear deterrence, and this is causing consternation and wild proclamations from western officials and corporate media. What is particularly galling for the United States side is that North Korea appears to have achieved the capability of hitting the US mainland with ICBMs.

However, is the US not capable of hitting North Korea from wherever? So why does a rival created by the US [1] cause panicked rhetoric upon achievement of an ICBM capacity?

If your castle is capable of being targeted by a bellicose castle with inter-castle projectiles, would you leave yourself undefended? Especially when the bellicose castle has already destroyed the disarmed Iraqi castle as well as the disarmed Libyan castle.

US Senator Lindsey Graham said,

The only way they [the North Korean government] are going to change is if they believe there is a credible threat of military force on the table.

Graham believes any war will be confined to the East Asian region.

Why would Graham speak such provocative words? Follow the money. Grahams campaign fundraising appears aimed at the arms industry: Security through Strength.

US secretary-of-state Rex Tillerson is advocating peaceful pressure against North Korea and a willingness to hold talks. However, there is a condition, which certainly will not entice the North Koreans to talks. That condition is that the North Koreans disarm themselves of nuclear weapons and the means to deliver them. What is unstated is thatthe US will not disarm in any way whatsoever. The lessons of the disarmed and subsequently destroyed Iraqi and Libyan castles would seem to urge a cautionary approach.

Jack Rice, a former CIA agent, referred to North Korea as a threat. Why? Who is threatening who? North Korea haspledged no-first-use of nukes. The US has not. So who is the actual threat?

The US is modernizing its nuclear stockpile which is a stark abrogation of its undertaking as a signatory of the Treaty on the Non-Proliferation of Nuclear Weapons. The NPTs Article VI states:

Each of the Parties to the Treaty undertakes to pursue negotiations in good faith on effective measures relating tocessation of the nuclear arms raceat an early date and tonuclear disarmament, and on a treaty on general andcomplete disarmament under strict and effective international control. [emphasis added]

North Korea has never attacked the US. It was the US that attacked North Korea during the so-called Korean War. The US used chemical and biological weapons, resulting in an estimated 4-10 million Koreans being killed. [2]

A Rationale Analysis of What a Nuclear-armed North Korea Portends

1. It is clear from the cases of Iraq and Libya that a disarmed US-designated enemy is not spared from a violent opportunistic attack. That North Korea was included on George W Bushs axis of evil along with Iraq triggered alarm bells in North Korea.

2. The US refuses a peace treaty with North Korea. [3] And the sanctions against North Korea constitute anact-of-war. Trump tweeted, China could easily solve this problem. But it is not China maintaining a state-of-war with North Korea.

3. The US is nuclear-armed, has used nuclear weapons, and does not adhere to a no-first-use policy.

Given the above three points would it be rationale to be without an effective deterrence against a military attack?

Furthermore, when North Korea did enter into anAgreed Frameworkwith the US in 1994, among the obligations was an end to hostilities; normalization of relations, no nuclearization of the peninsula; freezing operation and construction of North Korean nuclear reactors in exchange for two proliferation-resistant nuclear power reactors; and, while awaiting completion of the nuclear reactors, the US was to provide oil for North Korean energy needs. The US did not fulfill its obligations. In other words, the US cannot be trusted to uphold its end of any agreement.

If North Korea were ever to launch a nuclear weapon or even launch a non-nuclear attack against another country, then the North Korean government would be committing an act of suicide. Kim Jong-uns grandfather and father were not suicidal, so there is no reason to suspect familial psychosis.

If North Korea has achieved and maintains an effective nuclear deterrence, then a US attack is only imaginable in a nightmare Bizarro World. An attack on a nuclear-armed North Korea would be mad. The US would not be unscathed in such an attack. Major population centers such as Seoul, Busan, and Tokyo (all where US troops are stationed) and perhaps the US mainland would be hit. Of course, North Korea would be obliterated. Even if continental US were not hit by nukes, the radiation from nuclear fallout and a potential nuclear winter will affect the entire planet.

Consequently, all the talk in the media of a war is irrational conjecture or bluffing.

Rationality demands that all sides avoid any brinkmanship.

Kim Petersenis a former co-editor of the Dissident Voice newsletter. He can be reached at:[emailprotected]. Twitter:@kimpetersen.

Notes

1. At the end of World War II, the Korean Peoples Republic arose and the first cabinet was formed on 14 September 1945. US scuttled the Korean Peoples Republic. See Nhial Esso,What You Dont Know about North Korea Could Fill a Book, (Intransitive Publishers, 2013): 15%. See Bruce Cumings,Koreas Place in the Sun: A Modern History(New York: W.W. Norton & Co., 2005): 238.

2. SeeKorean Truth Commission, Report on U.S. Crimes in Korea: 1945-2001(New York: 2001).

3. Said former US secretary-of-state Colin Powell: We wont do nonaggression pacts or treaties, things of that nature. Quoted in Steven R. Weisman, U.S. Weighs Reward if North Korea Scraps Nuclear Arms,New York Times, 13 August 2003.

Featured image is from Infowars.

See more here:

Rationalism and Nuclear Lunacy - Center for Research on Globalization

A New Short Film Offers a Private Look Into the Life of an Italian Architect and Design Enigma – Vogue.com

Though he was one of Italys most influential mid-20th-century architects and interior designers , very little is known about the inner world of Turinese legend Carlo Mollino. Born in 1905 in the northern Italian city of Turin, Mollino became a figure of fascination for design enthusiasts worldwide, many of whom were transfixed by his hidden private life and ability to create dreamy, sensuous spaces inspired by his various obsessionswhich ranged from the voluptuousness of the female form to symbols and talismans of witchcraft and the occult. At a time when the style of the day was, for the most part, defined by a movement known as Rationalism (led by fellow design giants like Gio Ponti and the Castiglioni brothers, who looked to architecture primarily as a self-effacing entity, created more for streamlined functionality than for decoration), Mollinos work was particularly unique, overtly romantic, and a far cry from the goings-on in Milan.

Carlo Mollinos RAI Auditorium, built in 1952. Photo: Courtesy of Oscar Humphries

After graduating from college, where he studied engineering, architecture, and art history, Mollino began working for his fathers architecture firm. There, he entered several design competitions and won for projects like the Agricultural Federation in Cuneo, Italy, and the Turin Equestrian Association headquarters, both of which, for buildings intended for public use, were unusually artsy and illustrated his predilection for sloping forms and circular spaces. After Mollino left his fathers firm, he spent the rest of his life picking and choosing his own projects, many of them commissions for private homes that were hidden from public view. His most famous work, the grand Teatro Regio in Turin, an opera house, is one of his only buildings still standing today.

As Mollinos oeuvre has grown in appreciation over the years, the scarcity of what is available to view and acquire has only added fuel to the fire. In 2005, a Mollino table earned a record-high sale for 20th-century furniture at Christies, going for $3.8 million. Its great appeal is the immediately seductive look, a former director at Christies, Philippe Garner, told The New York Times in a 2009 interview. The fact that virtually every piece can be traced to a specific commission and that production was very limited add the appeal of rarity.

The chairs in Carlo Mollinos RAI Auditorium. Photo: Courtesy of Oscar Humphries

It was only until Mollino expert and curator Fulvio Ferrari and his son Napoleone discovered and restored an apartment Mollino had been secretly working on did the doors to the architects world open. A social recluse for most of his life, Mollino spent years creating and decorating a home for himself on the River Po in which to live out his later days. Inside, both his dark strangeness and genius were revealed: Rooms immaculately decorated, strange voodoo imagery hung on walls and ceilings, and hundreds of erotic Polaroids taken of women who modeled for him were found. Obsessed by the Ancient Egyptian mummification process and beliefs, Mollino also created a wooden boat-like bed that served as a symbolic vessel of passage into the afterlife, placed in a room prepared meticulously for his death. Though he never actually lived in this apartment, it spoke most aptly to his deep love of all things beautiful, revealing how carefully he tried to construct the world around him. It is within this spacenow known as the Museo Casa Mollino, a highlight for visitors to Turinthat Mollino has been brought back to life.

In a beautiful new short filmdirected by Felipe Sanguinetti, produced by Oscar Humphries, narrated by Fulvio Ferrari, and given exclusively to Vogue we are offered visits to Mollinos Teatro Regio and Casa Mollino. It provides private insights into Mollinos mind and how he saw the world. Shot from around corners and through half-opened doors, the visual narrative is atmospheric in its secrecy, just as one would imagine for spaces of Mollinos. His presence is palpable and, in many ways, evidently vulnerable in the navigation of the cameras lens: As viewers, we get the distinct impression that we are walking side by side with Mollino himself, reseeing the spaces so close to his heart.

The completed Teatro Regio, 1973. Photo: Courtesy of Oscar Humphries

Mollino is so famous for the Polaroids he took and his iconic pieces of design, that as an architect hes often overlooked, said Humphries, who shot the film with friend Sanguinetti in June. But he was an architect first, and we wanted to show that.

Of the films humanized perspective, Sanguinetti noted: I wanted to share what I felt in these two spaces. Its unlike anything Ive ever experienced before, and what Mollino brings out in people is such a unique and emotional response to his work. I hope the spectator, when watching the film, can feel that.

Continued here:

A New Short Film Offers a Private Look Into the Life of an Italian Architect and Design Enigma - Vogue.com

Reading the Bible with the Founding Fathers by Daniel Dreisbach – Church Times

NAME the American Founding Fathers, or at least the ones everyone knows, and then describe their religion. George Washington: reticent, probably lukewarm; Jefferson: accused of atheism, disliked organised religion, basically a deist; Franklin: another deist; Hamilton: youthfully religious, lost his enthusiasm, not keen on churchgoing. Madison: largely indifferent. Only really John Adams and John Jay can lay claim to piety.

Add to this the Fathers undeniable enthusiasm for Enlightenment rationalism, the new nations desire to keep Church and State separate, and the First Amendment to the Constitution (Congress shall make no law respecting an establishment of religion), and you seem to have built up a pretty godless, or at least God-uninterested, picture.

One of the many merits of Daniel Dreisbachs book is to show how misleading this picture is. Against this popular image, the Bible was referenced more often than any other text, or even writer, during the Revolutionary period. The most prominent Founding Fathers were not typical of American revolutionaries, and even they were steeped in, and often fascinated by, biblical ideas and figures.

Dreisbach shows how prevalent the Bible was in early American culture and politics: think England or Scotland c.1650, and you wouldnt be far wrong. He also demonstrates, in the books best chapter, that the Revolutionaries political theorising, in particular their justification for rebellion, would have been impossible (or at least unrecognisable) without two preceding centuries of Protestant resistance theology: Vindiciae contra tyrannos was an extremely useful text when you found yourself defending liberty against those you considered to be tyrants.

Dreisbach recognises that the Fathers biblical rhetoric was sometimes only skin deep, borrowing figures and phrases to lend political speechifying a weight, dignity, and significance that it would not otherwise have had. Nevertheless, to dismiss it all as theological window-dressing is mistaken. Even when the Bible was not embedded in the Fathers lives (and chapter three shows that it often was), it underpinned and defined the sense of justice, rights, duty, liberty, providence, and destiny that created the new nation.

Reading the Bible with the Founding Fathers is a scholarly book, drawing on an abundance of source material and demonstrating an admirable familiarity with the period and the Bible. It is also somewhat repetitive: having established that the Fathers were deeply informed by biblical language, narrative and ideas, Dreisbach effectively goes on repeating the conclusion with different examples and from different angles. By the end, you have well and truly got the point.

Still, it is a point that needs to be got. In the United States polarised climate, this book will remind culture warriors that the nations robust constitutional secularism was grounded, paradoxically, in its equally robust Christianity.

Nick Spencer is Research Director at Theos.

Reading the Bible with the Founding Fathers

Daniel Dreisbach

OUP 19.99

(978-0-19-998793-1)

Church Times Bookshop 18

Link:

Reading the Bible with the Founding Fathers by Daniel Dreisbach - Church Times

With the Mooch gone, rationalism finally has a chance – The Globe and Mail

The comedy industry should certainly be giving thanks to Anthony (The Mooch) Scaramucci, and so, too, should the Republican Party. So, too, should Donald Trump. The Mooch might turn out to be the best thing that never happened to him.

A great day at the White House, Mr. Trump tweeted following the latest upheavals, including the Moochs rapid execution. He could be right. It could turn out, in terms of management style, to be a turning point for the White House.

At the House of Trump, chaos had reached critical mass, Mr. Scaramuccis plutonium-enriched persona being the prime cause. His smut-laden rampage in a New Yorker interview, besting even the Presidents normal ribaldry, was wrenching enough to finally and critically make real change happen, it being the appointment of retired Marine Corps general John Kelly as chief of staff.

In the scattershot, morally bankrupt Trump world, there will now be, for the first time, a chain of command. Things should get better not only because the bar is so low they cant get much worse, but because adolescence has been derailed.

As a military man, Mr. Kelly will bring discipline. Frogmarching the Mooch out the door was the perfect opening gambit, establishing his authority. Reince Priebus, the former White House chief of staff, was a welterweight. Competing power centres blossomed all around him, chewed him up, spit him out. The grenade-hurling Mr. Scaramuccis first act was to blow up Mr. Priebus before thankfully detonating charges under himself.

Not insignificantly, he also humiliated Stephen Bannon, the alt-right impresario whose clout has been shrinking steadily. Mr. Kelly is no fan of Mr. Bannon and his crew of white nationalist America-firsters, which is another reason why things should get better. Conventional thinkers now hold more sway. Rationalism has a chance.

Theres another reason why this past week should be seen as a critical juncture. It was the week that Congress Republicans finally got the message through to Mr. Trump that they are not going to take it any more. They forced him to back down on his intention to fire Attorney-General Jeff Sessions for the senseless reason of his doing the right thing in recusing himself from the Russian-meddling investigation. They put him on notice that any intent to torpedo the inquiry of FBI director Robert Mueller on Russian collusion would be suicidal. As well, three Republicans came forward to defeat his bid to repeal and/or replace Obamacare.

Mr. Trump cant go on the way he has been. He is the oddest of leaders in that while others seek to avoid controversy, he seeks to create it. He revels in the havoc and the storm. Mr. Scaramucci was viewed, given his brashness, his vulgarity, his ego on stilts, as a mini-Trump. Had his appointment as director of communications taken hold, it would have buttressed and augmented all of the Presidents seething quixotic tendencies.

Its no sure bet that Mr. Kelly may be able to rein them in. In his work as head of Homeland Security, some were dismayed at how readily he sided with Mr. Trumps attitudes on immigration. He curried too much favour, they say. No small wonder the President likes him so much.

But Mr. Kelly, widely experienced in Washington, has a mandate to bring order, which is what military men do best. Two other generals, national security adviser H.R. McMaster and Defence Secretary James Mattis, both no-nonsense individuals, will likely see their clout enhanced.

For all his madcap proclivities, Mr. Trump is sometimes capable of listening to reason. He didnt rip up the Iran nuclear deal or the North American free-trade agreement, lift sanctions against Russia, or move the Israeli embassy to Jerusalem. As for the Mexican wall, Mr. Kelly has been pushing him to back off. He may get his wish.

On all these issues, rationalists have made headway. They were able to do so in getting Mr. Trump to fire the Mooch as well. That decision, which required seeing the scars in someone with a similar persona and modus operandi as himself, may be a sign that his presidency is not a hopeless cause.

Follow us on Twitter: @GlobeDebate

Link:

With the Mooch gone, rationalism finally has a chance - The Globe and Mail

More and more Puneites come forward to donate body for research … – The Indian Express

Written by Anuradha Mascarenhas | Pune | Published:August 3, 2017 6:34 am Taher Poonawala

The family of Taher Poonawala, a rationalist who died on Monday night, donated his body for educational and research purposes at the Sane Guruji Hospital, Hadapsar. This is not an isolated case as, according to anatomy experts across the city, the practice of body donations is increasingly being seen in the city. Taher Poonawala had already pledged to donate his body for medical research several years ago, said Dr Girish Kulkarni, associate professor of the Department of Anatomy at the Sane Guruji Hospital.

The hospital has the capacity to preserve as many as 30 bodies and every month, they receive at least four forms pledging body donations. This is a trend that has picked up over the years. Noted socialist leader G P Pradhan had donated his body and several prominent people had pledged to donate their bodies, said Kulkarni.

At the B J Medical College and Sassoon General Hospital, Professor B H Baheti, head of the Department of Anatomy, said that nearly 40-45 bodies are donated every year. The trend has picked up. In fact, the hospital has a capacity to preserve 30-35 bodies and we receive a lot of applications pledging body donations, said Baheti.

The Armed Forces Medical College has also seen an increase in body donations, said official spokesperson Colonel Abhijit Rudra. This year, till July, we have had 10 body donations and many have also filled forms pledging body donations. Overall, the awareness levels have increased and people are encouraged after they see a sympathetic interaction between the staff and relatives of those who donate their bodies, said Colonel Rudra.

Other activists remember Taherbhai: A grateful salaam for him

For several activists in the city, the death of eminent rationalist and progressive thinker Taher Poonawala was a huge loss. Taherbhai was a friend, philosopher and guide for my father Dr Narendra Dabholkar. A strong supporter of the Maharashtra Andhashradda Nirmulan Samiti, he was the one who actively supported the need for progressive thinking, said Hamid Dabholkar, son of the slain activist. Poonawala, who was 95, died on Sunday night. He is survived by his wife and a daughter.

Anwar Rajan, who was a member of the Peoples Union for Civil Liberties along with Poonawala, recalled how he had been ex-communicated due to his revolutionary views.

Phir bhi kisike saath dushmani nahi thi. (Still, he did not have any enemies) We thought his shop would shut down due to immense pressure but Taherbhai is an example of how the ex-communication turned out to be a good opportunity to spread his progressive thinking, said Rajan.

Social activist Razia Patel said Poonawala had strongly opposed orthodoxy in the Bohra community. It is difficult to stand up against religious authorities, but he did it. In his personal life, he staunchly followed principles of secularism and rationalism. How can we ever forget him? A grateful salaam for our Taherbhai, said Patel.

Ajit Abhyankar, a member of the CPI-Ms state secretariat, remembered Taherbhais kind heart and great sense of humour. He was a committed rationalist and was associated with several social organisations like the Mahatma Phule Samata Pratishthan, Rashtriya Ekatmata Samiti, Samaji Krutdnyata Nidhi and Peoples Union for Civil Liberties.

For all the latest Cities News, download Indian Express App

Excerpt from:

More and more Puneites come forward to donate body for research ... - The Indian Express

We can’t rehabilitate our way out of Baltimore’s crime problems – Baltimore Sun

The Readers Respond comments regarding crime and punishment in Baltimore (Yet another reminder of why I left Baltimore, Aug. 1) prompt reconsideration by Baltimore civic leaders on how best to address our horrific homicide rate and increasing criminal activity. Their perspective on the causality of crime, and the corresponding more lenient sentencing trends, seem rooted primarily in a belief that the best approach to mitigating crime is through a rehabilitative approach. While rehabilitation and resolution of some of our systemic poverty issues are certainly needed, our city leaders need to not forget that there are other mitigation models that must continue to be used in order to prevent further rampant crime and homicide in the city.

In 2010, David Mulhausen, Research Fellow in Empirical Policy Analysis for The Heritage foundation, testified before Congress on the foundations analysis regarding theories of punishment and mandatory minimum sentences. In his testimony, Mr. Mulhausen cited the generally accepted methods of reducing criminal activity: deterrence, incapacitation and rehabilitation.

Deterrence postulates that increasing the risk of apprehension and punishment in society deters members of society as a whole from committing crime. In layman terms, deterrence ensures that the administration of punishment is certain, swift, and imposes a severity commensurate with the crime, sending a message that crime will not be tolerated. According to the deterrence model, criminals are no different from law-abiding people. Criminals rationally maximize their own self-interest subject to constraints that they face in the marketplace and elsewhere. Increasing the certainty, swiftness, and severity of punishment will result in the utilitarian goal of reduced crime.

Incapacitation does not require any assumptions about the criminals rationalism, or root causes of the criminals behavior. Incarceration is beneficial because the physical restraint of incarceration prevents the commission of further crimes against society during the duration of the sentence.

Rehabilitation assumes that society is the root cause of criminality. Under this model, crime is predominately a product of social factors. Consequently, criminal behavior is determined by societal forces, such as poverty, racial discrimination and lack of employment opportunities, so the object of criminal justice is to mitigate or eliminate those harmful forces. Assuming that structural defects in society cause crime, then criminals deserve rehabilitation, not punishment. Supporters of the rehabilitation model hold the perspective that correctional treatment programs can successfully reduce crime.

The study found that while rehabilitation is an important societal goal, it cannot come at the expense of deterrence and incapacitation. The root causes (poverty, racial discrimination and lack of employment opportunities) are systemic issues, and discussions about the best approaches to mitigate those issues are under continuing debate. In the meanwhile, criminals will continue to commit crimes, which is detrimental to society, including those living within the root causes environment cited above. Rehabilitation is a much needed and important component of mitigating our crime problem, but it cannot be used in isolation. The immediacy of criminal activity and the safety of our citizens require a recognized use of deterrence (swift and sure punishment) and, when warranted, incarceration as well. Society cannot rely solely on altruistic thinking while criminals continue to threaten our safety and well being. This type of broad, holistic approach will better serve the needs of our city.

Jerry Cothran, Baltimore

Send letters to the editor to talkback@baltimoresun.com. Please include your name and contact information.

Excerpt from:

We can't rehabilitate our way out of Baltimore's crime problems - Baltimore Sun

Thinking their way through new superstitions – Times of India

Bengaluru: Challenge accepted -- AS Nataraj has been waiting to hear these words for the past 16 years after framing a seemingly simple challenge of 10 questions. To make it easier, he insists on only eight correct answers for the challenger to be eligible for the Rs 1 crore reward. The catch? The answers would involve the challenger accurately predicting an individual's future using janam kundali or astrological chart. Now you didn't see that coming, did you?

"The reward was Rs 10 lakh when I first issued the challenge in 2001. I increased it to Rs 1 crore because no one came forward despite initial promises. I am now sure that even if I raise the prize to Rs 100 crore, nobody will volunteer," says Nataraj, the 77-year-old founder of Akhila Karnataka Vicharavadi Sangha. His aim is to debunk astrology's main claim to fame - the power to pinpoint the future. "I know it is not true because I was also an astrologer," laughs Nataraj, author of Jyothishyakke Savaalu (Challenge to Astrology) and a veteran TV talking head on the matter.

The other challenge doing the rounds is aimed at busting a scientifically untested brain training programme. Narendra Nayak, the rationalist crusader from Mangaluru, has been holding demonstrations and challenging proponents of mid-brain activation for the last two years. The groups behind this fad take money from parents to enhance brainpower of their children through the 'activation'. Those trained can apparently see after being blindfolded. "People fall for new tricks all the time. Mid-brain activation involves teaching children to lie (about peeking from behind the blindfold). The organisers use pseudo-science jargons and it becomes difficult for lay persons to understand," says Nayak, president of the Federation of Indian Rationalist Associations (FIRA).

LOGIC WINS

For every new trickster in town, there are a few rationalists like Nayak who demand that fantastic claims should be backed by evidence, scientific reasoning and stone-cold rationale. If not, people like him resort to dramatic one-upmanship and myth busting on public platforms to uphold what they see as truth and rationality.

"Earlier, we used to go after petty godmen who produced ash from thin air or put their hands in boiling water. Now, the picture has changed," says Nayak, a 67-year-old trained bio-chemist. The new age miracles involve coming up with sales pitches to sell anything from yoga, millets, salt room therapy and apple cider vinegar as cures for various ills, including cancer, he says. The marketers rely on scientific terms or the ancient Indian label to bamboozle people.

As a trained scientist, the pseudo science gets Nayak going. Recently, he wrote a detailed complaint to the Advertising Standards Council of India about tall claims made by a coconut oil manufacturer in an ad. The regulatory body found that many of the claims such as the oil being a 'natural antiseptic' , 'restores thyroid function and reduces obesity' were not substantiated and hence, misleading. They asked the adverstiser to withdraw the ad or modify it.

ATHEISTIC START

For most such activists, rationalism starts with a healthy dose of atheism. Nayak says he became an atheist at the age of 11 after coming to a conclusion ("maybe hasty") about there being no god despite his prayers. A national science talent scholarship cemented his rationalist leanings and later, after a meeting with the legendary rationalist Abraham Kovoor, he joined the movement.

It isn't easy to break down strong beliefs. Nataraj, who became a rationalist after practicing astrology for several years, says he can hold his own in heated TV debates because he has studied several works about astrology. "There are times when TV astrologers have asked me in private why I oppose astrology as I know so much about it. I tell them we have to have proof," says Nataraj.

UPHILL BATTLE

Public confrontations have a tendency to deteriorate quickly. Sanal Edamaruku, a Delhi-based rationalist, had to relocate to Finland to avoid arrest in a blasphemy case filed by a Mumbai church. Edamaruku, who exposed 'Pilot' Baba and other assorted godmen across India, says in the Mumbai case, he was held up at a TV studio for hours after a violent mob thronged outside, opposing him for saying that miracle tears of a statue came from a leaky drainpipe. "I am not a hatemonger but I gave my opinion after observation (he was invited to see the statue). Listeners can choose to disbelieve. But the situation turned violent and I escaped through the studio's back gate after three-four hours," says Edamaruku, who is bringing out his memoir detailing 25 of the most memorable investigations he has done so far.

Excerpt from:

Thinking their way through new superstitions - Times of India