Java Makes Programmers Want To Do Absolutely Anything Else With Their Time

This question originally appeared on Quora: Why do many software engineers not like Java?

Answer by Michael O Church, functional programmer and machine learning engineer, on Quora

First, lets cover the technical issues. Its verbose, combines the worst of both worlds between static and dynamic typing by having a hobbled but extremely clunky type system, and mandates running on a VM that has a macroscopic startup time (not an issue for long-running servers, but painful for command-line applications). While it performs pretty well nowadays, it still isnt competitive with C or C++ and, with a little love, Haskell and OCaml can or will eclipse it in that domain. For real-world production servers, it tends to require a fair amount of JVM tuning.

The VM itself has a lot to recommend it. It offers concurrency and garbage collection at a level of quality that, until recently, wasnt found anywhere else. Python may have a better user experience, but it also has a GIL, which rules out parallelism. Much important software in the early 2000s was written in Java because, at the time, it was the best choice, even taking the mediocrity of the language itself into account. It had Unicode (albeit, UTF-16) from the start and a strong concurrency story, and it was a notch above C++ in terms of user experience (because, really, who wants to debug template errors deep in someone elses legacy code?)

If you put Java on a technical trial, it doesnt do so bad. The language sucks, the platform is pretty good for most purposes. I do hate the dominant interpretation of object-oriented programming with a passion, because it objectively sucks. See: Michael O. Churchs answer to Was object-oriented programming a failure?

So lets talk about the political and cultural issues. First, the dominant Java culture is one of mediocrity and bad taste, with MetaModelVibratorVisitorFactory classes dominating. Ive heard a number of experts on the Java issue argue that Javas biggest problem is the community, and that comes directly from the fact that good programmers dont want to deal with the bastardization of OOP that has entrenched itself in mainstream corporate development. You have a lot of people who trained up as Java programmers, havent seen a command line ever, and have no clue how the computer actually works. Most of them have never actually written a program; they just write classes and some Senior Chief Architect (who makes $246,001 per year and hasnt written a line of code since the 1990s) figures out how to stitch them together, and then tells some other clueless junior how to implement the glue in the gutshot hope that one will actually have the talent to make an actual working program out of the mess.

This isnt inherent to the JVM, because Clojure (currently hosted on the JVM, although its endgame seems to be language-agnosticism) has a radically different (and better) community. Scalas community is more mixed, but the top Scala engineers (the ones making tools like Spark and Kestrel) are really freaking good.

The root problem, lying under all of this, is that God clearly intended for the programmer-to-program ratio to be one-to-many. Its much more productive and engaging to work that way. Programs should be small, and when you need a lot of code to solve a large problem, you should create a system and give it the respect that systems deserve. The vision that seems ensconced in the modern Java community is one of Big Programs where the programmer-to-program ratio is many-to-one. Ive written at length about why this leads inexorably to political behavior and low productivity: Java Shop Politics.

Follow this link:
Java Makes Programmers Want To Do Absolutely Anything Else With Their Time

Climate deniers and other pimped-out professional skeptics: The paranoid legacy of Nietzsches problem of science

Looking back years later at his first major work, The Birth of Tragedy, the philosopher Friedrich Nietzsche gave himself credit for being the first modern thinker to tackle the problem of science itself, for presenting science for the first time as problematic and questionable. Dude! If the perverse German genius could only have known how far the problem of science would extend in our age, or to what ends his critique of Socratic reason would be twisted. He might be delighted or horrified in equal measure one thing you can say for Nietzsche is that his attitudes are never predictable to see how much we now live in a world he made, or at least made possible.

It may seem like a ridiculous leap to connect a scholarly work about ancient Greek culture published in 1872 with the contemporary rise of climate denialism and other forms of pimped-out skepticism, in which every aspect of science is treated by the media and the public as a matter of ideological debate and subjective interpretation. Im not suggesting that the leading climate skeptics, corporate shills and other professional mind-clouders seen in Robert Kenners new documentary Merchants of Doubt have read Nietzsche and based their P.R. playbook on what he would have termed an appeal to the Dionysian impulse, the primitive, violent and ecstatic forces that lie below the surface of civilization. (You can see two prime specimens at the top of the page: James Taylor of the libertarian-oriented Heartland Institute and longtime oil lobbyist William OKeefe, who now heads the George C. Marshall Institute, a climate-obsessed right-wing think tank.) They didnt have to. That impulse is baked into human culture at this point, and it can be exploited without entirely being recognized or understood.

Im not discounting the most obvious elements of the 21st-century assault on science, which are amply addressed in Kenners film and other recent works on the subject. There is certainly a heated cultural and political conflict over the issue of climate change, but there is no scientific debate, no matter how many times Fox News hosts repeat that phrase. Enormous financial interests are at stake, as oil companies and other big stakeholders in the fossil-fuel economy seek to fend off or delay a major social restructuring that could destroy their business. Ideological hangover from the Cold War and the 1960s, especially among a certain paranoid strain of the conservative movement, has turned the climate issue into a symbolic confrontation between American freedom and the sinister global forces of academia and environmentalism, often understood as the new faces of Communism. As former Republican congressman Bob Inglis a staunch conservative and former climate skeptic who was defeated by a Tea Party rebel in 2010 puts it, issues of tribal loyalty are at work here that trump rational questions about the validity of scientific evidence.

Inglis is the most interesting individual interviewee in Merchants of Doubt, partly because he stands apart from the competing ideological choruses on this issue and has taken on the thankless task of proselytizing his fellow Christian conservatives, one terrifying Deep South radio show at a time. His remarks about tribalism also nudge us toward the Nietzschean subtext of the climate fight, by which I mean not just the question of what political or corporate agendas are being served since thats pretty obvious but why the right-wing counterattack against a previously uncontroversial scientific consensus has been so effective with the general public.

In other words, we need to ask new versions of the questions Nietzsche himself asked: What does all science in general mean considered as a symptom of life? What is the point of all that science and, even more serious, where did it come from? Beneath the political, economic and tribal conflict over climate science lies a profound sense that what Nietzsche described as the Apollonian forces of social order, in this case being the book-learning of the professoriate and the rules and regulations of government, cannot contain or comprehend the chaotic and mysterious nature of reality. There is considerable truth in that, which was Nietzsches great insight how much truth and what kind of truth, and how these competing forces can best be managed, being precisely the important questions.

For the sanctimonious forces of liberalism, committed to a one-way human narrative from darkness into enlightenment, it is always tempting to blame such retrograde impulses on a uniquely American combination of ignorance, isolation and religiosity. Those factors have played their part in our nations history, but self-righteous rube-shaming is unlikely to lead to political victory, and does not address what appears to be a deep-seated species preference for passion over reason, sensuality over intellect, Dionysian excess over Apollonian discipline. To say that such a phenomenon exists and must be confronted is not to endorse it uncritically, a confusion that has often led to misreadings of Nietzsche. If those of us who would like to save the planet ignore or deny the dark allure of the Dionysian impulse, we have already conceded the high ground on the battlefield of human imagination, and are likely to lose everything.

Merchants of Doubt is primarily based on the influential 2010 book of the same name by science historians Naomi Oreskes and Erik M. Conway, which traces the strategy and tactics of climate denial back to the tobacco industrys 50-year propaganda war against clear-cut medical evidence and increased government regulation. Our product is doubt, as one infamous internal memo, found amid the reams of tobacco-industry documents pried free from the corporate vaults, put it. Advised by consultants at the P.R. firm Hill & Knowlton never to directly deny the mounting evidence that cigarettes were addictive and deadly, tobacco execs and their hired scientific hands insisted for decades that they simply werent sure. Maybe and maybe not! We need more research and more evidence! We dont personally believe these things are harmful just because smokers are many times more likely to die of lung cancer but who really knows?

In a devastating montage near the end of Kenners film, we see how leading Republican politicians, who appeared to accept the scientific consensus on climate change until a few years ago, have come to echo this rhetoric almost word for word. John McCain, Mitt Romney, John Boehner and even George W. Bush all used to agree that climate change was real and in large part caused by human activity; Newt Gingrich and Nancy Pelosi once did a public-service announcement together urging bipartisan action on the issue. Those were the days, my friends. After the Tea Party uprising of 2010 and climate counterattacks by the Koch brothers Americans for Progress, the oil industry-funded blogger and pundit Marc Morano and numerous others, that all changed. Boehner, Gingrich, Romney and every other Republican candidate or official in the country was forced to flip to the Heck, Im no scientist school of mandatory agnosticism. (We should spare half a kind thought for McCain, who even in his diminished and compromised post-Sarah Palin condition retains a few shreds of integrity.)

Building on the work of numerous other scholars notably the Australian economist and ethicist Clive Hamilton, whose book Requiem for a Species goes somewhat deeper into the same issues Oreskes and Conway identify a tiny group of renegade right-wing scientists who have established themselves as professional contrarians and saboteurs, seeking to muddy the waters on a whole range of issues from tobacco to acid rain to pesticides and carbon emissions. This cabal has been led by the physicists Bill Nierenberg, Fred Seitz and Fred Singer, who were leading figures in Cold War weapons design but possess no academic expertise in any discipline relating to climate science. Their importance to the climate-denial movement lies in their possession of legitimate Ph.D.s, their ability to comb through scientific studies and cherry-pick confusing or contradictory data points, and most of all their eagerness to defend free-market capitalism against all efforts to restrain it or redirect it.

This handful of devoted obfuscators, buttressed by an army of industry-funded experts from recently invented right-wing think tanks Morano, OKeefe, Taylor and pretty much all the other dudes who show up on TV in that role possess no actual background in science has ingeniously capitalized on the mainstream medias fetish for balance and succeeded in sowing widespread confusion. Since Barack Obama took office in 2009 which coincided, not by accident, with the launch of a major climate-skeptic counterattack opinion polling has consistently reported that at least 40 percent of Americans believe that the seriousness of global warming is exaggerated. That level had never been reached in 12 years of previous surveys. Its bizarre and distressing that such transparently bogus tactics worked so well, but it could only have happened if the seeds fell on fertile ground. For a whole range of reasons, reflecting both Americas chronic political divisions and the deeper cultural forces at work beneath them, many people ached to believe that the scientific bad news simply wasnt true.

Read the original here:
Climate deniers and other pimped-out professional skeptics: The paranoid legacy of Nietzsches problem of science

Why cars are the next frontier for the Internet of Things

Vehicles arent just becoming roving WiFi hot spots, theyre becoming connected devices that eventually will be part of the Internet of Things, the growing trend of objects that interact with each other over the Internet.

Having a SIM card embedded in a car isnt new. One of the longest-standing examples is General Motors OnStar service, which has operated through a partnership with Verizon, one of the largest wireless providers in the United States. The GSM Association (GSMA) forecasts a sevenfold increase in new vehicles equipped with mobile connectivity by 2018, and expects growth to grow substantially beyond that.

Trying to lead the way is GM with its 4G LTE in-car Internet service designed to give the car its own data plan. OnStar doesnt cost extra at the dealership; nor does it require an OnStar subscription. A majority of vehicles spread across the 2015 Chevrolet, Cadillac, Buick and GMC lines are equipped with it.

Its integrated into the vehicle, and we have the antenna to achieve optimal signal strength and coverage on the roof instead of in your pocket, says Fred Dixon, technology manager at GM Canada. You can connect up to seven devices to it and its seamless. You enter the car, it automatically connects and you can use that data, never having to turn on your phones hot spot.

Customers get a three-month trial with a hard cap of three gigabytes of data to play with. AT&T is the wireless carrier partner, and through agreements with most of its Canadian counterparts, Dixon confirms there are no roaming charges incurred in Canada and the United States. The car will automatically connect to whichever network is strongest in any particular locale within Canada, although users wouldnt notice the shift.

This all-in collaboration means consumers cant add the car to a shared monthly data plan they use with a smartphone and tablet, for example. The cars data is a separate cost paid to GM, and plans start at $10 a month for 200 megabytes all the way up to $250 for 10 gigabytes over 12 months. Rates are slightly cheaper for OnStar subscribers.

Mansell Nelson, senior vice-president of products and solutions for Rogers enterprise business, believes closer partnerships between auto makers and carriers are inevitable. Bell has long been the carrier partner for OnStar in Canada, and despite the agnosticism of the 4G LTE service, carriers will want to compete to do business with the original equipment manufacturers.

The car is becoming a computer and just one big API (application programming interface) that will need constant changing and updating, says Nelson. There are about 1.3 million new cars per year in Canada, and if more of them are connected, network capacity will have to continue to grow to accommodate them as connected devices.

He cites the example of Tesla, which has pushed updates hundreds of megabytes or even a gigabyte in size. And just like with Tesla, system updates pushed to GMs vehicles wont count against the customers data bucket.

This is the future, says Nelson. People werent used to the abrupt torque acceleration in Teslas cars, so they complained and the company rolled out a firmware update over-the-air [using AT&Ts network] that they could choose to install to reprogram their Tesla to accelerate more like a gasoline car.

Read this article:
Why cars are the next frontier for the Internet of Things

Agnosticism | Inters.org

I. Agnosticism as a Philosophical Position

1. Definition. The term agnosticism, as well as other modern words (Fr. agnosticisme, It. agnosticismo, Germ. agnostizismus), has its etymological roots in the Greek word gnostos, that is, unknowable. Although agnosticism as a philosophical school of thought has a long history and has been described from time to time with diverse connotations, it was the English naturalist Thomas H. Huxley (1825-1895) who coined the term agnosticism as an antithesis to the gnostic of Church history. Huxley saw the gnostic as someone who claims to know much about things which another does not. (cf. Collected Essays, V, London, 1898, pp. 237-245). Huxley coined the term in the context of a congress of the Metaphysical Society of London in 1869 and later re-iterated the same in his work Agnosticism in 1889. It is important to point out the antithesis posed by Huxley between a religious gnosis, which would claims to know the unknowable, and the agnosticism of the scientist, which refuses to determine a priori the solution to the problems that form the object of his or her research. In fact, it is within this refusal that the meaning of modern agnosticism resides inasmuch as it does not wish to be, in the majority of cases, a hostile refutation of metaphysical or religious topics as in the case of atheism but rather a suspension of judgment in regard to the question of God and of the Absolute. The question of God and of the Absolute is neither denied nor affirmed by agnosticism in order to allow scientific research to be uninhibited. Whereas atheism holds that God does not exist, agnosticism limits itself to affirming that we do not possess above all from a scientific and cognitive point of view adequate rational instruments to affirm or negate the reality of God or of the Absolute. In a letter of 1879, C. Darwin declared himself an agnostic in the same sense as coined by Huxley. Similarly, H. Spencer, maintaining in his work First Principles (1862) the impossibility of scientifically demonstrating the mysterious force that sustains natural phenomena, was classified as an agnostic. The physiologist Raymond Du-Boys in his work The Seven Enigmas of the World (1880) held that in front of the great enigmas of the world and of existence, it is most responsible for man, and above all for the scientist, to pronounce an ignorabimus (we will not know), since those enigmas go beyond the realm of scientific knowledge. One may conjecture that modern agnosticism, which is not to be confused with the agnostic tendencies that have been around even from the origins of the history of philosophy, predominantly has a scientific background and is motivated in particular by the imposition Kantian criticism gave to the metaphysical question.

2. The Critique of the Principle of Causality. In fact, the most rigorous modern formulation of metaphysical agnosticism was formulated by Immanuel Kant (1724-1804). Kants metaphysical agnosticism has decisively influenced both philosophical and scientific agnosticism as well as the religious agnosticism of the 19th and 20th centuries. In The Critique of Pure Reason (1781), especially in the third part (Transcendental Dialectics), and in The Critique of Practical Reason (1788), Kant clearly shows how the presuppositions of metaphysical agnosticism derive, on the one hand, from the empiricism of David Hume (1711-1776), particularly from his critique of the metaphysical concept of causality, and on the other hand from the idea of ratio separata proper to modern rationalism. The empiricism of Hume did indeed affirm as absolute the principle of experience, already formulated by John Locke (1632-1704) in An Essay Concerning Human Understanding (1688) and later elaborated by George Berkeley (1685-1753) in A Treatise Concerning the Principles of Human Knowledge (1710) with the famous statement esse est percipi (to be is to be perceived). Basing himself upon the principle of experience, in A Treatise of Human Nature (1740) and later in his Exposition Concerning Human Understanding (1748), Hume denies that abstract ideas have truth-value corresponding to experience, including even the idea of matter. It follows then that both the idea of cause and the consequent metaphysical principle of causality, according to which ontological causes are the foundation of physical causes, must be rejected as deceptive because they are contrary to the principle of experience. The distinction between ideas and impressions leads Hume to sustain that only those ideas which make reference to immediate impressions have truth-value. Now since the idea of cause makes reference only to an impression of sequences of events, it signifies only the order of this succession, and not the inference of a causal principle other than experience. The idea of cause then, Hume concludes, is only something that one feels, or rather a belief, which arises in ones consciousness because one observes the repetition in the experience of sequences that tend to repeat. These repetitions mistakenly lead one to believe in the possibility of locating in one of the elements of the sequence the cause, and in the other the effect (cf. A Treatise of Human Nature, Book I, part III, 14-15; cf. also part II, 6 and part IV, 2).

The demolition of the idea of cause based upon the radicalization of the principle of experience formulated by Hume inevitably led to the elimination of the very foundation of metaphysics. Starting from the second period of Platos works (cf. Phaedo, 79a, 98c-e, 99e, 100c-d) and later with the Metaphysics of Aristotle (cf. Books I and II), metaphysics had made precisely the principle of causality the cornerstone of ontology, setting out from there to a knowledge that would no longer limit itself to observing effects, but rather would be capable of rising to the fundamental causes of being.

1. Kant and Metaphysical Agnosticism. From Humes critique of the idea of cause, Immanuel Kant knew in effect how to draw out all the essential gnoseological consequences in order to formulate his critical evaluation of metaphysical knowledge. Already Sextus Empiricus (180-220), in Outlines of Pyrrhonism, had criticized the principle of causality, just as would some of the representatives of nominalism do much later in the Middle Ages, in particular, Nicholas DAutrecourt (1300-1350), Pierre DAilly (1350-1420), and William Ockham (1280-1349). Yet, as already observed, in the Kantian metaphysical agnosticism such critique joins itself to that acceptance of the primacy of experience proper to empiricism, as well as to the recognition of the value of the autonomous activity of the intellect proper to modern rationalism.

For the philosopher of Knigsberg, all knowledge that would have truth-value must be modeled upon the type of knowledge that makes science possible. In other words, only knowledge that results from the synthesis between matter, constituted by phenomena as the proper object of empirical observation, and the action of forms a priori, through which those phenomena are grasped by a specific category of our intellect, would have truth-value. So, for Kant, one is dealing with the examination of the nature of synthetic a priori judgments, in which he reforms the foundation not only of scientific knowledge, but also of all knowledge valuable for humanity. All knowledge that one desires to have the character of science must therefore be the result of a synthesis between matter, offered from the vastness of phenomenal experience, and an a priori form, given by the intellect. In as much as the I think is fount and root of every a priori category of the intellect, it therefore constitutes the transcendental condition of all knowledge, and such knowledge must be understood as the transcendental constitution of experience. As a result, philosophical knowledge is modeled after scientific knowledge, which in turn will become the paradigm of all sensible knowledge. Post-Kantian philosophy will often recognize solely itself as the methodology of science or epistemology, i.e., as a reflection on the scientific status of the theories of science. Thus, philosophy progressively loses its nature as knowledge in order to become a reflection on the modalities of knowledge. It is clear then that metaphysics, which claims to go beyond the appearance of experience (phenomenon) to grasp the essence of things in themselves (noumenon), which are not subject to experience, becomes, in a Kantian scheme, a knowledge that has no object, and therefore cannot claim to be a well-founded knowledge. According to the image of the same Kant, metaphysics appears outside the realm of experience as a dove that seeks to fly without air beneath its wings. For this reason, when metaphysics asks questions about the existence of God, of the soul, of the world, of freedom all realities that escape from a phenomenal type of experience it falls into insurmountable antinomies (cf. Kant, The Critique of Pure Reason, I, 2, ch. 2: The Antinomy of Pure Reason). Metaphysical agnosticism, therefore, consists not in the a priori denial of such realities, but in the thesis that one cannot attain any metaphysical knowledge, because it lies outside the domain of phenomenal experience.

2. Kant and Scientific Agnosticism. Numerous philosophies were inspired by the Kantian model of knowledge in the 19th and 20th centuries, and have dealt with all the implicit consequences of metaphysical agnosticism expressed in The Critique of Pure Reason. One can say that scientific agnosticism constitutes the flip side of metaphysical agnosticism, in as much as it presupposes it and radicalizes it by affirming the primacy of an agnostic scientific knowledge, being indifferent in principle to the great themes of metaphysics, particularly those of religion. Thus is the positivism of Auguste Comte (1798-1857), which considers as the only truth facts, i.e., that which can be described according to concrete experience and, similarly to Kant, judges all research of the metaphysical causes of the facts themselves to be without foundation (cf. Discourse on the Positive Spirit, 1844; Course of Positive Philosophy, 1830-42). And by applying the principles of Comtes positivism in the study of primitive peoples, it will be the French sociological school (E. Durkheim, M. Mauss, L. Lvy-Bruhl), that will bring about a strong critique of religion by affirming that the religious dimension manifested by a specific people is nothing other than the fruit of an imposition exerted by the dominant part of the group (cf. E. Durkheim, The Elementary Forms of Religious Life, 1912).

A particular type of scientific agnosticism was represented by Herbert Spencer (1820-1903). In his work The Factors of Organic Evolution (1887), Spencer maintains that all of nature and the entire cosmos are regulated by an evolutionistic principle which is not finalistic (seefinalism), in the sense that, departing from the study of natural phenomena, it would not be possible to infer the existence of God as creator and orderer of the cosmos. Nonetheless, for this reason alone such existence cannot be denied, in as much as the same Spencer holds that at the confines of human experience and of scientific knowledge, there exists the Unknowable, which is precisely that which is beyond the confines of experience and science (cf. System of Synthetic Philosophy, London 1858). The Unknowable is for Spencer that which metaphysics and religion have called God and which, even though it is not a part of the cognitive categories of science, nonetheless cannot be denied by them, as scientific atheism on the other hand would claim to do.

Contemporary epistemology, developing after the crisis of scientific positivism, which had attributed to scientific knowledge a paradigmatic value, subjected this latter to a dense critique on the part of authors such as Poincar, Boutroux, Duhem, Mach, Bergson, Hilbert, Peano, and Frege. Numerous scientific discoveries as well as the progress made in mathematics and logic and in the new relative paradigms of interpretation formulated in the 20th century drove scientists and philosophers of science towards a conception of the laws of nature formulated from scientific theories, one no longer static and mechanistic, but dynamic and probabilistic, marked by unpredictability because it had been opened to the emergence of complexity. Such rethinking gave birth to diverse epistemological currents: neo-positivistic logic (Schlick, Carnap, Ayer, Russell), according to which only experimental propositions or factual propositions have scientific value, or those whose content is empirically verifiable; the metaphysics of science (Meyerson, Eddington), according to which all science implies a metaphysics, and the same scientific knowledge must be understood as a progressive discovery of reality, able again to find its ultimate foundation in a metaphysics; scientific rationalism (Popper, Feyerabend), according to which science is nothing other than a rational construction of man and the observed facts nothing other than elements dependent upon the scientific theories utilized to organize them, whereas the theories themselves are, in their turn, responses to preceding theoretical problems and, in an ultimate analysis, systems of rash conjectures to which the experiment adds nothing true. If the scientific theory is the elaboration of a theory capable of resolving unresolved problems, the experimental verification plays then the role of a continuous control of the theory itself, with the warning of Karl Popper (1902-1994), that one ought not to speak of a verification in a positivistic sense, but rather of a falsification, because every scientific theory is not definitive, but provisional, subject to being falsified on the part of a better theory.

Although contemporary epistemology has strongly contested the Kantian and positivistic conception of knowledge, it did not know how to remove from scientific agnosticism its implications. In effect, the Kantian anti-metaphysical prejudice has remained present in almost all forms of contemporary epistemology, in the sense that although science itself evolves and the same evaluation of objective value of scientific theories transforms itself, science nonetheless continues to be considered the sole area of knowledge valuable for humanity. The questions that go beyond the domain of science the problem of God in particular can at most be accepted as questions that, as in Kant, have sense for the existence of man, but not for his knowledge. Scientific agnosticism consists precisely in dismissing the idea that science, however one understands it, represents an area where metaphysical and religious questions can be formulated or at least recognized as significant, i.e., have the sense of a question and the value of knowledge.

See the article here:
Agnosticism | Inters.org

Milbank: Scott Walkers insidious agnosticism

I dont know.

Thus proclaimed Scott Walker, the Wisconsin governor and Republican presidential hopeful, when asked by The Posts Dan Balz and Robert Costa on Saturday whether President Obama is a Christian.

This is not a matter of conjecture. The correct answer is yes: Obama is Christian, and he frequently speaks about it in public. Balz and Costa presented Walker with this information to give him a second chance to answer.

But even when prompted with the facts, Walker in Washington for the National Governors Association meeting persisted, saying, Ive actually never talked about it or I havent read about that, and, Ive never asked him that, and, Youve asked me to make statements about people that I havent had a conversation with about that.

This is an intriguing standard. Ive never had a conversation with Walker about whether hes a cannibal, a eunuch, a sleeper cell for the Islamic State, a sufferer of irritable bowel syndrome or a grand wizard of the Ku Klux Klan. By Walkers logic, it would be fair for me to let stand the possibility that he just might be any of those simply because I have no personal and direct refutation from him.

Walker justifies his agnosticism on grounds that he is avoiding gotcha questions. He caused a furor when he used the same logic last week to avoid saying whether Obama loves his country after Rudy Giuliani, at a dinner with Walker, volunteered his view that Obama does not. To me, this is a classic example of why people hate Washington and, increasingly, they dislike the press, he told my colleagues Balz and Costa, two of the best in the business.

This is insidious, and goes beyond last weeks questioning of Obamas patriotism, because it allows Walker to wink and nod at the far-right fringe where people really believe that Obama is a Muslim from Kenya who hates America. The governor is flirting with a significant segment of the Republican primary electorate: those who have peddled the notion (accepted by 17percent of Americans at the end of Obamas first term) that Obama is a Muslim.

Beyond that, Walkers technique shuts down all debate, because theres no way to have a constructive argument once youve disqualified your opponent as unpatriotic, un-Christian and anti-American. On the Internet, Godwins Law indicates that any reasonable discussion ceases when the Nazi accusations come out; Walker is essentially doing the same by refusing to grant his opponent legitimacy as an American and a Christian.

But if this is Walkers standard, it seems only fair that it should be applied to him, as well. Here is what one of those meet-the-candidate Q&As might look like if the answers were drawn from actual demurrals Walker has used in other contexts in recent weeks:

Why does Scott Walker hate America?

See the original post here:
Milbank: Scott Walkers insidious agnosticism

$290 Million Mobile Printer Market Hotbed for Solution Innovation, According to VDC Research

Natick, MA (PRWEB) February 13, 2015

Mobile printer revenues posted a 6% growth in 2014 to $290 million and are expected to increase to $371 million by 2018, according to new global market analysis and by VDC Research. The Americas will drive global revenue growth with an anticipated 7.3% CAGR through the forecast period, while the European and Asian markets will generate revenues at a slower rate. According to insights from VDCs recently conducted end-user survey, mobile printer deployments will rise sharply in Europe and North America in the next 3 years, especially among industrial supply chain participants. Mobile printers are primed to account for a 20% share of overall printer installed base in these regions.

These printers are riding the mobility wave with several enterprises capitalizing on rising interest and existing growth opportunities. Workforce mobilization and related requirements will spur current and anticipated product development enhancements, requiring support for various mobile operating systems, and will also spur seamless integration with various transaction-enabling solutions and contactless technologies like NFC. Mobile printer adoption will not be concentrated in any one vertical market. That said, applications such as mobile point-of-sale (for receipts), ticket printing, and shelf-edge labeling will drive growth.

The mobile printer market is faced with equal parts growth opportunities and challenges due to highly fragmented user environments, said VDC Senior Analyst Richa Gupta. Increased investments in R&D, distribution channels, and solution partnering are critical to ensure that product and market development strategies align with evolving user requirements and preferences.

Competition from emerging markets has intensified. The past 2-3 years has seen an influx of suppliers largely from Asia-Pacific designing, manufacturing, and selling low-cost mobile printers. Market leaders such as Datamax-ONeil, Intermec (now part of Honeywell), SATO, and Zebra Technologies face stiff price competition from these entrants including Sewoo Tech and Woosim Systems, eroding margins, and overall profitability. Vendors go-to-market strategies will need to become more application-specific as they seek ways to compete effectively and differentiate themselves in this highly fragmented global marketplace. For this, partnerships with value-adding channels like Systems Integrators (SIs) and Independent Software vendors (ISVs) will be critical. Channel organizations with domain expertise in POS and direct store delivery-type applications are in especially high demand.

Mobile device connectivity agnosticism will also be critical to product success in the long run. Demand is rising for solutions that are not only Wi-Fi or Bluetooth-enabled but that can also be integrated with a range of mobile options for both on-premise and in-field use. As smart device options increase, so will the need for mobile printers to be both platform- as well as OS-independent. It is also important for vendors to develop and release the necessary drivers to support the breadth of mobile OS platforms.

About VDC Research: Founded in 1971, VDC Research provides in-depth insights to technology vendors, end users, and investors across the globe. As a market research and consulting firm, VDCs coverage of AutoID, enterprise mobility, industrial automation, and IoT and embedded technologies is among the most advanced in the industry, helping our clients make critical decisions with confidence. Offering syndicated reports and custom consultation, our methodologies consistently provide accurate forecasts and unmatched thought leadership for deeply technical markets. Located in Natick, Massachusetts, VDC prides itself on its close personal relationships with clients, delivering an attention to detail and a unique perspective that is second to none. For more information, go to http://www.vdcresearch.com.

Read the original post:
$290 Million Mobile Printer Market Hotbed for Solution Innovation, According to VDC Research

Africa: The Wild Cards Offering Climate Hope

By Roger Williamson

In 2015, the world's governments are meant to sign up to a binding climate change agreement and a new set of development goals, to follow on from the Millennium Development Goals.

At the start of each year, the Economist predicts what to expect in the coming 12 months. This year's edition, The World in 2015, finds a few - actually very few - pages to discuss the prospects for the two agreements. [1]

Yet the most instructive parts are the spaces where The Economist owns up to some of its failed predictions from the year before.

Many different people seem to have been the first to warn us about the dangers of making predictions; especially about the future, because - as the joke runs - those are the ones that often go wrong.

One example is Nobel economics laureate Daniel Kahneman, who said: "Economists ... are quite good at explaining what has happened after it has happened, but rarely before." [2]

It is impossible to predict the outcome of climate change negotiations or calculate the odds of their success with mathematical certainty. The future is open and will surprise us. Wild cards often crop up in policymaking and new political constellations emerge.

But spotting those wild cards as they emerge can suggest the direction negotiations are heading in. I want to identify two such developments, and show how science figures within each.

How science's role is understood by big policy players makes a political difference. To say: "Yes, the Intergovernmental Panel on Climate Change is giving us yet another 'last chance' to save the planet", as news media often do, decreases the chances of international agreement by fuelling cynicism or, at best, agnosticism. That message suggests: sit on your hands and wait and see. But once the proof is in, that this was indeed the last chance, it will be too late.

There are better ways to use science than to frighten people into learned helplessness. And the two new sources of policy influence that I see illustrate this, while offering hope and momentum.

View post:
Africa: The Wild Cards Offering Climate Hope

Sites in English

There is only one playwright on the planet who could change the way you think about human existence in 90 minutes and leave you wishing there was a bit more to his new play. And that playwright is the great Tom Stoppard.

The Hard Problem comes with an almost unfair weight of expectations: its Stoppards first play in nine years; it follows 2006s Rock n Roll, which WAS an unqualified success, and there is, face facts, every chance that itll be the 77-year-olds final stage work.

The Hard Problem follows Hilary (Olivia Vinall), a young psychology researcher who is attempting to unravel the riddle of whether there is such thing as a truly good person something she has personally attempted to be ever since a trauma in her teens. Much of the play is set at the Krohl Institute, a high-powered research centre where Hilary is hired and taken in under the wing of the eccentric Leo (Jonathan Coy). Hes a cranky professor who rejects better qualified candidates because he likes her spirited some might say naive determination to pick away at the questions science seems incapable of answering. Namely, the hard problem: if existence is only matter, what is consciousness?

And theres, er, not really much more to the plot than that: Hilary occasionally indulges in bickering sexposition with Damien Molonys hunky cynic Spike; there is a slightly hard-to-swallow (though partly justified) resolution to her trauma; and in the background the economy tanks. Vinall is a compelling actor, and Hilarys not a total drip, but you kind of wish there was more to her than earnest goodness and background sorrow, while the other characters barely scrape two dimensions between them. As a drama, I couldnt help but think its overshadowed by Lucy Prebbles not dissimilar The Effect, which played the same theatre a couple of years back.

But no playwright does ideas like Stoppard, and the arguments he places in his characters mouths both for and against the possibility of something more to our existence are lucid, digestible, immaculately researched and at moments almost dazzlingly audacious in one scene he appears to make a case that the collapse of the stockmarkets is evidence for the possible existence of God. With typical Stoppardian mischief, The Hard Problem is probably the most eloquent case for agnosticism youll ever see. And Nicholas Hytners old-fashioned, light-touch production is the perfect vehicle.

Not champagne Stoppard, but still quintessentially Stoppard: if this is last act of his stage career, then he goes out undimmed.

'The Hard Problem' will be broadcast live to cinemas on April 16

More:
Sites in English