...23456...102030...


Can Libertarianism Be a Governing Philosophy?

The discussion we are about to have naturally divides itself into two aspects:

First: Could libertarianism, if implemented, sustain a state apparatus and not devolve into autocracy or anarchy? By that I mean the lawless versions of autocracy and anarchy, not stable monarchy or emergent rule of law without a state. Second: even if the answer were Yesor, Yes, if . . . we would still need to know whether enough citizens desired a libertarian order that it could feasibly be voluntarily chosen. That is, I am ruling out involuntary imposition by force of libertarianism as a governing philosophy.

I will address both questions, but want to assert at the outset that the first is the more important and more fundamental one. If the answer to it is No, there is no point in moving on to the second question. If the answer is Yes, it may be possible to change peoples minds about accepting a libertarian order.

The Destinationalists

As I have argued elsewhere[1], there are two main paths to deriving libertarian principles, destinations and directions. The destinationist approach shares the method of most other ethical paradigms: the enunciation of timeless moral and ethical precepts that describe the ideal libertarian society.

What makes for a distinctly libertarian set of principles is two precepts:

The extreme forms of these principles, for destinationists, can be hard for outsiders to accept. One example is noted by Matt Zwolinski, who cites opinion data gathered from libertarians by Liberty magazine and presented in its periodic Liberty Poll. A survey question frequently included in the survey was:

Suppose that you are on a friends balcony on the 50th floor of a condominium complex. You trip, stumble and fall over the edge. You catch a flagpole on the next floor down. The owner opens his window and demands you stop trespassing.

Zwolinski writes that in 1988, 84 percent of respondents to the flagpole question

said they believed that in such circumstances they should enter the owners residence against the owners wishes. 2% (one respondent) said that they should let go and fall to their death, and 15% said they should hang on and wait for somebody to throw them a rope. In 1999, the numbers were 86%, 1%, and 13%. In 2008, they were 89.2%, 0.9%, and 9.9%.

The interesting thing is that, while the answers to the flagpole question were almost unchanged over time, with a slight upward drift in those who would aggress by trespassing, support for the non-aggression principle itself plummeted. Writes Zwolinski:

Respondents were asked to say whether they agreed or disagreed with [the non-aggression principle]. In 1988, a full 90% of respondents said that they agreed. By 1999, however, the percentage expressing agreement had dropped by almost half to 50%. And by 2008, it was down to 39.7%.

If we take support for the non-aggression principle as a Rorschach test, it does not appear that most people, maybe not even everyone who identifies as a libertarian, are fully convinced that the principle is an absolute categorical moral principle.

The Directionalists

Of course, it could be true that many who identify now as libertarians, and those who might be attracted to libertarianism in the future, are directionalists. A directional approach holds that any policy action that increases the liberty and welfare of individuals is an improvement, and should be supported by libertarians, even if the policy itself violates either the self-ownership principle or the non-aggression principle.

A useful example here might be school vouchers. Instead of being a monopoly provider of public school education, the state might specialize in funding but leave the provision of education at least partly to private sector actors. The destinationist would object (and correctly) that the policy still involves the initiation of violence in collecting taxes involuntarily imposed on at least individuals who would not pay without the threat of coercion. In contrast, the directionalist might support vouchers, since parents would at least be afforded more liberty in choosing schools for their children, and the system would be subject to more competition, thus holding providers responsible for the quality of education being delivered.

Here, then, is a slightly modified take on the central question: Would a hybrid version of libertarianism, one that advocated for the destination but accepted directional improvements, be a viable governing philosophy? Even with this amendment, allowing for directional improvements as part of the core governing philosophy, is libertarianismto use a trope of the momentsustainable? The reason this approach could be useful is that it correlates to one of the great divisions within the libertarian movement: the split between political anarchists, who believe that any coercive state apparatus is ultimately incompatible with liberty, and the minarchists, who believe that a limited government is desirable, even necessary, and that it is also possible.

Limiting Leviathan: Getting Power to Stay Where You Put It

For a state to be consistent with both the self-ownership principle and the non-aggression principle, there must be certain core rights to property, expression, and action that are inviolable. This inviolability extends even to situations where initiating force would greatly benefit most people, meaning that consequentialist considerations cannot outweigh the rights of individuals.

Where might such a state originate, and how could it be continually limited to only those functions for which it was originally justified? One common answer is a form of contractarianism. (Another is convention, which is beyond the scope in this essay. See Robert Sugden[2] and Gerard Gaus[3] for a review of some of the issues.) This is not to say that actual states are the results of explicitly contractual arrangements; rather, there is an as if element: rational citizens in a state of nature would have voluntarily consented to the limited coercion of a minarchist state, given the substantial and universal improvement in welfare that results from having a provider of public goods and a neutral enforcer of contracts. Without a state, claims the minarchist, these two functionspublic goods provision and contract enforcementare either impossible or so difficult as to make the move to create a coercive state universally welcome for all citizens.

Contractarianism is of course an enormous body of work in philosophy, ranging from Thomas Hobbes and Jean-Jacques Rousseau to David Gauthier and John Rawls. Our contractarians, the libertarian versions, start with James Buchanan and Jan Narveson. Buchanans contractarianism is stark: Rules start with us, and the justification for coercion is, but can only be, our consent to being coerced. It is not clear that Buchanan would accept the full justification of political authority by tacit contract, but Buchanan also claims that each group in society should start from where we are now, meaning that changes in the rules require something as close to unanimous consent as possible.[4]

Narvesons view is closer to the necessary evil claim for justifying government. We need a way to be secure from violence, and to be able to enter into binding agreements that are enforceable. He wrote in The Libertarian Idea (1988) that there is no alternative that can provide reasons to everyone for accepting it, no matter what their personal values or philosophy of life may be, and thus motivating this informal, yet society-wide institution. He goes on to say:

Without resort to obfuscating intuitions, of self-evident rights and the like, the contractarian view offers an intelligible account both of why it is rational to want a morality and of what, broadly speaking, the essentials of that morality must consist in: namely, those general rules that are universally advantageous to rational agents. We each need morality, first because we are vulnerable to the depredations of others, and second because we can all benefit from cooperation with others. So we need protection, in the form of the ability to rely on our fellows not to engage in activities harmful to us; and we need to be able to rely on those with whom we deal. We each need this regardless of what else we need or value.

The problem, or so the principled political anarchist would answer, is that Leviathan cannot be limited unless for some reason Leviathan wants to limit itself.

One of the most interesting proponent of this view is Anthony de Jasay, an independent philosopher of political economy. Jasay would not dispute the value of credible commitments for contracts. His quarrel comes when contractarians invoke a founding myth. When I think of the Social Contract (the capitals signify how important it is!), I am reminded of that scene from Monty Python where King Arthur is talking to the peasants:

King Arthur: I am your king.

Woman: Well, I didnt vote for you.

King Arthur: You dont vote for kings.

Woman: Well howd you become king then?

[holy music . . . ]

King Arthur: The Lady of the Lake, her arm clad in the purest shimmering samite held aloft Excalibur from the bosom of the water, signifying by divine providence that I, Arthur, was to carry Excalibur. That is why I am your king.

Dennis: [interrupting] Listen, strange women lyin in ponds distributin swords is no basis for a system of government. Supreme executive power derives from a mandate from the masses, not from some farcical aquatic ceremony.

According to Jasay, there are two distinct problems with contractarian justifications for the state. Each, separately and independently, is fatal for the project, in his view. Together they put paid to the notion that a libertarian could favor minarchism.

The first problem is the enforceable contracts justification. The second is the limiting Leviathan problem.

The usual statement of the first comes from Hobbes: Covenants, without the sword, are but words. That means that individuals cannot enter into binding agreements without some third party to enforce the agreement. Since entering into binding agreements is a central precondition for mutually beneficial exchange and broad-scale market cooperation, we need a powerful, neutral enforcer. So, we all agree on that; the enforcer collects the taxes that we all agreed on and, in exchange, enforces all our contracts for us. (See John Thrasher[5] for some caveats.)

Butwait. Jasay compares this to jumping over your own shadow. If contracts cannot be enforced save by coercion from a third party, how can the contract between citizens and the state be enforced? [I]t takes courage to affirm that rational people could unanimously wish to have a sovereign contract enforcer bound by no contract, wrote Jasay in his book Against Politics (1997). By courage he does not intend a compliment. Either those who make this claim are contradicting themselves (since we cant have contracts, well use a contract to solve the problem) or the argument is circular (cooperation requires enforceable contracts, but these require a norm of cooperation).

Jasay put the question this way in On Treating Like Cases Alike: Review of Politics by Principle Not Interest, his 1999 essay in the Independent Review:

If man can no more bind himself by contract than he can jump over his own shadow, how can he jump over his own shadow and bind himself in a social contract? He cannot be both incapable of collective action and capable of it when creating the coercive agency needed to enforce his commitment. One can, without resorting to a bootstrap theory, accept the idea of an exogenous coercive agent, a conqueror whose regime is better than anything the conquered people could organize for themselves. Consenting to such an accomplished fact, however, can hardly be represented as entering into a contract, complete with a contracts ethical implications of an act of free will. [Emphasis in original]

In sum, the former claimthat contracts cannot be enforcedcannot then be used to conjure enforceable contracts out of a shadow. The latter claimthat people will cooperate on their ownmeans that no state is necessary in the first place. The conclusion Jasay reaches is that states, if they exist, may well be able to compel people to obey. The usual argument goes like this:

The state exists and enjoys the monopoly of the use of force for some reason, probably a historical one, that we need not inquire into. What matters is that without the state, society could not function tolerably, if at all. Therefore all rational persons would choose to enter into a social contract to create it. Indeed, we should regard the state as if it were the result of our social contract, hence indisputably legitimate.[6]

Jasay concludes that this argument must be false. As Robert Nozick famously put it in Anarchy, State, and Utopia (1974), tacit consent isnt worth the paper its not written on. We cannot confect a claim that states deserve our obedience based on consent. For consent is what true political authority requires: not that our compliance can be compelled, but that the state deserves our compliance. Ordered anarchy with no formal state is therefore a better solution, in Jasays view, because consent is either not real or is not enough.

Of course, this is simply an extension of a long tradition in libertarian thought, dating at least to Lysander Spooner. As Spooner said:

If the majority, however large, of the people of a country, enter into a contract of government, called a constitution, by which they agree to aid, abet or accomplish any kind of injustice, or to destroy or invade the natural rights of any person or persons whatsoever, whether such persons be parties to the compact or not, this contract of government is unlawful and voidand for the same reason that a treaty between two nations for a similar purpose, or a contract of the same nature between two individuals, is unlawful and void. Such a contract of government has no moral sanction. It confers no rightful authority upon those appointed to administer it. It confers no legal or moral rights, and imposes no legal or moral obligation upon the people who are parties to it. The only duties, which any one can owe to it, or to the government established under color of its authority, are disobedience, resistance, destruction.[7]

Now for the other problem highlighted by Jasay, that of limiting Leviathan. Let us assume the best of state officials: that they genuinely intend to do good. We might make the standard Public Choice assumption that officials want to use power to benefit themselves, but let us put that aside; instead, officials genuinely want to improve the lives of their citizens.

This means a minarchist state is not sustainable. Officials, thinking of the society as a collective rather than as individuals with inviolable rights, will immediately discover opportunities to raise taxes, and create new programs and new powers that benefit those in need. In fact, it is precisely the failure of the Public Choice assumptions of narrow self-interest that ensure this outcome. It might be possible in theory to design a principal-agent system of bureaucratic contract that constrains selfish officials. But if state power attracts those who are willing to sacrifice the lives or welfare of some for the greater good, then minarchy is quickly breached and Leviathan swells without the possibility of constraint.

I hasten to add that it need not be true, for Jasays claim to go through, that the concept of the greater good have any empirical content. It is enough that a few people believe, and can brandish the greater good like a truncheon, smashing rules and laws designed to stop the expansion of state power. No one who wants to do good will pass up a chance to do good, even if it means changing the rules. This process is much like that described by F.A. Hayek in Why the Worst Get on Top (see Chapter 10 of The Road to Serfdom) or Bertrand de Jouvenels Power (1945).

So, again, we reach a contradiction: Either 1) minarchy is not possible, because it is overwhelmed by the desire to do good, or minarchy is not legitimate because it is based on a mythical tacit consent; or 2) no state, minarchist or otherwise, is necessary because people can limit their actions on their own. Citizens might conclude that such self-imposed limits on their own actions are morally required, and that reputation and competition can limit the extent of depredation and reward cooperation in settings with repeated interaction. Jasay would argue, then, that constitutions and parchment barriers are either unnecessary (if people are self-governing) or ineffective (if they are not). Leviathan either cannot exist or else it is illimitable.

But Thats Not Enough

What I have argued so far is that destinationist libertarianism that is fully faithful to the self-ownership principle and the non-aggression principle could not be an effective governing philosophy. The only exception to this claim would be if libertarianism were universally believed, and people all agreed to govern themselves in the absence of a coercive state apparatus of any kind. Of course, one could object that even then something like a state would emerge, because of the economies of scale in the provision of defense, leading to a dominant protection network as described by Nozick. Whether that structure of service-delivery is necessarily a state is an interesting question, but not central to our current inquiry.

My own view is that libertarianism is, and in fact should be, a philosophy of governing that is robust and useful. But then I am a thoroughgoing directionalist. The state and its deputized coercive instruments have expanded the scope and intensity of their activities far beyond what people need to achieve cooperative goals, and beyond what they want in terms of immanent intrusions into our private lives.

Given the constant push and pull of politics, and the desire of groups to create and maintain rents for themselves, the task of leaning into the prevailing winds of statism will never be done. But it is a coherent and useful governing philosophy. When someone asks how big the state should be, there arent many people who think the answer is zero. But thats not on the table, anyway. My answer is smaller than it is now. Any policy change that grants greater autonomy (but also responsibility) to individual citizens, or that lessens government control over private action, is desirable; and libertarians are crucial for providing compelling intellectual justifications for why this is so.

In short, I dont advocate abandoning destinationist debates. The positing of an ideal is an important device for recruitment and discussion. But at this point we have been going in the wrong direction, for decades. It should be possible to find allies and fellow travelers. They may want to get off the train long before we arrive at the end of the line, but for many miles our paths toward smaller government follow the same track.

[1] Michael Munger, Basic Income Is Not an Obligation, but It Might Be a Legitimate Choice, Basic Income Studies 6:2 (December 2011), 1-13.

[2] Robert Sugden, Can a Humean Be a Contractarian? in Perspectives in Moral Science, edited by Michael Baurmann and Bernd Lahno, Frankfurt School Verlag (2009), 1123.

[3] Gerald Gaus, Why the Conventionalist Needs the Social Contract (and Vice Versa), Rationality, Markets and Morals, Frankfurt School Verlag, 4 (2013), 7187.

[4] For more on the foundation of Buchanans thought, see my forthcoming essay in the Review of Austrian Economics, Thirty Years After the Nobel: James Buchanans Political Philosophy.

[5] John Thrasher, Uniqueness and Symmetry in Bargaining Theories of Justice, Philosophical Studies 167 (2014), 683699.

[6] Anthony de Jasay, Pious Lies: The Justification of States and Welfare States, Economic Affairs 24:2 (2004), 63-64.

[7] Lysander Spooner, The Unconstitutionality of Slavery (Boston: Bela Marsh, 1860), pp. 9-10.

Go here to see the original:

Can Libertarianism Be a Governing Philosophy?

6 Reasons Why I Gave Up On Libertarianism Return Of Kings

These days, libertarianism tends to be quite discredited. It is now associated with the goofy candidature of Gary Johnson, having a rather narrow range of issueslegalize weed! less taxes!, cucking ones way to politics through sweeping all the embarrassing problems under the carpet, then surrendering to liberal virtue-signaling and endorsing anti-white diversity.

Now, everyone on the Alt-Right, manosphere und so wieser is laughing at those whose adhesion to a bunch of abstract premises leads to endorse globalist capital, and now that Trump officially heads the State, wed be better off if some private companies were nationalized than let to shadowy overlords.

To Americans, libertarianism has been a constant background presence. Its main icons, be them Ayn Rand, Murray Rothbard or Friedrich Hayek, were always read and discussed here and there, and never fell into oblivion although they barely had media attention. The academic and political standing of libertarianism may be marginal, it has always been granted small platforms and resurrected from time to time in the public landscape, one of the most conspicuous examples of it being the Tea Party demonstrations.

To a frog like yours trulyKek being now praised by thousands of well-meaning memers, I can embrace the frog moniker gladlylibertarianism does not have the same standing at all. In French universities, libertarian thinkers are barely discussed, even in classes that are supposed to tackle economics: for one hour spent talking about Hayek, Keynes easily enjoys ten, and the same goes on when comparing the attention given to, respectively, Adam Smith and Karl Marx.

On a wider perspective, a lot of the contemporary French identity is built on Jacobinism, i.e. on crushing underfoot organic regional sociability in the name of a bureaucratized and Masonic republic. The artificial construction of France is exactly the kind of endeavour libertarianism loathes. No matter why the public choices school, for example, is barely studied here: pompous leftist teachers and mediocre fonctionnaires are too busy gushing about themselves, sometimes hiding the emptiness of their life behind a ridiculous epic narrative that turns social achievements into heroic feats, to give a fair hearing to pertinent criticism.

When I found out about libertarianism, I was already sick of the dominant fifty shades of leftism political culture. The gloomy mediocrity of small bureaucrats, including most school teachers, combined with their petty political righteousness, always repelled me. Thus, the discovery oflaissez-faire advocates felt like stumbling on an entirely new scene of thoughtand my initial feeling was vindicated when I found about the naturalism often associated with it, something refreshing and intuitively more satisfying than the mainstream culture-obsessed, biology-denying view.

Libertarianism looked like it could solve everything. More entrepreneurship, more rights to those who actually create wealth and live through the good values of personal responsibility and work ethic, less parasitesbe they bureaucrats or immigrants, no more repressive speech laws. Coincidentally, a new translation of Ayn Rands Atlas Shrugged was published at this time: I devoured it, loving the sense of life, the heroism, the epic, the generally great and achieving ethos contained in it. Arent John Galt and Hank Rearden more appealing than any corrupt politician or beta bureaucrat that pretends to be altruistic while backstabbing his own colleagues and parasitizing the country?

Now, although I still support small-scale entrepreneurship wholeheartedly, I would never defend naked libertarianism, and here is why.

Part of the Rothschild family, where nepotism and consanguinity keep the money in

Unity makes strength, and trust is much easier to cultivate in a small group where everyone truly belongs than in an anonymous great society. Some ethnic groups, especially whites, tend to be instinctively individualistic, with a lot of people favouring personal liberty over belonging, while others, especially Jews, tend to favor extended family business and nepotism.

On a short-term basis, mobile individuals can do better than those who are bound to many social obligations. On the long run, however, extended families manage to create an environment of trust and concentrate capital. And whereas individuals may start cheating each other or scattering their wealth away, thanks to having no proper economic network, families and tribes will be able to invest heavily in some of their members and keep their wealth inside. This has been true for Jewish families, wherever their members work as moneylenders or diamond dealers, for Asians investing in new restaurants or any other business project of their own, and for North Africans taking over pubs and small shops in France.

The latter example is especially telling. White bartenders, butchers, grocers and the like have been chased off French suburbs by daily North African and black violence. No one helped them, everyone being afraid of getting harassed as well and busy with their own business. (Yep, just like what happened and still happens in Rotheram.) As a result, these isolated, unprotected shop-owners sold their outlet for a cheap price and fled. North Africans always covered each others violence and replied in groups against any hurdle, whereas whites lowered their heads and hoped not to be next on the list.

Atlas Shrugged was wrong. Loners get wrecked by groups. Packs of hyenas corner and eat the lone dog.

Libertarianism is not good for individuals on the long runit turns them into asocial weaklings, soon to be legally enslaved by global companies or beaten by groups, be they made of nepotistic family members or thugs.

How the middle classes end up after jobs have been sent overseas and wages lowered

People often believe, thanks to Leftist media and cuckservative posturing, that libertarians are big bosses. This is mostly, if not entirely, false. Most libertarians are middle class guys who want more opportunities, less taxation, and believe that libertarianism will help them to turn into successful entrepreneurs. They may be right in very specific circumstances: during the 2000s, small companies overturned the market of electronics, thus benefiting both to their independent founders and to society as a whole; but ultimately, they got bought by giants like Apple and Google, who are much better off when backed by a corrupt State than on a truly free market.

Libertarianism is a fake alternative, just as impossible to realize as communism: far from putting everyone at its place, it lets ample room to mafias, monopolies, unemployment caused by mechanization and global competition. If one wants the middle classes to survive, one must protect the employment and relative independence of its membersbankers and billionaires be damned.

Spontaneous order helped by a weak government. I hope they at least smoke weed.

A good feature of libertarianism is that it usually goes along with a positive stance on biology and human nature, in contrast with the everything is cultural and ought to be deconstructed left. However, this stance often leads to an exaggerated optimism about human nature. In a society of laissez-faire, the libertarians say, people flourish and the order appears spontaneously.

Well, this is plainly false. As all of the great religions say, after what Christians call the Fall, man is a sinner. If you let children flourish without moral standards and role models, they become spoiled, entitled, manipulative, emotionally fragile and deprived of self-control. If you let women flourish without suspicion, you let free rein to their propensities to hypergamy, hysteria, self-entitlement and everything we can witness in them today. If you let men do as they please, you let them become greedy, envious, and turning into bullies. As a Muslim proverb says, people must be flogged to enter into paradiseand as Aristotle put forth, virtues are trained dispositions, no matter the magnitude of innate talents and propensities.

Michelle The Man Obama and Lying Crooked at a Democrat meeting

When the laissez-faire rules, some will succeed on the market more than others, due to differences in investment, work, and natural abilities. Some will succeed enough to be able to buy someone elses business: this is the natural consequence of differences in wealth and of greed. When corrupt politicians enter the game, things become worse, as they will usually help some large business owners to shield their position against competitorsat the expense of most people, who then lose their independence and live off a wage.

At the end, what we get is a handful of very wealthy individuals who have managed to concentrate most capital and power levers into their hands and a big crowd of low-wage employees ready to cut each others throat for a small promotion, and females waiting in line to get notched by the one per cent while finding the other ninety-nine per cent boring.

Censorship by massive social pressure, monopoly over the institutions and crybullying is perfectly legal. What could go wrong?

On the surface, libertarianism looks good here, because it protects the individuals rights against left-hailing Statism and cuts off the welfare programs that have attracted dozens of millions of immigrants. Beneath, however, things are quite dire. Libertarianism enshrines the leftists right to free speech they abuse from, allows the pressure tactics used by radicals, and lets freethinking individuals getting singled out by SJWs as long as these do not resort to overt stealing or overt physical violence. As for the immigrants, libertarianism tends to oppose the very notion of non-private boundaries, thus letting the local cultures and identities defenseless against both greedy capitalists and subproletarian masses.

Supporting an ideology that allows the leftists to destroy society more or less legally equates to cucking, plain and simple. Desiring an ephemeral cohabitation with rabid ideological warriors is stupid. We should aim at a lasting victory, not at pretending to constrain them through useless means.

Am I the only one to find that Gary Johnson looks like a snail (Spongebob notwithstanding)?

In 2013, one of the rare French libertarians academic teachers, Jean-Louis Caccomo, was forced into a mental ward at the request of his university president. He then spent more than a year getting drugged. Mr. Caccomo had no real psychological problem: his confinement was part of a vicious strategy of pathologization and career-destruction that was already used by the Soviets. French libertarians could have wide denounced the abuse. Nonetheless, most of them freaked out, and almost no one dared to actually defend him publicly.

Why should rational egoists team up and risk their careers to defend one of themselves after all? They would rather posture at confidential social events, rail at organic solidarity and protectionism, or trolling the shit out of individuals of their own social milieu because Ive got the right to mock X, its my right to free speech! The few libertarian people I knew firsthand, the few events I have witnessed in that small milieu, were enough to give me serious doubts about libertarianism: how can a good political ideology breed such an unhealthy mindset?

Political ideologies are tools. They are not ends in themselves. All forms of government arent fit for any people or any era. Political actors must know at least the most important ones to get some inspiration, but ultimately, said actors win on the ground, not in philosophical debates.

Individualism, mindless consumerism, careerism, hedonism are part of the problem. Individual rights granted regardless of ones abilities, situation, and identity are a disaster. Time has come to overcome modernity, not stall in one of its false alternatives. The merchant caste must be regulated, though neither micromanaged or hampered by a parasitic bureaucracy nor denied its members right for small-scale independence. Individual rights must be conditional, boundaries must be restored, minority identities based on anti-white male resentment must be crushed so they cannot devour sociability from the inside again, and the pater familias must assert himself anew.

Long live the State and protectionism as long as they defend the backbone of society and healthy relationships between the sexes, and no quarter for those who think they have a right to wage grievance-mongering against us, no matter if they want to use the State or private companies. At the end, the socialism-libertarianism dichotomy is quite secondary.

Read Next: Sugar Baby Culture In The US Is Creating A Marketplace for Prostitution

Continue reading here:

6 Reasons Why I Gave Up On Libertarianism Return Of Kings

Nanotechnology – Wikipedia

Nanotechnology (“nanotech”) is manipulation of matter on an atomic, molecular, and supramolecular scale. The earliest, widespread description of nanotechnology[1][2] referred to the particular technological goal of precisely manipulating atoms and molecules for fabrication of macroscale products, also now referred to as molecular nanotechnology. A more generalized description of nanotechnology was subsequently established by the National Nanotechnology Initiative, which defines nanotechnology as the manipulation of matter with at least one dimension sized from 1 to 100 nanometers. This definition reflects the fact that quantum mechanical effects are important at this quantum-realm scale, and so the definition shifted from a particular technological goal to a research category inclusive of all types of research and technologies that deal with the special properties of matter which occur below the given size threshold. It is therefore common to see the plural form “nanotechnologies” as well as “nanoscale technologies” to refer to the broad range of research and applications whose common trait is size. Because of the variety of potential applications (including industrial and military), governments have invested billions of dollars in nanotechnology research. Through 2012, the USA has invested $3.7 billion using its National Nanotechnology Initiative, the European Union has invested $1.2 billion, and Japan has invested $750 million.[3]

Nanotechnology as defined by size is naturally very broad, including fields of science as diverse as surface science, organic chemistry, molecular biology, semiconductor physics, energy storage,[4][5] microfabrication,[6] molecular engineering, etc.[7] The associated research and applications are equally diverse, ranging from extensions of conventional device physics to completely new approaches based upon molecular self-assembly,[8] from developing new materials with dimensions on the nanoscale to direct control of matter on the atomic scale.

Scientists currently debate the future implications of nanotechnology. Nanotechnology may be able to create many new materials and devices with a vast range of applications, such as in nanomedicine, nanoelectronics, biomaterials energy production, and consumer products. On the other hand, nanotechnology raises many of the same issues as any new technology, including concerns about the toxicity and environmental impact of nanomaterials,[9] and their potential effects on global economics, as well as speculation about various doomsday scenarios. These concerns have led to a debate among advocacy groups and governments on whether special regulation of nanotechnology is warranted.

The concepts that seeded nanotechnology were first discussed in 1959 by renowned physicist Richard Feynman in his talk There’s Plenty of Room at the Bottom, in which he described the possibility of synthesis via direct manipulation of atoms. The term “nano-technology” was first used by Norio Taniguchi in 1974, though it was not widely known.

Inspired by Feynman’s concepts, K. Eric Drexler used the term “nanotechnology” in his 1986 book Engines of Creation: The Coming Era of Nanotechnology, which proposed the idea of a nanoscale “assembler” which would be able to build a copy of itself and of other items of arbitrary complexity with atomic control. Also in 1986, Drexler co-founded The Foresight Institute (with which he is no longer affiliated) to help increase public awareness and understanding of nanotechnology concepts and implications.

Thus, emergence of nanotechnology as a field in the 1980s occurred through convergence of Drexler’s theoretical and public work, which developed and popularized a conceptual framework for nanotechnology, and high-visibility experimental advances that drew additional wide-scale attention to the prospects of atomic control of matter. In the 1980s, two major breakthroughs sparked the growth of nanotechnology in modern era.

First, the invention of the scanning tunneling microscope in 1981 which provided unprecedented visualization of individual atoms and bonds, and was successfully used to manipulate individual atoms in 1989. The microscope’s developers Gerd Binnig and Heinrich Rohrer at IBM Zurich Research Laboratory received a Nobel Prize in Physics in 1986.[10][11] Binnig, Quate and Gerber also invented the analogous atomic force microscope that year.

Second, Fullerenes were discovered in 1985 by Harry Kroto, Richard Smalley, and Robert Curl, who together won the 1996 Nobel Prize in Chemistry.[12][13] C60 was not initially described as nanotechnology; the term was used regarding subsequent work with related graphene tubes (called carbon nanotubes and sometimes called Bucky tubes) which suggested potential applications for nanoscale electronics and devices.

In the early 2000s, the field garnered increased scientific, political, and commercial attention that led to both controversy and progress. Controversies emerged regarding the definitions and potential implications of nanotechnologies, exemplified by the Royal Society’s report on nanotechnology.[14] Challenges were raised regarding the feasibility of applications envisioned by advocates of molecular nanotechnology, which culminated in a public debate between Drexler and Smalley in 2001 and 2003.[15]

Meanwhile, commercialization of products based on advancements in nanoscale technologies began emerging. These products are limited to bulk applications of nanomaterials and do not involve atomic control of matter. Some examples include the Silver Nano platform for using silver nanoparticles as an antibacterial agent, nanoparticle-based transparent sunscreens, carbon fiber strengthening using silica nanoparticles, and carbon nanotubes for stain-resistant textiles.[16][17]

Governments moved to promote and fund research into nanotechnology, such as in the U.S. with the National Nanotechnology Initiative, which formalized a size-based definition of nanotechnology and established funding for research on the nanoscale, and in Europe via the European Framework Programmes for Research and Technological Development.

By the mid-2000s new and serious scientific attention began to flourish. Projects emerged to produce nanotechnology roadmaps[18][19] which center on atomically precise manipulation of matter and discuss existing and projected capabilities, goals, and applications.

Nanotechnology is the engineering of functional systems at the molecular scale. This covers both current work and concepts that are more advanced. In its original sense, nanotechnology refers to the projected ability to construct items from the bottom up, using techniques and tools being developed today to make complete, high performance products.

One nanometer (nm) is one billionth, or 109, of a meter. By comparison, typical carbon-carbon bond lengths, or the spacing between these atoms in a molecule, are in the range 0.120.15 nm, and a DNA double-helix has a diameter around 2nm. On the other hand, the smallest cellular life-forms, the bacteria of the genus Mycoplasma, are around 200nm in length. By convention, nanotechnology is taken as the scale range 1 to 100 nm following the definition used by the National Nanotechnology Initiative in the US. The lower limit is set by the size of atoms (hydrogen has the smallest atoms, which are approximately a quarter of a nm kinetic diameter) since nanotechnology must build its devices from atoms and molecules. The upper limit is more or less arbitrary but is around the size below which phenomena not observed in larger structures start to become apparent and can be made use of in the nano device.[20] These new phenomena make nanotechnology distinct from devices which are merely miniaturised versions of an equivalent macroscopic device; such devices are on a larger scale and come under the description of microtechnology.[21]

To put that scale in another context, the comparative size of a nanometer to a meter is the same as that of a marble to the size of the earth.[22] Or another way of putting it: a nanometer is the amount an average man’s beard grows in the time it takes him to raise the razor to his face.[22]

Two main approaches are used in nanotechnology. In the “bottom-up” approach, materials and devices are built from molecular components which assemble themselves chemically by principles of molecular recognition.[23] In the “top-down” approach, nano-objects are constructed from larger entities without atomic-level control.[24]

Areas of physics such as nanoelectronics, nanomechanics, nanophotonics and nanoionics have evolved during the last few decades to provide a basic scientific foundation of nanotechnology.

Several phenomena become pronounced as the size of the system decreases. These include statistical mechanical effects, as well as quantum mechanical effects, for example the “quantum size effect” where the electronic properties of solids are altered with great reductions in particle size. This effect does not come into play by going from macro to micro dimensions. However, quantum effects can become significant when the nanometer size range is reached, typically at distances of 100 nanometers or less, the so-called quantum realm. Additionally, a number of physical (mechanical, electrical, optical, etc.) properties change when compared to macroscopic systems. One example is the increase in surface area to volume ratio altering mechanical, thermal and catalytic properties of materials. Diffusion and reactions at nanoscale, nanostructures materials and nanodevices with fast ion transport are generally referred to nanoionics. Mechanical properties of nanosystems are of interest in the nanomechanics research. The catalytic activity of nanomaterials also opens potential risks in their interaction with biomaterials.

Materials reduced to the nanoscale can show different properties compared to what they exhibit on a macroscale, enabling unique applications. For instance, opaque substances can become transparent (copper); stable materials can turn combustible (aluminium); insoluble materials may become soluble (gold). A material such as gold, which is chemically inert at normal scales, can serve as a potent chemical catalyst at nanoscales. Much of the fascination with nanotechnology stems from these quantum and surface phenomena that matter exhibits at the nanoscale.[25]

Modern synthetic chemistry has reached the point where it is possible to prepare small molecules to almost any structure. These methods are used today to manufacture a wide variety of useful chemicals such as pharmaceuticals or commercial polymers. This ability raises the question of extending this kind of control to the next-larger level, seeking methods to assemble these single molecules into supramolecular assemblies consisting of many molecules arranged in a well defined manner.

These approaches utilize the concepts of molecular self-assembly and/or supramolecular chemistry to automatically arrange themselves into some useful conformation through a bottom-up approach. The concept of molecular recognition is especially important: molecules can be designed so that a specific configuration or arrangement is favored due to non-covalent intermolecular forces. The WatsonCrick basepairing rules are a direct result of this, as is the specificity of an enzyme being targeted to a single substrate, or the specific folding of the protein itself. Thus, two or more components can be designed to be complementary and mutually attractive so that they make a more complex and useful whole.

Such bottom-up approaches should be capable of producing devices in parallel and be much cheaper than top-down methods, but could potentially be overwhelmed as the size and complexity of the desired assembly increases. Most useful structures require complex and thermodynamically unlikely arrangements of atoms. Nevertheless, there are many examples of self-assembly based on molecular recognition in biology, most notably WatsonCrick basepairing and enzyme-substrate interactions. The challenge for nanotechnology is whether these principles can be used to engineer new constructs in addition to natural ones.

Molecular nanotechnology, sometimes called molecular manufacturing, describes engineered nanosystems (nanoscale machines) operating on the molecular scale. Molecular nanotechnology is especially associated with the molecular assembler, a machine that can produce a desired structure or device atom-by-atom using the principles of mechanosynthesis. Manufacturing in the context of productive nanosystems is not related to, and should be clearly distinguished from, the conventional technologies used to manufacture nanomaterials such as carbon nanotubes and nanoparticles.

When the term “nanotechnology” was independently coined and popularized by Eric Drexler (who at the time was unaware of an earlier usage by Norio Taniguchi) it referred to a future manufacturing technology based on molecular machine systems. The premise was that molecular scale biological analogies of traditional machine components demonstrated molecular machines were possible: by the countless examples found in biology, it is known that sophisticated, stochastically optimised biological machines can be produced.

It is hoped that developments in nanotechnology will make possible their construction by some other means, perhaps using biomimetic principles. However, Drexler and other researchers[26] have proposed that advanced nanotechnology, although perhaps initially implemented by biomimetic means, ultimately could be based on mechanical engineering principles, namely, a manufacturing technology based on the mechanical functionality of these components (such as gears, bearings, motors, and structural members) that would enable programmable, positional assembly to atomic specification.[27] The physics and engineering performance of exemplar designs were analyzed in Drexler’s book Nanosystems.

In general it is very difficult to assemble devices on the atomic scale, as one has to position atoms on other atoms of comparable size and stickiness. Another view, put forth by Carlo Montemagno,[28] is that future nanosystems will be hybrids of silicon technology and biological molecular machines. Richard Smalley argued that mechanosynthesis are impossible due to the difficulties in mechanically manipulating individual molecules.

This led to an exchange of letters in the ACS publication Chemical & Engineering News in 2003.[29] Though biology clearly demonstrates that molecular machine systems are possible, non-biological molecular machines are today only in their infancy. Leaders in research on non-biological molecular machines are Dr. Alex Zettl and his colleagues at Lawrence Berkeley Laboratories and UC Berkeley.[1] They have constructed at least three distinct molecular devices whose motion is controlled from the desktop with changing voltage: a nanotube nanomotor, a molecular actuator,[30] and a nanoelectromechanical relaxation oscillator.[31] See nanotube nanomotor for more examples.

An experiment indicating that positional molecular assembly is possible was performed by Ho and Lee at Cornell University in 1999. They used a scanning tunneling microscope to move an individual carbon monoxide molecule (CO) to an individual iron atom (Fe) sitting on a flat silver crystal, and chemically bound the CO to the Fe by applying a voltage.

The nanomaterials field includes subfields which develop or study materials having unique properties arising from their nanoscale dimensions.[34]

These seek to arrange smaller components into more complex assemblies.

These seek to create smaller devices by using larger ones to direct their assembly.

These seek to develop components of a desired functionality without regard to how they might be assembled.

These subfields seek to anticipate what inventions nanotechnology might yield, or attempt to propose an agenda along which inquiry might progress. These often take a big-picture view of nanotechnology, with more emphasis on its societal implications than the details of how such inventions could actually be created.

Nanomaterials can be classified in 0D, 1D, 2D and 3D nanomaterials. The dimensionality play a major role in determining the characteristic of nanomaterials including physical, chemical and biological characteristics. With the decrease in dimensionality, an increase in surface-to-volume ratio is observed. This indicate that smaller dimensional nanomaterials have higher surface area compared to 3D nanomaterials. Recently, two dimensional (2D) nanomaterials are extensively investigated for electronic, biomedical, drug delivery and biosensor applications.

There are several important modern developments. The atomic force microscope (AFM) and the Scanning Tunneling Microscope (STM) are two early versions of scanning probes that launched nanotechnology. There are other types of scanning probe microscopy. Although conceptually similar to the scanning confocal microscope developed by Marvin Minsky in 1961 and the scanning acoustic microscope (SAM) developed by Calvin Quate and coworkers in the 1970s, newer scanning probe microscopes have much higher resolution, since they are not limited by the wavelength of sound or light.

The tip of a scanning probe can also be used to manipulate nanostructures (a process called positional assembly). Feature-oriented scanning methodology may be a promising way to implement these nanomanipulations in automatic mode.[52][53] However, this is still a slow process because of low scanning velocity of the microscope.

Various techniques of nanolithography such as optical lithography, X-ray lithography, dip pen nanolithography, electron beam lithography or nanoimprint lithography were also developed. Lithography is a top-down fabrication technique where a bulk material is reduced in size to nanoscale pattern.

Another group of nanotechnological techniques include those used for fabrication of nanotubes and nanowires, those used in semiconductor fabrication such as deep ultraviolet lithography, electron beam lithography, focused ion beam machining, nanoimprint lithography, atomic layer deposition, and molecular vapor deposition, and further including molecular self-assembly techniques such as those employing di-block copolymers. The precursors of these techniques preceded the nanotech era, and are extensions in the development of scientific advancements rather than techniques which were devised with the sole purpose of creating nanotechnology and which were results of nanotechnology research.[54]

The top-down approach anticipates nanodevices that must be built piece by piece in stages, much as manufactured items are made. Scanning probe microscopy is an important technique both for characterization and synthesis of nanomaterials. Atomic force microscopes and scanning tunneling microscopes can be used to look at surfaces and to move atoms around. By designing different tips for these microscopes, they can be used for carving out structures on surfaces and to help guide self-assembling structures. By using, for example, feature-oriented scanning approach, atoms or molecules can be moved around on a surface with scanning probe microscopy techniques.[52][53] At present, it is expensive and time-consuming for mass production but very suitable for laboratory experimentation.

In contrast, bottom-up techniques build or grow larger structures atom by atom or molecule by molecule. These techniques include chemical synthesis, self-assembly and positional assembly. Dual polarisation interferometry is one tool suitable for characterisation of self assembled thin films. Another variation of the bottom-up approach is molecular beam epitaxy or MBE. Researchers at Bell Telephone Laboratories like John R. Arthur. Alfred Y. Cho, and Art C. Gossard developed and implemented MBE as a research tool in the late 1960s and 1970s. Samples made by MBE were key to the discovery of the fractional quantum Hall effect for which the 1998 Nobel Prize in Physics was awarded. MBE allows scientists to lay down atomically precise layers of atoms and, in the process, build up complex structures. Important for research on semiconductors, MBE is also widely used to make samples and devices for the newly emerging field of spintronics.

However, new therapeutic products, based on responsive nanomaterials, such as the ultradeformable, stress-sensitive Transfersome vesicles, are under development and already approved for human use in some countries.[55]

As of August 21, 2008, the Project on Emerging Nanotechnologies estimates that over 800 manufacturer-identified nanotech products are publicly available, with new ones hitting the market at a pace of 34 per week.[17] The project lists all of the products in a publicly accessible online database. Most applications are limited to the use of “first generation” passive nanomaterials which includes titanium dioxide in sunscreen, cosmetics, surface coatings,[56] and some food products; Carbon allotropes used to produce gecko tape; silver in food packaging, clothing, disinfectants and household appliances; zinc oxide in sunscreens and cosmetics, surface coatings, paints and outdoor furniture varnishes; and cerium oxide as a fuel catalyst.[16]

Further applications allow tennis balls to last longer, golf balls to fly straighter, and even bowling balls to become more durable and have a harder surface. Trousers and socks have been infused with nanotechnology so that they will last longer and keep people cool in the summer. Bandages are being infused with silver nanoparticles to heal cuts faster.[57] Video game consoles and personal computers may become cheaper, faster, and contain more memory thanks to nanotechnology.[58] Also, to build structures for on chip computing with light, for example on chip optical quantum information processing, and picosecond transmission of information.[59]

Nanotechnology may have the ability to make existing medical applications cheaper and easier to use in places like the general practitioner’s office and at home.[60] Cars are being manufactured with nanomaterials so they may need fewer metals and less fuel to operate in the future.[61]

Scientists are now turning to nanotechnology in an attempt to develop diesel engines with cleaner exhaust fumes. Platinum is currently used as the diesel engine catalyst in these engines. The catalyst is what cleans the exhaust fume particles. First a reduction catalyst is employed to take nitrogen atoms from NOx molecules in order to free oxygen. Next the oxidation catalyst oxidizes the hydrocarbons and carbon monoxide to form carbon dioxide and water.[62] Platinum is used in both the reduction and the oxidation catalysts.[63] Using platinum though, is inefficient in that it is expensive and unsustainable. Danish company InnovationsFonden invested DKK 15 million in a search for new catalyst substitutes using nanotechnology. The goal of the project, launched in the autumn of 2014, is to maximize surface area and minimize the amount of material required. Objects tend to minimize their surface energy; two drops of water, for example, will join to form one drop and decrease surface area. If the catalyst’s surface area that is exposed to the exhaust fumes is maximized, efficiency of the catalyst is maximized. The team working on this project aims to create nanoparticles that will not merge. Every time the surface is optimized, material is saved. Thus, creating these nanoparticles will increase the effectiveness of the resulting diesel engine catalystin turn leading to cleaner exhaust fumesand will decrease cost. If successful, the team hopes to reduce platinum use by 25%.[64]

Nanotechnology also has a prominent role in the fast developing field of Tissue Engineering. When designing scaffolds, researchers attempt to the mimic the nanoscale features of a Cell’s microenvironment to direct its differentiation down a suitable lineage.[65] For example, when creating scaffolds to support the growth of bone, researchers may mimic osteoclast resorption pits.[66]

Researchers have successfully used DNA origami-based nanobots capable of carrying out logic functions to achieve targeted drug delivery in cockroaches. It is said that the computational power of these nanobots can be scaled up to that of a Commodore 64.[67]

An area of concern is the effect that industrial-scale manufacturing and use of nanomaterials would have on human health and the environment, as suggested by nanotoxicology research. For these reasons, some groups advocate that nanotechnology be regulated by governments. Others counter that overregulation would stifle scientific research and the development of beneficial innovations. Public health research agencies, such as the National Institute for Occupational Safety and Health are actively conducting research on potential health effects stemming from exposures to nanoparticles.[68][69]

Some nanoparticle products may have unintended consequences. Researchers have discovered that bacteriostatic silver nanoparticles used in socks to reduce foot odor are being released in the wash.[70] These particles are then flushed into the waste water stream and may destroy bacteria which are critical components of natural ecosystems, farms, and waste treatment processes.[71]

Public deliberations on risk perception in the US and UK carried out by the Center for Nanotechnology in Society found that participants were more positive about nanotechnologies for energy applications than for health applications, with health applications raising moral and ethical dilemmas such as cost and availability.[72]

Experts, including director of the Woodrow Wilson Center’s Project on Emerging Nanotechnologies David Rejeski, have testified[73] that successful commercialization depends on adequate oversight, risk research strategy, and public engagement. Berkeley, California is currently the only city in the United States to regulate nanotechnology;[74] Cambridge, Massachusetts in 2008 considered enacting a similar law,[75] but ultimately rejected it.[76] Relevant for both research on and application of nanotechnologies, the insurability of nanotechnology is contested.[77] Without state regulation of nanotechnology, the availability of private insurance for potential damages is seen as necessary to ensure that burdens are not socialised implicitly.

Nanofibers are used in several areas and in different products, in everything from aircraft wings to tennis rackets. Inhaling airborne nanoparticles and nanofibers may lead to a number of pulmonary diseases, e.g. fibrosis.[78] Researchers have found that when rats breathed in nanoparticles, the particles settled in the brain and lungs, which led to significant increases in biomarkers for inflammation and stress response[79] and that nanoparticles induce skin aging through oxidative stress in hairless mice.[80][81]

A two-year study at UCLA’s School of Public Health found lab mice consuming nano-titanium dioxide showed DNA and chromosome damage to a degree “linked to all the big killers of man, namely cancer, heart disease, neurological disease and aging”.[82]

A major study published more recently in Nature Nanotechnology suggests some forms of carbon nanotubes a poster child for the “nanotechnology revolution” could be as harmful as asbestos if inhaled in sufficient quantities. Anthony Seaton of the Institute of Occupational Medicine in Edinburgh, Scotland, who contributed to the article on carbon nanotubes said “We know that some of them probably have the potential to cause mesothelioma. So those sorts of materials need to be handled very carefully.”[83] In the absence of specific regulation forthcoming from governments, Paull and Lyons (2008) have called for an exclusion of engineered nanoparticles in food.[84] A newspaper article reports that workers in a paint factory developed serious lung disease and nanoparticles were found in their lungs.[85][86][87][88]

Calls for tighter regulation of nanotechnology have occurred alongside a growing debate related to the human health and safety risks of nanotechnology.[89] There is significant debate about who is responsible for the regulation of nanotechnology. Some regulatory agencies currently cover some nanotechnology products and processes (to varying degrees) by “bolting on” nanotechnology to existing regulations there are clear gaps in these regimes.[90] Davies (2008) has proposed a regulatory road map describing steps to deal with these shortcomings.[91]

Stakeholders concerned by the lack of a regulatory framework to assess and control risks associated with the release of nanoparticles and nanotubes have drawn parallels with bovine spongiform encephalopathy (“mad cow” disease), thalidomide, genetically modified food,[92] nuclear energy, reproductive technologies, biotechnology, and asbestosis. Dr. Andrew Maynard, chief science advisor to the Woodrow Wilson Center’s Project on Emerging Nanotechnologies, concludes that there is insufficient funding for human health and safety research, and as a result there is currently limited understanding of the human health and safety risks associated with nanotechnology.[93] As a result, some academics have called for stricter application of the precautionary principle, with delayed marketing approval, enhanced labelling and additional safety data development requirements in relation to certain forms of nanotechnology.[94][95]

The Royal Society report[14] identified a risk of nanoparticles or nanotubes being released during disposal, destruction and recycling, and recommended that “manufacturers of products that fall under extended producer responsibility regimes such as end-of-life regulations publish procedures outlining how these materials will be managed to minimize possible human and environmental exposure” (p. xiii).

The Center for Nanotechnology in Society has found that people respond to nanotechnologies differently, depending on application with participants in public deliberations more positive about nanotechnologies for energy than health applications suggesting that any public calls for nano regulations may differ by technology sector.[72]

See original here:

Nanotechnology – Wikipedia

What is Nanotechnology? | Nano

Nanotechnology is science, engineering, and technologyconductedat the nanoscale, which is about 1 to 100 nanometers.

Physicist Richard Feynman, the father of nanotechnology.

Nanoscience and nanotechnology are the study and application of extremely small things and can be used across all the other science fields, such as chemistry, biology, physics, materials science, and engineering.

The ideas and concepts behind nanoscience and nanotechnology started with a talk entitled Theres Plenty of Room at the Bottom by physicist Richard Feynman at an American Physical Society meeting at the California Institute of Technology (CalTech) on December 29, 1959, long before the term nanotechnology was used. In his talk, Feynman described a process in which scientists would be able to manipulate and control individual atoms and molecules. Over a decade later, in his explorations of ultraprecision machining, Professor Norio Taniguchi coined the term nanotechnology. It wasn’t until 1981, with the development of the scanning tunneling microscope that could “see” individual atoms, that modern nanotechnology began.

Its hard to imagine just how small nanotechnology is. One nanometer is a billionth of a meter, or 10-9 of a meter. Here are a few illustrative examples:

Nanoscience and nanotechnology involve the ability to see and to control individual atoms and molecules. Everything on Earth is made up of atomsthe food we eat, the clothes we wear, the buildings and houses we live in, and our own bodies.

But something as small as an atom is impossible to see with the naked eye. In fact, its impossible to see with the microscopes typically used in a high school science classes. The microscopes needed to see things at the nanoscale were invented relatively recentlyabout 30 years ago.

Once scientists had the right tools, such as thescanning tunneling microscope (STM)and the atomic force microscope (AFM), the age of nanotechnology was born.

Although modern nanoscience and nanotechnology are quite new, nanoscale materialswereused for centuries. Alternate-sized gold and silver particles created colors in the stained glass windows of medieval churches hundreds of years ago. The artists back then just didnt know that the process they used to create these beautiful works of art actually led to changes in the composition of the materials they were working with.

Today’s scientists andengineers are finding a wide variety of ways to deliberatelymake materials at the nanoscale to take advantage of their enhanced properties such as higher strength, lighter weight,increased control oflight spectrum, and greater chemical reactivity than theirlarger-scale counterparts.

Read the original:

What is Nanotechnology? | Nano

Nanotechnology | Britannica.com

Nanotechnology, the manipulation and manufacture of materials and devices on the scale of atoms or small groups of atoms. The nanoscale is typically measured in nanometres, or billionths of a metre (nanos, the Greek word for dwarf, being the source of the prefix), and materials built at this scale often exhibit distinctive physical and chemical properties due to quantum mechanical effects. Although usable devices this small may be decades away (see microelectromechanical system), techniques for working at the nanoscale have become essential to electronic engineering, and nanoengineered materials have begun to appear in consumer products. For example, billions of microscopic nanowhiskers, each about 10 nanometres in length, have been molecularly hooked onto natural and synthetic fibres to impart stain resistance to clothing and other fabrics; zinc oxide nanocrystals have been used to create invisible sunscreens that block ultraviolet light; and silver nanocrystals have been embedded in bandages to kill bacteria and prevent infection.

Possibilities for the future are numerous. Nanotechnology may make it possible to manufacture lighter, stronger, and programmable materials that require less energy to produce than conventional materials, that produce less waste than with conventional manufacturing, and that promise greater fuel efficiency in land transportation, ships, aircraft, and space vehicles. Nanocoatings for both opaque and translucent surfaces may render them resistant to corrosion, scratches, and radiation. Nanoscale electronic, magnetic, and mechanical devices and systems with unprecedented levels of information processing may be fabricated, as may chemical, photochemical, and biological sensors for protection, health care, manufacturing, and the environment; new photoelectric materials that will enable the manufacture of cost-efficient solar-energy panels; and molecular-semiconductor hybrid devices that may become engines for the next revolution in the information age. The potential for improvements in health, safety, quality of life, and conservation of the environment are vast.

At the same time, significant challenges must be overcome for the benefits of nanotechnology to be realized. Scientists must learn how to manipulate and characterize individual atoms and small groups of atoms reliably. New and improved tools are needed to control the properties and structure of materials at the nanoscale; significant improvements in computer simulations of atomic and molecular structures are essential to the understanding of this realm. Next, new tools and approaches are needed for assembling atoms and molecules into nanoscale systems and for the further assembly of small systems into more-complex objects. Furthermore, nanotechnology products must provide not only improved performance but also lower cost. Finally, without integration of nanoscale objects with systems at the micro- and macroscale (that is, from millionths of a metre up to the millimetre scale), it will be very difficult to exploit many of the unique properties found at the nanoscale.

Nanotechnology is highly interdisciplinary, involving physics, chemistry, biology, materials science, and the full range of the engineering disciplines. The word nanotechnology is widely used as shorthand to refer to both the science and the technology of this emerging field. Narrowly defined, nanoscience concerns a basic understanding of physical, chemical, and biological properties on atomic and near-atomic scales. Nanotechnology, narrowly defined, employs controlled manipulation of these properties to create materials and functional systems with unique capabilities.

In contrast to recent engineering efforts, nature developed nanotechnologies over billions of years, employing enzymes and catalysts to organize with exquisite precision different kinds of atoms and molecules into complex microscopic structures that make life possible. These natural products are built with great efficiency and have impressive capabilities, such as the power to harvest solar energy, to convert minerals and water into living cells, to store and process massive amounts of data using large arrays of nerve cells, and to replicate perfectly billions of bits of information stored in molecules of deoxyribonucleic acid (DNA).

There are two principal reasons for qualitative differences in material behaviour at the nanoscale (traditionally defined as less than 100 nanometres). First, quantum mechanical effects come into play at very small dimensions and lead to new physics and chemistry. Second, a defining feature at the nanoscale is the very large surface-to-volume ratio of these structures. This means that no atom is very far from a surface or interface, and the behaviour of atoms at these higher-energy sites have a significant influence on the properties of the material. For example, the reactivity of a metal catalyst particle generally increases appreciably as its size is reducedmacroscopic gold is chemically inert, whereas at nanoscales gold becomes extremely reactive and catalytic and even melts at a lower temperature. Thus, at nanoscale dimensions material properties depend on and change with size, as well as composition and structure.

Using the processes of nanotechnology, basic industrial production may veer dramatically from the course followed by steel plants and chemical factories of the past. Raw materials will come from the atoms of abundant elementscarbon, hydrogen, and siliconand these will be manipulated into precise configurations to create nanostructured materials that exhibit exactly the right properties for each particular application. For example, carbon atoms can be bonded together in a number of different geometries to create variously a fibre, a tube, a molecular coating, or a wire, all with the superior strength-to-weight ratio of another carbon materialdiamond. Additionally, such material processing need not require smokestacks, power-hungry industrial machinery, or intensive human labour. Instead, it may be accomplished either by growing new structures through some combination of chemical catalysts and synthetic enzymes or by building them through new techniques based on patterning and self-assembly of nanoscale materials into useful predetermined designs. Nanotechnology ultimately may allow people to fabricate almost any type of material or product allowable under the laws of physics and chemistry. While such possibilities seem remote, even approaching natures virtuosity in energy-efficient fabrication would be revolutionary.

Even more revolutionary would be the fabrication of nanoscale machines and devices for incorporation into micro- and macroscale systems. Once again, nature has led the way with the fabrication of both linear and rotary molecular motors. These biological machines carry out such tasks as muscle contraction (in organisms ranging from clams to humans) and shuttling little packets of material around within cells while being powered by the recyclable, energy-efficient fuel adenosine triphosphate. Scientists are only beginning to develop the tools to fabricate functioning systems at such small scales, with most advances based on electronic or magnetic information processing and storage systems. The energy-efficient, reconfigurable, and self-repairing aspects of biological systems are just becoming understood.

The potential impact of nanotechnology processes, machines, and products is expected to be far-reaching, affecting nearly every conceivable information technology, energy source, agricultural product, medical device, pharmaceutical, and material used in manufacturing. Meanwhile, the dimensions of electronic circuits on semiconductors continue to shrink, with minimum feature sizes now reaching the nanorealm, under 100 nanometres. Likewise, magnetic memory materials, which form the basis of hard disk drives, have achieved dramatically greater memory density as a result of nanoscale structuring to exploit new magnetic effects at nanodimensions. These latter two areas represent another major trend, the evolution of critical elements of microtechnology into the realm of nanotechnology to enhance performance. They are immense markets driven by the rapid advance of information technology.

In a lecture in 1959 to the American Physical Society, Theres Plenty of Room at the Bottom, American Nobelist Richard P. Feynman presented his audience with a vision of what could be done with extreme miniaturization. He began his lecture by noting that the Lords Prayer had been written on the head of a pin and asked,

Why cannot we write the entire 24 volumes of the Encyclopdia Britannica on the head of a pin? Lets see what would be involved. The head of a pin is a sixteenth of an inch across. If you magnify it by 25,000 diameters, the area of the head of the pin is then equal to the area of all the pages of the Encyclopdia Britannica. Therefore, all it is necessary to do is to reduce in size all the writing in the Encyclopdia by 25,000 times. Is that possible? The resolving power of the eye is about 1/120 of an inchthat is roughly the diameter of one of the little dots on the fine half-tone reproductions in the Encyclopdia. This, when you demagnify it by 25,000 times, is still 80 angstroms in diameter32 atoms across, in an ordinary metal. In other words, one of those dots still would contain in its area 1,000 atoms. So, each dot can easily be adjusted in size as required by the photoengraving, and there is no question that there is enough room on the head of a pin to put all of the Encyclopdia Britannica.

Feynman was intrigued by biology and pointed out that

cells are very tiny, but they are very active; they manufacture various substances; they walk around; they wiggle; and they do all kinds of marvelous thingsall on a very small scale. Also, they store information. Consider the possibility that we too can make a thing very small which does what we wantthat we can manufacture an object that maneuvers at that level!

He also considered using big tools to make smaller tools that could make yet smaller tools, eventually obtaining nanoscale tools for directly manipulating atoms and molecules. In considering what all this might mean, Feynman declared,

I can hardly doubt that when we have some control of the arrangement of things on a small scale we will get an enormously greater range of possible properties that substances can have, and of different things that we can do.

Perhaps the biggest barrier to following these prophetic thoughts was simply the immediate lack of tools to manipulate and visualize matter at such a small scale. The availability of tools has always been an enabling aspect of the advance of all science and technology, and some of the key tools for nanotechnology are discussed in the next section, Pioneers.

Starting with a 1981 paper in the Proceedings of the National Academy of Sciences and following with two popular books, Engines of Creation (1986) and Nanosystems (1992), American scientist K. Eric Drexler became one of the foremost advocates of nanotechnology. In fact, Drexler was the first person anywhere to receive a Ph.D. in molecular nanotechnology (from the Massachusetts Institute of Technology). In his written works he takes a molecular view of the world and envisions molecular machines doing much of the work of the future. For example, he refers to assemblers, which will manipulate individual atoms to manufacture structures, and replicators, which will be able to make multiple copies of themselves in order to save time dealing with the billions of atoms needed to make objects of useful size. In an article for Encyclopdia Britannicas 1990 Yearbook of Science and the Future, Drexler wrote:

Cells and tissues in the human body are built and maintained by molecular machinery, but sometimes that machinery proves inadequate: viruses multiply, cancer cells spread, or systems age and deteriorate. As one might expect, new molecular machines and computers of subcellular size could support the bodys own mechanisms. Devices containing nanocomputers interfaced to molecular sensors and effectors could serve as an augmented immune system, searching out and destroying viruses and cancer cells. Similar devices programmed as repair machines could enter living cells to edit out viral DNA sequences and repair molecular damage. Such machines would bring surgical control to the molecular level, opening broad new horizons in medicine.

Drexlers futurist visions have stimulated much thought, but the assembler approach has failed to account for the strong influence of atomic and molecular forces (i.e., the chemistry) at such dimensions. The controversy surrounding these popularizations, and the potential dangers of entities such as intelligent replicators (however remote), have stimulated debate over the ethical and societal implications of nanotechnology.

A number of key technological milestones have been achieved by working pioneers. Molecular beam epitaxy, invented by Alfred Cho and John Arthur at Bell Labs in 1968 and developed in the 1970s, enabled the controlled deposition of single atomic layers. This tool provided for nanostructuring in one dimension as atomic layers were grown one upon the next. It subsequently became important in the area of compound semiconductor device fabrication. For example, sandwiching one-nanometre-thick layers of nonmagnetic-sensor materials between magnetic layers in computer disk drives resulted in large increases in storage capacity, and a similar use of nanostructuring resulted in more energy-efficient semiconductor lasers for use in compact disc players.

In 1981 Gerd Binnig and Heinrich Rohrer developed the scanning tunneling microscope at IBMs laboratories in Switzerland. This tool provided a revolutionary advance by enabling scientists to image the position of individual atoms on surfaces. It earned Binnig and Rohrer a Nobel Prize in 1986 and spawned a wide variety of scanning probe tools for nanoscale observations.

The observation of new carbon structures marked another important milestone in the advance of nanotechnology, with Nobel Prizes for the discoverers. In 1985 Robert F. Curl, Jr., Harold W. Kroto, and Richard E. Smalley discovered the first fullerene, the third known form of pure carbon (after diamond and graphite). They named their discovery buckminsterfullerene (buckyball) for its resemblance to the geodesic domes promoted by the American architect R. Buckminster Fuller. Technically called C60 for the 60 carbon atoms that form their hollow spherical structure, buckyballs resemble a football one nanometre in diameter (see figure). In 1991 Sumio Iijima of NEC Corporation in Japan discovered carbon nanotubes, in which the carbon ringlike structures are extended from spheres into long tubes of varying diameter. Taken together, these new structures surprised and excited the imaginations of scientists about the possibilities of forming well-defined nanostructures with unexpected new properties.

The scanning tunneling microscope not only allowed for the imaging of atoms by scanning a sharp probe tip over a surface, but it also allowed atoms to be pushed around on the surface. With a slight bias voltage applied to the probe tip, certain atoms could be made to adhere to the tip used for imaging and then to be released from it. Thus, in 1990 Donald Eigler spelled out the letters of his companys logo, IBM, by moving 35 xenon atoms into place on a nickel surface. This demonstration caught the publics attention because it showed the precision of the emerging nanoscale tools.

At nanoscale dimensions the properties of materials no longer depend solely on composition and structure in the usual sense. Nanomaterials display new phenomena associated with quantized effects and with the preponderance of surfaces and interfaces.

Quantized effects arise in the nanometre regime because the overall dimensions of objects are comparable to the characteristic wavelength for fundamental excitations in materials. For example, electron wave functions (see also de Broglie wave) in semiconductors are typically on the order of 10 to 100 nanometres. Such excitations include the wavelength of electrons, photons, phonons, and magnons, to name a few. These excitations carry the quanta of energy through materials and thus determine the dynamics of their propagation and transformation from one form to another. When the size of structures is comparable to the quanta themselves, it influences how these excitations move through and interact in the material. Small structures may limit flow, create wave interference effects, and otherwise bring into play quantum mechanical selection rules not apparent at larger dimensions.

Quantum mechanical properties for confinement of electrons in one dimension have long been exploited in solid-state electronics. Semiconductor devices are grown with thin layers of differing composition so that electrons (or holes in the case of missing electron charges) can be confined in specific regions of the structure (known as quantum wells). Thin layers with larger energy bandgaps can serve as barriers that restrict the flow of charges to certain conditions under which they can tunnel through these barriersthe basis of resonant tunneling diodes. Superlattices are periodic structures of repeating wells that set up a new set of selection rules which affect the conditions for charges to flow through the structure. Superlattices have been exploited in cascade lasers to achieve far infrared wavelengths. Modern telecommunications is based on semiconductor lasers that exploit the unique properties of quantum wells to achieve specific wavelengths and high efficiency.

The propagation of photons is altered dramatically when the size and periodicity of the transient structure approach the wavelength of visible light (400 to 800 nanometres). When photons propagate through a periodically varying dielectric constantfor example, semiconductor posts surrounded by airquantum mechanical rules define and limit the propagation of the photons depending on their energy (wavelength). This new behaviour is analogous to the quantum mechanical rules that define the motion of electrons through crystals, giving bandgaps for semiconductors. In one dimension, compound semiconductor superlattices can be grown epitaxially with the alternating layers having different dielectric constants, thus providing highly reflective mirrors for specific wavelengths as determined by the repeat distance of layers in the superlattice. These structures are used to provide built-in mirrors for vertical-cavity surface-emitting lasers, which are used in communications applications. In two and three dimensions, periodic structures known as photonic crystals offer additional control over photon propagation.

Photonic crystals are being explored in a variety of materials and periodicities, such as two-dimensional hexagonal arrays of posts fabricated in compound semiconductors or stacked loglike arrays of silicon bars in three dimensions. The dimensions of these structures depend on the wavelength of light being propagated and are typically in the range of a few hundred nanometres for wavelengths in the visible and near infrared. Photonic crystal properties based on nanostructured materials offer the possibility of confining, steering, and separating light by wavelength on unprecedented small scales and of creating new devices such as lasers that require very low currents to initiate lasing (called near-thresholdless lasers). These structures are being extensively investigated as the tools for nanostructuring materials are steadily advancing. Researchers are particularly interested in the infrared wavelengths, where dimensional control is not as stringent as at the shorter visible wavelengths and where optical communications and chemical sensing provide motivation for potential new applications.

Nanoscale materials also have size-dependent magnetic behaviour, mechanical properties, and chemical reactivity. At very small sizes (a few nanometres), magnetic nanoclusters have a single magnetic domain, and the strongly coupled magnetic spins on each atom combine to produce a particle with a single giant spin. For example, the giant spin of a ferromagnetic iron particle rotates freely at room temperature for diameters below about 16 nanometres, an effect termed superparamagnetism. Mechanical properties of nanostructured materials can reach exceptional strengths. As a specific example, the introduction of two-nanometre aluminum oxide precipitates into thin films of pure nickel results in yield strengths increasing from 0.15 to 5 gigapascals, which is more than twice that for a hard bearing steel. Another example of exceptional mechanical properties at the nanoscale is the carbon nanotube, which exhibits great strength and stiffness along its longitudinal axis.

The preponderance of surfaces is a major reason for the change in behaviour of materials at the nanoscale. Since up to half of all the atoms in nanoparticles are surface atoms, properties such as electrical transport are no longer determined by solid-state bulk phenomena. Likewise, the atoms in nanostructures have a higher average energy than atoms in larger structures, because of the large proportion of surface atoms. For example, catalytic materials have a greater chemical activity per atom of exposed surface as the catalyst is reduced in size at the nanoscale. Defects and impurities may be attracted to surfaces and interfaces, and interactions between particles at these small dimensions can depend on the structure and nature of chemical bonding at the surface. Molecular monolayers may be used to change or control surface properties and to mediate the interaction between nanoparticles.

Surfaces and their interactions with molecular structures are basic to all biology. The intersection of nanotechnology and biotechnology offers the possibility of achieving new functions and properties with nanostructured surfaces. In this surface- and interface-dominated regime, biology does an exquisite job of selectively controlling functions through a combination of structure and chemical forces. The transcription of information stored in genes and the selectivity of biochemical reactions based on chemical recognition of complex molecules are examples where interfaces play the key role in establishing nanoscale behaviour. Atomic forces and chemical bonds dominate at these dimensions, while macroscopic effectssuch as convection, turbulence, and momentum (inertial forces)are of little consequence.

As discussed in the section Properties at the nanoscale, material propertieselectrical, optical, magnetic, mechanical, and chemicaldepend on their exact dimensions. This opens the way for development of new and improved materials through manipulation of their nanostructure. Hierarchical assemblies of nanoscale-engineered materials into larger structures, or their incorporation into devices, provide the basis for tailoring radically new materials and machines.

Natures assemblies point the way to improving structural materials. The often-cited abalone seashell provides a beautiful example of how the combination of a hard, brittle inorganic material with nanoscale structuring and a soft, tough organic material can produce a strong, durable nanocompositebasically, these nanocomposites are made of calcium carbonate bricks held together by a glycoprotein glue. New engineered materials are emergingsuch as polymer-clay nanocompositesthat are not only strong and tough but also lightweight and easier to recycle than conventional reinforced plastics. Such improvements in structural materials are particularly important for the transportation industry, where reduced weight directly translates into improved fuel economy. Other improvements can increase safety or decrease the impact on the environment of fabrication and recycling. Further advances, such as truly smart materials that signal their impending failure or are even able to self-repair flaws, may be possible with composites of the future.

Sensors are central to almost all modern control systems. For example, multiple sensors are used in automobiles for such diverse tasks as engine management, emission control, security, safety, comfort, vehicle monitoring, and diagnostics. While such traditional applications for physical sensing generally rely on microscale sensing devices, the advent of nanoscale materials and structures has led to new electronic, photonic, and magnetic nanosensors, sometimes known as smart dust. Because of their small size, nanosensors exhibit unprecedented speed and sensitivity, extending in some cases down to the detection of single molecules. For example, nanowires made of carbon nanotubes, silicon, or other semiconductor materials exhibit exceptional sensitivity to chemical species or biological agents. Electrical current through nanowires can be altered by having molecules attached to their surface that locally perturb their electronic band structure. By means of nanowire surfaces coated with sensor molecules that selectively attach particular species, charge-induced changes in current can be used to detect the presence of those species. This same strategy is adopted for many classes of sensing systems. New types of sensors with ultrahigh sensitivity and specificity will have many applications; for example, sensors that can detect cancerous tumours when they consist of only a few cells would be a very significant advance.

Nanomaterials also make excellent filters for trapping heavy metals and other pollutants from industrial wastewater. One of the greatest potential impacts of nanotechnology on the lives of the majority of people on Earth will be in the area of economical water desalination and purification. Nanomaterials will very likely find important use in fuel cells, bioconversion for energy, bioprocessing of food products, waste remediation, and pollution-control systems.

A recent concern regarding nanoparticles is whether their small sizes and novel properties may pose significant health or environmental risks. In general, ultrafine particlessuch as the carbon in photocopier toners or in soot produced by combustion engines and factorieshave adverse respiratory and cardiovascular effects on people and animals. Studies are under way to determine if specific nanoscale particles pose higher risks that may require special regulatory restrictions. Of particular concern are potential carcinogenic risks from inhaled particles and the possibility for very small nanoparticles to cross the blood-brain barrier to unknown effect. Nanomaterials currently receiving attention from health officials include carbon nanotubes, buckyballs, and cadmium selenide quantum dots. Studies of the absorption through the skin of titanium oxide nanoparticles (used in sunscreens) are also planned. More far-ranging studies of the toxicity, transport, and overall fate of nanoparticles in ecosystems and the environment have not yet been undertaken. Some early animal studies, involving the introduction of very high levels of nanoparticles which resulted in the rapid death of many of the subjects, are quite controversial.

Nanotechnology promises to impact medical treatment in multiple ways. First, advances in nanoscale particle design and fabrication provide new options for drug delivery and drug therapies. More than half of the new drugs developed each year are not water-soluble, which makes their delivery difficult. In the form of nanosized particles, however, these drugs are more readily transported to their destination, and they can be delivered in the conventional form of pills.

More important, nanotechnology may enable drugs to be delivered to precisely the right location in the body and to release drug doses on a predetermined schedule for optimal treatment. The general approach is to attach the drug to a nanosized carrier that will release the medicine in the body over an extended period of time or when specifically triggered to do so. In addition, the surfaces of these nanoscale carriers may be treated to seek out and become localized at a disease sitefor example, attaching to cancerous tumours. One type of molecule of special interest for these applications is an organic dendrimer. A dendrimer is a special class of polymeric molecule that weaves in and out from a hollow central region. These spherical fuzz balls are about the size of a typical protein but cannot unfold like proteins. Interest in dendrimers derives from the ability to tailor their cavity sizes and chemical properties to hold different therapeutic agents. Researchers hope to design different dendrimers that can swell and release their drug on exposure to specifically recognized molecules that indicate a disease target. This same general approach to nanoparticle-directed drug delivery is being explored for other types of nanoparticles as well.

Another approach involves gold-coated nanoshells whose size can be adjusted to absorb light energy at different wavelengths. In particular, infrared light will pass through several centimetres of body tissue, allowing a delicate and precise heating of such capsules in order to release the therapeutic substance within. Furthermore, antibodies may be attached to the outer gold surface of the shells to cause them to bind specifically to certain tumour cells, thereby reducing the damage to surrounding healthy cells.

A second area of intense study in nanomedicine is that of developing new diagnostic tools. Motivation for this work ranges from fundamental biomedical research at the level of single genes or cells to point-of-care applications for health delivery services. With advances in molecular biology, much diagnostic work now focuses on detecting specific biological signatures. These analyses are referred to as bioassays. Examples include studies to determine which genes are active in response to a particular disease or drug therapy. A general approach involves attaching fluorescing dye molecules to the target biomolecules in order to reveal their concentration.

Another approach to bioassays uses semiconductor nanoparticles, such as cadmium selenide, which emit light of a specific wavelength depending on their size. Different-size particles can be tagged to different receptors so that a wider variety of distinct colour tags are available than can be distinguished for dye molecules. The degradation in fluorescence with repeated excitation for dyes is avoided. Furthermore, various-size particles can be encapsulated in latex beads and their resulting wavelengths read like a bar code. This approach, while still in the exploratory stage, would allow for an enormous number of distinct labels for bioassays.

Another nanotechnology variation on bioassays is to attach one half of the single-stranded complementary DNA segment for the genetic sequence to be detected to one set of gold particles and the other half to a second set of gold particles. When the material of interest is present in a solution, the two attachments cause the gold balls to agglomerate, providing a large change in optical properties that can be seen in the colour of the solution. If both halves of the sequence do not match, no agglomeration will occur and no change will be observed.

Approaches that do not involve optical detection techniques are also being explored with nanoparticles. For example, magnetic nanoparticles can be attached to antibodies that in turn recognize and attach to specific biomolecules. The magnetic particles then act as tags and handlebars through which magnetic fields can be used for mixing, extracting, or identifying the attached biomolecules within microlitre- or nanolitre-sized samples. For example, magnetic nanoparticles stay magnetized as a single domain for a significant period, which enables them to be aligned and detected in a magnetic field. In particular, attached antibodymagnetic-nanoparticle combinations rotate slowly and give a distinctive magnetic signal. In contrast, magnetically tagged antibodies that are not attached to the biological material being detected rotate more rapidly and so do not give the same distinctive signal.

Microfluidic systems, or labs-on-chips, have been developed for biochemical assays of minuscule samples. Typically cramming numerous electronic and mechanical components into a portable unit no larger than a credit card, they are especially useful for conducting rapid analysis in the field. While these microfluidic systems primarily operate at the microscale (that is, millionths of a metre), nanotechnology has contributed new concepts and will likely play an increasing role in the future. For example, separation of DNA is sensitive to entropic effects, such as the entropy required to unfold DNA of a given length. A new approach to separating DNA could take advantage of its passage through a nanoscale array of posts or channels such that DNA molecules of different lengths would uncoil at different rates.

Other researchers have focused on detecting signal changes as nanometre-wide DNA strands are threaded through a nanoscale pore. Early studies used pores punched in membranes by viruses; artificially fabricated nanopores are also being tested. By applying an electric potential across the membrane in a liquid cell to pull the DNA through, changes in ion current can be measured as different repeating base units of the molecule pass through the pores. Nanotechnology-enabled advances in the entire area of bioassays will clearly impact health care in many ways, from early detection, rapid clinical analysis, and home monitoring to new understanding of molecular biology and genetic-based treatments for fighting disease.

Another biomedical application of nanotechnology involves assistive devices for people who have lost or lack certain natural capabilities. For example, researchers hope to design retinal implants for vision-impaired individuals. The concept is to implant chips with photodetector arrays to transmit signals from the retina to the brain via the optic nerve. Meaningful spatial information, even if only at a rudimentary level, would be of great assistance to the blind. Such research illustrates the tremendous challenge of designing hybrid systems that work at the interface between inorganic devices and biological systems.

Closely related research involves implanting nanoscale neural probes in brain tissue to activate and control motor functions. This requires effective and stable wiring of many electrodes to neurons. It is exciting because of the possibility of recovery of control for motor-impaired individuals. Studies employing neural stimulation of damaged spinal cords by electrical signals have demonstrated the return of some locomotion. Researchers are also seeking ways to assist in the regeneration and healing of bone, skin, and cartilagefor example, developing synthetic biocompatible or biodegradable structures with nanosized voids that would serve as templates for regenerating specific tissue while delivering chemicals to assist in the repair process. At a more sophisticated level, researchers hope to someday build nanoscale or microscale machines that can repair, assist, or replace more-complex organs.

Semiconductor experts agree that the ongoing shrinkage in conventional electronic devices will inevitably reach fundamental limits due to quantum effects such as tunneling, in which electrons jump out of their prescribed circuit path and create atomic-scale interference between devices. At that point, radical new approaches to data storage and information processing will be required for further advances. For example, radically new systems have been imagined that are based on quantum computing or biomolecular computing.

The use of molecules for electronic devices was suggested by Mark Ratner of Northwestern University and Avi Aviram of IBM as early as the 1970s, but proper nanotechnology tools did not become available until the turn of the 21st century. Wiring up molecules some half a nanometre wide and a few nanometres long remains a major challenge, and an understanding of electrical transport through single molecules is only beginning to emerge. A number of groups have been able to demonstrate molecular switches, for example, that could conceivably be used in computer memory or logic arrays. Current areas of research include mechanisms to guide the selection of molecules, architectures for assembling molecules into nanoscale gates, and three-terminal molecules for transistor-like behaviour. More-radical approaches include DNA computing, where single-stranded DNA on a silicon chip would encode all possible variable values and complementary strand interactions would be used for a parallel processing approach to finding solutions. An area related to molecular electronics is that of organic thin-film transistors and light emitters, which promise new applications such as video displays that can be rolled out like wallpaper and flexible electronic newspapers.

Carbon nanotubes have remarkable electronic, mechanical, and chemical properties. Depending on their specific diameter and the bonding arrangement of their carbon atoms, nanotubes exhibit either metallic or semiconducting behaviour. Electrical conduction within a perfect nanotube is ballistic (negligible scattering), with low thermal dissipation. As a result, a wire made from a nanotube, or a nanowire, can carry much more current than an ordinary metal wire of comparable size. At 1.4 nanometres in diameter, nanotubes are about a hundred times smaller than the gate width of silicon semiconductor devices. In addition to nanowires for conduction, transistors, diodes, and simple logic circuits have been demonstrated by combining metallic and semiconductor carbon nanotubes. Similarly, silicon nanowires have been used to build experimental devices, such as field-effect transistors, bipolar transistors, inverters, light-emitting diodes, sensors, and even simple memory. A major challenge for nanowire circuits, as for molecular electronics, is connecting and integrating these devices into a workable high-density architecture. Ideally, the structure would be grown and assembled in place. Crossbar architectures that combine the function of wires and devices are of particular interest.

At nanoscale dimensions the energy required to add one additional electron to a small island (isolated physical region)for example, through a tunneling barrierbecomes significant. This change in energy provides the basis for devising single-electron transistors. At low temperatures, where thermal fluctuations are small, various single-electron-device nanostructures are readily achievable, and extensive research has been carried out for structures with confined electron flow. However, room-temperature applications will require that sizes be reduced significantly, to the one-nanometre range, to achieve stable operation. For large-scale application with millions of devices, as found in current integrated circuits, the need for structures with very uniform size to maintain uniform device characteristics presents a significant challenge. Also, in this and many new nanodevices being explored, the lack of gain is a serious drawback limiting implementation in large-scale electronic circuits.

Spintronics refers to electronic devices that perform logic operations based on not just the electrical charge of carriers but also their spin. For example, information could be transported or stored through the spin-up or spin-down states of electrons. This is a new area of research, and issues include the injection of spin-polarized carriers, their transport, and their detection. The role of nanoscale structure and electronic properties of the ferromagnetic-semiconductor interface on the spin injection process, the growth of new ferromagnetic semiconductors with nanoscale control, and the possible use of nanostructured features to manipulate spin are all of interest.

Current approaches to information storage and retrieval include high-density, high-speed, solid-state electronic memories, as well as slower (but generally more spacious) magnetic and optical discs (see computer memory). As the minimum feature size for electronic processing approaches 100 nanometres, nanotechnology provides ways to decrease further the bit size of the stored information, thus increasing density and reducing interconnection distances for obtaining still-higher speeds. For example, the basis of the current generation of magnetic disks is the giant magnetoresistance effect. A magnetic read/write head stores bits of information by setting the direction of the magnetic field in nanometre-thick metallic layers that alternate between ferromagnetic and nonferromagnetic. Differences in spin-dependent scattering of electrons at the interface layers lead to resistance differences that can be read by the magnetic head. Mechanical properties, particularly tribology (friction and wear of moving surfaces), also play an important role in magnetic hard disk drives, since magnetic heads float only about 10 nanometres above spinning magnetic disks.

Another approach to information storage that is dependent on designing nanometre-thick magnetic layers is under commercial development. Known as magnetic random access memory (MRAM), a line of electrically switchable magnetic material is separated from a permanently magnetized layer by a nanoscale nonmagnetic interlayer. A resistance change that depends on the relative alignment of the fields is read electrically from a large array of wires through cross lines. MRAM will require a relatively small evolution from conventional semiconductor manufacturing, and it has the added benefit of producing nonvolatile memory (no power or batteries are needed to maintain stored memory states).

Still at an exploratory stage, studies of electrical conduction through molecules have generated interest in their possible use as memory. While still very speculative, molecular and nanowire approaches to memory are intriguing because of the small volume in which the bits of memory are stored and the effectiveness with which biological systems store large amounts of information.

Nanoscale structuring of optical devices, such as vertical-cavity surface-emitting lasers (VCSELs), quantum dot lasers, and photonic crystal materials, is leading to additional advances in communications technology.

VCSELs have nanoscale layers of compound semiconductors epitaxially grown into their structurealternating dielectric layers as mirrors and quantum wells. Quantum wells allow the charge carriers to be confined in well-defined regions and provide the energy conversion into light at desired wavelengths. They are placed in the lasers cavity to confine carriers at the nodes of a standing wave and to tailor the band structure for more efficient radiative recombination. One-dimensional nanotechnology techniques involving precise growth of very thin epitaxial semiconductor layers were developed during the 1990s. Such nanostructuring has enhanced the efficiency of VCSELs and reduced the current required for lasing to start (called the threshold current). Because of improving performance and their compatibility with planar manufacturing technology, VCSELs are fast becoming a preferred laser source in a variety of communications applications.

More recently, the introduction of quantum dots (regions so small that they can be given a single electric charge) into semiconductor lasers has been investigated and found to give additional benefitsboth further reductions in threshold current and narrower line widths. Quantum dots further confine the optical emission modes within a very narrow spectrum and give the lowest threshold current densities for lasing achieved to date in VCSELs. The quantum dots are introduced into the laser during the growth of strained layers, by a process called Stransky-Krastanov growth. They arise because of the lattice mismatch stress and surface tension of the growing film. Improvements in ways to control precisely the resulting quantum dots to a more uniform single size are still being sought.

Photonic crystals provide a new means to control the steering and manipulation of photons based on periodic dielectric lattices with repeat dimensions on the order of the wavelength of light. These materials can have very exotic properties, such as not allowing light within certain wavelengths to be propagated in a material based on the particular periodic structure. Photonic lattices can act as perfect wavelength-selective mirrors to reflect back incident light from all orientations. They provide the basis for optical switching, steering, and wavelength separation on unprecedented small scales. The periodic structures required for these artificial crystals can be configured as both two- and three-dimensional lattices. Optical sources, switches, and routers are being considered, with two-dimensional planar geometries receiving the most attention, because of their greater ease of fabrication.

Another potentially important communications application for nanotechnology is microelectromechanical systems (MEMS), devices sized at the micrometre level (millionths of a metre). MEMS are currently poised to have a major impact on communications via optical switching. In the future, electromechanical devices may shrink to nanodimensions to take advantage of the higher frequencies of mechanical vibration at smaller masses. The natural (resonant) frequency of vibration for small mechanical beams increases as their size decreases, so that little power is needed to drive them as oscillators. Their efficiency is rated by a quality factor, known as Q, which is a ratio of the energy stored per cycle versus the energy dissipated per cycle. The higher the Q, the more precise the absolute frequency of an oscillator. The Q is very high for micro- and nanoscale mechanical oscillators, and these devices can reach very high frequencies (up to microwave frequencies), making them potential low-power replacements for electronic-based oscillators and filters.

Mechanical oscillators have been made from silicon at dimensions of 10 100 nanometres, where more than 10 percent of the atoms are less than one atomic distance from the surface. While highly homogeneous materials can be made at these dimensionsfor example, single-crystal silicon barssurfaces play an increasing role at nanoscales, and energy losses increase, presumably because of surface defects and molecular species absorbed on surfaces.

It is possible to envision even higher frequencies, in what might be viewed as the ultimate in nanomechanical systems, by moving from nanomachined structures to molecular systems. As an example, multiwalled carbon nanotubes are being explored for their mechanical properties. When the ends of the outer nanotube are removed, the inner tube may be pulled partway out from the outer tube where van der Waals forces between the two tubes will supply a restoring force. The inner tube can thus oscillate, sliding back and forth inside the outer tube. The resonant frequency of oscillation for such structures is predicted to be above one gigahertz (one billion cycles per second). It is unknown whether connecting such systems to the macro world and protecting them from surface effects will ever be practical.

See the original post here:

Nanotechnology | Britannica.com

Nanotechnology – Wikipedia

Nanotechnology (“nanotech”) is manipulation of matter on an atomic, molecular, and supramolecular scale. The earliest, widespread description of nanotechnology[1][2] referred to the particular technological goal of precisely manipulating atoms and molecules for fabrication of macroscale products, also now referred to as molecular nanotechnology. A more generalized description of nanotechnology was subsequently established by the National Nanotechnology Initiative, which defines nanotechnology as the manipulation of matter with at least one dimension sized from 1 to 100 nanometers. This definition reflects the fact that quantum mechanical effects are important at this quantum-realm scale, and so the definition shifted from a particular technological goal to a research category inclusive of all types of research and technologies that deal with the special properties of matter which occur below the given size threshold. It is therefore common to see the plural form “nanotechnologies” as well as “nanoscale technologies” to refer to the broad range of research and applications whose common trait is size. Because of the variety of potential applications (including industrial and military), governments have invested billions of dollars in nanotechnology research. Through 2012, the USA has invested $3.7 billion using its National Nanotechnology Initiative, the European Union has invested $1.2 billion, and Japan has invested $750 million.[3]

Nanotechnology as defined by size is naturally very broad, including fields of science as diverse as surface science, organic chemistry, molecular biology, semiconductor physics, energy storage,[4][5] microfabrication,[6] molecular engineering, etc.[7] The associated research and applications are equally diverse, ranging from extensions of conventional device physics to completely new approaches based upon molecular self-assembly,[8] from developing new materials with dimensions on the nanoscale to direct control of matter on the atomic scale.

Scientists currently debate the future implications of nanotechnology. Nanotechnology may be able to create many new materials and devices with a vast range of applications, such as in nanomedicine, nanoelectronics, biomaterials energy production, and consumer products. On the other hand, nanotechnology raises many of the same issues as any new technology, including concerns about the toxicity and environmental impact of nanomaterials,[9] and their potential effects on global economics, as well as speculation about various doomsday scenarios. These concerns have led to a debate among advocacy groups and governments on whether special regulation of nanotechnology is warranted.

The concepts that seeded nanotechnology were first discussed in 1959 by renowned physicist Richard Feynman in his talk There’s Plenty of Room at the Bottom, in which he described the possibility of synthesis via direct manipulation of atoms. The term “nano-technology” was first used by Norio Taniguchi in 1974, though it was not widely known.

Inspired by Feynman’s concepts, K. Eric Drexler used the term “nanotechnology” in his 1986 book Engines of Creation: The Coming Era of Nanotechnology, which proposed the idea of a nanoscale “assembler” which would be able to build a copy of itself and of other items of arbitrary complexity with atomic control. Also in 1986, Drexler co-founded The Foresight Institute (with which he is no longer affiliated) to help increase public awareness and understanding of nanotechnology concepts and implications.

Thus, emergence of nanotechnology as a field in the 1980s occurred through convergence of Drexler’s theoretical and public work, which developed and popularized a conceptual framework for nanotechnology, and high-visibility experimental advances that drew additional wide-scale attention to the prospects of atomic control of matter. In the 1980s, two major breakthroughs sparked the growth of nanotechnology in modern era.

First, the invention of the scanning tunneling microscope in 1981 which provided unprecedented visualization of individual atoms and bonds, and was successfully used to manipulate individual atoms in 1989. The microscope’s developers Gerd Binnig and Heinrich Rohrer at IBM Zurich Research Laboratory received a Nobel Prize in Physics in 1986.[10][11] Binnig, Quate and Gerber also invented the analogous atomic force microscope that year.

Second, Fullerenes were discovered in 1985 by Harry Kroto, Richard Smalley, and Robert Curl, who together won the 1996 Nobel Prize in Chemistry.[12][13] C60 was not initially described as nanotechnology; the term was used regarding subsequent work with related graphene tubes (called carbon nanotubes and sometimes called Bucky tubes) which suggested potential applications for nanoscale electronics and devices.

In the early 2000s, the field garnered increased scientific, political, and commercial attention that led to both controversy and progress. Controversies emerged regarding the definitions and potential implications of nanotechnologies, exemplified by the Royal Society’s report on nanotechnology.[14] Challenges were raised regarding the feasibility of applications envisioned by advocates of molecular nanotechnology, which culminated in a public debate between Drexler and Smalley in 2001 and 2003.[15]

Meanwhile, commercialization of products based on advancements in nanoscale technologies began emerging. These products are limited to bulk applications of nanomaterials and do not involve atomic control of matter. Some examples include the Silver Nano platform for using silver nanoparticles as an antibacterial agent, nanoparticle-based transparent sunscreens, carbon fiber strengthening using silica nanoparticles, and carbon nanotubes for stain-resistant textiles.[16][17]

Governments moved to promote and fund research into nanotechnology, such as in the U.S. with the National Nanotechnology Initiative, which formalized a size-based definition of nanotechnology and established funding for research on the nanoscale, and in Europe via the European Framework Programmes for Research and Technological Development.

By the mid-2000s new and serious scientific attention began to flourish. Projects emerged to produce nanotechnology roadmaps[18][19] which center on atomically precise manipulation of matter and discuss existing and projected capabilities, goals, and applications.

Nanotechnology is the engineering of functional systems at the molecular scale. This covers both current work and concepts that are more advanced. In its original sense, nanotechnology refers to the projected ability to construct items from the bottom up, using techniques and tools being developed today to make complete, high performance products.

One nanometer (nm) is one billionth, or 109, of a meter. By comparison, typical carbon-carbon bond lengths, or the spacing between these atoms in a molecule, are in the range 0.120.15 nm, and a DNA double-helix has a diameter around 2nm. On the other hand, the smallest cellular life-forms, the bacteria of the genus Mycoplasma, are around 200nm in length. By convention, nanotechnology is taken as the scale range 1 to 100 nm following the definition used by the National Nanotechnology Initiative in the US. The lower limit is set by the size of atoms (hydrogen has the smallest atoms, which are approximately a quarter of a nm kinetic diameter) since nanotechnology must build its devices from atoms and molecules. The upper limit is more or less arbitrary but is around the size below which phenomena not observed in larger structures start to become apparent and can be made use of in the nano device.[20] These new phenomena make nanotechnology distinct from devices which are merely miniaturised versions of an equivalent macroscopic device; such devices are on a larger scale and come under the description of microtechnology.[21]

To put that scale in another context, the comparative size of a nanometer to a meter is the same as that of a marble to the size of the earth.[22] Or another way of putting it: a nanometer is the amount an average man’s beard grows in the time it takes him to raise the razor to his face.[22]

Two main approaches are used in nanotechnology. In the “bottom-up” approach, materials and devices are built from molecular components which assemble themselves chemically by principles of molecular recognition.[23] In the “top-down” approach, nano-objects are constructed from larger entities without atomic-level control.[24]

Areas of physics such as nanoelectronics, nanomechanics, nanophotonics and nanoionics have evolved during the last few decades to provide a basic scientific foundation of nanotechnology.

Several phenomena become pronounced as the size of the system decreases. These include statistical mechanical effects, as well as quantum mechanical effects, for example the “quantum size effect” where the electronic properties of solids are altered with great reductions in particle size. This effect does not come into play by going from macro to micro dimensions. However, quantum effects can become significant when the nanometer size range is reached, typically at distances of 100 nanometers or less, the so-called quantum realm. Additionally, a number of physical (mechanical, electrical, optical, etc.) properties change when compared to macroscopic systems. One example is the increase in surface area to volume ratio altering mechanical, thermal and catalytic properties of materials. Diffusion and reactions at nanoscale, nanostructures materials and nanodevices with fast ion transport are generally referred to nanoionics. Mechanical properties of nanosystems are of interest in the nanomechanics research. The catalytic activity of nanomaterials also opens potential risks in their interaction with biomaterials.

Materials reduced to the nanoscale can show different properties compared to what they exhibit on a macroscale, enabling unique applications. For instance, opaque substances can become transparent (copper); stable materials can turn combustible (aluminium); insoluble materials may become soluble (gold). A material such as gold, which is chemically inert at normal scales, can serve as a potent chemical catalyst at nanoscales. Much of the fascination with nanotechnology stems from these quantum and surface phenomena that matter exhibits at the nanoscale.[25]

Modern synthetic chemistry has reached the point where it is possible to prepare small molecules to almost any structure. These methods are used today to manufacture a wide variety of useful chemicals such as pharmaceuticals or commercial polymers. This ability raises the question of extending this kind of control to the next-larger level, seeking methods to assemble these single molecules into supramolecular assemblies consisting of many molecules arranged in a well defined manner.

These approaches utilize the concepts of molecular self-assembly and/or supramolecular chemistry to automatically arrange themselves into some useful conformation through a bottom-up approach. The concept of molecular recognition is especially important: molecules can be designed so that a specific configuration or arrangement is favored due to non-covalent intermolecular forces. The WatsonCrick basepairing rules are a direct result of this, as is the specificity of an enzyme being targeted to a single substrate, or the specific folding of the protein itself. Thus, two or more components can be designed to be complementary and mutually attractive so that they make a more complex and useful whole.

Such bottom-up approaches should be capable of producing devices in parallel and be much cheaper than top-down methods, but could potentially be overwhelmed as the size and complexity of the desired assembly increases. Most useful structures require complex and thermodynamically unlikely arrangements of atoms. Nevertheless, there are many examples of self-assembly based on molecular recognition in biology, most notably WatsonCrick basepairing and enzyme-substrate interactions. The challenge for nanotechnology is whether these principles can be used to engineer new constructs in addition to natural ones.

Molecular nanotechnology, sometimes called molecular manufacturing, describes engineered nanosystems (nanoscale machines) operating on the molecular scale. Molecular nanotechnology is especially associated with the molecular assembler, a machine that can produce a desired structure or device atom-by-atom using the principles of mechanosynthesis. Manufacturing in the context of productive nanosystems is not related to, and should be clearly distinguished from, the conventional technologies used to manufacture nanomaterials such as carbon nanotubes and nanoparticles.

When the term “nanotechnology” was independently coined and popularized by Eric Drexler (who at the time was unaware of an earlier usage by Norio Taniguchi) it referred to a future manufacturing technology based on molecular machine systems. The premise was that molecular scale biological analogies of traditional machine components demonstrated molecular machines were possible: by the countless examples found in biology, it is known that sophisticated, stochastically optimised biological machines can be produced.

It is hoped that developments in nanotechnology will make possible their construction by some other means, perhaps using biomimetic principles. However, Drexler and other researchers[26] have proposed that advanced nanotechnology, although perhaps initially implemented by biomimetic means, ultimately could be based on mechanical engineering principles, namely, a manufacturing technology based on the mechanical functionality of these components (such as gears, bearings, motors, and structural members) that would enable programmable, positional assembly to atomic specification.[27] The physics and engineering performance of exemplar designs were analyzed in Drexler’s book Nanosystems.

In general it is very difficult to assemble devices on the atomic scale, as one has to position atoms on other atoms of comparable size and stickiness. Another view, put forth by Carlo Montemagno,[28] is that future nanosystems will be hybrids of silicon technology and biological molecular machines. Richard Smalley argued that mechanosynthesis are impossible due to the difficulties in mechanically manipulating individual molecules.

This led to an exchange of letters in the ACS publication Chemical & Engineering News in 2003.[29] Though biology clearly demonstrates that molecular machine systems are possible, non-biological molecular machines are today only in their infancy. Leaders in research on non-biological molecular machines are Dr. Alex Zettl and his colleagues at Lawrence Berkeley Laboratories and UC Berkeley.[1] They have constructed at least three distinct molecular devices whose motion is controlled from the desktop with changing voltage: a nanotube nanomotor, a molecular actuator,[30] and a nanoelectromechanical relaxation oscillator.[31] See nanotube nanomotor for more examples.

An experiment indicating that positional molecular assembly is possible was performed by Ho and Lee at Cornell University in 1999. They used a scanning tunneling microscope to move an individual carbon monoxide molecule (CO) to an individual iron atom (Fe) sitting on a flat silver crystal, and chemically bound the CO to the Fe by applying a voltage.

The nanomaterials field includes subfields which develop or study materials having unique properties arising from their nanoscale dimensions.[34]

These seek to arrange smaller components into more complex assemblies.

These seek to create smaller devices by using larger ones to direct their assembly.

These seek to develop components of a desired functionality without regard to how they might be assembled.

These subfields seek to anticipate what inventions nanotechnology might yield, or attempt to propose an agenda along which inquiry might progress. These often take a big-picture view of nanotechnology, with more emphasis on its societal implications than the details of how such inventions could actually be created.

Nanomaterials can be classified in 0D, 1D, 2D and 3D nanomaterials. The dimensionality play a major role in determining the characteristic of nanomaterials including physical, chemical and biological characteristics. With the decrease in dimensionality, an increase in surface-to-volume ratio is observed. This indicate that smaller dimensional nanomaterials have higher surface area compared to 3D nanomaterials. Recently, two dimensional (2D) nanomaterials are extensively investigated for electronic, biomedical, drug delivery and biosensor applications.

There are several important modern developments. The atomic force microscope (AFM) and the Scanning Tunneling Microscope (STM) are two early versions of scanning probes that launched nanotechnology. There are other types of scanning probe microscopy. Although conceptually similar to the scanning confocal microscope developed by Marvin Minsky in 1961 and the scanning acoustic microscope (SAM) developed by Calvin Quate and coworkers in the 1970s, newer scanning probe microscopes have much higher resolution, since they are not limited by the wavelength of sound or light.

The tip of a scanning probe can also be used to manipulate nanostructures (a process called positional assembly). Feature-oriented scanning methodology may be a promising way to implement these nanomanipulations in automatic mode.[52][53] However, this is still a slow process because of low scanning velocity of the microscope.

Various techniques of nanolithography such as optical lithography, X-ray lithography, dip pen nanolithography, electron beam lithography or nanoimprint lithography were also developed. Lithography is a top-down fabrication technique where a bulk material is reduced in size to nanoscale pattern.

Another group of nanotechnological techniques include those used for fabrication of nanotubes and nanowires, those used in semiconductor fabrication such as deep ultraviolet lithography, electron beam lithography, focused ion beam machining, nanoimprint lithography, atomic layer deposition, and molecular vapor deposition, and further including molecular self-assembly techniques such as those employing di-block copolymers. The precursors of these techniques preceded the nanotech era, and are extensions in the development of scientific advancements rather than techniques which were devised with the sole purpose of creating nanotechnology and which were results of nanotechnology research.[54]

The top-down approach anticipates nanodevices that must be built piece by piece in stages, much as manufactured items are made. Scanning probe microscopy is an important technique both for characterization and synthesis of nanomaterials. Atomic force microscopes and scanning tunneling microscopes can be used to look at surfaces and to move atoms around. By designing different tips for these microscopes, they can be used for carving out structures on surfaces and to help guide self-assembling structures. By using, for example, feature-oriented scanning approach, atoms or molecules can be moved around on a surface with scanning probe microscopy techniques.[52][53] At present, it is expensive and time-consuming for mass production but very suitable for laboratory experimentation.

In contrast, bottom-up techniques build or grow larger structures atom by atom or molecule by molecule. These techniques include chemical synthesis, self-assembly and positional assembly. Dual polarisation interferometry is one tool suitable for characterisation of self assembled thin films. Another variation of the bottom-up approach is molecular beam epitaxy or MBE. Researchers at Bell Telephone Laboratories like John R. Arthur. Alfred Y. Cho, and Art C. Gossard developed and implemented MBE as a research tool in the late 1960s and 1970s. Samples made by MBE were key to the discovery of the fractional quantum Hall effect for which the 1998 Nobel Prize in Physics was awarded. MBE allows scientists to lay down atomically precise layers of atoms and, in the process, build up complex structures. Important for research on semiconductors, MBE is also widely used to make samples and devices for the newly emerging field of spintronics.

However, new therapeutic products, based on responsive nanomaterials, such as the ultradeformable, stress-sensitive Transfersome vesicles, are under development and already approved for human use in some countries.[55]

As of August 21, 2008, the Project on Emerging Nanotechnologies estimates that over 800 manufacturer-identified nanotech products are publicly available, with new ones hitting the market at a pace of 34 per week.[17] The project lists all of the products in a publicly accessible online database. Most applications are limited to the use of “first generation” passive nanomaterials which includes titanium dioxide in sunscreen, cosmetics, surface coatings,[56] and some food products; Carbon allotropes used to produce gecko tape; silver in food packaging, clothing, disinfectants and household appliances; zinc oxide in sunscreens and cosmetics, surface coatings, paints and outdoor furniture varnishes; and cerium oxide as a fuel catalyst.[16]

Further applications allow tennis balls to last longer, golf balls to fly straighter, and even bowling balls to become more durable and have a harder surface. Trousers and socks have been infused with nanotechnology so that they will last longer and keep people cool in the summer. Bandages are being infused with silver nanoparticles to heal cuts faster.[57] Video game consoles and personal computers may become cheaper, faster, and contain more memory thanks to nanotechnology.[58] Also, to build structures for on chip computing with light, for example on chip optical quantum information processing, and picosecond transmission of information.[59]

Nanotechnology may have the ability to make existing medical applications cheaper and easier to use in places like the general practitioner’s office and at home.[60] Cars are being manufactured with nanomaterials so they may need fewer metals and less fuel to operate in the future.[61]

Scientists are now turning to nanotechnology in an attempt to develop diesel engines with cleaner exhaust fumes. Platinum is currently used as the diesel engine catalyst in these engines. The catalyst is what cleans the exhaust fume particles. First a reduction catalyst is employed to take nitrogen atoms from NOx molecules in order to free oxygen. Next the oxidation catalyst oxidizes the hydrocarbons and carbon monoxide to form carbon dioxide and water.[62] Platinum is used in both the reduction and the oxidation catalysts.[63] Using platinum though, is inefficient in that it is expensive and unsustainable. Danish company InnovationsFonden invested DKK 15 million in a search for new catalyst substitutes using nanotechnology. The goal of the project, launched in the autumn of 2014, is to maximize surface area and minimize the amount of material required. Objects tend to minimize their surface energy; two drops of water, for example, will join to form one drop and decrease surface area. If the catalyst’s surface area that is exposed to the exhaust fumes is maximized, efficiency of the catalyst is maximized. The team working on this project aims to create nanoparticles that will not merge. Every time the surface is optimized, material is saved. Thus, creating these nanoparticles will increase the effectiveness of the resulting diesel engine catalystin turn leading to cleaner exhaust fumesand will decrease cost. If successful, the team hopes to reduce platinum use by 25%.[64]

Nanotechnology also has a prominent role in the fast developing field of Tissue Engineering. When designing scaffolds, researchers attempt to the mimic the nanoscale features of a Cell’s microenvironment to direct its differentiation down a suitable lineage.[65] For example, when creating scaffolds to support the growth of bone, researchers may mimic osteoclast resorption pits.[66]

Researchers have successfully used DNA origami-based nanobots capable of carrying out logic functions to achieve targeted drug delivery in cockroaches. It is said that the computational power of these nanobots can be scaled up to that of a Commodore 64.[67]

An area of concern is the effect that industrial-scale manufacturing and use of nanomaterials would have on human health and the environment, as suggested by nanotoxicology research. For these reasons, some groups advocate that nanotechnology be regulated by governments. Others counter that overregulation would stifle scientific research and the development of beneficial innovations. Public health research agencies, such as the National Institute for Occupational Safety and Health are actively conducting research on potential health effects stemming from exposures to nanoparticles.[68][69]

Some nanoparticle products may have unintended consequences. Researchers have discovered that bacteriostatic silver nanoparticles used in socks to reduce foot odor are being released in the wash.[70] These particles are then flushed into the waste water stream and may destroy bacteria which are critical components of natural ecosystems, farms, and waste treatment processes.[71]

Public deliberations on risk perception in the US and UK carried out by the Center for Nanotechnology in Society found that participants were more positive about nanotechnologies for energy applications than for health applications, with health applications raising moral and ethical dilemmas such as cost and availability.[72]

Experts, including director of the Woodrow Wilson Center’s Project on Emerging Nanotechnologies David Rejeski, have testified[73] that successful commercialization depends on adequate oversight, risk research strategy, and public engagement. Berkeley, California is currently the only city in the United States to regulate nanotechnology;[74] Cambridge, Massachusetts in 2008 considered enacting a similar law,[75] but ultimately rejected it.[76] Relevant for both research on and application of nanotechnologies, the insurability of nanotechnology is contested.[77] Without state regulation of nanotechnology, the availability of private insurance for potential damages is seen as necessary to ensure that burdens are not socialised implicitly.

Nanofibers are used in several areas and in different products, in everything from aircraft wings to tennis rackets. Inhaling airborne nanoparticles and nanofibers may lead to a number of pulmonary diseases, e.g. fibrosis.[78] Researchers have found that when rats breathed in nanoparticles, the particles settled in the brain and lungs, which led to significant increases in biomarkers for inflammation and stress response[79] and that nanoparticles induce skin aging through oxidative stress in hairless mice.[80][81]

A two-year study at UCLA’s School of Public Health found lab mice consuming nano-titanium dioxide showed DNA and chromosome damage to a degree “linked to all the big killers of man, namely cancer, heart disease, neurological disease and aging”.[82]

A major study published more recently in Nature Nanotechnology suggests some forms of carbon nanotubes a poster child for the “nanotechnology revolution” could be as harmful as asbestos if inhaled in sufficient quantities. Anthony Seaton of the Institute of Occupational Medicine in Edinburgh, Scotland, who contributed to the article on carbon nanotubes said “We know that some of them probably have the potential to cause mesothelioma. So those sorts of materials need to be handled very carefully.”[83] In the absence of specific regulation forthcoming from governments, Paull and Lyons (2008) have called for an exclusion of engineered nanoparticles in food.[84] A newspaper article reports that workers in a paint factory developed serious lung disease and nanoparticles were found in their lungs.[85][86][87][88]

Calls for tighter regulation of nanotechnology have occurred alongside a growing debate related to the human health and safety risks of nanotechnology.[89] There is significant debate about who is responsible for the regulation of nanotechnology. Some regulatory agencies currently cover some nanotechnology products and processes (to varying degrees) by “bolting on” nanotechnology to existing regulations there are clear gaps in these regimes.[90] Davies (2008) has proposed a regulatory road map describing steps to deal with these shortcomings.[91]

Stakeholders concerned by the lack of a regulatory framework to assess and control risks associated with the release of nanoparticles and nanotubes have drawn parallels with bovine spongiform encephalopathy (“mad cow” disease), thalidomide, genetically modified food,[92] nuclear energy, reproductive technologies, biotechnology, and asbestosis. Dr. Andrew Maynard, chief science advisor to the Woodrow Wilson Center’s Project on Emerging Nanotechnologies, concludes that there is insufficient funding for human health and safety research, and as a result there is currently limited understanding of the human health and safety risks associated with nanotechnology.[93] As a result, some academics have called for stricter application of the precautionary principle, with delayed marketing approval, enhanced labelling and additional safety data development requirements in relation to certain forms of nanotechnology.[94][95]

The Royal Society report[14] identified a risk of nanoparticles or nanotubes being released during disposal, destruction and recycling, and recommended that “manufacturers of products that fall under extended producer responsibility regimes such as end-of-life regulations publish procedures outlining how these materials will be managed to minimize possible human and environmental exposure” (p. xiii).

The Center for Nanotechnology in Society has found that people respond to nanotechnologies differently, depending on application with participants in public deliberations more positive about nanotechnologies for energy than health applications suggesting that any public calls for nano regulations may differ by technology sector.[72]

See the article here:

Nanotechnology – Wikipedia

History of nanotechnology – Wikipedia

The history of nanotechnology traces the development of the concepts and experimental work falling under the broad category of nanotechnology. Although nanotechnology is a relatively recent development in scientific research, the development of its central concepts happened over a longer period of time. The emergence of nanotechnology in the 1980s was caused by the convergence of experimental advances such as the invention of the scanning tunneling microscope in 1981 and the discovery of fullerenes in 1985, with the elucidation and popularization of a conceptual framework for the goals of nanotechnology beginning with the 1986 publication of the book Engines of Creation. The field was subject to growing public awareness and controversy in the early 2000s, with prominent debates about both its potential implications as well as the feasibility of the applications envisioned by advocates of molecular nanotechnology, and with governments moving to promote and fund research into nanotechnology. The early 2000s also saw the beginnings of commercial applications of nanotechnology, although these were limited to bulk applications of nanomaterials rather than the transformative applications envisioned by the field. .

The earliest evidence of the use and applications of nanotechnology can be traced back to carbon nanotubes, cementite nanowires found in the microstructure of wootz steel manufactured in ancient India from the time period of 600 BC and exported globally.[1]

Although nanoparticles are associated with modern science, they were used by artisans as far back as the ninth century in Mesopotamia for creating a glittering effect on the surface of pots.[2][3]

In modern times, pottery from the Middle Ages and Renaissance often retains a distinct gold- or copper-colored metallic glitter. This luster is caused by a metallic film that was applied to the transparent surface of a glazing, which contains silver and copper nanoparticles dispersed homogeneously in the glassy matrix of the ceramic glaze. These nanoparticles are created by the artisans by adding copper and silver salts and oxides together with vinegar, ochre, and clay on the surface of previously-glazed pottery. The technique originated in the Muslim world. As Muslims were not allowed to use gold in artistic representations, they sought a way to create a similar effect without using real gold. The solution they found was using luster.[3][4]

The American physicist Richard Feynman lectured, “There’s Plenty of Room at the Bottom,” at an American Physical Society meeting at Caltech on December 29, 1959, which is often held to have provided inspiration for the field of nanotechnology. Feynman had described a process by which the ability to manipulate individual atoms and molecules might be developed, using one set of precise tools to build and operate another proportionally smaller set, so on down to the needed scale. In the course of this, he noted, scaling issues would arise from the changing magnitude of various physical phenomena: gravity would become less important, surface tension and Van der Waals attraction would become more important.[5]

After Feynman’s death, scholars studying the historical development of nanotechnology have concluded that his actual role in catalyzing nanotechnology research was limited, based on recollections from many of the people active in the nascent field in the 1980s and 1990s. Chris Toumey, a cultural anthropologist at the University of South Carolina, found that the published versions of Feynmans talk had a negligible influence in the twenty years after it was first published, as measured by citations in the scientific literature, and not much more influence in the decade after the Scanning Tunneling Microscope was invented in 1981. Subsequently, interest in Plenty of Room in the scientific literature greatly increased in the early 1990s. This is probably because the term nanotechnology gained serious attention just before that time, following its use by K. Eric Drexler in his 1986 book, Engines of Creation: The Coming Era of Nanotechnology, which took the Feynman concept of a billion tiny factories and added the idea that they could make more copies of themselves via computer control instead of control by a human operator; and in a cover article headlined “Nanotechnology”,[6][7] published later that year in a mass-circulation science-oriented magazine, OMNI. Toumeys analysis also includes comments from distinguished scientists in nanotechnology who say that Plenty of Room did not influence their early work, and in fact most of them had not read it until a later date.[8][9]

These and other developments hint that the retroactive rediscovery of Feynmans Plenty of Room gave nanotechnology a packaged history that provided an early date of December 1959, plus a connection to the charisma and genius of Richard Feynman. Feynman’s stature as a Nobel laureate and as an iconic figure in 20th century science surely helped advocates of nanotechnology and provided a valuable intellectual link to the past.[10]

The Japanese scientist called Norio Taniguchi of Tokyo University of Science was first to use the term “nano-technology” in a 1974 conference,[11] to describe semiconductor processes such as thin film deposition and ion beam milling exhibiting characteristic control on the order of a nanometer. His definition was, “‘Nano-technology’ mainly consists of the processing of, separation, consolidation, and deformation of materials by one atom or one molecule.” However, the term was not used again until 1981 when Eric Drexler, who was unaware of Taniguchi’s prior use of the term, published his first paper on nanotechnology in 1981.[12][13][14]

In the 1980s the idea of nanotechnology as a deterministic, rather than stochastic, handling of individual atoms and molecules was conceptually explored in depth by K. Eric Drexler, who promoted the technological significance of nano-scale phenomena and devices through speeches and two influential books.

In 1980, Drexler encountered Feynman’s provocative 1959 talk “There’s Plenty of Room at the Bottom” while preparing his initial scientific paper on the subject, Molecular Engineering: An approach to the development of general capabilities for molecular manipulation, published in the Proceedings of the National Academy of Sciences in 1981.[15] The term “nanotechnology” (which paralleled Taniguchi’s “nano-technology”) was independently applied by Drexler in his 1986 book Engines of Creation: The Coming Era of Nanotechnology, which proposed the idea of a nanoscale “assembler” which would be able to build a copy of itself and of other items of arbitrary complexity. He also first published the term “grey goo” to describe what might happen if a hypothetical self-replicating machine, capable of independent operation, were constructed and released. Drexler’s vision of nanotechnology is often called “Molecular Nanotechnology” (MNT) or “molecular manufacturing.”

His 1991 Ph.D. work at the MIT Media Lab was the first doctoral degree on the topic of molecular nanotechnology and (after some editing) his thesis, “Molecular Machinery and Manufacturing with Applications to Computation,”[16] was published as Nanosystems: Molecular Machinery, Manufacturing, and Computation,[17] which received the Association of American Publishers award for Best Computer Science Book of 1992. Drexler founded the Foresight Institute in 1986 with the mission of “Preparing for nanotechnology. Drexler is no longer a member of the Foresight Institute.[citation needed]

Nanotechnology and nanoscience got a boost in the early 1980s with two major developments: the birth of cluster science and the invention of the scanning tunneling microscope (STM). These developments led to the discovery of fullerenes in 1985 and the structural assignment of carbon nanotubes a few years later

The scanning tunneling microscope, an instrument for imaging surfaces at the atomic level, was developed in 1981 by Gerd Binnig and Heinrich Rohrer at IBM Zurich Research Laboratory, for which they were awarded the Nobel Prize in Physics in 1986.[18][19] Binnig, Calvin Quate and Christoph Gerber invented the first atomic force microscope in 1986. The first commercially available atomic force microscope was introduced in 1989.

IBM researcher Don Eigler was the first to manipulate atoms using a scanning tunneling microscope in 1989. He used 35 Xenon atoms to spell out the IBM logo.[20] He shared the 2010 Kavli Prize in Nanoscience for this work.[21]

Interface and colloid science had existed for nearly a century before they became associated with nanotechnology.[22][23] The first observations and size measurements of nanoparticles had been made during the first decade of the 20th century by Richard Adolf Zsigmondy, winner of the 1925 Nobel Prize in Chemistry, who made a detailed study of gold sols and other nanomaterials with sizes down to 10nm using an ultramicroscope which was capable of visualizing particles much smaller than the light wavelength.[24] Zsigmondy was also the first to use the term “nanometer” explicitly for characterizing particle size. In the 1920s, Irving Langmuir, winner of the 1932 Nobel Prize in Chemistry, and Katharine B. Blodgett introduced the concept of a monolayer, a layer of material one molecule thick. In the early 1950s, Derjaguin and Abrikosova conducted the first measurement of surface forces.[25]

In 1974 the process of atomic layer deposition for depositing uniform thin films one atomic layer at a time was developed and patented by Tuomo Suntola and co-workers in Finland.[26]

In another development, the synthesis and properties of semiconductor nanocrystals were studied. This led to a fast increasing number of semiconductor nanoparticles of quantum dots.

Fullerenes were discovered in 1985 by Harry Kroto, Richard Smalley, and Robert Curl, who together won the 1996 Nobel Prize in Chemistry. Smalley’s research in physical chemistry investigated formation of inorganic and semiconductor clusters using pulsed molecular beams and time of flight mass spectrometry. As a consequence of this expertise, Curl introduced him to Kroto in order to investigate a question about the constituents of astronomical dust. These are carbon rich grains expelled by old stars such as R Corona Borealis. The result of this collaboration was the discovery of C60 and the fullerenes as the third allotropic form of carbon. Subsequent discoveries included the endohedral fullerenes, and the larger family of fullerenes the following year.[27][28]

The discovery of carbon nanotubes is largely attributed to Sumio Iijima of NEC in 1991, although carbon nanotubes have been produced and observed under a variety of conditions prior to 1991.[29] Iijima’s discovery of multi-walled carbon nanotubes in the insoluble material of arc-burned graphite rods in 1991[30] and Mintmire, Dunlap, and White’s independent prediction that if single-walled carbon nanotubes could be made, then they would exhibit remarkable conducting properties[31] helped create the initial buzz that is now associated with carbon nanotubes. Nanotube research accelerated greatly following the independent discoveries[32][33] by Bethune at IBM[34] and Iijima at NEC of single-walled carbon nanotubes and methods to specifically produce them by adding transition-metal catalysts to the carbon in an arc discharge.

In the early 1990s Huffman and Kraetschmer, of the University of Arizona, discovered how to synthesize and purify large quantities of fullerenes. This opened the door to their characterization and functionalization by hundreds of investigators in government and industrial laboratories. Shortly after, rubidium doped C60 was found to be a mid temperature (Tc = 32 K) superconductor. At a meeting of the Materials Research Society in 1992, Dr. T. Ebbesen (NEC) described to a spellbound audience his discovery and characterization of carbon nanotubes. This event sent those in attendance and others downwind of his presentation into their laboratories to reproduce and push those discoveries forward. Using the same or similar tools as those used by Huffman and Kratschmer, hundreds of researchers further developed the field of nanotube-based nanotechnology.

The National Nanotechnology Initiative is a United States federal nanotechnology research and development program. The NNI serves as the central point of communication, cooperation, and collaboration for all Federal agencies engaged in nanotechnology research, bringing together the expertise needed to advance this broad and complex field.”[35] Its goals are to advance a world-class nanotechnology research and development (R&D) program, foster the transfer of new technologies into products for commercial and public benefit, develop and sustain educational resources, a skilled workforce, and the supporting infrastructure and tools to advance nanotechnology, and support responsible development of nanotechnology. The initiative was spearheaded by Mihail Roco, who formally proposed the National Nanotechnology Initiative to the Office of Science and Technology Policy during the Clinton administration in 1999, and was a key architect in its development. He is currently the Senior Advisor for Nanotechnology at the National Science Foundation, as well as the founding chair of the National Science and Technology Council subcommittee on Nanoscale Science, Engineering and Technology.[36]

President Bill Clinton advocated nanotechnology development. In a 21 January 2000 speech[37] at the California Institute of Technology, Clinton said, “Some of our research goals may take twenty or more years to achieve, but that is precisely why there is an important role for the federal government.” Feynman’s stature and concept of atomically precise fabrication played a role in securing funding for nanotechnology research, as mentioned in President Clinton’s speech:

My budget supports a major new National Nanotechnology Initiative, worth $500 million. Caltech is no stranger to the idea of nanotechnology the ability to manipulate matter at the atomic and molecular level. Over 40 years ago, Caltech’s own Richard Feynman asked, “What would happen if we could arrange the atoms one by one the way we want them?”[38]

President George W. Bush further increased funding for nanotechnology. On December 3, 2003 Bush signed into law the 21st Century Nanotechnology Research and Development Act,[39] which authorizes expenditures for five of the participating agencies totaling US$3.63 billion over four years.[40] The NNI budget supplement for Fiscal Year 2009 provides $1.5 billion to the NNI, reflecting steady growth in the nanotechnology investment.[41]

“Why the future doesn’t need us” is an article written by Bill Joy, then Chief Scientist at Sun Microsystems, in the April 2000 issue of Wired magazine. In the article, he argues that “Our most powerful 21st-century technologies robotics, genetic engineering, and nanotech are threatening to make humans an endangered species.” Joy argues that developing technologies provide a much greater danger to humanity than any technology before it has ever presented. In particular, he focuses on genetics, nanotechnology and robotics. He argues that 20th-century technologies of destruction, such as the nuclear bomb, were limited to large governments, due to the complexity and cost of such devices, as well as the difficulty in acquiring the required materials. He also voices concern about increasing computer power. His worry is that computers will eventually become more intelligent than we are, leading to such dystopian scenarios as robot rebellion. He notably quotes the Unabomber on this topic. After the publication of the article, Bill Joy suggested assessing technologies to gauge their implicit dangers, as well as having scientists refuse to work on technologies that have the potential to cause harm.

In the AAAS Science and Technology Policy Yearbook 2001 article titled A Response to Bill Joy and the Doom-and-Gloom Technofuturists, Bill Joy was criticized for having technological tunnel vision on his prediction, by failing to consider social factors.[42] In Ray Kurzweil’s The Singularity Is Near, he questioned the regulation of potentially dangerous technology, asking “Should we tell the millions of people afflicted with cancer and other devastating conditions that we are canceling the development of all bioengineered treatments because there is a risk that these same technologies may someday be used for malevolent purposes?”.

Prey is a 2002 novel by Michael Crichton which features an artificial swarm of nanorobots which develop intelligence and threaten their human inventors. The novel generated concern within the nanotechnology community that the novel could negatively affect public perception of nanotechnology by creating fear of a similar scenario in real life.[43]

Richard Smalley, best known for co-discovering the soccer ball-shaped buckyball molecule and a leading advocate of nanotechnology and its many applications, was an outspoken critic of the idea of molecular assemblers, as advocated by Eric Drexler. In 2001 he introduced scientific objections to them[44] attacking the notion of universal assemblers in a 2001 Scientific American article, leading to a rebuttal later that year from Drexler and colleagues,[45] and eventually to an exchange of open letters in 2003.[46]

Smalley criticized Drexler’s work on nanotechnology as naive, arguing that chemistry is extremely complicated, reactions are hard to control, and that a universal assembler is science fiction. Smalley believed that such assemblers were not physically possible and introduced scientific objections to them. His two principal technical objections, which he had termed the fat fingers problem” and the “sticky fingers problem, argued against the feasibility of molecular assemblers being able to precisely select and place individual atoms. He also believed that Drexlers speculations about apocalyptic dangers of molecular assemblers threaten the public support for development of nanotechnology.

Smalley first argued that “fat fingers” made MNT impossible. He later argued that nanomachines would have to resemble chemical enzymes more than Drexler’s assemblers and could only work in water. He believed these would exclude the possibility of “molecular assemblers” that worked by precision picking and placing of individual atoms. Also, Smalley argued that nearly all of modern chemistry involves reactions that take place in a solvent (usually water), because the small molecules of a solvent contribute many things, such as lowering binding energies for transition states. Since nearly all known chemistry requires a solvent, Smalley felt that Drexler’s proposal to use a high vacuum environment was not feasible.

Smalley also believed that Drexler’s speculations about apocalyptic dangers of self-replicating machines that have been equated with “molecular assemblers” would threaten the public support for development of nanotechnology. To address the debate between Drexler and Smalley regarding molecular assemblers Chemical & Engineering News published a point-counterpoint consisting of an exchange of letters that addressed the issues.[46]

Drexler and coworkers responded to these two issues[45] in a 2001 publication. Drexler and colleagues noted that Drexler never proposed universal assemblers able to make absolutely anything, but instead proposed more limited assemblers able to make a very wide variety of things. They challenged the relevance of Smalley’s arguments to the more specific proposals advanced in Nanosystems. Drexler maintained that both were straw man arguments, and in the case of enzymes, Prof. Klibanov wrote in 1994, “…using an enzyme in organic solvents eliminates several obstacles…”[47] Drexler also addresses this in Nanosystems by showing mathematically that well designed catalysts can provide the effects of a solvent and can fundamentally be made even more efficient than a solvent/enzyme reaction could ever be. Drexler had difficulty in getting Smalley to respond, but in December 2003, Chemical & Engineering News carried a 4-part debate.[46]

Ray Kurzweil spends four pages in his book ‘The Singularity Is Near’ to showing that Richard Smalley’s arguments are not valid, and disputing them point by point. Kurzweil ends by stating that Drexler’s visions are very practicable and even happening already.[48]

The Royal Society and Royal Academy of Engineering’s 2004 report on the implications of nanoscience and nanotechnologies[49] was inspired by Prince Charles’ concerns about nanotechnology, including molecular manufacturing. However, the report spent almost no time on molecular manufacturing.[50] In fact, the word “Drexler” appears only once in the body of the report (in passing), and “molecular manufacturing” or “molecular nanotechnology” not at all. The report covers various risks of nanoscale technologies, such as nanoparticle toxicology. It also provides a useful overview of several nanoscale fields. The report contains an annex (appendix) on grey goo, which cites a weaker variation of Richard Smalley’s contested argument against molecular manufacturing. It concludes that there is no evidence that autonomous, self replicating nanomachines will be developed in the foreseeable future, and suggests that regulators should be more concerned with issues of nanoparticle toxicology.

The early 2000s saw the beginnings of the use of nanotechnology in commercial products, although most applications are limited to the bulk use of passive nanomaterials. Examples include titanium dioxide and zinc oxide nanoparticles in sunscreen, cosmetics and some food products; silver nanoparticles in food packaging, clothing, disinfectants and household appliances such as Silver Nano; carbon nanotubes for stain-resistant textiles; and cerium oxide as a fuel catalyst.[51] As of March 10, 2011, the Project on Emerging Nanotechnologies estimated that over 1300 manufacturer-identified nanotech products are publicly available, with new ones hitting the market at a pace of 34 per week.[52]

The National Science Foundation funded researcher David Berube to study the field of nanotechnology. His findings are published in the monograph Nano-Hype: The Truth Behind the Nanotechnology Buzz. This study concludes that much of what is sold as nanotechnology is in fact a recasting of straightforward materials science, which is leading to a nanotech industry built solely on selling nanotubes, nanowires, and the like which will end up with a few suppliers selling low margin products in huge volumes.” Further applications which require actual manipulation or arrangement of nanoscale components await further research. Though technologies branded with the term ‘nano’ are sometimes little related to and fall far short of the most ambitious and transformative technological goals of the sort in molecular manufacturing proposals, the term still connotes such ideas. According to Berube, there may be a danger that a “nano bubble” will form, or is forming already, from the use of the term by scientists and entrepreneurs to garner funding, regardless of interest in the transformative possibilities of more ambitious and far-sighted work.[53]

Visit link:

History of nanotechnology – Wikipedia

Institute of Bioengineering and Nanotechnology

Professor Jonathan ClaydenSchool of Chemistry, University of Bristol, UK

Tuesday, January 23, 2018 9:00 am to 10:00 am

Discovery Theatrette, Level 4 The Matrix, 30 Biopolis Street, Biopolis

AbstractBiology solves the problem of communicating information through cell membranes by means of conformationally switchable proteins, of which the most important are the G-protein coupled receptors (GPCRs). The lecture will describe the design and synthesis of dynamic foldamers as artificial mimics of GPCRs, with the ultimate aim of controlling function in the interior of an artificial vesicle. Techniques that allow detailed dynamic conformational information to be extracted both in solution and in the membrane phase will be described.

About the SpeakerJonathan Clayden was born in Uganda in 1968, grew up in the county of Essex, in the East of England, and was an undergraduate at Churchill College, Cambridge. In 1992 he completed a PhD at the University of Cambridge with Dr Stuart Warren. After postdoctoral work with Professor Marc Julia at the cole Normale Suprieure in Paris, he moved in 1994 to Manchester as a lecturer. In 2001 he was promoted to full professor, and in 2015 he moved to a position as Professor of Chemistry at the University of Bristol.

His research interests encompass various areas of synthesis and stereochemistry, particularly where conformation has a role to play: asymmetric synthesis, atropisomerism, organolithium chemistry, long-range stereocontrol. He has pioneered the field of dynamic foldamer chemistry for the synthesis of artificial molecules with biomimetic function.

He is a co-author of the widely used textbook Organic Chemistry, and his book Organolithiums: Selectivity for Synthesis was published by Pergamon in 2002.

He has received the Royal Society of Chemistrys Meldola (1997) and Corday Morgan (2003) medals, Stereochemistry Prize (2005), Hickinbottom Fellowship (2006) and Merck Prize (2011), and the Novartis Young European Investigator Award (2004). He held senior research fellowships from the Leverhulme Trust and the Royal Society in 2003-4 and 2009-10 and has held a Royal Society Wolfson Research Merit award and a European Research Council Advanced Investigator Grant (2.5M).

This seminar is free and no registration is required.

View post:

Institute of Bioengineering and Nanotechnology

Nanotechnology – Wikipedia

Nanotechnology (“nanotech”) is manipulation of matter on an atomic, molecular, and supramolecular scale. The earliest, widespread description of nanotechnology[1][2] referred to the particular technological goal of precisely manipulating atoms and molecules for fabrication of macroscale products, also now referred to as molecular nanotechnology. A more generalized description of nanotechnology was subsequently established by the National Nanotechnology Initiative, which defines nanotechnology as the manipulation of matter with at least one dimension sized from 1 to 100 nanometers. This definition reflects the fact that quantum mechanical effects are important at this quantum-realm scale, and so the definition shifted from a particular technological goal to a research category inclusive of all types of research and technologies that deal with the special properties of matter which occur below the given size threshold. It is therefore common to see the plural form “nanotechnologies” as well as “nanoscale technologies” to refer to the broad range of research and applications whose common trait is size. Because of the variety of potential applications (including industrial and military), governments have invested billions of dollars in nanotechnology research. Through 2012, the USA has invested $3.7 billion using its National Nanotechnology Initiative, the European Union has invested $1.2 billion, and Japan has invested $750 million.[3]

Nanotechnology as defined by size is naturally very broad, including fields of science as diverse as surface science, organic chemistry, molecular biology, semiconductor physics, energy storage,[4][5] microfabrication,[6] molecular engineering, etc.[7] The associated research and applications are equally diverse, ranging from extensions of conventional device physics to completely new approaches based upon molecular self-assembly,[8] from developing new materials with dimensions on the nanoscale to direct control of matter on the atomic scale.

Scientists currently debate the future implications of nanotechnology. Nanotechnology may be able to create many new materials and devices with a vast range of applications, such as in nanomedicine, nanoelectronics, biomaterials energy production, and consumer products. On the other hand, nanotechnology raises many of the same issues as any new technology, including concerns about the toxicity and environmental impact of nanomaterials,[9] and their potential effects on global economics, as well as speculation about various doomsday scenarios. These concerns have led to a debate among advocacy groups and governments on whether special regulation of nanotechnology is warranted.

The concepts that seeded nanotechnology were first discussed in 1959 by renowned physicist Richard Feynman in his talk There’s Plenty of Room at the Bottom, in which he described the possibility of synthesis via direct manipulation of atoms. The term “nano-technology” was first used by Norio Taniguchi in 1974, though it was not widely known.

Inspired by Feynman’s concepts, K. Eric Drexler used the term “nanotechnology” in his 1986 book Engines of Creation: The Coming Era of Nanotechnology, which proposed the idea of a nanoscale “assembler” which would be able to build a copy of itself and of other items of arbitrary complexity with atomic control. Also in 1986, Drexler co-founded The Foresight Institute (with which he is no longer affiliated) to help increase public awareness and understanding of nanotechnology concepts and implications.

Thus, emergence of nanotechnology as a field in the 1980s occurred through convergence of Drexler’s theoretical and public work, which developed and popularized a conceptual framework for nanotechnology, and high-visibility experimental advances that drew additional wide-scale attention to the prospects of atomic control of matter. In the 1980s, two major breakthroughs sparked the growth of nanotechnology in modern era.

First, the invention of the scanning tunneling microscope in 1981 which provided unprecedented visualization of individual atoms and bonds, and was successfully used to manipulate individual atoms in 1989. The microscope’s developers Gerd Binnig and Heinrich Rohrer at IBM Zurich Research Laboratory received a Nobel Prize in Physics in 1986.[10][11] Binnig, Quate and Gerber also invented the analogous atomic force microscope that year.

Second, Fullerenes were discovered in 1985 by Harry Kroto, Richard Smalley, and Robert Curl, who together won the 1996 Nobel Prize in Chemistry.[12][13] C60 was not initially described as nanotechnology; the term was used regarding subsequent work with related graphene tubes (called carbon nanotubes and sometimes called Bucky tubes) which suggested potential applications for nanoscale electronics and devices.

In the early 2000s, the field garnered increased scientific, political, and commercial attention that led to both controversy and progress. Controversies emerged regarding the definitions and potential implications of nanotechnologies, exemplified by the Royal Society’s report on nanotechnology.[14] Challenges were raised regarding the feasibility of applications envisioned by advocates of molecular nanotechnology, which culminated in a public debate between Drexler and Smalley in 2001 and 2003.[15]

Meanwhile, commercialization of products based on advancements in nanoscale technologies began emerging. These products are limited to bulk applications of nanomaterials and do not involve atomic control of matter. Some examples include the Silver Nano platform for using silver nanoparticles as an antibacterial agent, nanoparticle-based transparent sunscreens, carbon fiber strengthening using silica nanoparticles, and carbon nanotubes for stain-resistant textiles.[16][17]

Governments moved to promote and fund research into nanotechnology, such as in the U.S. with the National Nanotechnology Initiative, which formalized a size-based definition of nanotechnology and established funding for research on the nanoscale, and in Europe via the European Framework Programmes for Research and Technological Development.

By the mid-2000s new and serious scientific attention began to flourish. Projects emerged to produce nanotechnology roadmaps[18][19] which center on atomically precise manipulation of matter and discuss existing and projected capabilities, goals, and applications.

Nanotechnology is the engineering of functional systems at the molecular scale. This covers both current work and concepts that are more advanced. In its original sense, nanotechnology refers to the projected ability to construct items from the bottom up, using techniques and tools being developed today to make complete, high performance products.

One nanometer (nm) is one billionth, or 109, of a meter. By comparison, typical carbon-carbon bond lengths, or the spacing between these atoms in a molecule, are in the range 0.120.15 nm, and a DNA double-helix has a diameter around 2nm. On the other hand, the smallest cellular life-forms, the bacteria of the genus Mycoplasma, are around 200nm in length. By convention, nanotechnology is taken as the scale range 1 to 100 nm following the definition used by the National Nanotechnology Initiative in the US. The lower limit is set by the size of atoms (hydrogen has the smallest atoms, which are approximately a quarter of a nm kinetic diameter) since nanotechnology must build its devices from atoms and molecules. The upper limit is more or less arbitrary but is around the size below which phenomena not observed in larger structures start to become apparent and can be made use of in the nano device.[20] These new phenomena make nanotechnology distinct from devices which are merely miniaturised versions of an equivalent macroscopic device; such devices are on a larger scale and come under the description of microtechnology.[21]

To put that scale in another context, the comparative size of a nanometer to a meter is the same as that of a marble to the size of the earth.[22] Or another way of putting it: a nanometer is the amount an average man’s beard grows in the time it takes him to raise the razor to his face.[22]

Two main approaches are used in nanotechnology. In the “bottom-up” approach, materials and devices are built from molecular components which assemble themselves chemically by principles of molecular recognition.[23] In the “top-down” approach, nano-objects are constructed from larger entities without atomic-level control.[24]

Areas of physics such as nanoelectronics, nanomechanics, nanophotonics and nanoionics have evolved during the last few decades to provide a basic scientific foundation of nanotechnology.

Several phenomena become pronounced as the size of the system decreases. These include statistical mechanical effects, as well as quantum mechanical effects, for example the “quantum size effect” where the electronic properties of solids are altered with great reductions in particle size. This effect does not come into play by going from macro to micro dimensions. However, quantum effects can become significant when the nanometer size range is reached, typically at distances of 100 nanometers or less, the so-called quantum realm. Additionally, a number of physical (mechanical, electrical, optical, etc.) properties change when compared to macroscopic systems. One example is the increase in surface area to volume ratio altering mechanical, thermal and catalytic properties of materials. Diffusion and reactions at nanoscale, nanostructures materials and nanodevices with fast ion transport are generally referred to nanoionics. Mechanical properties of nanosystems are of interest in the nanomechanics research. The catalytic activity of nanomaterials also opens potential risks in their interaction with biomaterials.

Materials reduced to the nanoscale can show different properties compared to what they exhibit on a macroscale, enabling unique applications. For instance, opaque substances can become transparent (copper); stable materials can turn combustible (aluminium); insoluble materials may become soluble (gold). A material such as gold, which is chemically inert at normal scales, can serve as a potent chemical catalyst at nanoscales. Much of the fascination with nanotechnology stems from these quantum and surface phenomena that matter exhibits at the nanoscale.[25]

Modern synthetic chemistry has reached the point where it is possible to prepare small molecules to almost any structure. These methods are used today to manufacture a wide variety of useful chemicals such as pharmaceuticals or commercial polymers. This ability raises the question of extending this kind of control to the next-larger level, seeking methods to assemble these single molecules into supramolecular assemblies consisting of many molecules arranged in a well defined manner.

These approaches utilize the concepts of molecular self-assembly and/or supramolecular chemistry to automatically arrange themselves into some useful conformation through a bottom-up approach. The concept of molecular recognition is especially important: molecules can be designed so that a specific configuration or arrangement is favored due to non-covalent intermolecular forces. The WatsonCrick basepairing rules are a direct result of this, as is the specificity of an enzyme being targeted to a single substrate, or the specific folding of the protein itself. Thus, two or more components can be designed to be complementary and mutually attractive so that they make a more complex and useful whole.

Such bottom-up approaches should be capable of producing devices in parallel and be much cheaper than top-down methods, but could potentially be overwhelmed as the size and complexity of the desired assembly increases. Most useful structures require complex and thermodynamically unlikely arrangements of atoms. Nevertheless, there are many examples of self-assembly based on molecular recognition in biology, most notably WatsonCrick basepairing and enzyme-substrate interactions. The challenge for nanotechnology is whether these principles can be used to engineer new constructs in addition to natural ones.

Molecular nanotechnology, sometimes called molecular manufacturing, describes engineered nanosystems (nanoscale machines) operating on the molecular scale. Molecular nanotechnology is especially associated with the molecular assembler, a machine that can produce a desired structure or device atom-by-atom using the principles of mechanosynthesis. Manufacturing in the context of productive nanosystems is not related to, and should be clearly distinguished from, the conventional technologies used to manufacture nanomaterials such as carbon nanotubes and nanoparticles.

When the term “nanotechnology” was independently coined and popularized by Eric Drexler (who at the time was unaware of an earlier usage by Norio Taniguchi) it referred to a future manufacturing technology based on molecular machine systems. The premise was that molecular scale biological analogies of traditional machine components demonstrated molecular machines were possible: by the countless examples found in biology, it is known that sophisticated, stochastically optimised biological machines can be produced.

It is hoped that developments in nanotechnology will make possible their construction by some other means, perhaps using biomimetic principles. However, Drexler and other researchers[26] have proposed that advanced nanotechnology, although perhaps initially implemented by biomimetic means, ultimately could be based on mechanical engineering principles, namely, a manufacturing technology based on the mechanical functionality of these components (such as gears, bearings, motors, and structural members) that would enable programmable, positional assembly to atomic specification.[27] The physics and engineering performance of exemplar designs were analyzed in Drexler’s book Nanosystems.

In general it is very difficult to assemble devices on the atomic scale, as one has to position atoms on other atoms of comparable size and stickiness. Another view, put forth by Carlo Montemagno,[28] is that future nanosystems will be hybrids of silicon technology and biological molecular machines. Richard Smalley argued that mechanosynthesis are impossible due to the difficulties in mechanically manipulating individual molecules.

This led to an exchange of letters in the ACS publication Chemical & Engineering News in 2003.[29] Though biology clearly demonstrates that molecular machine systems are possible, non-biological molecular machines are today only in their infancy. Leaders in research on non-biological molecular machines are Dr. Alex Zettl and his colleagues at Lawrence Berkeley Laboratories and UC Berkeley.[1] They have constructed at least three distinct molecular devices whose motion is controlled from the desktop with changing voltage: a nanotube nanomotor, a molecular actuator,[30] and a nanoelectromechanical relaxation oscillator.[31] See nanotube nanomotor for more examples.

An experiment indicating that positional molecular assembly is possible was performed by Ho and Lee at Cornell University in 1999. They used a scanning tunneling microscope to move an individual carbon monoxide molecule (CO) to an individual iron atom (Fe) sitting on a flat silver crystal, and chemically bound the CO to the Fe by applying a voltage.

The nanomaterials field includes subfields which develop or study materials having unique properties arising from their nanoscale dimensions.[34]

These seek to arrange smaller components into more complex assemblies.

These seek to create smaller devices by using larger ones to direct their assembly.

These seek to develop components of a desired functionality without regard to how they might be assembled.

These subfields seek to anticipate what inventions nanotechnology might yield, or attempt to propose an agenda along which inquiry might progress. These often take a big-picture view of nanotechnology, with more emphasis on its societal implications than the details of how such inventions could actually be created.

Nanomaterials can be classified in 0D, 1D, 2D and 3D nanomaterials. The dimensionality play a major role in determining the characteristic of nanomaterials including physical, chemical and biological characteristics. With the decrease in dimensionality, an increase in surface-to-volume ratio is observed. This indicate that smaller dimensional nanomaterials have higher surface area compared to 3D nanomaterials. Recently, two dimensional (2D) nanomaterials are extensively investigated for electronic, biomedical, drug delivery and biosensor applications.

There are several important modern developments. The atomic force microscope (AFM) and the Scanning Tunneling Microscope (STM) are two early versions of scanning probes that launched nanotechnology. There are other types of scanning probe microscopy. Although conceptually similar to the scanning confocal microscope developed by Marvin Minsky in 1961 and the scanning acoustic microscope (SAM) developed by Calvin Quate and coworkers in the 1970s, newer scanning probe microscopes have much higher resolution, since they are not limited by the wavelength of sound or light.

The tip of a scanning probe can also be used to manipulate nanostructures (a process called positional assembly). Feature-oriented scanning methodology may be a promising way to implement these nanomanipulations in automatic mode.[52][53] However, this is still a slow process because of low scanning velocity of the microscope.

Various techniques of nanolithography such as optical lithography, X-ray lithography, dip pen nanolithography, electron beam lithography or nanoimprint lithography were also developed. Lithography is a top-down fabrication technique where a bulk material is reduced in size to nanoscale pattern.

Another group of nanotechnological techniques include those used for fabrication of nanotubes and nanowires, those used in semiconductor fabrication such as deep ultraviolet lithography, electron beam lithography, focused ion beam machining, nanoimprint lithography, atomic layer deposition, and molecular vapor deposition, and further including molecular self-assembly techniques such as those employing di-block copolymers. The precursors of these techniques preceded the nanotech era, and are extensions in the development of scientific advancements rather than techniques which were devised with the sole purpose of creating nanotechnology and which were results of nanotechnology research.[54]

The top-down approach anticipates nanodevices that must be built piece by piece in stages, much as manufactured items are made. Scanning probe microscopy is an important technique both for characterization and synthesis of nanomaterials. Atomic force microscopes and scanning tunneling microscopes can be used to look at surfaces and to move atoms around. By designing different tips for these microscopes, they can be used for carving out structures on surfaces and to help guide self-assembling structures. By using, for example, feature-oriented scanning approach, atoms or molecules can be moved around on a surface with scanning probe microscopy techniques.[52][53] At present, it is expensive and time-consuming for mass production but very suitable for laboratory experimentation.

In contrast, bottom-up techniques build or grow larger structures atom by atom or molecule by molecule. These techniques include chemical synthesis, self-assembly and positional assembly. Dual polarisation interferometry is one tool suitable for characterisation of self assembled thin films. Another variation of the bottom-up approach is molecular beam epitaxy or MBE. Researchers at Bell Telephone Laboratories like John R. Arthur. Alfred Y. Cho, and Art C. Gossard developed and implemented MBE as a research tool in the late 1960s and 1970s. Samples made by MBE were key to the discovery of the fractional quantum Hall effect for which the 1998 Nobel Prize in Physics was awarded. MBE allows scientists to lay down atomically precise layers of atoms and, in the process, build up complex structures. Important for research on semiconductors, MBE is also widely used to make samples and devices for the newly emerging field of spintronics.

However, new therapeutic products, based on responsive nanomaterials, such as the ultradeformable, stress-sensitive Transfersome vesicles, are under development and already approved for human use in some countries.[55]

As of August 21, 2008, the Project on Emerging Nanotechnologies estimates that over 800 manufacturer-identified nanotech products are publicly available, with new ones hitting the market at a pace of 34 per week.[17] The project lists all of the products in a publicly accessible online database. Most applications are limited to the use of “first generation” passive nanomaterials which includes titanium dioxide in sunscreen, cosmetics, surface coatings,[56] and some food products; Carbon allotropes used to produce gecko tape; silver in food packaging, clothing, disinfectants and household appliances; zinc oxide in sunscreens and cosmetics, surface coatings, paints and outdoor furniture varnishes; and cerium oxide as a fuel catalyst.[16]

Further applications allow tennis balls to last longer, golf balls to fly straighter, and even bowling balls to become more durable and have a harder surface. Trousers and socks have been infused with nanotechnology so that they will last longer and keep people cool in the summer. Bandages are being infused with silver nanoparticles to heal cuts faster.[57] Video game consoles and personal computers may become cheaper, faster, and contain more memory thanks to nanotechnology.[58] Also, to build structures for on chip computing with light, for example on chip optical quantum information processing, and picosecond transmission of information.[59]

Nanotechnology may have the ability to make existing medical applications cheaper and easier to use in places like the general practitioner’s office and at home.[60] Cars are being manufactured with nanomaterials so they may need fewer metals and less fuel to operate in the future.[61]

Scientists are now turning to nanotechnology in an attempt to develop diesel engines with cleaner exhaust fumes. Platinum is currently used as the diesel engine catalyst in these engines. The catalyst is what cleans the exhaust fume particles. First a reduction catalyst is employed to take nitrogen atoms from NOx molecules in order to free oxygen. Next the oxidation catalyst oxidizes the hydrocarbons and carbon monoxide to form carbon dioxide and water.[62] Platinum is used in both the reduction and the oxidation catalysts.[63] Using platinum though, is inefficient in that it is expensive and unsustainable. Danish company InnovationsFonden invested DKK 15 million in a search for new catalyst substitutes using nanotechnology. The goal of the project, launched in the autumn of 2014, is to maximize surface area and minimize the amount of material required. Objects tend to minimize their surface energy; two drops of water, for example, will join to form one drop and decrease surface area. If the catalyst’s surface area that is exposed to the exhaust fumes is maximized, efficiency of the catalyst is maximized. The team working on this project aims to create nanoparticles that will not merge. Every time the surface is optimized, material is saved. Thus, creating these nanoparticles will increase the effectiveness of the resulting diesel engine catalystin turn leading to cleaner exhaust fumesand will decrease cost. If successful, the team hopes to reduce platinum use by 25%.[64]

Nanotechnology also has a prominent role in the fast developing field of Tissue Engineering. When designing scaffolds, researchers attempt to the mimic the nanoscale features of a Cell’s microenvironment to direct its differentiation down a suitable lineage.[65] For example, when creating scaffolds to support the growth of bone, researchers may mimic osteoclast resorption pits.[66]

Researchers have successfully used DNA origami-based nanobots capable of carrying out logic functions to achieve targeted drug delivery in cockroaches. It is said that the computational power of these nanobots can be scaled up to that of a Commodore 64.[67]

An area of concern is the effect that industrial-scale manufacturing and use of nanomaterials would have on human health and the environment, as suggested by nanotoxicology research. For these reasons, some groups advocate that nanotechnology be regulated by governments. Others counter that overregulation would stifle scientific research and the development of beneficial innovations. Public health research agencies, such as the National Institute for Occupational Safety and Health are actively conducting research on potential health effects stemming from exposures to nanoparticles.[68][69]

Some nanoparticle products may have unintended consequences. Researchers have discovered that bacteriostatic silver nanoparticles used in socks to reduce foot odor are being released in the wash.[70] These particles are then flushed into the waste water stream and may destroy bacteria which are critical components of natural ecosystems, farms, and waste treatment processes.[71]

Public deliberations on risk perception in the US and UK carried out by the Center for Nanotechnology in Society found that participants were more positive about nanotechnologies for energy applications than for health applications, with health applications raising moral and ethical dilemmas such as cost and availability.[72]

Experts, including director of the Woodrow Wilson Center’s Project on Emerging Nanotechnologies David Rejeski, have testified[73] that successful commercialization depends on adequate oversight, risk research strategy, and public engagement. Berkeley, California is currently the only city in the United States to regulate nanotechnology;[74] Cambridge, Massachusetts in 2008 considered enacting a similar law,[75] but ultimately rejected it.[76] Relevant for both research on and application of nanotechnologies, the insurability of nanotechnology is contested.[77] Without state regulation of nanotechnology, the availability of private insurance for potential damages is seen as necessary to ensure that burdens are not socialised implicitly.

Nanofibers are used in several areas and in different products, in everything from aircraft wings to tennis rackets. Inhaling airborne nanoparticles and nanofibers may lead to a number of pulmonary diseases, e.g. fibrosis.[78] Researchers have found that when rats breathed in nanoparticles, the particles settled in the brain and lungs, which led to significant increases in biomarkers for inflammation and stress response[79] and that nanoparticles induce skin aging through oxidative stress in hairless mice.[80][81]

A two-year study at UCLA’s School of Public Health found lab mice consuming nano-titanium dioxide showed DNA and chromosome damage to a degree “linked to all the big killers of man, namely cancer, heart disease, neurological disease and aging”.[82]

A major study published more recently in Nature Nanotechnology suggests some forms of carbon nanotubes a poster child for the “nanotechnology revolution” could be as harmful as asbestos if inhaled in sufficient quantities. Anthony Seaton of the Institute of Occupational Medicine in Edinburgh, Scotland, who contributed to the article on carbon nanotubes said “We know that some of them probably have the potential to cause mesothelioma. So those sorts of materials need to be handled very carefully.”[83] In the absence of specific regulation forthcoming from governments, Paull and Lyons (2008) have called for an exclusion of engineered nanoparticles in food.[84] A newspaper article reports that workers in a paint factory developed serious lung disease and nanoparticles were found in their lungs.[85][86][87][88]

Calls for tighter regulation of nanotechnology have occurred alongside a growing debate related to the human health and safety risks of nanotechnology.[89] There is significant debate about who is responsible for the regulation of nanotechnology. Some regulatory agencies currently cover some nanotechnology products and processes (to varying degrees) by “bolting on” nanotechnology to existing regulations there are clear gaps in these regimes.[90] Davies (2008) has proposed a regulatory road map describing steps to deal with these shortcomings.[91]

Stakeholders concerned by the lack of a regulatory framework to assess and control risks associated with the release of nanoparticles and nanotubes have drawn parallels with bovine spongiform encephalopathy (“mad cow” disease), thalidomide, genetically modified food,[92] nuclear energy, reproductive technologies, biotechnology, and asbestosis. Dr. Andrew Maynard, chief science advisor to the Woodrow Wilson Center’s Project on Emerging Nanotechnologies, concludes that there is insufficient funding for human health and safety research, and as a result there is currently limited understanding of the human health and safety risks associated with nanotechnology.[93] As a result, some academics have called for stricter application of the precautionary principle, with delayed marketing approval, enhanced labelling and additional safety data development requirements in relation to certain forms of nanotechnology.[94][95]

The Royal Society report[14] identified a risk of nanoparticles or nanotubes being released during disposal, destruction and recycling, and recommended that “manufacturers of products that fall under extended producer responsibility regimes such as end-of-life regulations publish procedures outlining how these materials will be managed to minimize possible human and environmental exposure” (p. xiii).

The Center for Nanotechnology in Society has found that people respond to nanotechnologies differently, depending on application with participants in public deliberations more positive about nanotechnologies for energy than health applications suggesting that any public calls for nano regulations may differ by technology sector.[72]

Continue reading here:

Nanotechnology – Wikipedia

JAS Oceania – JAS Home

More than just another online ordering platform, the new JAS Oceania system gives the trade fast access to the entire JAS stock network for checking price and availability, your online account balance and orders you have in the system.

eJas is set to improve the way trade workshops place product orders and access marketing information in the Automotive Industry.

Access to eJas is available to JAS Oceania account holders only. If you do not have an existing account, please fill in an Account Application Form.

View post:

JAS Oceania – JAS Home

Nineteen Eighty-Four – Wikipedia

Nineteen Eighty-Four, often published as 1984, is a dystopian novel published in 1949 by English author George Orwell.[2][3] The novel is set in the year 1984 when most of the world population have become victims of perpetual war, omnipresent government surveillance and public manipulation.

In the novel, Great Britain (“Airstrip One”) has become a province of a superstate named Oceania. Oceania is ruled by the “Party”, who employ the “Thought Police” to persecute individualism and independent thinking.[4] The Party’s leader is Big Brother, who enjoys an intense cult of personality but may not even exist. The protagonist of the novel, Winston Smith, is a rank-and-file Party member. Smith is an outwardly diligent and skillful worker, but he secretly hates the Party and dreams of rebellion against Big Brother. Smith rebels by entering a forbidden relationship with fellow employee Julia.

As literary political fiction and dystopian science-fiction, Nineteen Eighty-Four is a classic novel in content, plot, and style. Many of its terms and concepts, such as Big Brother, doublethink, thoughtcrime, Newspeak, Room 101, telescreen, 2 + 2 = 5, and memory hole, have entered into common usage since its publication in 1949. Nineteen Eighty-Four popularised the adjective Orwellian, which describes official deception, secret surveillance, brazenly misleading terminology, and manipulation of recorded history by a totalitarian or authoritarian state.[5] In 2005, the novel was chosen by Time magazine as one of the 100 best English-language novels from 1923 to 2005.[6] It was awarded a place on both lists of Modern Library 100 Best Novels, reaching number 13 on the editor’s list, and 6 on the readers’ list.[7] In 2003, the novel was listed at number 8 on the BBC’s survey The Big Read.[8]

Orwell “encapsulate[d] the thesis at the heart of his unforgiving novel” in 1944, the implications of dividing the world up into zones of influence, which had been conjured by the Tehran Conference. Three years later, he wrote most of it on the Scottish island of Jura from 1947 to 1948 despite being seriously ill with tuberculosis.[9][10] On 4 December 1948, he sent the final manuscript to the publisher Secker and Warburg, and Nineteen Eighty-Four was published on 8 June 1949.[11][12] By 1989, it had been translated into 65 languages, more than any other novel in English until then.[13] The title of the novel, its themes, the Newspeak language and the author’s surname are often invoked against control and intrusion by the state, and the adjective Orwellian describes a totalitarian dystopia that is characterised by government control and subjugation of the people.

Orwell’s invented language, Newspeak, satirises hypocrisy and evasion by the state: the Ministry of Love (Miniluv) oversees torture and brainwashing, the Ministry of Plenty (Miniplenty) oversees shortage and rationing, the Ministry of Peace (Minipax) oversees war and atrocity and the Ministry of Truth (Minitrue) oversees propaganda and historical revisionism.

The Last Man in Europe was an early title for the novel, but in a letter dated 22 October 1948 to his publisher Fredric Warburg, eight months before publication, Orwell wrote about hesitating between that title and Nineteen Eighty-Four.[14] Warburg suggested choosing the main title to be the latter, a more commercial one.[15]

In the novel 1985 (1978), Anthony Burgess suggests that Orwell, disillusioned by the onset of the Cold War (194591), intended to call the book 1948. The introduction to the Penguin Books Modern Classics edition of Nineteen Eighty-Four reports that Orwell originally set the novel in 1980 but that he later shifted the date to 1982 and then to 1984. The introduction to the Houghton Mifflin Harcourt edition of Animal Farm and 1984 (2003) reports that the title 1984 was chosen simply as an inversion of the year 1948, the year in which it was being completed, and that the date was meant to give an immediacy and urgency to the menace of totalitarian rule.[16]

Throughout its publication history, Nineteen Eighty-Four has been either banned or legally challenged, as subversive or ideologically corrupting, like Aldous Huxley’s Brave New World (1932), We (1924) by Yevgeny Zamyatin, Darkness at Noon (1940) by Arthur Koestler, Kallocain (1940) by Karin Boye and Fahrenheit 451 (1953) by Ray Bradbury.[17] Some writers consider the Russian dystopian novel We by Zamyatin to have influenced Nineteen Eighty-Four,[18][19] and the novel bears significant similarities in its plot and characters to Darkness at Noon, written years before by Arthur Koestler, who was a personal friend of Orwell.[20]

The novel is in the public domain in Canada,[21] South Africa,[22] Argentina,[23] Australia,[24] and Oman.[25] It will be in the public domain in the United Kingdom, the EU,[26] and Brazil in 2021[27] (70 years after the author’s death), and in the United States in 2044.[28]

Nineteen Eighty-Four is set in Oceania, one of three inter-continental superstates that divided the world after a global war.

Smith’s memories and his reading of the proscribed book, The Theory and Practice of Oligarchical Collectivism by Emmanuel Goldstein, reveal that after the Second World War, the United Kingdom became involved in a war fought in Europe, western Russia, and North America during the early 1950s. Nuclear weapons were used during the war, leading to the destruction of Colchester. London would also suffer widespread aerial raids, leading Winston’s family to take refuge in a London Underground station. Britain fell to civil war, with street fighting in London, before the English Socialist Party, abbreviated as Ingsoc, emerged victorious and formed a totalitarian government in Britain. The British Commonwealth was absorbed by the United States to become Oceania. Eventually Ingsoc emerged to form a totalitarian government in the country.

Simultaneously, the Soviet Union conquered continental Europe and established the second superstate of Eurasia. The third superstate of Eastasia would emerge in the Far East after several decades of fighting. The three superstates wage perpetual war for the remaining unconquered lands of the world in “a rough quadrilateral with its corners at Tangier, Brazzaville, Darwin, and Hong Kong” through constantly shifting alliances. Although each of the three states are said to have sufficient natural resources, the war continues in order to maintain ideological control over the people.

However, due to the fact that Winston barely remembers these events and due to the Party’s manipulation of history, the continuity and accuracy of these events are unclear. Winston himself notes that the Party has claimed credit for inventing helicopters, airplanes and trains, while Julia theorizes that the perpetual bombing of London is merely a false-flag operation designed to convince the populace that a war is occurring. If the official account was accurate, Smith’s strengthening memories and the story of his family’s dissolution suggest that the atomic bombings occurred first, followed by civil war featuring “confused street fighting in London itself” and the societal postwar reorganisation, which the Party retrospectively calls “the Revolution”.

Most of the plot takes place in London, the “chief city of Airstrip One”, the Oceanic province that “had once been called England or Britain”.[29][30] Posters of the Party leader, Big Brother, bearing the caption “BIG BROTHER IS WATCHING YOU”, dominate the city (Winston states it can be found on nearly every house), while the ubiquitous telescreen (transceiving television set) monitors the private and public lives of the populace. Military parades, propaganda films, and public executions are said to be commonplace.

The class hierarchy of Oceania has three levels:

As the government, the Party controls the population with four ministries:

The protagonist Winston Smith, a member of the Outer Party, works in the Records Department of the Ministry of Truth as an editor, revising historical records, to make the past conform to the ever-changing party line and deleting references to unpersons, people who have been “vaporised”, i.e., not only killed by the state but denied existence even in history or memory.

The story of Winston Smith begins on 4 April 1984: “It was a bright cold day in April, and the clocks were striking thirteen.” Yet he is uncertain of the true date, given the regime’s continual rewriting and manipulation of history.[31]

In the year 1984, civilization has been damaged by war, civil conflict, and revolution. Airstrip One (formerly Britain) is a province of Oceania, one of the three totalitarian super-states that rules the world. It is ruled by the “Party” under the ideology of “Ingsoc” and the mysterious leader Big Brother, who has an intense cult of personality. The Party stamps out anyone who does not fully conform to their regime using the Thought Police and constant surveillance, through devices such as Telescreens (two-way televisions).

Winston Smith is a member of the middle class Outer Party. He works at the Ministry of Truth, where he rewrites historical records to conform to the state’s ever-changing version of history. Those who fall out of favour with the Party become “unpersons”, disappearing with all evidence of their existence removed. Winston revises past editions of The Times, while the original documents are destroyed by fire in a “memory hole”. He secretly opposes the Party’s rule and dreams of rebellion. He realizes that he is already a “thoughtcriminal” and likely to be caught one day.

While in a proletarian neighbourhood, he meets an antique shop owner called Mr. Charrington and buys a diary. He uses an alcove to hide it from the Telescreen in his room, and writes thoughts criticising the Party and Big Brother. In the journal, he records his sexual frustration over a young woman maintaining the novel-writing machines at the ministry named Julia, whom Winston is attracted to but suspects is an informant. He also suspects that his superior, an Inner Party official named O’Brien, is a secret agent for an enigmatic underground resistance movement known as the Brotherhood, a group formed by Big Brother’s reviled political rival Emmanuel Goldstein.

The next day, Julia secretly hands Winston a note confessing her love for him. Winston and Julia begin an affair, an act of the rebellion as the Party insists that sex may only be used for reproduction. Winston realizes that she shares his loathing of the Party. They first meet in the country, and later in a rented room above Mr. Charrington’s shop. During his affair with Julia, Winston remembers the disappearance of his family during the civil war of the 1950s and his terse relationship with his ex-wife Katharine. Winston also interacts with his colleague Syme, who is writing a dictionary for a revised version of the English language called Newspeak. After Syme admits that the true purpose of Newspeak is to reduce the capacity of human thought, Winston speculates that Syme will disappear. Not long after, Syme disappears and no one acknowledges his absence.

Weeks later, Winston is approached by O’Brien, who offers Winston a chance to join the Brotherhood. They arrange a meeting at O’Brien’s luxurious flat where both Winston and Julia swear allegiance to the Brotherhood. He sends Winston a copy of The Theory and Practice of Oligarchical Collectivism by Emmanuel Goldstein. Winston and Julia read parts of the book, which explains more about how the Party maintains power, the true meanings of its slogans and the concept of perpetual war. It argues that the Party can be overthrown if proles (proletarians) rise up against it.

Mr. Charrington is revealed to be an agent of the Thought Police. Winston and Julia are captured in the shop and imprisoned in the Ministry of Love. O’Brien reveals that he is loyal to the party, and part of a special sting operation to catch “thoughtcriminals”. Over many months, Winston is tortured and forced to “cure” himself of his “insanity” by changing his own perception to fit the Party line, even if it requires saying that “2 + 2 = 5”. O’Brien openly admits that the Party “is not interested in the good of others; it is interested solely in power.” He says that once Winston is brainwashed into loyalty, he will be released back into society for a period of time, before they execute him. Winston points out that the Party has not managed to make him betray Julia.

O’Brien then takes Winston to Room 101 for the final stage of re-education. The room contains each prisoner’s worst fear, in Winston’s case rats. As a wire cage holding hungry rats is fitted onto his face, Winston shouts “Do it to Julia!”, thus betraying her. After being released, Winston meets Julia in a park. She says that she was also tortured, and both reveal betraying the other. Later, Winston sits alone in a caf as Oceania celebrates a supposed victory over Eurasian armies in Africa, and realizes that “He loved Big Brother.”

Ingsoc (English Socialism) is the predominant ideology and pseudophilosophy of Oceania, and Newspeak is the official language of official documents.

In London, the capital city of Airstrip One, Oceania’s four government ministries are in pyramids (300 m high), the faades of which display the Party’s three slogans. The ministries’ names are the opposite (doublethink) of their true functions: “The Ministry of Peace concerns itself with war, the Ministry of Truth with lies, the Ministry of Love with torture and the Ministry of Plenty with starvation.” (Part II, Chapter IX The Theory and Practice of Oligarchical Collectivism)

The Ministry of Peace supports Oceania’s perpetual war against either of the two other superstates:

The primary aim of modern warfare (in accordance with the principles of doublethink, this aim is simultaneously recognized and not recognized by the directing brains of the Inner Party) is to use up the products of the machine without raising the general standard of living. Ever since the end of the nineteenth century, the problem of what to do with the surplus of consumption goods has been latent in industrial society. At present, when few human beings even have enough to eat, this problem is obviously not urgent, and it might not have become so, even if no artificial processes of destruction had been at work.

The Ministry of Plenty rations and controls food, goods, and domestic production; every fiscal quarter, it publishes false claims of having raised the standard of living, when it has, in fact, reduced rations, availability, and production. The Ministry of Truth substantiates Ministry of Plenty’s claims by revising historical records to report numbers supporting the current, “increased rations”.

The Ministry of Truth controls information: news, entertainment, education, and the arts. Winston Smith works in the Minitrue RecDep (Records Department), “rectifying” historical records to concord with Big Brother’s current pronouncements so that everything the Party says is true.

The Ministry of Love identifies, monitors, arrests, and converts real and imagined dissidents. In Winston’s experience, the dissident is beaten and tortured, and, when near-broken, he is sent to Room 101 to face “the worst thing in the world”until love for Big Brother and the Party replaces dissension.

The keyword here is blackwhite. Like so many Newspeak words, this word has two mutually contradictory meanings. Applied to an opponent, it means the habit of impudently claiming that black is white, in contradiction of the plain facts. Applied to a Party member, it means a loyal willingness to say that black is white when Party discipline demands this. But it means also the ability to believe that black is white, and more, to know that black is white, and to forget that one has ever believed the contrary. This demands a continuous alteration of the past, made possible by the system of thought which really embraces all the rest, and which is known in Newspeak as doublethink. Doublethink is basically the power of holding two contradictory beliefs in one’s mind simultaneously, and accepting both of them.

Three perpetually warring totalitarian super-states control the world:[34]

The perpetual war is fought for control of the “disputed area” lying “between the frontiers of the super-states”, which forms “a rough parallelogram with its corners at Tangier, Brazzaville, Darwin and Hong Kong”,[34] and Northern Africa, the Middle East, India and Indonesia are where the superstates capture and use slave labour. Fighting also takes place between Eurasia and Eastasia in Manchuria, Mongolia and Central Asia, and all three powers battle one another over various Atlantic and Pacific islands.

Goldstein’s book, The Theory and Practice of Oligarchical Collectivism, explains that the superstates’ ideologies are alike and that the public’s ignorance of this fact is imperative so that they might continue believing in the detestability of the opposing ideologies. The only references to the exterior world for the Oceanian citizenry (the Outer Party and the Proles) are Ministry of Truth maps and propaganda to ensure their belief in “the war”.

Winston Smith’s memory and Emmanuel Goldstein’s book communicate some of the history that precipitated the Revolution. Eurasia was formed when the Soviet Union conquered Continental Europe, creating a single state stretching from Portugal to the Bering Strait. Eurasia does not include the British Isles because the United States annexed them along with the rest of the British Empire and Latin America, thus establishing Oceania and gaining control over a quarter of the planet. Eastasia, the last superstate established, emerged only after “a decade of confused fighting”. It includes the Asian lands conquered by China and Japan. Although Eastasia is prevented from matching Eurasia’s size, its larger populace compensates for that handicap.

The annexation of Britain occurred about the same time as the atomic war that provoked civil war, but who fought whom in the war is left unclear. Nuclear weapons fell on Britain; an atomic bombing of Colchester is referenced in the text. Exactly how Ingsoc and its rival systems (Neo-Bolshevism and Death Worship) gained power in their respective countries is also unclear.

While the precise chronology cannot be traced, most of the global societal reorganization occurred between 1945 and the early 1960s. Winston and Julia once meet in the ruins of a church that was destroyed in a nuclear attack “thirty years” earlier, which suggests 1954 as the year of the atomic war that destabilised society and allowed the Party to seize power. It is stated in the novel that the “fourth quarter of 1983” was “also the sixth quarter of the Ninth Three-Year Plan”, which implies that the first quarter of the first three-year plan began in July 1958. By then, the Party was apparently in control of Oceania.

In 1984, there is a perpetual war between Oceania, Eurasia and Eastasia, the superstates that emerged from the global atomic war. The Theory and Practice of Oligarchical Collectivism, by Emmanuel Goldstein, explains that each state is so strong it cannot be defeated, even with the combined forces of two superstates, despite changing alliances. To hide such contradictions, history is rewritten to explain that the (new) alliance always was so; the populaces are accustomed to doublethink and accept it. The war is not fought in Oceanian, Eurasian or Eastasian territory but in the Arctic wastes and in a disputed zone comprising the sea and land from Tangiers (Northern Africa) to Darwin (Australia). At the start, Oceania and Eastasia are allies fighting Eurasia in northern Africa and the Malabar Coast.

That alliance ends and Oceania, allied with Eurasia, fights Eastasia, a change occurring on Hate Week, dedicated to creating patriotic fervour for the Party’s perpetual war. The public are blind to the change; in mid-sentence, an orator changes the name of the enemy from “Eurasia” to “Eastasia” without pause. When the public are enraged at noticing that the wrong flags and posters are displayed, they tear them down; the Party later claims to have captured Africa.

Goldstein’s book explains that the purpose of the unwinnable, perpetual war is to consume human labour and commodities so that the economy of a superstate cannot support economic equality, with a high standard of life for every citizen. By using up most of the produced objects like boots and rations, the proles are kept poor and uneducated and will neither realise what the government is doing nor rebel. Goldstein also details an Oceanian strategy of attacking enemy cities with atomic rockets before invasion but dismisses it as unfeasible and contrary to the war’s purpose; despite the atomic bombing of cities in the 1950s, the superstates stopped it for fear that would imbalance the powers. The military technology in the novel differs little from that of World War II, but strategic bomber aeroplanes are replaced with rocket bombs, helicopters were heavily used as weapons of war (they did not figure in World War II in any form but prototypes) and surface combat units have been all but replaced by immense and unsinkable Floating Fortresses, island-like contraptions concentrating the firepower of a whole naval task force in a single, semi-mobile platform (in the novel, one is said to have been anchored between Iceland and the Faroe Islands, suggesting a preference for sea lane interdiction and denial).

The society of Airstrip One and, according to “The Book”, almost the whole world, lives in poverty: hunger, disease and filth are the norms. Ruined cities and towns are common: the consequence of the civil war, the atomic wars and the purportedly enemy (but possibly false flag) rockets. Social decay and wrecked buildings surround Winston; aside from the ministerial pyramids, little of London was rebuilt. Members of the Outer Party consume synthetic foodstuffs and poor-quality “luxuries” such as oily gin and loosely-packed cigarettes, distributed under the “Victory” brand. (That is a parody of the low-quality Indian-made “Victory” cigarettes, widely smoked in Britain and by British soldiers during World War II. They were smoked because it was easier to import them from India than it was to import American cigarettes from across the Atlantic because of the War of the Atlantic.)

Winston describes something as simple as the repair of a broken pane of glass as requiring committee approval that can take several years and so most of those living in one of the blocks usually do the repairs themselves (Winston himself is called in by Mrs. Parsons to repair her blocked sink). All Outer Party residences include telescreens that serve both as outlets for propaganda and to monitor the Party members; they can be turned down, but they cannot be turned off.

In contrast to their subordinates, the Inner Party upper class of Oceanian society reside in clean and comfortable flats in their own quarter of the city, with pantries well-stocked with foodstuffs such as wine, coffee and sugar, all denied to the general populace.[35] Winston is astonished that the lifts in O’Brien’s building work, the telescreens can be switched off and O’Brien has an Asian manservant, Martin. All members of the Inner Party are attended to by slaves captured in the disputed zone, and “The Book” suggests that many have their own motorcars or even helicopters. Nonetheless, “The Book” makes clear that even the conditions enjoyed by the Inner Party are only “relatively” comfortable, and standards would be regarded as austere by those of the prerevolutionary lite.[36]

The proles live in poverty and are kept sedated with alcohol, pornography and a national lottery whose winnings are never actually paid out; that is obscured by propaganda and the lack of communication within Oceania. At the same time, the proles are freer and less intimidated than the middle-class Outer Party: they are subject to certain levels of monitoring but are not expected to be particularly patriotic. They lack telescreens in their own homes and often jeer at the telescreens that they see. “The Book” indicates that is because the middle class, not the lower class, traditionally starts revolutions. The model demands tight control of the middle class, with ambitious Outer-Party members neutralised via promotion to the Inner Party or “reintegration” by the Ministry of Love, and proles can be allowed intellectual freedom because they lack intellect. Winston nonetheless believes that “the future belonged to the proles”.[37]

The standard of living of the populace is low overall. Consumer goods are scarce, and all those available through official channels are of low quality; for instance, despite the Party regularly reporting increased boot production, more than half of the Oceanian populace goes barefoot. The Party claims that poverty is a necessary sacrifice for the war effort, and “The Book” confirms that to be partially correct since the purpose of perpetual war consumes surplus industrial production. Outer Party members and proles occasionally gain access to better items in the market, which deals in goods that were pilfered from the residences of the Inner Party.[citation needed]

Nineteen Eighty-Four expands upon the subjects summarised in Orwell’s essay “Notes on Nationalism”[38] about the lack of vocabulary needed to explain the unrecognised phenomena behind certain political forces. In Nineteen Eighty-Four, the Party’s artificial, minimalist language ‘Newspeak’ addresses the matter.

O’Brien concludes: “The object of persecution is persecution. The object of torture is torture. The object of power is power.”

In the book, Inner Party member O’Brien describes the Party’s vision of the future:

There will be no curiosity, no enjoyment of the process of life. All competing pleasures will be destroyed. But alwaysdo not forget this, Winstonalways there will be the intoxication of power, constantly increasing and constantly growing subtler. Always, at every moment, there will be the thrill of victory, the sensation of trampling on an enemy who is helpless. If you want a picture of the future, imagine a boot stamping on a human faceforever.

Part III, Chapter III, Nineteen Eighty-Four

A major theme of Nineteen Eighty-Four is censorship, especially in the Ministry of Truth, where photographs are modified and public archives rewritten to rid them of “unpersons” (persons who are erased from history by the Party). On the telescreens, figures for all types of production are grossly exaggerated or simply invented to indicate an ever-growing economy, when the reality is the opposite. One small example of the endless censorship is Winston being charged with the task of eliminating a reference to an unperson in a newspaper article. He proceeds to write an article about Comrade Ogilvy, a made-up party member who displayed great heroism by leaping into the sea from a helicopter so that the dispatches he was carrying would not fall into enemy hands.

The inhabitants of Oceania, particularly the Outer Party members, have no real privacy. Many of them live in apartments equipped with two-way telescreens so that they may be watched or listened to at any time. Similar telescreens are found at workstations and in public places, along with hidden microphones. Written correspondence is routinely opened and read by the government before it is delivered. The Thought Police employ undercover agents, who pose as normal citizens and report any person with subversive tendencies. Children are encouraged to report suspicious persons to the government, and some denounce their parents. Citizens are controlled, and the smallest sign of rebellion, even something so small as a facial expression, can result in immediate arrest and imprisonment. Thus, citizens, particularly party members, are compelled to obedience.

“The Principles of Newspeak” is an academic essay appended to the novel. It describes the development of Newspeak, the Party’s minimalist artificial language meant to ideologically align thought and action with the principles of Ingsoc by making “all other modes of thought impossible”. (A linguistic theory about how language may direct thought is the SapirWhorf hypothesis.)

Whether or not the Newspeak appendix implies a hopeful end to Nineteen Eighty-Four remains a critical debate, as it is in Standard English and refers to Newspeak, Ingsoc, the Party etc., in the past tense: “Relative to our own, the Newspeak vocabulary was tiny, and new ways of reducing it were constantly being devised” p.422). Some critics (Atwood,[39] Benstead,[40] Milner,[41] Pynchon[42]) claim that for the essay’s author, both Newspeak and the totalitarian government are in the past.

Nineteen Eighty-Four uses themes from life in the Soviet Union and wartime life in Great Britain as sources for many of its motifs. Some time at an unspecified date after the first American publication of the book, producer Sidney Sheldon wrote to Orwell interested in adapting the novel to the Broadway stage. Orwell sold the American stage rights to Sheldon, explaining that his basic goal with Nineteen Eighty-Four was imagining the consequences of Stalinist government ruling British society:

[Nineteen Eighty-Four] was based chiefly on communism, because that is the dominant form of totalitarianism, but I was trying chiefly to imagine what communism would be like if it were firmly rooted in the English speaking countries, and was no longer a mere extension of the Russian Foreign Office.[43]

The statement “2 + 2 = 5”, used to torment Winston Smith during his interrogation, was a communist party slogan from the second five-year plan, which encouraged fulfillment of the five-year plan in four years. The slogan was seen in electric lights on Moscow house-fronts, billboards and elsewhere.[44]

The switch of Oceania’s allegiance from Eastasia to Eurasia and the subsequent rewriting of history (“Oceania was at war with Eastasia: Oceania had always been at war with Eastasia. A large part of the political literature of five years was now completely obsolete”; ch 9) is evocative of the Soviet Union’s changing relations with Nazi Germany. The two nations were open and frequently vehement critics of each other until the signing of the 1939 Treaty of Non-Aggression. Thereafter, and continuing until the Nazi invasion of the Soviet Union in 1941, no criticism of Germany was allowed in the Soviet press, and all references to prior party lines stoppedincluding in the majority of non-Russian communist parties who tended to follow the Russian line. Orwell had criticised the Communist Party of Great Britain for supporting the Treaty in his essays for Betrayal of the Left (1941). “The Hitler-Stalin pact of August 1939 reversed the Soviet Union’s stated foreign policy. It was too much for many of the fellow-travellers like Gollancz [Orwell’s sometime publisher] who had put their faith in a strategy of construction Popular Front governments and the peace bloc between Russia, Britain and France.”[45]

The description of Emmanuel Goldstein, with a “small, goatee beard”, evokes the image of Leon Trotsky. The film of Goldstein during the Two Minutes Hate is described as showing him being transformed into a bleating sheep. This image was used in a propaganda film during the Kino-eye period of Soviet film, which showed Trotsky transforming into a goat.[46] Goldstein’s book is similar to Trotsky’s highly critical analysis of the USSR, The Revolution Betrayed, published in 1936.

The omnipresent images of Big Brother, a man described as having a moustache, bears resemblance to the cult of personality built up around Joseph Stalin.

The news in Oceania emphasised production figures, just as it did in the Soviet Union, where record-setting in factories (by “Heroes of Socialist Labor”) was especially glorified. The best known of these was Alexey Stakhanov, who purportedly set a record for coal mining in 1935.

The tortures of the Ministry of Love evoke the procedures used by the NKVD in their interrogations,[47] including the use of rubber truncheons, being forbidden to put your hands in your pockets, remaining in brightly lit rooms for days, torture through the use of provoked rodents, and the victim being shown a mirror after their physical collapse.

The random bombing of Airstrip One is based on the Buzz bombs and the V-2 rocket, which struck England at random in 19441945.

The Thought Police is based on the NKVD, which arrested people for random “anti-soviet” remarks.[48] The Thought Crime motif is drawn from Kempeitai, the Japanese wartime secret police, who arrested people for “unpatriotic” thoughts.

The confessions of the “Thought Criminals” Rutherford, Aaronson and Jones are based on the show trials of the 1930s, which included fabricated confessions by prominent Bolsheviks Nikolai Bukharin, Grigory Zinoviev and Lev Kamenev to the effect that they were being paid by the Nazi government to undermine the Soviet regime under Leon Trotsky’s direction.

The song “Under the Spreading Chestnut Tree” (“Under the spreading chestnut tree, I sold you, and you sold me”) was based on an old English song called “Go no more a-rushing” (“Under the spreading chestnut tree, Where I knelt upon my knee, We were as happy as could be, ‘Neath the spreading chestnut tree.”). The song was published as early as 1891. The song was a popular camp song in the 1920s, sung with corresponding movements (like touching your chest when you sing “chest”, and touching your head when you sing “nut”). Glenn Miller recorded the song in 1939.[49]

The “Hates” (Two Minutes Hate and Hate Week) were inspired by the constant rallies sponsored by party organs throughout the Stalinist period. These were often short pep-talks given to workers before their shifts began (Two Minutes Hate), but could also last for days, as in the annual celebrations of the anniversary of the October revolution (Hate Week).

Orwell fictionalized “newspeak”, “doublethink”, and “Ministry of Truth” as evinced by both the Soviet press and that of Nazi Germany.[50] In particular, he adapted Soviet ideological discourse constructed to ensure that public statements could not be questioned.[51]

Winston Smith’s job, “revising history” (and the “unperson” motif) are based on the Stalinist habit of airbrushing images of ‘fallen’ people from group photographs and removing references to them in books and newspapers.[53] In one well-known example, the Soviet encyclopaedia had an article about Lavrentiy Beria. When he fell in 1953, and was subsequently executed, institutes that had the encyclopaedia were sent an article about the Bering Strait, with instructions to paste it over the article about Beria.[54]

Big Brother’s “Orders of the Day” were inspired by Stalin’s regular wartime orders, called by the same name. A small collection of the more political of these have been published (together with his wartime speeches) in English as “On the Great Patriotic War of the Soviet Union” By Joseph Stalin.[55][56] Like Big Brother’s Orders of the day, Stalin’s frequently lauded heroic individuals,[57] like Comrade Ogilvy, the fictitious hero Winston Smith invented to ‘rectify’ (fabricate) a Big Brother Order of the day.

The Ingsoc slogan “Our new, happy life”, repeated from telescreens, evokes Stalin’s 1935 statement, which became a CPSU slogan, “Life has become better, Comrades; life has become more cheerful.”[48]

In 1940 Argentine writer Jorge Luis Borges published Tln, Uqbar, Orbis Tertius which described the invention by a “benevolent secret society” of a world that would seek to remake human language and reality along human-invented lines. The story concludes with an appendix describing the success of the project. Borges’ story addresses similar themes of epistemology, language and history to 1984.[58]

During World War II, Orwell believed that British democracy as it existed before 1939 would not survive the war. The question being “Would it end via Fascist coup d’tat from above or via Socialist revolution from below”?[citation needed] Later, he admitted that events proved him wrong: “What really matters is that I fell into the trap of assuming that ‘the war and the revolution are inseparable’.”[59]

Nineteen Eighty-Four (1949) and Animal Farm (1945) share themes of the betrayed revolution, the person’s subordination to the collective, rigorously enforced class distinctions (Inner Party, Outer Party, Proles), the cult of personality, concentration camps, Thought Police, compulsory regimented daily exercise, and youth leagues. Oceania resulted from the US annexation of the British Empire to counter the Asian peril to Australia and New Zealand. It is a naval power whose militarism venerates the sailors of the floating fortresses, from which battle is given to recapturing India, the “Jewel in the Crown” of the British Empire. Much of Oceanic society is based upon the USSR under Joseph StalinBig Brother. The televised Two Minutes Hate is ritual demonisation of the enemies of the State, especially Emmanuel Goldstein (viz Leon Trotsky). Altered photographs and newspaper articles create unpersons deleted from the national historical record, including even founding members of the regime (Jones, Aaronson and Rutherford) in the 1960s purges (viz the Soviet Purges of the 1930s, in which leaders of the Bolshevik Revolution were similarly treated). A similar thing also happened during the French Revolution in which many of the original leaders of the Revolution were later put to death, for example Danton who was put to death by Robespierre, and then later Robespierre himself met the same fate.

In his 1946 essay “Why I Write”, Orwell explains that the serious works he wrote since the Spanish Civil War (193639) were “written, directly or indirectly, against totalitarianism and for democratic socialism”.[3][60] Nineteen Eighty-Four is a cautionary tale about revolution betrayed by totalitarian defenders previously proposed in Homage to Catalonia (1938) and Animal Farm (1945), while Coming Up for Air (1939) celebrates the personal and political freedoms lost in Nineteen Eighty-Four (1949). Biographer Michael Shelden notes Orwell’s Edwardian childhood at Henley-on-Thames as the golden country; being bullied at St Cyprian’s School as his empathy with victims; his life in the Indian Imperial Police in Burma and the techniques of violence and censorship in the BBC as capricious authority.[61]

Other influences include Darkness at Noon (1940) and The Yogi and the Commissar (1945) by Arthur Koestler; The Iron Heel (1908) by Jack London; 1920: Dips into the Near Future[62] by John A. Hobson; Brave New World (1932) by Aldous Huxley; We (1921) by Yevgeny Zamyatin which he reviewed in 1946;[63] and The Managerial Revolution (1940) by James Burnham predicting perpetual war among three totalitarian superstates. Orwell told Jacintha Buddicom that he would write a novel stylistically like A Modern Utopia (1905) by H. G. Wells.[citation needed]

Extrapolating from World War II, the novel’s pastiche parallels the politics and rhetoric at war’s endthe changed alliances at the “Cold War’s” (194591) beginning; the Ministry of Truth derives from the BBC’s overseas service, controlled by the Ministry of Information; Room 101 derives from a conference room at BBC Broadcasting House;[64] the Senate House of the University of London, containing the Ministry of Information is the architectural inspiration for the Minitrue; the post-war decrepitude derives from the socio-political life of the UK and the US, i.e., the impoverished Britain of 1948 losing its Empire despite newspaper-reported imperial triumph; and war ally but peace-time foe, Soviet Russia became Eurasia.

The term “English Socialism” has precedents in his wartime writings; in the essay “The Lion and the Unicorn: Socialism and the English Genius” (1941), he said that “the war and the revolution are inseparable…the fact that we are at war has turned Socialism from a textbook word into a realisable policy” because Britain’s superannuated social class system hindered the war effort and only a socialist economy would defeat Adolf Hitler. Given the middle class’s grasping this, they too would abide socialist revolution and that only reactionary Britons would oppose it, thus limiting the force revolutionaries would need to take power. An English Socialism would come about which “will never lose touch with the tradition of compromise and the belief in a law that is above the State. It will shoot traitors, but it will give them a solemn trial beforehand and occasionally it will acquit them. It will crush any open revolt promptly and cruelly, but it will interfere very little with the spoken and written word.”[65]

In the world of Nineteen Eighty-Four, “English Socialism”(or “Ingsoc” in Newspeak) is a totalitarian ideology unlike the English revolution he foresaw. Comparison of the wartime essay “The Lion and the Unicorn” with Nineteen Eighty-Four shows that he perceived a Big Brother regime as a perversion of his cherished socialist ideals and English Socialism. Thus Oceania is a corruption of the British Empire he believed would evolve “into a federation of Socialist states, like a looser and freer version of the Union of Soviet Republics”.[66][verification needed]

When first published, Nineteen Eighty-Four was generally well received by reviewers. V. S. Pritchett, reviewing the novel for the New Statesman stated: “I do not think I have ever read a novel more frightening and depressing; and yet, such are the originality, the suspense, the speed of writing and withering indignation that it is impossible to put the book down.”[67] P. H. Newby, reviewing Nineteen Eighty-Four for The Listener magazine, described it as “the most arresting political novel written by an Englishman since Rex Warner’s The Aerodrome.”[68] Nineteen Eighty-Four was also praised by Bertrand Russell, E. M. Forster and Harold Nicolson.[68] On the other hand, Edward Shanks, reviewing Nineteen Eighty-Four for The Sunday Times, was dismissive; Shanks claimed Nineteen Eighty-Four “breaks all records for gloomy vaticination”.[68] C. S. Lewis was also critical of the novel, claiming that the relationship of Julia and Winston, and especially the Party’s view on sex, lacked credibility, and that the setting was “odious rather than tragic”.[69]

Nineteen Eighty-Four has been adapted for the cinema, radio, television and theatre at least twice each, as well as for other art media, such as ballet and opera.

The effect of Nineteen Eighty-Four on the English language is extensive; the concepts of Big Brother, Room 101, the Thought Police, thoughtcrime, unperson, memory hole (oblivion), doublethink (simultaneously holding and believing contradictory beliefs) and Newspeak (ideological language) have become common phrases for denoting totalitarian authority. Doublespeak and groupthink are both deliberate elaborations of doublethink, and the adjective “Orwellian” means similar to Orwell’s writings, especially Nineteen Eighty-Four. The practice of ending words with “-speak” (such as mediaspeak) is drawn from the novel.[70] Orwell is perpetually associated with 1984; in July 1984, an asteroid was discovered by Antonn Mrkos and named after Orwell.

References to the themes, concepts and plot of Nineteen Eighty-Four have appeared frequently in other works, especially in popular music and video entertainment. An example is the worldwide hit reality television show Big Brother, in which a group of people live together in a large house, isolated from the outside world but continuously watched by television cameras.

The book touches on the invasion of privacy and ubiquitous surveillance. From mid-2013 it was publicized that the NSA has been secretly monitoring and storing global internet traffic, including the bulk data collection of email and phone call data. Sales of Nineteen Eighty-Four increased by up to seven times within the first week of the 2013 mass surveillance leaks.[79][80][81] The book again topped the Amazon.com sales charts in 2017 after a controversy involving Kellyanne Conway using the phrase “alternative facts” to explain discrepancies with the media.[82][83][84][85]

The book also shows mass media as a catalyst for the intensification of destructive emotions and violence. Since the 20th century, news and other forms of media have been publicizing violence more often.[86][87] In 2013, the Almeida Theatre and Headlong staged a successful new adaptation (by Robert Icke and Duncan Macmillan), which twice toured the UK and played an extended run in London’s West End. The play opened on Broadway in 2017.

In the decades since the publication of Nineteen Eighty-Four, there have been numerous comparisons to Aldous Huxley’s novel Brave New World, which had been published 17 years earlier, in 1932.[88][89][90][91] They are both predictions of societies dominated by a central government and are both based on extensions of the trends of their times. However, members of the ruling class of Nineteen Eighty-Four use brutal force, torture and mind control to keep individuals in line, but rulers in Brave New World keep the citizens in line by addictive drugs and pleasurable distractions.

In October 1949, after reading Nineteen Eighty-Four, Huxley sent a letter to Orwell and wrote that it would be more efficient for rulers to stay in power by the softer touch by allowing citizens to self-seek pleasure to control them rather than brute force and to allow a false sense of freedom:

Within the next generation I believe that the world’s rulers will discover that infant conditioning and narco-hypnosis are more efficient, as instruments of government, than clubs and prisons, and that the lust for power can be just as completely satisfied by suggesting people into loving their servitude as by flogging and kicking them into obedience.[92]

Elements of both novels can be seen in modern-day societies, with Huxley’s vision being more dominant in the West and Orwell’s vision more prevalent with dictators in ex-communist countries, as is pointed out in essays that compare the two novels, including Huxley’s own Brave New World Revisited.[93][94][95][85]

Comparisons with other dystopian novels like The Handmaid’s Tale, Virtual Light, The Private Eye and Children of Men have also been drawn.[96][97]

Here is the original post:

Nineteen Eighty-Four – Wikipedia

War on drugs – Wikipedia

War on Drugs is an American term[6][7] usually applied to the U.S. federal government’s campaign of prohibition of drugs, military aid, and military intervention, with the stated aim being to reduce the illegal drug trade.[8][9] The initiative includes a set of drug policies that are intended to discourage the production, distribution, and consumption of psychoactive drugs that the participating governments and the UN have made illegal. The term was popularized by the media shortly after a press conference given on June 18, 1971, by President Richard Nixonthe day after publication of a special message from President Nixon to the Congress on Drug Abuse Prevention and Controlduring which he declared drug abuse “public enemy number one”. That message to the Congress included text about devoting more federal resources to the “prevention of new addicts, and the rehabilitation of those who are addicted”, but that part did not receive the same public attention as the term “war on drugs”.[10][11][12] However, two years prior to this, Nixon had formally declared a “war on drugs” that would be directed toward eradication, interdiction, and incarceration.[13] Today, the Drug Policy Alliance, which advocates for an end to the War on Drugs, estimates that the United States spends $51 billion annually on these initiatives.[14]

On May 13, 2009, Gil Kerlikowskethe Director of the Office of National Drug Control Policy (ONDCP)signaled that the Obama administration did not plan to significantly alter drug enforcement policy, but also that the administration would not use the term “War on Drugs”, because Kerlikowske considers the term to be “counter-productive”.[15] ONDCP’s view is that “drug addiction is a disease that can be successfully prevented and treated… making drugs more available will make it harder to keep our communities healthy and safe”.[16] One of the alternatives that Kerlikowske has showcased is the drug policy of Sweden, which seeks to balance public health concerns with opposition to drug legalization. The prevalence rates for cocaine use in Sweden are barely one-fifth of those in Spain, the biggest consumer of the drug.[17]

In June 2011, the Global Commission on Drug Policy released a critical report on the War on Drugs, declaring: “The global war on drugs has failed, with devastating consequences for individuals and societies around the world. Fifty years after the initiation of the UN Single Convention on Narcotic Drugs, and years after President Nixon launched the US government’s war on drugs, fundamental reforms in national and global drug control policies are urgently needed.”[18] The report was criticized by organizations that oppose a general legalization of drugs.[16]

The first U.S. law that restricted the distribution and use of certain drugs was the Harrison Narcotics Tax Act of 1914. The first local laws came as early as 1860.[19] In 1919, the United States passed the 18th Amendment, prohibiting the sale, manufacture, and transportation of alcohol, with exceptions for religious and medical use. In 1920, the United States passed the National Prohibition Act (Volstead Act), enacted to carry out the provisions in law of the 18th Amendment.

The Federal Bureau of Narcotics was established in the United States Department of the Treasury by an act of June 14, 1930 (46 Stat. 585).[20] In 1933, the federal prohibition for alcohol was repealed by passage of the 21st Amendment. In 1935, President Franklin D. Roosevelt publicly supported the adoption of the Uniform State Narcotic Drug Act. The New York Times used the headline “Roosevelt Asks Narcotic War Aid”.[21][22]

In 1937, the Marihuana Tax Act of 1937 was passed. Several scholars have claimed that the goal was to destroy the hemp industry,[23][24][25] largely as an effort of businessmen Andrew Mellon, Randolph Hearst, and the Du Pont family.[23][25] These scholars argue that with the invention of the decorticator, hemp became a very cheap substitute for the paper pulp that was used in the newspaper industry.[23][26] These scholars believe that Hearst felt[dubious discuss] that this was a threat to his extensive timber holdings. Mellon, United States Secretary of the Treasury and the wealthiest man in America, had invested heavily in the DuPont’s new synthetic fiber, nylon, and considered[dubious discuss] its success to depend on its replacement of the traditional resource, hemp.[23][27][28][29][30][31][32][33] However, there were circumstances that contradict these claims. One reason for doubts about those claims is that the new decorticators did not perform fully satisfactorily in commercial production.[34] To produce fiber from hemp was a labor-intensive process if you include harvest, transport and processing. Technological developments decreased the labor with hemp but not sufficient to eliminate this disadvantage.[35][36]

On October 27, 1970, Congress passes the Comprehensive Drug Abuse Prevention and Control Act of 1970, which, among other things, categorizes controlled substances based on their medicinal use and potential for addiction.[37] In 1971, two congressmen released an explosive report on the growing heroin epidemic among U.S. servicemen in Vietnam; ten to fifteen percent of the servicemen were addicted to heroin, and President Nixon declared drug abuse to be “public enemy number one”.[37][38]

Although Nixon declared “drug abuse” to be public enemy number one in 1971,[39] the policies that his administration implemented as part of the Comprehensive Drug Abuse Prevention and Control Act of 1970 were a continuation of drug prohibition policies in the U.S., which started in 1914.[37][40]

“The Nixon campaign in 1968, and the Nixon White House after that, had two enemies: the antiwar left and black people. You understand what I’m saying? We knew we couldn’t make it illegal to be either against the war or black, but by getting the public to associate the hippies with marijuana and blacks with heroin, and then criminalizing both heavily, we could disrupt those communities. We could arrest their leaders, raid their homes, break up their meetings, and vilify them night after night on the evening news. Did we know we were lying about the drugs? Of course we did.” John Ehrlichman, to Dan Baum[41][42][43] for Harper’s Magazine[44] in 1994, about President Richard Nixon’s war on drugs, declared in 1971.[45][46]

In 1973, the Drug Enforcement Administration was created to replace the Bureau of Narcotics and Dangerous Drugs.[37]

The Nixon Administration also repealed the federal 210-year mandatory minimum sentences for possession of marijuana and started federal demand reduction programs and drug-treatment programs. Robert DuPont, the “Drug czar” in the Nixon Administration, stated it would be more accurate to say that Nixon ended, rather than launched, the “war on drugs”. DuPont also argued that it was the proponents of drug legalization that popularized the term “war on drugs”.[16][unreliable source?]

In 1982, Vice President George H. W. Bush and his aides began pushing for the involvement of the CIA and U.S. military in drug interdiction efforts.[47]

The Office of National Drug Control Policy (ONDCP) was originally established by the National Narcotics Leadership Act of 1988,[48][49] which mandated a national anti-drug media campaign for youth, which would later become the National Youth Anti-Drug Media Campaign.[50] The director of ONDCP is commonly known as the Drug czar,[37] and it was first implemented in 1989 under President George H. W. Bush,[51] and raised to cabinet-level status by Bill Clinton in 1993.[52] These activities were subsequently funded by the Treasury and General Government Appropriations Act of 1998.[53][54] The Drug-Free Media Campaign Act of 1998 codified the campaign at 21 U.S.C.1708.[55]

The Global Commission on Drug Policy released a report on June 2, 2011, alleging that “The War On Drugs Has Failed.” The commissioned was made up of 22 self-appointed members including a number of prominent international politicians and writers. U.S. Surgeon General Regina Benjamin also released the first ever National Prevention Strategy.[56]

On May 21, 2012, the U.S. Government published an updated version of its Drug Policy.[57] The director of ONDCP stated simultaneously that this policy is something different from the “War on Drugs”:

At the same meeting was a declaration signed by the representatives of Italy, the Russian Federation, Sweden, the United Kingdom and the United States in line with this: “Our approach must be a balanced one, combining effective enforcement to restrict the supply of drugs, with efforts to reduce demand and build recovery; supporting people to live a life free of addiction.”[59]

In March 2016 the International Narcotics Control Board stated that the International Drug Control treaties do not mandate a “war on drugs.”[60]

According to Human Rights Watch, the War on Drugs caused soaring arrest rates that disproportionately targeted African Americans due to various factors.[62] John Ehrlichman, an aide to Nixon, said that Nixon used the war on drugs to criminalize and disrupt black and hippie communities and their leaders.[63]

The present state of incarceration in the U.S. as a result of the war on drugs arrived in several stages. By 1971, different stops on drugs had been implemented for more than 50 years (for e.g. since 1914, 1937 etc.) with only a very small increase of inmates per 100,000 citizens. During the first 9 years after Nixon coined the expression “War on Drugs”, statistics showed only a minor increase in the total number of imprisoned.

After 1980, the situation began to change. In the 1980s, while the number of arrests for all crimes had risen by 28%, the number of arrests for drug offenses rose 126%.[64] The result of increased demand was the development of privatization and the for-profit prison industry.[65] The US Department of Justice, reporting on the effects of state initiatives, has stated that, from 1990 through 2000, “the increasing number of drug offenses accounted for 27% of the total growth among black inmates, 7% of the total growth among Hispanic inmates, and 15% of the growth among white inmates.” In addition to prison or jail, the United States provides for the deportation of many non-citizens convicted of drug offenses.[66]

In 1994, the New England Journal of Medicine reported that the “War on Drugs” resulted in the incarceration of one million Americans each year.[67] In 2008, the Washington Post reported that of 1.5 million Americans arrested each year for drug offenses, half a million would be incarcerated.[68] In addition, one in five black Americans would spend time behind bars due to drug laws.[68]

Federal and state policies also impose collateral consequences on those convicted of drug offenses, such as denial of public benefits or licenses, that are not applicable to those convicted of other types of crime.[69] In particular, the passage of the 1990 SolomonLautenberg amendment led many states to impose mandatory driver’s license suspensions (of at least 6 months) for persons committing a drug offense, regardless of whether any motor vehicle was involved.[70][71] Approximately 191,000 licenses were suspended in this manner in 2016, according to a Prison Policy Initiative report.[72]

In 1986, the U.S. Congress passed laws that created a 100 to 1 sentencing disparity for the trafficking or possession of crack when compared to penalties for trafficking of powder cocaine,[73][74][75][76] which had been widely criticized as discriminatory against minorities, mostly blacks, who were more likely to use crack than powder cocaine.[77] This 100:1 ratio had been required under federal law since 1986.[78] Persons convicted in federal court of possession of 5grams of crack cocaine received a minimum mandatory sentence of 5 years in federal prison. On the other hand, possession of 500grams of powder cocaine carries the same sentence.[74][75] In 2010, the Fair Sentencing Act cut the sentencing disparity to 18:1.[77]

According to Human Rights Watch, crime statistics show thatin the United States in 1999compared to non-minorities, African Americans were far more likely to be arrested for drug crimes, and received much stiffer penalties and sentences.[79]

Statistics from 1998 show that there were wide racial disparities in arrests, prosecutions, sentencing and deaths. African-American drug users made up for 35% of drug arrests, 55% of convictions, and 74% of people sent to prison for drug possession crimes.[74] Nationwide African-Americans were sent to state prisons for drug offenses 13 times more often than other races,[80] even though they only supposedly comprised 13% of regular drug users.[74]

Anti-drug legislation over time has also displayed an apparent racial bias. University of Minnesota Professor and social justice author Michael Tonry writes, “The War on Drugs foreseeably and unnecessarily blighted the lives of hundreds and thousands of young disadvantaged black Americans and undermined decades of effort to improve the life chances of members of the urban black underclass.”[81]

In 1968, President Lyndon B. Johnson decided that the government needed to make an effort to curtail the social unrest that blanketed the country at the time. He decided to focus his efforts on illegal drug use, an approach which was in line with expert opinion on the subject at the time. In the 1960s, it was believed that at least half of the crime in the U.S. was drug related, and this number grew as high as 90 percent in the next decade.[82] He created the Reorganization Plan of 1968 which merged the Bureau of Narcotics and the Bureau of Drug Abuse to form the Bureau of Narcotics and Dangerous Drugs within the Department of Justice.[83] The belief during this time about drug use was summarized by journalist Max Lerner in his celebrated[citation needed] work America as a Civilization (1957):

As a case in point we may take the known fact of the prevalence of reefer and dope addiction in Negro areas. This is essentially explained in terms of poverty, slum living, and broken families, yet it would be easy to show the lack of drug addiction among other ethnic groups where the same conditions apply.[84]

Richard Nixon became president in 1969, and did not back away from the anti-drug precedent set by Johnson. Nixon began orchestrating drug raids nationwide to improve his “watchdog” reputation. Lois B. Defleur, a social historian who studied drug arrests during this period in Chicago, stated that, “police administrators indicated they were making the kind of arrests the public wanted”. Additionally, some of Nixon’s newly created drug enforcement agencies would resort to illegal practices to make arrests as they tried to meet public demand for arrest numbers. From 1972 to 1973, the Office of Drug Abuse and Law Enforcement performed 6,000 drug arrests in 18 months, the majority of the arrested black.[85]

The next two Presidents, Gerald Ford and Jimmy Carter, responded with programs that were essentially a continuation of their predecessors. Shortly after Ronald Reagan became President in 1981 he delivered a speech on the topic. Reagan announced, “We’re taking down the surrender flag that has flown over so many drug efforts; we’re running up a battle flag.”[86] For his first five years in office, Reagan slowly strengthened drug enforcement by creating mandatory minimum sentencing and forfeiture of cash and real estate for drug offenses, policies far more detrimental to poor blacks than any other sector affected by the new laws.[citation needed]

Then, driven by the 1986 cocaine overdose of black basketball star Len Bias,[dubious discuss] Reagan was able to pass the Anti-Drug Abuse Act through Congress. This legislation appropriated an additional $1.7 billion to fund the War on Drugs. More importantly, it established 29 new, mandatory minimum sentences for drug offenses. In the entire history of the country up until that point, the legal system had only seen 55 minimum sentences in total.[87] A major stipulation of the new sentencing rules included different mandatory minimums for powder and crack cocaine. At the time of the bill, there was public debate as to the difference in potency and effect of powder cocaine, generally used by whites, and crack cocaine, generally used by blacks, with many believing that “crack” was substantially more powerful and addictive. Crack and powder cocaine are closely related chemicals, crack being a smokeable, freebase form of powdered cocaine hydrochloride which produces a shorter, more intense high while using less of the drug. This method is more cost effective, and therefore more prevalent on the inner-city streets, while powder cocaine remains more popular in white suburbia. The Reagan administration began shoring public opinion against “crack”, encouraging DEA official Robert Putnam to play up the harmful effects of the drug. Stories of “crack whores” and “crack babies” became commonplace; by 1986, Time had declared “crack” the issue of the year.[88] Riding the wave of public fervor, Reagan established much harsher sentencing for crack cocaine, handing down stiffer felony penalties for much smaller amounts of the drug.[89]

Reagan protg and former Vice-President George H. W. Bush was next to occupy the oval office, and the drug policy under his watch held true to his political background. Bush maintained the hard line drawn by his predecessor and former boss, increasing narcotics regulation when the First National Drug Control Strategy was issued by the Office of National Drug Control in 1989.[90]

The next three presidents Clinton, Bush and Obama continued this trend, maintaining the War on Drugs as they inherited it upon taking office.[91] During this time of passivity by the federal government, it was the states that initiated controversial legislation in the War on Drugs. Racial bias manifested itself in the states through such controversial policies as the “stop and frisk” police practices in New York city and the “three strikes” felony laws began in California in 1994.[92]

In August 2010, President Obama signed the Fair Sentencing Act into law that dramatically reduced the 100-to-1 sentencing disparity between powder and crack cocaine, which disproportionately affected minorities.[93]

Commonly used illegal drugs include heroin, cocaine, methamphetamine, and, marijuana.

Heroin is an opiate that is highly addictive. If caught selling or possessing heroin, a perpetrator can be charged with a felony and face twofour years in prison and could be fined to a maximum of $20,000.[94]

Crystal meth is composed of methamphetamine hydrochloride. It is marketed as either a white powder or in a solid (rock) form. The possession of crystal meth can result in a punishment varying from a fine to a jail sentence. As with other drug crimes, sentencing length may increase depending on the amount of the drug found in the possession of the defendant.[95]

Cocaine possession is illegal across the U.S., with the cheaper crack cocaine incurring even greater penalties. Having possession is when the accused knowingly has it on their person, or in a backpack or purse. The possession of cocaine with no prior conviction, for the first offense, the person will be sentenced to a maximum of one year in prison or fined $1,000, or both. If the person has a prior conviction, whether it is a narcotic or cocaine, they will be sentenced to two years in prison, a $2,500 fine, or both. With two or more convictions of possession prior to this present offense, they can be sentenced to 90 days in prison along with a $5,000 fine.[96]

Marijuana is the most popular illegal drug worldwide. The punishment for possession of it is less than for the possession of cocaine or heroin. In some U.S. states, the drug is legal. Over 80 million Americans have tried marijuana. The Criminal Defense Lawyer article claims that, depending on the age of person and how much the person has been caught for possession, they will be fined and could plea bargain into going to a treatment program versus going to prison. In each state the convictions differ along with how much marijuana they have on their person.[97]

Some scholars have claimed that the phrase “War on Drugs” is propaganda cloaking an extension of earlier military or paramilitary operations.[9] Others have argued that large amounts of “drug war” foreign aid money, training, and equipment actually goes to fighting leftist insurgencies and is often provided to groups who themselves are involved in large-scale narco-trafficking, such as corrupt members of the Colombian military.[8]

From 1963 to the end of the Vietnam War in 1975, marijuana usage became common among U.S. soldiers in non-combat situations. Some servicemen also used heroin. Many of the servicemen ended the heroin use after returning to the United States but came home addicted. In 1971, the U.S. military conducted a study of drug use among American servicemen and women. It found that daily usage rates for drugs on a worldwide basis were as low as two percent.[98] However, in the spring of 1971, two congressmen released an alarming report alleging that 15% of the servicemen in Vietnam were addicted to heroin. Marijuana use was also common in Vietnam. Soldiers who used drugs had more disciplinary problems. The frequent drug use had become an issue for the commanders in Vietnam; in 1971 it was estimated that 30,000 servicemen were addicted to drugs, most of them to heroin.[11]

From 1971 on, therefore, returning servicemen were required to take a mandatory heroin test. Servicemen who tested positive upon returning from Vietnam were not allowed to return home until they had passed the test with a negative result. The program also offered a treatment for heroin addicts.[99]

Elliot Borin’s article “The U.S. Military Needs its Speed”published in Wired on February 10, 2003reports:

But the Defense Department, which distributed millions of amphetamine tablets to troops during World War II, Vietnam and the Gulf War, soldiers on, insisting that they are not only harmless but beneficial.

In a news conference held in connection with Schmidt and Umbach’s Article 32 hearing, Dr. Pete Demitry, an Air Force physician and a pilot, claimed that the “Air Force has used (Dexedrine) safely for 60 years” with “no known speed-related mishaps.”

The need for speed, Demitry added “is a life-and-death issue for our military.”[100]

One of the first anti-drug efforts in the realm of foreign policy was President Nixon’s Operation Intercept, announced in September 1969, targeted at reducing the amount of cannabis entering the United States from Mexico. The effort began with an intense inspection crackdown that resulted in an almost shutdown of cross-border traffic.[101] Because the burden on border crossings was controversial in border states, the effort only lasted twenty days.[102]

On December 20, 1989, the United States invaded Panama as part of Operation Just Cause, which involved 25,000 American troops. Gen. Manuel Noriega, head of the government of Panama, had been giving military assistance to Contra groups in Nicaragua at the request of the U.S. which, in exchange, tolerated his drug trafficking activities, which they had known about since the 1960s.[103][104] When the Drug Enforcement Administration (DEA) tried to indict Noriega in 1971, the CIA prevented them from doing so.[103] The CIA, which was then directed by future president George H. W. Bush, provided Noriega with hundreds of thousands of dollars per year as payment for his work in Latin America.[103] When CIA pilot Eugene Hasenfus was shot down over Nicaragua by the Sandinistas, documents aboard the plane revealed many of the CIA’s activities in Latin America, and the CIA’s connections with Noriega became a public relations “liability” for the U.S. government, which finally allowed the DEA to indict him for drug trafficking, after decades of tolerating his drug operations.[103] Operation Just Cause, whose purpose was to capture Noriega and overthrow his government; Noriega found temporary asylum in the Papal Nuncio, and surrendered to U.S. soldiers on January 3, 1990.[105] He was sentenced by a court in Miami to 45 years in prison.[103]

As part of its Plan Colombia program, the United States government currently provides hundreds of millions of dollars per year of military aid, training, and equipment to Colombia,[106] to fight left-wing guerrillas such as the Revolutionary Armed Forces of Colombia (FARC-EP), which has been accused of being involved in drug trafficking.[107]

Private U.S. corporations have signed contracts to carry out anti-drug activities as part of Plan Colombia. DynCorp, the largest private company involved, was among those contracted by the State Department, while others signed contracts with the Defense Department.[108]

Colombian military personnel have received extensive counterinsurgency training from U.S. military and law enforcement agencies, including the School of Americas (SOA). Author Grace Livingstone has stated that more Colombian SOA graduates have been implicated in human rights abuses than currently known SOA graduates from any other country. All of the commanders of the brigades highlighted in a 2001 Human Rights Watch report on Colombia were graduates of the SOA, including the III brigade in Valle del Cauca, where the 2001 Alto Naya Massacre occurred. US-trained officers have been accused of being directly or indirectly involved in many atrocities during the 1990s, including the Massacre of Trujillo and the 1997 Mapiripn Massacre.

In 2000, the Clinton administration initially waived all but one of the human rights conditions attached to Plan Colombia, considering such aid as crucial to national security at the time.[109]

The efforts of U.S. and Colombian governments have been criticized for focusing on fighting leftist guerrillas in southern regions without applying enough pressure on right-wing paramilitaries and continuing drug smuggling operations in the north of the country.[110][111] Human Rights Watch, congressional committees and other entities have documented the existence of connections between members of the Colombian military and the AUC, which the U.S. government has listed as a terrorist group, and that Colombian military personnel have committed human rights abuses which would make them ineligible for U.S. aid under current laws.[citation needed]

In 2010, the Washington Office on Latin America concluded that both Plan Colombia and the Colombian government’s security strategy “came at a high cost in lives and resources, only did part of the job, are yielding diminishing returns and have left important institutions weaker.”[112]

A 2014 report by the RAND Corporation, which was issued to analyze viable strategies for the Mexican drug war considering successes experienced in Columbia, noted:

Between 1999 and 2002, the United States gave Colombia $2.04 billion in aid, 81 percent of which was for military purposes, placing Colombia just below Israel and Egypt among the largest recipients of U.S. military assistance. Colombia increased its defense spending from 3.2 percent of gross domestic product (GDP) in 2000 to 4.19 percent in 2005. Overall, the results were extremely positive. Greater spending on infrastructure and social programs helped the Colombian government increase its political legitimacy, while improved security forces were better able to consolidate control over large swaths of the country previously overrun by insurgents and drug cartels.

It also notes that, “Plan Colombia has been widely hailed as a success, and some analysts believe that, by 2010, Colombian security forces had finally gained the upper hand once and for all.”[113]

The Mrida Initiative is a security cooperation between the United States and the government of Mexico and the countries of Central America. It was approved on June 30, 2008, and its stated aim is combating the threats of drug trafficking and transnational crime. The Mrida Initiative appropriated $1.4 billion in a three-year commitment (20082010) to the Mexican government for military and law enforcement training and equipment, as well as technical advice and training to strengthen the national justice systems. The Mrida Initiative targeted many very important government officials, but it failed to address the thousands of Central Americans who had to flee their countries due to the danger they faced everyday because of the war on drugs. There is still not any type of plan that addresses these people. No weapons are included in the plan.[114][115]

The United States regularly sponsors the spraying of large amounts of herbicides such as glyphosate over the jungles of Central and South America as part of its drug eradication programs. Environmental consequences resulting from aerial fumigation have been criticized as detrimental to some of the world’s most fragile ecosystems;[116] the same aerial fumigation practices are further credited with causing health problems in local populations.[117]

In 2012, the U.S. sent DEA agents to Honduras to assist security forces in counternarcotics operations. Honduras has been a major stop for drug traffickers, who use small planes and landing strips hidden throughout the country to transport drugs. The U.S. government made agreements with several Latin American countries to share intelligence and resources to counter the drug trade. DEA agents, working with other U.S. agencies such as the State Department, the CBP, and Joint Task Force-Bravo, assisted Honduras troops in conducting raids on traffickers’ sites of operation.[118]

The War on Drugs has been a highly contentious issue since its inception. A poll on October 2, 2008, found that three in four Americans believed that the War On Drugs was failing.[119]

At a meeting in Guatemala in 2012, three former presidents from Guatemala, Mexico and Colombia said that the war on drugs had failed and that they would propose a discussion on alternatives, including decriminalization, at the Summit of the Americas in April of that year.[120] Guatemalan President Otto Prez Molina said that the war on drugs was exacting too high a price on the lives of Central Americans and that it was time to “end the taboo on discussing decriminalization”.[121] At the summit, the government of Colombia pushed for the most far-reaching change to drugs policy since the war on narcotics was declared by Nixon four decades prior, citing the catastrophic effects it had had in Colombia.[122]

Several critics have compared the wholesale incarceration of the dissenting minority of drug users to the wholesale incarceration of other minorities in history. Psychiatrist Thomas Szasz, for example, writes in 1997 “Over the past thirty years, we have replaced the medical-political persecution of illegal sex users (‘perverts’ and ‘psychopaths’) with the even more ferocious medical-political persecution of illegal drug users.”[123]

Penalties for drug crimes among American youth almost always involve permanent or semi-permanent removal from opportunities for education, strip them of voting rights, and later involve creation of criminal records which make employment more difficult.[124] Thus, some authors maintain that the War on Drugs has resulted in the creation of a permanent underclass of people who have few educational or job opportunities, often as a result of being punished for drug offenses which in turn have resulted from attempts to earn a living in spite of having no education or job opportunities.[124]

According to a 2008 study published by Harvard economist Jeffrey A. Miron, the annual savings on enforcement and incarceration costs from the legalization of drugs would amount to roughly $41.3 billion, with $25.7 billion being saved among the states and over $15.6 billion accrued for the federal government. Miron further estimated at least $46.7 billion in tax revenue based on rates comparable to those on tobacco and alcohol ($8.7 billion from marijuana, $32.6 billion from cocaine and heroin, remainder from other drugs).[125]

Low taxation in Central American countries has been credited with weakening the region’s response in dealing with drug traffickers. Many cartels, especially Los Zetas have taken advantage of the limited resources of these nations. 2010 tax revenue in El Salvador, Guatemala, and Honduras, composed just 13.53% of GDP. As a comparison, in Chile and the U.S., taxes were 18.6% and 26.9% of GDP respectively. However, direct taxes on income are very hard to enforce and in some cases tax evasion is seen as a national pastime.[126]

The status of coca and coca growers has become an intense political issue in several countries, including Colombia and particularly Bolivia, where the president, Evo Morales, a former coca growers’ union leader, has promised to legalise the traditional cultivation and use of coca.[127] Indeed, legalization efforts have yielded some successes under the Morales administration when combined with aggressive and targeted eradication efforts. The country saw a 1213% decline in coca cultivation[127] in 2011 under Morales, who has used coca growers’ federations to ensure compliance with the law rather than providing a primary role for security forces.[127]

The coca eradication policy has been criticised for its negative impact on the livelihood of coca growers in South America. In many areas of South America the coca leaf has traditionally been chewed and used in tea and for religious, medicinal and nutritional purposes by locals.[128] For this reason many insist that the illegality of traditional coca cultivation is unjust. In many areas the U.S. government and military has forced the eradication of coca without providing for any meaningful alternative crop for farmers, and has additionally destroyed many of their food or market crops, leaving them starving and destitute.[128]

The CIA, DEA, State Department, and several other U.S. government agencies have been alleged to have relations with various groups which are involved in drug trafficking.

Senator John Kerry’s 1988 U.S. Senate Committee on Foreign Relations report on Contra drug links concludes that members of the U.S. State Department “who provided support for the Contras are involved in drug trafficking… and elements of the Contras themselves knowingly receive financial and material assistance from drug traffickers.”[129] The report further states that “the Contra drug links include… payments to drug traffickers by the U.S. State Department of funds authorized by the Congress for humanitarian assistance to the Contras, in some cases after the traffickers had been indicted by federal law enforcement agencies on drug charges, in others while traffickers were under active investigation by these same agencies.”

In 1996, journalist Gary Webb published reports in the San Jose Mercury News, and later in his book Dark Alliance, detailing how Contras, had been involved in distributing crack cocaine into Los Angeles whilst receiving money from the CIA.[citation needed] Contras used money from drug trafficking to buy weapons.[citation needed]

Webb’s premise regarding the U.S. Government connection was initially attacked at the time by the media. It is now widely accepted that Webb’s main assertion of government “knowledge of drug operations, and collaboration with and protection of known drug traffickers” was correct.[130][not in citation given] In 1998, CIA Inspector General Frederick Hitz published a two-volume report[131] that while seemingly refuting Webb’s claims of knowledge and collaboration in its conclusions did not deny them in its body.[citation needed] Hitz went on to admit CIA improprieties in the affair in testimony to a House congressional committee. There has been a reversal amongst mainstream media of its position on Webb’s work, with acknowledgement made of his contribution to exposing a scandal it had ignored.

According to Rodney Campbell, an editorial assistant to Nelson Rockefeller, during World War II, the United States Navy, concerned that strikes and labor disputes in U.S. eastern shipping ports would disrupt wartime logistics, released the mobster Lucky Luciano from prison, and collaborated with him to help the mafia take control of those ports. Labor union members were terrorized and murdered by mafia members as a means of preventing labor unrest and ensuring smooth shipping of supplies to Europe.[132]

According to Alexander Cockburn and Jeffrey St. Clair, in order to prevent Communist party members from being elected in Italy following World War II, the CIA worked closely with the Sicilian Mafia, protecting them and assisting in their worldwide heroin smuggling operations. The mafia was in conflict with leftist groups and was involved in assassinating, torturing, and beating leftist political organizers.[133]

In 1986, the US Defense Department funded a two-year study by the RAND Corporation, which found that the use of the armed forces to interdict drugs coming into the United States would have little or no effect on cocaine traffic and might, in fact, raise the profits of cocaine cartels and manufacturers. The 175-page study, “Sealing the Borders: The Effects of Increased Military Participation in Drug Interdiction”, was prepared by seven researchers, mathematicians and economists at the National Defense Research Institute, a branch of the RAND, and was released in 1988. The study noted that seven prior studies in the past nine years, including one by the Center for Naval Research and the Office of Technology Assessment, had come to similar conclusions. Interdiction efforts, using current armed forces resources, would have almost no effect on cocaine importation into the United States, the report concluded.[135]

During the early-to-mid-1990s, the Clinton administration ordered and funded a major cocaine policy study, again by RAND. The Rand Drug Policy Research Center study concluded that $3 billion should be switched from federal and local law enforcement to treatment. The report said that treatment is the cheapest way to cut drug use, stating that drug treatment is twenty-three times more effective than the supply-side “war on drugs”.[136]

The National Research Council Committee on Data and Research for Policy on Illegal Drugs published its findings in 2001 on the efficacy of the drug war. The NRC Committee found that existing studies on efforts to address drug usage and smuggling, from U.S. military operations to eradicate coca fields in Colombia, to domestic drug treatment centers, have all been inconclusive, if the programs have been evaluated at all: “The existing drug-use monitoring systems are strikingly inadequate to support the full range of policy decisions that the nation must make…. It is unconscionable for this country to continue to carry out a public policy of this magnitude and cost without any way of knowing whether and to what extent it is having the desired effect.”[137] The study, though not ignored by the press, was ignored by top-level policymakers, leading Committee Chair Charles Manski to conclude, as one observer notes, that “the drug war has no interest in its own results”.[138]

In mid-1995, the US government tried to reduce the supply of methamphetamine precursors to disrupt the market of this drug. According to a 2009 study, this effort was successful, but its effects were largely temporary.[139]

During alcohol prohibition, the period from 1920 to 1933, alcohol use initially fell but began to increase as early as 1922. It has been extrapolated that even if prohibition had not been repealed in 1933, alcohol consumption would have quickly surpassed pre-prohibition levels.[140] One argument against the War on Drugs is that it uses similar measures as Prohibition and is no more effective.

In the six years from 2000 to 2006, the U.S. spent $4.7 billion on Plan Colombia, an effort to eradicate coca production in Colombia. The main result of this effort was to shift coca production into more remote areas and force other forms of adaptation. The overall acreage cultivated for coca in Colombia at the end of the six years was found to be the same, after the U.S. Drug Czar’s office announced a change in measuring methodology in 2005 and included new areas in its surveys.[141] Cultivation in the neighboring countries of Peru and Bolivia increased, some would describe this effect like squeezing a balloon.[142]

Richard Davenport-Hines, in his book The Pursuit of Oblivion,[143] criticized the efficacy of the War on Drugs by pointing out that

1015% of illicit heroin and 30% of illicit cocaine is intercepted. Drug traffickers have gross profit margins of up to 300%. At least 75% of illicit drug shipments would have to be intercepted before the traffickers’ profits were hurt.

Alberto Fujimori, president of Peru from 1990 to 2000, described U.S. foreign drug policy as “failed” on grounds that “for 10 years, there has been a considerable sum invested by the Peruvian government and another sum on the part of the American government, and this has not led to a reduction in the supply of coca leaf offered for sale. Rather, in the 10 years from 1980 to 1990, it grew 10-fold.”[144]

At least 500 economists, including Nobel Laureates Milton Friedman,[145] George Akerlof and Vernon L. Smith, have noted that reducing the supply of marijuana without reducing the demand causes the price, and hence the profits of marijuana sellers, to go up, according to the laws of supply and demand.[146] The increased profits encourage the producers to produce more drugs despite the risks, providing a theoretical explanation for why attacks on drug supply have failed to have any lasting effect. The aforementioned economists published an open letter to President George W. Bush stating “We urge…the country to commence an open and honest debate about marijuana prohibition… At a minimum, this debate will force advocates of current policy to show that prohibition has benefits sufficient to justify the cost to taxpayers, foregone tax revenues and numerous ancillary consequences that result from marijuana prohibition.”

The declaration from the World Forum Against Drugs, 2008 state that a balanced policy of drug abuse prevention, education, treatment, law enforcement, research, and supply reduction provides the most effective platform to reduce drug abuse and its associated harms and call on governments to consider demand reduction as one of their first priorities in the fight against drug abuse.[147]

Despite over $7 billion spent annually towards arresting[148] and prosecuting nearly 800,000 people across the country for marijuana offenses in 2005[citation needed] (FBI Uniform Crime Reports), the federally funded Monitoring the Future Survey reports about 85% of high school seniors find marijuana “easy to obtain”. That figure has remained virtually unchanged since 1975, never dropping below 82.7% in three decades of national surveys.[149] The Drug Enforcement Administration states that the number of users of marijuana in the U.S. declined between 2000 and 2005 even with many states passing new medical marijuana laws making access easier,[150] though usage rates remain higher than they were in the 1990s according to the National Survey on Drug Use and Health.[151]

ONDCP stated in April 2011 that there has been a 46 percent drop in cocaine use among young adults over the past five years, and a 65 percent drop in the rate of people testing positive for cocaine in the workplace since 2006.[152] At the same time, a 2007 study found that up to 35% of college undergraduates used stimulants not prescribed to them.[153]

A 2013 study found that prices of heroin, cocaine and cannabis had decreased from 1990 to 2007, but the purity of these drugs had increased during the same time.[154]

The War on Drugs is often called a policy failure.[155][156][157][158][159]

The legality of the War on Drugs has been challenged on four main grounds in the U.S.

Several authors believe that the United States’ federal and state governments have chosen wrong methods for combatting the distribution of illicit substances. Aggressive, heavy-handed enforcement funnels individuals through courts and prisons; instead of treating the cause of the addiction, the focus of government efforts has been on punishment. By making drugs illegal rather than regulating them, the War on Drugs creates a highly profitable black market. Jefferson Fish has edited scholarly collections of articles offering a wide variety of public health based and rights based alternative drug policies.[160][161][162]

In the year 2000, the United States drug-control budget reached 18.4 billion dollars,[163] nearly half of which was spent financing law enforcement while only one sixth was spent on treatment. In the year 2003, 53 percent of the requested drug control budget was for enforcement, 29 percent for treatment, and 18 percent for prevention.[164] The state of New York, in particular, designated 17 percent of its budget towards substance-abuse-related spending. Of that, a mere one percent was put towards prevention, treatment, and research.

In a survey taken by Substance Abuse and Mental Health Services Administration (SAMHSA), it was found that substance abusers that remain in treatment longer are less likely to resume their former drug habits. Of the people that were studied, 66 percent were cocaine users. After experiencing long-term in-patient treatment, only 22 percent returned to the use of cocaine. Treatment had reduced the number of cocaine abusers by two-thirds.[163] By spending the majority of its money on law enforcement, the federal government had underestimated the true value of drug-treatment facilities and their benefit towards reducing the number of addicts in the U.S.

In 2004 the federal government issued the National Drug Control Strategy. It supported programs designed to expand treatment options, enhance treatment delivery, and improve treatment outcomes. For example, the Strategy provided SAMHSA with a $100.6 million grant to put towards their Access to Recovery (ATR) initiative. ATR is a program that provides vouchers to addicts to provide them with the means to acquire clinical treatment or recovery support. The project’s goals are to expand capacity, support client choice, and increase the array of faith-based and community based providers for clinical treatment and recovery support services.[165] The ATR program will also provide a more flexible array of services based on the individual’s treatment needs.

The 2004 Strategy additionally declared a significant 32 million dollar raise in the Drug Courts Program, which provides drug offenders with alternatives to incarceration. As a substitute for imprisonment, drug courts identify substance-abusing offenders and place them under strict court monitoring and community supervision, as well as provide them with long-term treatment services.[166] According to a report issued by the National Drug Court Institute, drug courts have a wide array of benefits, with only 16.4 percent of the nation’s drug court graduates rearrested and charged with a felony within one year of completing the program (versus the 44.1% of released prisoners who end up back in prison within 1-year). Additionally, enrolling an addict in a drug court program costs much less than incarcerating one in prison.[167] According to the Bureau of Prisons, the fee to cover the average cost of incarceration for Federal inmates in 2006 was $24,440.[168] The annual cost of receiving treatment in a drug court program ranges from $900 to $3,500. Drug courts in New York State alone saved $2.54 million in incarceration costs.[167]

Describing the failure of the War on Drugs, New York Times columnist Eduardo Porter noted:

Jeffrey Miron, an economist at Harvard who studies drug policy closely, has suggested that legalizing all illicit drugs would produce net benefits to the United States of some $65 billion a year, mostly by cutting public spending on enforcement as well as through reduced crime and corruption. A study by analysts at the RAND Corporation, a California research organization, suggested that if marijuana were legalized in California and the drug spilled from there to other states, Mexican drug cartels would lose about a fifth of their annual income of some $6.5 billion from illegal exports to the United States.[169]

Many believe that the War on Drugs has been costly and ineffective largely because inadequate emphasis is placed on treatment of addiction. The United States leads the world in both recreational drug usage and incarceration rates. 70% of men arrested in metropolitan areas test positive for an illicit substance,[170] and 54% of all men incarcerated will be repeat offenders.[171]

There are also programs in the United States to combat public health risks of injecting drug users such as the Needle exchange programme. The “needle exchange programme” is intended to provide injecting drug users with new needles in exchange for used needles to prevent needle sharing.

Covert activities and foreign policy

Read the original:

War on drugs – Wikipedia

Homepage – The War On Drugs

Postal Code

Country Select CountryUnited StatesAfghanistanAlbaniaAlgeriaAmerican SamoaAndorraAngolaAnguillaAntarcticaAntigua and BarbudaArgentinaArmeniaArubaAustraliaAustriaAzerbaijanBahamasBahrainBangladeshBarbadosBelarusBelgiumBelizeBeninBermudaBhutanBoliviaBosnia & HerzegovinaBotswanaBouvet IslandBrazilBritish Ind Ocean Ter Brunei DarussalamBulgariaBurkina FasoBurundiCambodiaCameroonCanadaCape VerdeCayman IslandsCentral African RepChadChileChinaChristmas IslandCocos (Keeling Is)ColombiaComorosCongoCook IslandsCosta RicaCote D’Ivoire Croatia (Hrvatska)CubaCyprusCzech RepublicDenmarkDjiboutiDominicaDominican RepublicEast TimorEcuadorEgyptEl SalvadorEquatorial GuineaEritreaEstoniaEthiopiaFalkland Islands Faroe IslandsFijiFinlandFranceFrance, MetroFrench GuianaFrench PolynesiaFrench Southern TerGabonGambiaGeorgiaGeorgia and S. Sand IsGermanyGhanaGibraltarGreeceGreenlandGrenadaGuadeloupeGuamGuatemalaGuineaGuinea-BissauGuyanaHaitiHeard & McDonald IsHondurasHong KongHungaryIcelandIndiaIndonesiaIranIraqIrelandIsraelItalyJamaicaJapanJordanKazakhstanKenyaKiribatiKorea (North) Korea (South)KuwaitKyrgyzstan LaosLatviaLebanonLesothoLiberiaLibyaLiechtensteinLithuaniaLuxembourgMacauMacedoniaMadagascarMalawiMalaysiaMaldivesMaliMaltaMarshall IslandsMartiniqueMauritaniaMauritiusMayotteMexicoMicronesiaMoldovaMonacoMongoliaMontserratMoroccoMozambiqueMyanmarNamibiaNauruNepalNetherlandsNetherlands AntillesNeutral Zone Saudi/IraqNew CaledoniaNew ZealandNicaraguaNigerNigeriaNiueNorfolk IslandNorthern Mariana IsNorwayOmanPakistanPalauPanamaPapua New GuineaParaguayPeruPhilippinesPitcairnPolandPortugalPuerto RicoQatarReunionRomaniaRussian FederationRwandaSaint Kitts and NevisSaint LuciaSt. Vincent/Grenadines SamoaSan MarinoSao Tome and PrincipeSaudi ArabiaSenegalSeychellesSierra LeoneSingaporeSlovakia (Slovak Rep)SloveniaSolomon IslandsSomaliaSouth AfricaSoviet Union (former)SpainSri LankaSt. HelenaSt. Pierre and Miquelo SudanSurinameSvalbard & Jan MayenSwazilandSwedenSwitzerlandSyriaTaiwanTajikistanTanzaniaThailandTogoTokelauTongaTrinidad and TobagoTunisiaTurkeyTurkmenistanTurks and Caicos TuvaluUS Minor Outlying IsUgandaUkraineUnited Arab EmiratesUnited Kingdom UruguayUzbekistanVanuatuVatican City State VenezuelaViet NamVirgin Islands (Brit)Virgin Islands (US)Wallis and Futuna IsWestern SaharaYemenYugoslaviaZaireZambiaZimbabwe

First Name

Birth Date

MonthJanuaryFebruaryMarchAprilMayJuneJulyAugustSeptemberOctoberNovemberDecember

Date12345678910111213141516171819202122232425262728293031

Read more here:

Homepage – The War On Drugs

Philippines War on Drugs | Human Rights Watch

Tilted election playing field in Turkey; European Court of Justice confirms rights of same-sex couples; Philippine policepromoting abusers; Vietnam’s cyber security law; Nigerian military trying to smear Amnesty International; Paris names imprisoned Bahrainrights activist Nabeel Rajaban honorary citizen; Intimidation ofjournalists in the US; Brutal US treatment of refugees; and Russia’s World Cup amid Syria slaughter.

Read more from the original source:

Philippines War on Drugs | Human Rights Watch

The War on Drugs (band) – Wikipedia

The War on Drugs is an American indie rock band from Philadelphia, Pennsylvania, formed in 2005. The band consists of Adam Granduciel (lyrics, vocals, guitar), David Hartley (bass), Robbie Bennett (keyboards), Charlie Hall (drums), Jon Natchez (saxophone, keyboards) and Anthony LaMarca (guitar).

Founded by close collaborators Granduciel and Kurt Vile, The War on Drugs released their debut studio album, Wagonwheel Blues, in 2008. Vile departed shortly after its release to focus on his solo career. The band’s second studio album Slave Ambient was released in 2011 to favorable reviews and extensive touring.

The band’s third album, Lost in the Dream, was released in 2014 following extensive touring and a period of loneliness and depression for primary songwriter Granduciel. The album was released to widespread critical acclaim and increased exposure. Previous collaborator Hall joined the band as its full-time drummer during the recording process, with saxophonist Natchez and additional guitarist LaMarca accompanying the band for its world tour. Signing to Atlantic Records, the six-piece band released their fourth album, A Deeper Understanding, in 2017, which won the Grammy Award for Best Rock Album at the 60th Annual Grammy Awards.

In 2003, frontman Adam Granduciel moved from Oakland, California to Philadelphia, where he met Kurt Vile, who had also recently moved back to Philadelphia after living in Boston for two years.[2] The duo subsequently began writing, recording and performing music together.[3] Vile stated, “Adam was the first dude I met when I moved back to Philadelphia in 2003. We saw eye-to-eye on a lot of things. I was obsessed with Bob Dylan at the time, and we totally geeked-out on that. We started playing together in the early days and he would be in my band, The Violators. Then, eventually I played in The War On Drugs.”[4]

Granduciel and Vile began playing together as The War on Drugs in 2005. Regarding the band’s name, Granduciel noted, “My friend Julian and I came up with it a few years ago over a couple bottles of red wine and a few typewriters when we were living in Oakland. We were writing a lot back then, working on a dictionary, and it just came out and we were like “hey, good band name” so eventually when I moved to Philadelphia and got a band together I used it. It was either that or The Rigatoni Danzas. I think we made the right choice. I always felt though that it was the kind of name I could record all sorts of different music under without any sort of predictability inherent in the name”[5]

While Vile and Granduciel formed the backbone of the band, they had a number of accompanists early in the group’s career, before finally settling on a lineup that added Charlie Hall as drummer/organist, Kyle Lloyd as drummer and Dave Hartley on bass.[6] Granduciel had previously toured and recorded with The Capitol Years, and Vile has several solo albums.[7] The group gave away its Barrel of Batteries EP for free early in 2008.[8] Their debut LP for Secretly Canadian, Wagonwheel Blues, was released in 2008.[9]

Following the album’s release, and subsequent European tour, Vile departed from the band to focus on his solo career, stating, “I only went on the first European tour when their album came out, and then I basically left the band. I knew if I stuck with that, it would be all my time and my goal was to have my own musical career.”[4] Fellow Kurt Vile & the Violators bandmate Mike Zanghi joined the band at this time, with Vile noting, “Mike was my drummer first and then when The War On Drugs’ first record came out I thought I was lending Mike to Adam for the European tour but then he just played with them all the time so I kind of had to like, while they were touring a lot, figure out my own thing.”[10]

The lineup underwent several changes, and by the end of 2008, Kurt Vile, Charlie Hall, and Kyle Lloyd had all exited the group. At that time Granduciel and Hartley were joined by drummer Mike Zanghi, whom Granduciel also played with in Kurt Vile’s backing band, the Violators.

After recording much of the band’s forthcoming studio album, Slave Ambient, Zanghi departed from the band in 2010. Drummer Steven Urgo subsequently joined the band, with keyboardist Robbie Bennett also joining at around this time. Regarding Zanghi’s exit, Granduciel noted: “I loved Mike, and I loved the sound of The Violators, but then he wasn’t really the sound of my band. But you have things like friendship, and he’s down to tour and he’s a great guy, but it wasn’t the sound of what this band was.”[11]

Slave Ambient was released to favorable reviews in 2011.[citation needed]

In 2012, Patrick Berkery replaced Urgo as the band’s drummer.[12]

On December 4, 2013 the band announced the upcoming release of its third studio album, Lost in the Dream (March 18, 2014). The band streamed the album in its entirety on NPR’s First Listen site for a week before its release.[13]

Lost in the Dream was featured as the Vinyl Me, Please record of the month in August 2014. The pressing was a limited edition pressing on mint green colored vinyl.

In June 2015, The War on Drugs signed with Atlantic Records for a two-album deal.[14]

On Record Store Day, April 22, 2017, The War on Drugs released their new single “Thinking of a Place.”[15] The single was produced by frontman Granduciel and Shawn Everett.[16] April 28, 2017, The War on Drugs announced a fall 2017 tour in North America and Europe and that a new album was imminent.[17] On June 1, 2017, a new song, “Holding On”, was released, and it was announced that the album would be titled A Deeper Understanding and was released on August 25, 2017.[18]

The 2017 tour begins in September, opening in the band’s hometown, Philadelphia, and it concludes in November in Sweden.[19]

A Deeper Understanding was nominated for the International Album of the Year award at the 2018 UK Americana Awards[20].

At the 60th Annual Grammy Awards, on January 28th, 2018, A Deeper Understanding won the Grammy for Best Rock Album [21]

Granduciel and Zanghi are both former members of founding guitarist Vile’s backing band The Violators, with Granduciel noting, “There was never, despite what lazy journalists have assumed, any sort of falling out, or resentment”[22] following Vile’s departure from The War on Drugs. In 2011, Vile stated, “When my record came out, I assumed Adam would want to focus on The War On Drugs but he came with us in The Violators when we toured the States. The Violators became a unit, and although the cast does rotate, we’ve developed an even tighter unity and sound. Adam is an incredible guitar player these days and there is a certain feeling [between us] that nobody else can tap into. We don’t really have to tell each other what to play, it just happens.”

Both Hartley and Granduciel contributed to singer-songwriter Sharon Van Etten’s fourth studio album, Are We There (2014). Hartley performs bass guitar on the entire album, with Granduciel contributing guitar on two tracks.

Granduciel is currently[when?] producing the new Sore Eros album. They have been recording it in Philadelphia and Los Angeles on and off for the past several years.[4]

In 2016, The War on Drugs contributed a cover of “Touch of Grey” for a Grateful Dead tribute album called Day of the Dead. The album was curated by The National’s Aaron and Bryce Dessner.[19]

Current members

Former members

See original here:

The War on Drugs (band) – Wikipedia

A Brief History of the Drug War | Drug Policy Alliance

This video from hip hop legend Jay Z and acclaimed artist Molly Crabapple depicts the drug wars devastating impact on the Black community from decades of biased law enforcement.

The video traces the drug war from President Nixon to the draconian Rockefeller Drug Laws to the emerging aboveground marijuana market that is poised to make legal millions for wealthy investors doing the same thing that generations of people of color have been arrested and locked up for. After you watch the video, read on to learn more about the discriminatory history of the war on drugs.

Many currently illegal drugs, such as marijuana, opium, coca, and psychedelics have been used for thousands of years for both medical and spiritual purposes. So why are some drugs legal and other drugs illegal today? It’s not based on any scientific assessment of the relative risks of these drugs but it has everything to do with who is associated with these drugs.

The first anti-opium laws in the 1870s were directed at Chinese immigrants. The first anti-cocaine laws in the early 1900s were directed at black men in the South. The first anti-marijuana laws, in the Midwest and the Southwest in the 1910s and 20s, were directed at Mexican migrants and Mexican Americans. Today, Latino and especially black communities are still subject to wildly disproportionate drug enforcement and sentencing practices.

In the 1960s, as drugs became symbols of youthful rebellion, social upheaval, and political dissent, the government halted scientific research to evaluate their medical safety and efficacy.

In June 1971, President Nixon declared a war on drugs. He dramatically increased the size and presence of federal drug control agencies, and pushed through measures such as mandatory sentencing and no-knock warrants.

A top Nixon aide, John Ehrlichman, later admitted: You want to know what this was really all about. The Nixon campaign in 1968, and the Nixon White House after that, had two enemies: the antiwar left and black people. You understand what Im saying. We knew we couldnt make it illegal to be either against the war or black, but by getting the public to associate the hippies with marijuana and blacks with heroin, and then criminalizing both heavily, we could disrupt those communities. We could arrest their leaders, raid their homes, break up their meetings, and vilify them night after night on the evening news. Did we know we were lying about the drugs? Of course we did.Nixon temporarily placed marijuana in Schedule One, the most restrictive category of drugs, pending review by a commission he appointed led by Republican Pennsylvania Governor Raymond Shafer.

In 1972, the commission unanimously recommended decriminalizing the possession and distribution of marijuana for personal use. Nixon ignored the report and rejected its recommendations.

Between 1973 and 1977, however, eleven states decriminalized marijuana possession. In January 1977, President Jimmy Carter was inaugurated on a campaign platform that included marijuana decriminalization. In October 1977, the Senate Judiciary Committee voted to decriminalize possession of up to an ounce of marijuana for personal use.

Within just a few years, though, the tide had shifted. Proposals to decriminalize marijuana were abandoned as parents became increasingly concerned about high rates of teen marijuana use. Marijuana was ultimately caught up in a broader cultural backlash against the perceived permissiveness of the 1970s.

The presidency of Ronald Reagan marked the start of a long period of skyrocketing rates of incarceration, largely thanks to his unprecedented expansion of the drug war. The number of people behind bars for nonviolent drug law offenses increased from 50,000 in 1980 to over 400,000 by 1997.

Public concern about illicit drug use built throughout the 1980s, largely due to media portrayals of people addicted to the smokeable form of cocaine dubbed crack. Soon after Ronald Reagan took office in 1981, his wife, Nancy Reagan, began a highly-publicized anti-drug campaign, coining the slogan “Just Say No.”

This set the stage for the zero tolerance policies implemented in the mid-to-late 1980s. Los Angeles Police Chief Daryl Gates, who believed that casual drug users should be taken out and shot, founded the DARE drug education program, which was quickly adopted nationwide despite the lack of evidence of its effectiveness. The increasingly harsh drug policies also blocked the expansion of syringe access programs and other harm reduction policies to reduce the rapid spread of HIV/AIDS.

In the late 1980s, a political hysteria about drugs led to the passage of draconian penalties in Congress and state legislatures that rapidly increased the prison population. In 1985, the proportion of Americans polled who saw drug abuse as the nation’s “number one problem” was just 2-6 percent. The figure grew through the remainder of the 1980s until, in September 1989, it reached a remarkable 64 percent one of the most intense fixations by the American public on any issue in polling history. Within less than a year, however, the figure plummeted to less than 10 percent, as the media lost interest. The draconian policies enacted during the hysteria remained, however, and continued to result in escalating levels of arrests and incarceration.

Although Bill Clinton advocated for treatment instead of incarceration during his 1992 presidential campaign, after his first few months in the White House he reverted to the drug war strategies of his Republican predecessors by continuing to escalate the drug war. Notoriously, Clinton rejected a U.S. Sentencing Commission recommendation to eliminate the disparity between crack and powder cocaine sentences.

He also rejected, with the encouragement of drug czar General Barry McCaffrey, Health Secretary Donna Shalalas advice to end the federal ban on funding for syringe access programs. Yet, a month before leaving office, Clinton asserted in a Rolling Stone interview that “we really need a re-examination of our entire policy on imprisonment” of people who use drugs, and said that marijuana use “should be decriminalized.”

At the height of the drug war hysteria in the late 1980s and early 1990s, a movement emerged seeking a new approach to drug policy. In 1987, Arnold Trebach and Kevin Zeese founded the Drug Policy Foundation describing it as the loyal opposition to the war on drugs. Prominent conservatives such as William Buckley and Milton Friedman had long advocated for ending drug prohibition, as had civil libertarians such as longtime ACLU Executive Director Ira Glasser. In the late 1980s they were joined by Baltimore Mayor Kurt Schmoke, Federal Judge Robert Sweet, Princeton professor Ethan Nadelmann, and other activists, scholars and policymakers.

In 1994, Nadelmann founded The Lindesmith Center as the first U.S. project of George Soros Open Society Institute. In 2000, the growing Center merged with the Drug Policy Foundation to create the Drug Policy Alliance.

George W. Bush arrived in the White House as the drug war was running out of steam yet he allocated more money than ever to it. His drug czar, John Walters, zealously focused on marijuana and launched a major campaign to promote student drug testing. While rates of illicit drug use remained constant, overdose fatalities rose rapidly.

The era of George W. Bush also witnessed the rapid escalation of the militarization of domestic drug law enforcement. By the end of Bush’s term, there were about 40,000 paramilitary-style SWAT raids on Americans every year mostly for nonviolent drug law offenses, often misdemeanors. While federal reform mostly stalled under Bush, state-level reforms finally began to slow the growth of the drug war.

Politicians now routinely admit to having used marijuana, and even cocaine, when they were younger. When Michael Bloomberg was questioned during his 2001 mayoral campaign about whether he had ever used marijuana, he said, “You bet I did and I enjoyed it.” Barack Obama also candidly discussed his prior cocaine and marijuana use: “When I was a kid, I inhaled frequently that was the point.”

Public opinion has shifted dramatically in favor of sensible reforms that expand health-based approaches while reducing the role of criminalization in drug policy.

Marijuana reform has gained unprecedented momentum throughout the Americas. Alaska, California, Colorado, Nevada, Oregon, Maine, Massachusetts, Washington State, and Washington D.C. have legalized marijuana for adults. In December 2013, Uruguay became the first country in the world to legally regulate marijuana. In Canada, Prime Minister Justin Trudeau plans legalize marijuana for adults by 2018.

In response to a worsening overdose epidemic, dozens of U.S. states passed laws to increase access to the overdose antidote, naloxone, as well as 911 Good Samaritan laws to encourage people to seek medical help in the event of an overdose.

Yet the assault on American citizens and others continues, with 700,000 people still arrested for marijuana offenses each year and almost 500,000 people still behind bars for nothing more than a drug law violation.

President Obama, despite supporting several successful policy changes such as reducing the crack/powder sentencing disparity, ending the ban on federal funding for syringe access programs, and ending federal interference with state medical marijuana laws did not shift the majority of drug policy funding to a health-based approach.

Now, the new administration is threatening to take us backward toward a 1980s style drug war. President Trump is calling for a wall to keep drugs out of the country, and Attorney General Jeff Sessions has made it clear that he does not support the sovereignty of states to legalize marijuana, and believes good people dont smoke marijuana.

Progress is inevitably slow, and even with an administration hostile to reform there is still unprecedented momentum behind drug policy reform in states and localities across the country. The Drug Policy Alliance and its allies will continue to advocate for health-based reforms such as marijuana legalization, drug decriminalization, safe consumption sites, naloxone access, bail reform, and more.

We look forward to a future where drug policies are shaped by science and compassion rather than political hysteria.

The rest is here:

A Brief History of the Drug War | Drug Policy Alliance

War on Drugs | United States history | Britannica.com

War on Drugs, the effort in the United States since the 1970s to combat illegal drug use by greatly increasing penalties, enforcement, and incarceration for drug offenders.

The War on Drugs began in June 1971 when U.S. Pres. Richard Nixon declared drug abuse to be public enemy number one and increased federal funding for drug-control agencies and drug-treatment efforts. In 1973 the Drug Enforcement Agency was created out of the merger of the Office for Drug Abuse Law Enforcement, the Bureau of Narcotics and Dangerous Drugs, and the Office of Narcotics Intelligence to consolidate federal efforts to control drug abuse.

The War on Drugs was a relatively small component of federal law-enforcement efforts until the presidency of Ronald Reagan, which began in 1981. Reagan greatly expanded the reach of the drug war and his focus on criminal punishment over treatment led to a massive increase in incarcerations for nonviolent drug offenses, from 50,000 in 1980 to 400,000 in 1997. In 1984 his wife, Nancy, spearheaded another facet of the War on Drugs with her Just Say No campaign, which was a privately funded effort to educate schoolchildren on the dangers of drug use. The expansion of the War on Drugs was in many ways driven by increased media coverage ofand resulting public nervousness overthe crack epidemic that arose in the early 1980s. This heightened concern over illicit drug use helped drive political support for Reagans hard-line stance on drugs. The U.S. Congress passed the Anti-Drug Abuse Act of 1986, which allocated $1.7 billion to the War on Drugs and established a series of mandatory minimum prison sentences for various drug offenses. A notable feature of mandatory minimums was the massive gap between the amounts of crack and of powder cocaine that resulted in the same minimum sentence: possession of five grams of crack led to an automatic five-year sentence while it took the possession of 500 grams of powder cocaine to trigger that sentence. Since approximately 80% of crack users were African American, mandatory minimums led to an unequal increase of incarceration rates for nonviolent black drug offenders, as well as claims that the War on Drugs was a racist institution.

Concerns over the effectiveness of the War on Drugs and increased awareness of the racial disparity of the punishments meted out by it led to decreased public support of the most draconian aspects of the drug war during the early 21st century. Consequently, reforms were enacted during that time, such as the legalization of recreational marijuana in a number of states and the passage of the Fair Sentencing Act of 2010 that reduced the discrepancy of crack-to-powder possession thresholds for minimum sentences from 100-to-1 to 18-to-1. While the War on Drugs is still technically being waged, it is done at much less intense level than it was during its peak in the 1980s.

See more here:

War on Drugs | United States history | Britannica.com

Institute of Bioengineering and Nanotechnology

Professor Jonathan ClaydenSchool of Chemistry, University of Bristol, UK

Tuesday, January 23, 2018 9:00 am to 10:00 am

Discovery Theatrette, Level 4 The Matrix, 30 Biopolis Street, Biopolis

AbstractBiology solves the problem of communicating information through cell membranes by means of conformationally switchable proteins, of which the most important are the G-protein coupled receptors (GPCRs). The lecture will describe the design and synthesis of dynamic foldamers as artificial mimics of GPCRs, with the ultimate aim of controlling function in the interior of an artificial vesicle. Techniques that allow detailed dynamic conformational information to be extracted both in solution and in the membrane phase will be described.

About the SpeakerJonathan Clayden was born in Uganda in 1968, grew up in the county of Essex, in the East of England, and was an undergraduate at Churchill College, Cambridge. In 1992 he completed a PhD at the University of Cambridge with Dr Stuart Warren. After postdoctoral work with Professor Marc Julia at the cole Normale Suprieure in Paris, he moved in 1994 to Manchester as a lecturer. In 2001 he was promoted to full professor, and in 2015 he moved to a position as Professor of Chemistry at the University of Bristol.

His research interests encompass various areas of synthesis and stereochemistry, particularly where conformation has a role to play: asymmetric synthesis, atropisomerism, organolithium chemistry, long-range stereocontrol. He has pioneered the field of dynamic foldamer chemistry for the synthesis of artificial molecules with biomimetic function.

He is a co-author of the widely used textbook Organic Chemistry, and his book Organolithiums: Selectivity for Synthesis was published by Pergamon in 2002.

He has received the Royal Society of Chemistrys Meldola (1997) and Corday Morgan (2003) medals, Stereochemistry Prize (2005), Hickinbottom Fellowship (2006) and Merck Prize (2011), and the Novartis Young European Investigator Award (2004). He held senior research fellowships from the Leverhulme Trust and the Royal Society in 2003-4 and 2009-10 and has held a Royal Society Wolfson Research Merit award and a European Research Council Advanced Investigator Grant (2.5M).

This seminar is free and no registration is required.

Follow this link:

Institute of Bioengineering and Nanotechnology

Molecular nanotechnology – Wikipedia

Molecular nanotechnology (MNT) is a technology based on the ability to build structures to complex, atomic specifications by means of mechanosynthesis.[1] This is distinct from nanoscale materials. Based on Richard Feynman’s vision of miniature factories using nanomachines to build complex products (including additional nanomachines), this advanced form of nanotechnology (or molecular manufacturing[2]) would make use of positionally-controlled mechanosynthesis guided by molecular machine systems. MNT would involve combining physical principles demonstrated by biophysics, chemistry, other nanotechnologies, and the molecular machinery of life with the systems engineering principles found in modern macroscale factories.

While conventional chemistry uses inexact processes obtaining inexact results, and biology exploits inexact processes to obtain definitive results, molecular nanotechnology would employ original definitive processes to obtain definitive results. The desire in molecular nanotechnology would be to balance molecular reactions in positionally-controlled locations and orientations to obtain desired chemical reactions, and then to build systems by further assembling the products of these reactions.

A roadmap for the development of MNT is an objective of a broadly based technology project led by Battelle (the manager of several U.S. National Laboratories) and the Foresight Institute.[3] The roadmap was originally scheduled for completion by late 2006, but was released in January 2008.[4] The Nanofactory Collaboration[5] is a more focused ongoing effort involving 23 researchers from 10 organizations and 4 countries that is developing a practical research agenda[6] specifically aimed at positionally-controlled diamond mechanosynthesis and diamondoid nanofactory development. In August 2005, a task force consisting of 50+ international experts from various fields was organized by the Center for Responsible Nanotechnology to study the societal implications of molecular nanotechnology.[7]

One proposed application of MNT is so-called smart materials. This term refers to any sort of material designed and engineered at the nanometer scale for a specific task. It encompasses a wide variety of possible commercial applications. One example would be materials designed to respond differently to various molecules; such a capability could lead, for example, to artificial drugs which would recognize and render inert specific viruses. Another is the idea of self-healing structures, which would repair small tears in a surface naturally in the same way as self-sealing tires or human skin.

A MNT nanosensor would resemble a smart material, involving a small component within a larger machine that would react to its environment and change in some fundamental, intentional way. A very simple example: a photosensor might passively measure the incident light and discharge its absorbed energy as electricity when the light passes above or below a specified threshold, sending a signal to a larger machine. Such a sensor would supposedly cost less and use less power than a conventional sensor, and yet function usefully in all the same applications for example, turning on parking lot lights when it gets dark.

While smart materials and nanosensors both exemplify useful applications of MNT, they pale in comparison with the complexity of the technology most popularly associated with the term: the replicating nanorobot.

MNT nanofacturing is popularly linked with the idea of swarms of coordinated nanoscale robots working together, a popularization of an early proposal by K. Eric Drexler in his 1986 discussions of MNT, but superseded in 1992. In this early proposal, sufficiently capable nanorobots would construct more nanorobots in an artificial environment containing special molecular building blocks.

Critics have doubted both the feasibility of self-replicating nanorobots and the feasibility of control if self-replicating nanorobots could be achieved: they cite the possibility of mutations removing any control and favoring reproduction of mutant pathogenic variations. Advocates address the first doubt by pointing out that the first macroscale autonomous machine replicator, made of Lego blocks, was built and operated experimentally in 2002.[8] While there are sensory advantages present at the macroscale compared to the limited sensorium available at the nanoscale, proposals for positionally controlled nanoscale mechanosynthetic fabrication systems employ dead reckoning of tooltips combined with reliable reaction sequence design to ensure reliable results, hence a limited sensorium is no handicap; similar considerations apply to the positional assembly of small nanoparts. Advocates address the second doubt by arguing that bacteria are (of necessity) evolved to evolve, while nanorobot mutation could be actively prevented by common error-correcting techniques. Similar ideas are advocated in the Foresight Guidelines on Molecular Nanotechnology,[9] and a map of the 137-dimensional replicator design space[10] recently published by Freitas and Merkle provides numerous proposed methods by which replicators could, in principle, be safely controlled by good design.

However, the concept of suppressing mutation raises the question: How can design evolution occur at the nanoscale without a process of random mutation and deterministic selection? Critics argue that MNT advocates have not provided a substitute for such a process of evolution in this nanoscale arena where conventional sensory-based selection processes are lacking. The limits of the sensorium available at the nanoscale could make it difficult or impossible to winnow successes from failures. Advocates argue that design evolution should occur deterministically and strictly under human control, using the conventional engineering paradigm of modeling, design, prototyping, testing, analysis, and redesign.

In any event, since 1992 technical proposals for MNT do not include self-replicating nanorobots, and recent ethical guidelines put forth by MNT advocates prohibit unconstrained self-replication.[9][11]

One of the most important applications of MNT would be medical nanorobotics or nanomedicine, an area pioneered by Robert Freitas in numerous books[12] and papers.[13] The ability to design, build, and deploy large numbers of medical nanorobots would, at a minimum, make possible the rapid elimination of disease and the reliable and relatively painless recovery from physical trauma. Medical nanorobots might also make possible the convenient correction of genetic defects, and help to ensure a greatly expanded lifespan. More controversially, medical nanorobots might be used to augment natural human capabilities. One study has reported on the conditions like tumors, arteriosclerosis, blood clots leading to stroke, accumulation of scar tissue and localized pockets of infection can be possibly be addressed by employing medical nanorobots.[14][15]

Another proposed application of molecular nanotechnology is “utility fog”[16] in which a cloud of networked microscopic robots (simpler than assemblers) would change its shape and properties to form macroscopic objects and tools in accordance with software commands. Rather than modify the current practices of consuming material goods in different forms, utility fog would simply replace many physical objects.

Yet another proposed application of MNT would be phased-array optics (PAO).[17] However, this appears to be a problem addressable by ordinary nanoscale technology. PAO would use the principle of phased-array millimeter technology but at optical wavelengths. This would permit the duplication of any sort of optical effect but virtually. Users could request holograms, sunrises and sunsets, or floating lasers as the mood strikes. PAO systems were described in BC Crandall’s Nanotechnology: Molecular Speculations on Global Abundance in the Brian Wowk article “Phased-Array Optics.”[18]

Molecular manufacturing is a potential future subfield of nanotechnology that would make it possible to build complex structures at atomic precision.[19] Molecular manufacturing requires significant advances in nanotechnology, but once achieved could produce highly advanced products at low costs and in large quantities in nanofactories weighing a kilogram or more.[19][20] When nanofactories gain the ability to produce other nanofactories production may only be limited by relatively abundant factors such as input materials, energy and software.[20]

The products of molecular manufacturing could range from cheaper, mass-produced versions of known high-tech products to novel products with added capabilities in many areas of application. Some applications that have been suggested are advanced smart materials, nanosensors, medical nanorobots and space travel.[19] Additionally, molecular manufacturing could be used to cheaply produce highly advanced, durable weapons, which is an area of special concern regarding the impact of nanotechnology.[20] Being equipped with compact computers and motors these could be increasingly autonomous and have a large range of capabilities.[20]

According to Chris Phoenix and Mike Treder from the Center for Responsible Nanotechnology as well as Anders Sandberg from the Future of Humanity Institute molecular manufacturing is the application of nanotechnology that poses the most significant global catastrophic risk.[20][21] Several nanotechnology researchers state that the bulk of risk from nanotechnology comes from the potential to lead to war, arms races and destructive global government.[20][21][22] Several reasons have been suggested why the availability of nanotech weaponry may with significant likelihood lead to unstable arms races (compared to e.g. nuclear arms races): (1) A large number of players may be tempted to enter the race since the threshold for doing so is low;[20] (2) the ability to make weapons with molecular manufacturing will be cheap and easy to hide;[20] (3) therefore lack of insight into the other parties’ capabilities can tempt players to arm out of caution or to launch preemptive strikes;[20][23] (4) molecular manufacturing may reduce dependency on international trade,[20] a potential peace-promoting factor;[24] (5) wars of aggression may pose a smaller economic threat to the aggressor since manufacturing is cheap and humans may not be needed on the battlefield.[20]

Since self-regulation by all state and non-state actors seems hard to achieve,[25] measures to mitigate war-related risks have mainly been proposed in the area of international cooperation.[20][26] International infrastructure may be expanded giving more sovereignty to the international level. This could help coordinate efforts for arms control.[27] International institutions dedicated specifically to nanotechnology (perhaps analogously to the International Atomic Energy Agency IAEA) or general arms control may also be designed.[26] One may also jointly make differential technological progress on defensive technologies, a policy that players should usually favour.[20] The Center for Responsible Nanotechnology also suggest some technical restrictions.[28] Improved transparency regarding technological capabilities may be another important facilitator for arms-control.[29]

A grey goo is another catastrophic scenario, which was proposed by Eric Drexler in his 1986 book Engines of Creation,[30] has been analyzed by Freitas in “Some Limits to Global Ecophagy by Biovorous Nanoreplicators, with Public Policy Recommendations” [31] and has been a theme in mainstream media and fiction.[32][33] This scenario involves tiny self-replicating robots that consume the entire biosphere using it as a source of energy and building blocks. Nanotech experts including Drexler now discredit the scenario. According to Chris Phoenix a “So-called grey goo could only be the product of a deliberate and difficult engineering process, not an accident”.[34] With the advent of nano-biotech, a different scenario called green goo has been forwarded. Here, the malignant substance is not nanobots but rather self-replicating biological organisms engineered through nanotechnology.

Nanotechnology (or molecular nanotechnology to refer more specifically to the goals discussed here) will let us continue the historical trends in manufacturing right up to the fundamental limits imposed by physical law. It will let us make remarkably powerful molecular computers. It will let us make materials over fifty times lighter than steel or aluminium alloy but with the same strength. We’ll be able to make jets, rockets, cars or even chairs that, by today’s standards, would be remarkably light, strong, and inexpensive. Molecular surgical tools, guided by molecular computers and injected into the blood stream could find and destroy cancer cells or invading bacteria, unclog arteries, or provide oxygen when the circulation is impaired.

Nanotechnology will replace our entire manufacturing base with a new, radically more precise, radically less expensive, and radically more flexible way of making products. The aim is not simply to replace today’s computer chip making plants, but also to replace the assembly lines for cars, televisions, telephones, books, surgical tools, missiles, bookcases, airplanes, tractors, and all the rest. The objective is a pervasive change in manufacturing, a change that will leave virtually no product untouched. Economic progress and military readiness in the 21st Century will depend fundamentally on maintaining a competitive position in nanotechnology.

[35]

Despite the current early developmental status of nanotechnology and molecular nanotechnology, much concern surrounds MNT’s anticipated impact on economics[36][37] and on law. Whatever the exact effects, MNT, if achieved, would tend to reduce the scarcity of manufactured goods and make many more goods (such as food and health aids) manufacturable.

MNT should make possible nanomedical capabilities able to cure any medical condition not already cured by advances in other areas. Good health would be common, and poor health of any form would be as rare as smallpox and scurvy are today. Even cryonics would be feasible, as cryopreserved tissue could be fully repaired.

Molecular nanotechnology is one of the technologies that some analysts believe could lead to a technological singularity. Some feel that molecular nanotechnology would have daunting risks.[38] It conceivably could enable cheaper and more destructive conventional weapons. Also, molecular nanotechnology might permit weapons of mass destruction that could self-replicate, as viruses and cancer cells do when attacking the human body. Commentators generally agree that, in the event molecular nanotechnology were developed, its self-replication should be permitted only under very controlled or “inherently safe” conditions.

A fear exists that nanomechanical robots, if achieved, and if designed to self-replicate using naturally occurring materials (a difficult task), could consume the entire planet in their hunger for raw materials,[39] or simply crowd out natural life, out-competing it for energy (as happened historically when blue-green algae appeared and outcompeted earlier life forms). Some commentators have referred to this situation as the “grey goo” or “ecophagy” scenario. K. Eric Drexler considers an accidental “grey goo” scenario extremely unlikely and says so in later editions of Engines of Creation.

In light of this perception of potential danger, the Foresight Institute, founded by Drexler, has prepared a set of guidelines[40] for the ethical development of nanotechnology. These include the banning of free-foraging self-replicating pseudo-organisms on the Earth’s surface, at least, and possibly in other places.

The feasibility of the basic technologies analyzed in Nanosystems has been the subject of a formal scientific review by U.S. National Academy of Sciences, and has also been the focus of extensive debate on the internet and in the popular press.

In 2006, U.S. National Academy of Sciences released the report of a study of molecular manufacturing as part of a longer report, A Matter of Size: Triennial Review of the National Nanotechnology Initiative[41] The study committee reviewed the technical content of Nanosystems, and in its conclusion states that no current theoretical analysis can be considered definitive regarding several questions of potential system performance, and that optimal paths for implementing high-performance systems cannot be predicted with confidence. It recommends experimental research to advance knowledge in this area:

A section heading in Drexler’s Engines of Creation reads[42] “Universal Assemblers”, and the following text speaks of multiple types of assemblers which, collectively, could hypothetically “build almost anything that the laws of nature allow to exist.” Drexler’s colleague Ralph Merkle has noted that, contrary to widespread legend,[43] Drexler never claimed that assembler systems could build absolutely any molecular structure. The endnotes in Drexler’s book explain the qualification “almost”: “For example, a delicate structure might be designed that, like a stone arch, would self-destruct unless all its pieces were already in place. If there were no room in the design for the placement and removal of a scaffolding, then the structure might be impossible to build. Few structures of practical interest seem likely to exhibit such a problem, however.”

In 1992, Drexler published Nanosystems: Molecular Machinery, Manufacturing, and Computation,[44] a detailed proposal for synthesizing stiff covalent structures using a table-top factory. Diamondoid structures and other stiff covalent structures, if achieved, would have a wide range of possible applications, going far beyond current MEMS technology. An outline of a path was put forward in 1992 for building a table-top factory in the absence of an assembler. Other researchers have begun advancing tentative, alternative proposed paths [5] for this in the years since Nanosystems was published.

In 2004 Richard Jones wrote Soft Machines (nanotechnology and life), a book for lay audiences published by Oxford University. In this book he describes radical nanotechnology (as advocated by Drexler) as a deterministic/mechanistic idea of nano engineered machines that does not take into account the nanoscale challenges such as wetness, stickiness, Brownian motion, and high viscosity. He also explains what is soft nanotechnology or more appropriatelly biomimetic nanotechnology which is the way forward, if not the best way, to design functional nanodevices that can cope with all the problems at a nanoscale. One can think of soft nanotechnology as the development of nanomachines that uses the lessons learned from biology on how things work, chemistry to precisely engineer such devices and stochastic physics to model the system and its natural processes in detail.

Several researchers, including Nobel Prize winner Dr. Richard Smalley (19432005),[45] attacked the notion of universal assemblers, leading to a rebuttal from Drexler and colleagues,[46] and eventually to an exchange of letters.[47] Smalley argued that chemistry is extremely complicated, reactions are hard to control, and that a universal assembler is science fiction. Drexler and colleagues, however, noted that Drexler never proposed universal assemblers able to make absolutely anything, but instead proposed more limited assemblers able to make a very wide variety of things. They challenged the relevance of Smalley’s arguments to the more specific proposals advanced in Nanosystems. Also, Smalley argued that nearly all of modern chemistry involves reactions that take place in a solvent (usually water), because the small molecules of a solvent contribute many things, such as lowering binding energies for transition states. Since nearly all known chemistry requires a solvent, Smalley felt that Drexler’s proposal to use a high vacuum environment was not feasible. However, Drexler addresses this in Nanosystems by showing mathematically that well designed catalysts can provide the effects of a solvent and can fundamentally be made even more efficient than a solvent/enzyme reaction could ever be. It is noteworthy that, contrary to Smalley’s opinion that enzymes require water, “Not only do enzymes work vigorously in anhydrous organic media, but in this unnatural milieu they acquire remarkable properties such as greatly enhanced stability, radically altered substrate and enantiomeric specificities, molecular memory, and the ability to catalyse unusual reactions.”[48]

For the future, some means have to be found for MNT design evolution at the nanoscale which mimics the process of biological evolution at the molecular scale. Biological evolution proceeds by random variation in ensemble averages of organisms combined with culling of the less-successful variants and reproduction of the more-successful variants, and macroscale engineering design also proceeds by a process of design evolution from simplicity to complexity as set forth somewhat satirically by John Gall: “A complex system that works is invariably found to have evolved from a simple system that worked. . . . A complex system designed from scratch never works and can not be patched up to make it work. You have to start over, beginning with a system that works.” [49] A breakthrough in MNT is needed which proceeds from the simple atomic ensembles which can be built with, e.g., an STM to complex MNT systems via a process of design evolution. A handicap in this process is the difficulty of seeing and manipulation at the nanoscale compared to the macroscale which makes deterministic selection of successful trials difficult; in contrast biological evolution proceeds via action of what Richard Dawkins has called the “blind watchmaker” [50] comprising random molecular variation and deterministic reproduction/extinction.

At present in 2007 the practice of nanotechnology embraces both stochastic approaches (in which, for example, supramolecular chemistry creates waterproof pants) and deterministic approaches wherein single molecules (created by stochastic chemistry) are manipulated on substrate surfaces (created by stochastic deposition methods) by deterministic methods comprising nudging them with STM or AFM probes and causing simple binding or cleavage reactions to occur. The dream of a complex, deterministic molecular nanotechnology remains elusive. Since the mid-1990s, thousands of surface scientists and thin film technocrats have latched on to the nanotechnology bandwagon and redefined their disciplines as nanotechnology. This has caused much confusion in the field and has spawned thousands of “nano”-papers on the peer reviewed literature. Most of these reports are extensions of the more ordinary research done in the parent fields.

The feasibility of Drexler’s proposals largely depends, therefore, on whether designs like those in Nanosystems could be built in the absence of a universal assembler to build them and would work as described. Supporters of molecular nanotechnology frequently claim that no significant errors have been discovered in Nanosystems since 1992. Even some critics concede[51] that “Drexler has carefully considered a number of physical principles underlying the ‘high level’ aspects of the nanosystems he proposes and, indeed, has thought in some detail” about some issues.

Other critics claim, however, that Nanosystems omits important chemical details about the low-level ‘machine language’ of molecular nanotechnology.[52][53][54][55] They also claim that much of the other low-level chemistry in Nanosystems requires extensive further work, and that Drexler’s higher-level designs therefore rest on speculative foundations. Recent such further work by Freitas and Merkle [56] is aimed at strengthening these foundations by filling the existing gaps in the low-level chemistry.

Drexler argues that we may need to wait until our conventional nanotechnology improves before solving these issues: “Molecular manufacturing will result from a series of advances in molecular machine systems, much as the first Moon landing resulted from a series of advances in liquid-fuel rocket systems. We are now in a position like that of the British Interplanetary Society of the 1930s which described how multistage liquid-fueled rockets could reach the Moon and pointed to early rockets as illustrations of the basic principle.”[57] However, Freitas and Merkle argue [58] that a focused effort to achieve diamond mechanosynthesis (DMS) can begin now, using existing technology, and might achieve success in less than a decade if their “direct-to-DMS approach is pursued rather than a more circuitous development approach that seeks to implement less efficacious nondiamondoid molecular manufacturing technologies before progressing to diamondoid”.

To summarize the arguments against feasibility: First, critics argue that a primary barrier to achieving molecular nanotechnology is the lack of an efficient way to create machines on a molecular/atomic scale, especially in the absence of a well-defined path toward a self-replicating assembler or diamondoid nanofactory. Advocates respond that a preliminary research path leading to a diamondoid nanofactory is being developed.[6]

A second difficulty in reaching molecular nanotechnology is design. Hand design of a gear or bearing at the level of atoms might take a few to several weeks. While Drexler, Merkle and others have created designs of simple parts, no comprehensive design effort for anything approaching the complexity of a Model T Ford has been attempted. Advocates respond that it is difficult to undertake a comprehensive design effort in the absence of significant funding for such efforts, and that despite this handicap much useful design-ahead has nevertheless been accomplished with new software tools that have been developed, e.g., at Nanorex.[59]

In the latest report A Matter of Size: Triennial Review of the National Nanotechnology Initiative[41] put out by the National Academies Press in December 2006 (roughly twenty years after Engines of Creation was published), no clear way forward toward molecular nanotechnology could yet be seen, as per the conclusion on page 108 of that report: “Although theoretical calculations can be made today, the eventually attainable range of chemical reaction cycles, error rates, speed of operation, and thermodynamic efficiencies of such bottom-up manufacturing systems cannot be reliably predicted at this time. Thus, the eventually attainable perfection and complexity of manufactured products, while they can be calculated in theory, cannot be predicted with confidence. Finally, the optimum research paths that might lead to systems which greatly exceed the thermodynamic efficiencies and other capabilities of biological systems cannot be reliably predicted at this time. Research funding that is based on the ability of investigators to produce experimental demonstrations that link to abstract models and guide long-term vision is most appropriate to achieve this goal.” This call for research leading to demonstrations is welcomed by groups such as the Nanofactory Collaboration who are specifically seeking experimental successes in diamond mechanosynthesis.[60] The “Technology Roadmap for Productive Nanosystems”[61] aims to offer additional constructive insights.

It is perhaps interesting to ask whether or not most structures consistent with physical law can in fact be manufactured. Advocates assert that to achieve most of the vision of molecular manufacturing it is not necessary to be able to build “any structure that is compatible with natural law.” Rather, it is necessary to be able to build only a sufficient (possibly modest) subset of such structuresas is true, in fact, of any practical manufacturing process used in the world today, and is true even in biology. In any event, as Richard Feynman once said, “It is scientific only to say what’s more likely or less likely, and not to be proving all the time what’s possible or impossible.”[62]

There is a growing body of peer-reviewed theoretical work on synthesizing diamond by mechanically removing/adding hydrogen atoms [63] and depositing carbon atoms [64][65][66][67][68][69] (a process known as mechanosynthesis). This work is slowly permeating the broader nanoscience community and is being critiqued. For instance, Peng et al. (2006)[70] (in the continuing research effort by Freitas, Merkle and their collaborators) reports that the most-studied mechanosynthesis tooltip motif (DCB6Ge) successfully places a C2 carbon dimer on a C(110) diamond surface at both 300K (room temperature) and 80K (liquid nitrogen temperature), and that the silicon variant (DCB6Si) also works at 80K but not at 300K. Over 100,000 CPU hours were invested in this latest study. The DCB6 tooltip motif, initially described by Merkle and Freitas at a Foresight Conference in 2002, was the first complete tooltip ever proposed for diamond mechanosynthesis and remains the only tooltip motif that has been successfully simulated for its intended function on a full 200-atom diamond surface.

The tooltips modeled in this work are intended to be used only in carefully controlled environments (e.g., vacuum). Maximum acceptable limits for tooltip translational and rotational misplacement errors are reported in Peng et al. (2006) — tooltips must be positioned with great accuracy to avoid bonding the dimer incorrectly. Peng et al. (2006) reports that increasing the handle thickness from 4 support planes of C atoms above the tooltip to 5 planes decreases the resonance frequency of the entire structure from 2.0THz to 1.8THz. More importantly, the vibrational footprints of a DCB6Ge tooltip mounted on a 384-atom handle and of the same tooltip mounted on a similarly constrained but much larger 636-atom “crossbar” handle are virtually identical in the non-crossbar directions. Additional computational studies modeling still bigger handle structures are welcome, but the ability to precisely position SPM tips to the requisite atomic accuracy has been repeatedly demonstrated experimentally at low temperature,[71][72] or even at room temperature[73][74] constituting a basic existence proof for this capability.

Further research[75] to consider additional tooltips will require time-consuming computational chemistry and difficult laboratory work.

A working nanofactory would require a variety of well-designed tips for different reactions, and detailed analyses of placing atoms on more complicated surfaces. Although this appears a challenging problem given current resources, many tools will be available to help future researchers: Moore’s law predicts further increases in computer power, semiconductor fabrication techniques continue to approach the nanoscale, and researchers grow ever more skilled at using proteins, ribosomes and DNA to perform novel chemistry.

See the rest here:

Molecular nanotechnology – Wikipedia


...23456...102030...