12345...102030...


Nanotechnology – Wikipedia

Nanotechnology (“nanotech”) is manipulation of matter on an atomic, molecular, and supramolecular scale. The earliest, widespread description of nanotechnology[1][2] referred to the particular technological goal of precisely manipulating atoms and molecules for fabrication of macroscale products, also now referred to as molecular nanotechnology. A more generalized description of nanotechnology was subsequently established by the National Nanotechnology Initiative, which defines nanotechnology as the manipulation of matter with at least one dimension sized from 1 to 100 nanometers. This definition reflects the fact that quantum mechanical effects are important at this quantum-realm scale, and so the definition shifted from a particular technological goal to a research category inclusive of all types of research and technologies that deal with the special properties of matter which occur below the given size threshold. It is therefore common to see the plural form “nanotechnologies” as well as “nanoscale technologies” to refer to the broad range of research and applications whose common trait is size. Because of the variety of potential applications (including industrial and military), governments have invested billions of dollars in nanotechnology research. Through 2012, the USA has invested $3.7 billion using its National Nanotechnology Initiative, the European Union has invested $1.2 billion, and Japan has invested $750 million.[3]

Nanotechnology as defined by size is naturally very broad, including fields of science as diverse as surface science, organic chemistry, molecular biology, semiconductor physics, energy storage,[4][5] microfabrication,[6] molecular engineering, etc.[7] The associated research and applications are equally diverse, ranging from extensions of conventional device physics to completely new approaches based upon molecular self-assembly,[8] from developing new materials with dimensions on the nanoscale to direct control of matter on the atomic scale.

Scientists currently debate the future implications of nanotechnology. Nanotechnology may be able to create many new materials and devices with a vast range of applications, such as in nanomedicine, nanoelectronics, biomaterials energy production, and consumer products. On the other hand, nanotechnology raises many of the same issues as any new technology, including concerns about the toxicity and environmental impact of nanomaterials,[9] and their potential effects on global economics, as well as speculation about various doomsday scenarios. These concerns have led to a debate among advocacy groups and governments on whether special regulation of nanotechnology is warranted.

The concepts that seeded nanotechnology were first discussed in 1959 by renowned physicist Richard Feynman in his talk There’s Plenty of Room at the Bottom, in which he described the possibility of synthesis via direct manipulation of atoms. The term “nano-technology” was first used by Norio Taniguchi in 1974, though it was not widely known.

Inspired by Feynman’s concepts, K. Eric Drexler used the term “nanotechnology” in his 1986 book Engines of Creation: The Coming Era of Nanotechnology, which proposed the idea of a nanoscale “assembler” which would be able to build a copy of itself and of other items of arbitrary complexity with atomic control. Also in 1986, Drexler co-founded The Foresight Institute (with which he is no longer affiliated) to help increase public awareness and understanding of nanotechnology concepts and implications.

Thus, emergence of nanotechnology as a field in the 1980s occurred through convergence of Drexler’s theoretical and public work, which developed and popularized a conceptual framework for nanotechnology, and high-visibility experimental advances that drew additional wide-scale attention to the prospects of atomic control of matter. Since the popularity spike in the 1980s, most of nanotechnology has involved investigation of several approaches to making mechanical devices out of a small number of atoms.[10]

In the 1980s, two major breakthroughs sparked the growth of nanotechnology in modern era. First, the invention of the scanning tunneling microscope in 1981 which provided unprecedented visualization of individual atoms and bonds, and was successfully used to manipulate individual atoms in 1989. The microscope’s developers Gerd Binnig and Heinrich Rohrer at IBM Zurich Research Laboratory received a Nobel Prize in Physics in 1986.[11][12] Binnig, Quate and Gerber also invented the analogous atomic force microscope that year.

Second, Fullerenes were discovered in 1985 by Harry Kroto, Richard Smalley, and Robert Curl, who together won the 1996 Nobel Prize in Chemistry.[13][14] C60 was not initially described as nanotechnology; the term was used regarding subsequent work with related graphene tubes (called carbon nanotubes and sometimes called Bucky tubes) which suggested potential applications for nanoscale electronics and devices.

In the early 2000s, the field garnered increased scientific, political, and commercial attention that led to both controversy and progress. Controversies emerged regarding the definitions and potential implications of nanotechnologies, exemplified by the Royal Society’s report on nanotechnology.[15] Challenges were raised regarding the feasibility of applications envisioned by advocates of molecular nanotechnology, which culminated in a public debate between Drexler and Smalley in 2001 and 2003.[16]

Meanwhile, commercialization of products based on advancements in nanoscale technologies began emerging. These products are limited to bulk applications of nanomaterials and do not involve atomic control of matter. Some examples include the Silver Nano platform for using silver nanoparticles as an antibacterial agent, nanoparticle-based transparent sunscreens, carbon fiber strengthening using silica nanoparticles, and carbon nanotubes for stain-resistant textiles.[17][18]

Governments moved to promote and fund research into nanotechnology, such as in the U.S. with the National Nanotechnology Initiative, which formalized a size-based definition of nanotechnology and established funding for research on the nanoscale, and in Europe via the European Framework Programmes for Research and Technological Development.

By the mid-2000s new and serious scientific attention began to flourish. Projects emerged to produce nanotechnology roadmaps[19][20] which center on atomically precise manipulation of matter and discuss existing and projected capabilities, goals, and applications.

Nanotechnology is the engineering of functional systems at the molecular scale. This covers both current work and concepts that are more advanced. In its original sense, nanotechnology refers to the projected ability to construct items from the bottom up, using techniques and tools being developed today to make complete, high performance products.

One nanometer (nm) is one billionth, or 109, of a meter. By comparison, typical carbon-carbon bond lengths, or the spacing between these atoms in a molecule, are in the range 0.120.15 nm, and a DNA double-helix has a diameter around 2nm. On the other hand, the smallest cellular life-forms, the bacteria of the genus Mycoplasma, are around 200nm in length. By convention, nanotechnology is taken as the scale range 1 to 100 nm following the definition used by the National Nanotechnology Initiative in the US. The lower limit is set by the size of atoms (hydrogen has the smallest atoms, which are approximately a quarter of a nm kinetic diameter) since nanotechnology must build its devices from atoms and molecules. The upper limit is more or less arbitrary but is around the size below which phenomena not observed in larger structures start to become apparent and can be made use of in the nano device.[21] These new phenomena make nanotechnology distinct from devices which are merely miniaturised versions of an equivalent macroscopic device; such devices are on a larger scale and come under the description of microtechnology.[22]

To put that scale in another context, the comparative size of a nanometer to a meter is the same as that of a marble to the size of the earth.[23] Or another way of putting it: a nanometer is the amount an average man’s beard grows in the time it takes him to raise the razor to his face.[23]

Two main approaches are used in nanotechnology. In the “bottom-up” approach, materials and devices are built from molecular components which assemble themselves chemically by principles of molecular recognition.[24] In the “top-down” approach, nano-objects are constructed from larger entities without atomic-level control.[25]

Areas of physics such as nanoelectronics, nanomechanics, nanophotonics and nanoionics have evolved during the last few decades to provide a basic scientific foundation of nanotechnology.

Several phenomena become pronounced as the size of the system decreases. These include statistical mechanical effects, as well as quantum mechanical effects, for example the “quantum size effect” where the electronic properties of solids are altered with great reductions in particle size. This effect does not come into play by going from macro to micro dimensions. However, quantum effects can become significant when the nanometer size range is reached, typically at distances of 100 nanometers or less, the so-called quantum realm. Additionally, a number of physical (mechanical, electrical, optical, etc.) properties change when compared to macroscopic systems. One example is the increase in surface area to volume ratio altering mechanical, thermal and catalytic properties of materials. Diffusion and reactions at nanoscale, nanostructures materials and nanodevices with fast ion transport are generally referred to nanoionics. Mechanical properties of nanosystems are of interest in the nanomechanics research. The catalytic activity of nanomaterials also opens potential risks in their interaction with biomaterials.

Materials reduced to the nanoscale can show different properties compared to what they exhibit on a macroscale, enabling unique applications. For instance, opaque substances can become transparent (copper); stable materials can turn combustible (aluminium); insoluble materials may become soluble (gold). A material such as gold, which is chemically inert at normal scales, can serve as a potent chemical catalyst at nanoscales. Much of the fascination with nanotechnology stems from these quantum and surface phenomena that matter exhibits at the nanoscale.[26]

Modern synthetic chemistry has reached the point where it is possible to prepare small molecules to almost any structure. These methods are used today to manufacture a wide variety of useful chemicals such as pharmaceuticals or commercial polymers. This ability raises the question of extending this kind of control to the next-larger level, seeking methods to assemble these single molecules into supramolecular assemblies consisting of many molecules arranged in a well defined manner.

These approaches utilize the concepts of molecular self-assembly and/or supramolecular chemistry to automatically arrange themselves into some useful conformation through a bottom-up approach. The concept of molecular recognition is especially important: molecules can be designed so that a specific configuration or arrangement is favored due to non-covalent intermolecular forces. The WatsonCrick basepairing rules are a direct result of this, as is the specificity of an enzyme being targeted to a single substrate, or the specific folding of the protein itself. Thus, two or more components can be designed to be complementary and mutually attractive so that they make a more complex and useful whole.

Such bottom-up approaches should be capable of producing devices in parallel and be much cheaper than top-down methods, but could potentially be overwhelmed as the size and complexity of the desired assembly increases. Most useful structures require complex and thermodynamically unlikely arrangements of atoms. Nevertheless, there are many examples of self-assembly based on molecular recognition in biology, most notably WatsonCrick basepairing and enzyme-substrate interactions. The challenge for nanotechnology is whether these principles can be used to engineer new constructs in addition to natural ones.

Molecular nanotechnology, sometimes called molecular manufacturing, describes engineered nanosystems (nanoscale machines) operating on the molecular scale. Molecular nanotechnology is especially associated with the molecular assembler, a machine that can produce a desired structure or device atom-by-atom using the principles of mechanosynthesis. Manufacturing in the context of productive nanosystems is not related to, and should be clearly distinguished from, the conventional technologies used to manufacture nanomaterials such as carbon nanotubes and nanoparticles.

When the term “nanotechnology” was independently coined and popularized by Eric Drexler (who at the time was unaware of an earlier usage by Norio Taniguchi) it referred to a future manufacturing technology based on molecular machine systems. The premise was that molecular scale biological analogies of traditional machine components demonstrated molecular machines were possible: by the countless examples found in biology, it is known that sophisticated, stochastically optimised biological machines can be produced.

It is hoped that developments in nanotechnology will make possible their construction by some other means, perhaps using biomimetic principles. However, Drexler and other researchers[27] have proposed that advanced nanotechnology, although perhaps initially implemented by biomimetic means, ultimately could be based on mechanical engineering principles, namely, a manufacturing technology based on the mechanical functionality of these components (such as gears, bearings, motors, and structural members) that would enable programmable, positional assembly to atomic specification.[28] The physics and engineering performance of exemplar designs were analyzed in Drexler’s book Nanosystems.

In general it is very difficult to assemble devices on the atomic scale, as one has to position atoms on other atoms of comparable size and stickiness. Another view, put forth by Carlo Montemagno,[29] is that future nanosystems will be hybrids of silicon technology and biological molecular machines. Richard Smalley argued that mechanosynthesis are impossible due to the difficulties in mechanically manipulating individual molecules.

This led to an exchange of letters in the ACS publication Chemical & Engineering News in 2003.[30] Though biology clearly demonstrates that molecular machine systems are possible, non-biological molecular machines are today only in their infancy. Leaders in research on non-biological molecular machines are Dr. Alex Zettl and his colleagues at Lawrence Berkeley Laboratories and UC Berkeley.[1] They have constructed at least three distinct molecular devices whose motion is controlled from the desktop with changing voltage: a nanotube nanomotor, a molecular actuator,[31] and a nanoelectromechanical relaxation oscillator.[32] See nanotube nanomotor for more examples.

An experiment indicating that positional molecular assembly is possible was performed by Ho and Lee at Cornell University in 1999. They used a scanning tunneling microscope to move an individual carbon monoxide molecule (CO) to an individual iron atom (Fe) sitting on a flat silver crystal, and chemically bound the CO to the Fe by applying a voltage.

The nanomaterials field includes subfields which develop or study materials having unique properties arising from their nanoscale dimensions.[35]

These seek to arrange smaller components into more complex assemblies.

These seek to create smaller devices by using larger ones to direct their assembly.

These seek to develop components of a desired functionality without regard to how they might be assembled.

These subfields seek to anticipate what inventions nanotechnology might yield, or attempt to propose an agenda along which inquiry might progress. These often take a big-picture view of nanotechnology, with more emphasis on its societal implications than the details of how such inventions could actually be created.

Nanomaterials can be classified in 0D, 1D, 2D and 3D nanomaterials. The dimensionality play a major role in determining the characteristic of nanomaterials including physical, chemical and biological characteristics. With the decrease in dimensionality, an increase in surface-to-volume ratio is observed. This indicate that smaller dimensional nanomaterials have higher surface area compared to 3D nanomaterials. Recently, two dimensional (2D) nanomaterials are extensively investigated for electronic, biomedical, drug delivery and biosensor applications.

There are several important modern developments. The atomic force microscope (AFM) and the Scanning Tunneling Microscope (STM) are two early versions of scanning probes that launched nanotechnology. There are other types of scanning probe microscopy. Although conceptually similar to the scanning confocal microscope developed by Marvin Minsky in 1961 and the scanning acoustic microscope (SAM) developed by Calvin Quate and coworkers in the 1970s, newer scanning probe microscopes have much higher resolution, since they are not limited by the wavelength of sound or light.

The tip of a scanning probe can also be used to manipulate nanostructures (a process called positional assembly). Feature-oriented scanning methodology may be a promising way to implement these nanomanipulations in automatic mode.[53][54] However, this is still a slow process because of low scanning velocity of the microscope.

Various techniques of nanolithography such as optical lithography, X-ray lithography, dip pen nanolithography, electron beam lithography or nanoimprint lithography were also developed. Lithography is a top-down fabrication technique where a bulk material is reduced in size to nanoscale pattern.

Another group of nanotechnological techniques include those used for fabrication of nanotubes and nanowires, those used in semiconductor fabrication such as deep ultraviolet lithography, electron beam lithography, focused ion beam machining, nanoimprint lithography, atomic layer deposition, and molecular vapor deposition, and further including molecular self-assembly techniques such as those employing di-block copolymers. The precursors of these techniques preceded the nanotech era, and are extensions in the development of scientific advancements rather than techniques which were devised with the sole purpose of creating nanotechnology and which were results of nanotechnology research.[55]

The top-down approach anticipates nanodevices that must be built piece by piece in stages, much as manufactured items are made. Scanning probe microscopy is an important technique both for characterization and synthesis of nanomaterials. Atomic force microscopes and scanning tunneling microscopes can be used to look at surfaces and to move atoms around. By designing different tips for these microscopes, they can be used for carving out structures on surfaces and to help guide self-assembling structures. By using, for example, feature-oriented scanning approach, atoms or molecules can be moved around on a surface with scanning probe microscopy techniques.[53][54] At present, it is expensive and time-consuming for mass production but very suitable for laboratory experimentation.

In contrast, bottom-up techniques build or grow larger structures atom by atom or molecule by molecule. These techniques include chemical synthesis, self-assembly and positional assembly. Dual polarisation interferometry is one tool suitable for characterisation of self assembled thin films. Another variation of the bottom-up approach is molecular beam epitaxy or MBE. Researchers at Bell Telephone Laboratories like John R. Arthur. Alfred Y. Cho, and Art C. Gossard developed and implemented MBE as a research tool in the late 1960s and 1970s. Samples made by MBE were key to the discovery of the fractional quantum Hall effect for which the 1998 Nobel Prize in Physics was awarded. MBE allows scientists to lay down atomically precise layers of atoms and, in the process, build up complex structures. Important for research on semiconductors, MBE is also widely used to make samples and devices for the newly emerging field of spintronics.

However, new therapeutic products, based on responsive nanomaterials, such as the ultradeformable, stress-sensitive Transfersome vesicles, are under development and already approved for human use in some countries.[56]

As of August 21, 2008, the Project on Emerging Nanotechnologies estimates that over 800 manufacturer-identified nanotech products are publicly available, with new ones hitting the market at a pace of 34 per week.[18] The project lists all of the products in a publicly accessible online database. Most applications are limited to the use of “first generation” passive nanomaterials which includes titanium dioxide in sunscreen, cosmetics, surface coatings,[57] and some food products; Carbon allotropes used to produce gecko tape; silver in food packaging, clothing, disinfectants and household appliances; zinc oxide in sunscreens and cosmetics, surface coatings, paints and outdoor furniture varnishes; and cerium oxide as a fuel catalyst.[17]

Further applications allow tennis balls to last longer, golf balls to fly straighter, and even bowling balls to become more durable and have a harder surface. Trousers and socks have been infused with nanotechnology so that they will last longer and keep people cool in the summer. Bandages are being infused with silver nanoparticles to heal cuts faster.[58] Video game consoles and personal computers may become cheaper, faster, and contain more memory thanks to nanotechnology.[59] Also, to build structures for on chip computing with light, for example on chip optical quantum information processing, and picosecond transmission of information.[60]

Nanotechnology may have the ability to make existing medical applications cheaper and easier to use in places like the general practitioner’s office and at home.[61] Cars are being manufactured with nanomaterials so they may need fewer metals and less fuel to operate in the future.[62]

Scientists are now turning to nanotechnology in an attempt to develop diesel engines with cleaner exhaust fumes. Platinum is currently used as the diesel engine catalyst in these engines. The catalyst is what cleans the exhaust fume particles. First a reduction catalyst is employed to take nitrogen atoms from NOx molecules in order to free oxygen. Next the oxidation catalyst oxidizes the hydrocarbons and carbon monoxide to form carbon dioxide and water.[63] Platinum is used in both the reduction and the oxidation catalysts.[64] Using platinum though, is inefficient in that it is expensive and unsustainable. Danish company InnovationsFonden invested DKK 15 million in a search for new catalyst substitutes using nanotechnology. The goal of the project, launched in the autumn of 2014, is to maximize surface area and minimize the amount of material required. Objects tend to minimize their surface energy; two drops of water, for example, will join to form one drop and decrease surface area. If the catalyst’s surface area that is exposed to the exhaust fumes is maximized, efficiency of the catalyst is maximized. The team working on this project aims to create nanoparticles that will not merge. Every time the surface is optimized, material is saved. Thus, creating these nanoparticles will increase the effectiveness of the resulting diesel engine catalystin turn leading to cleaner exhaust fumesand will decrease cost. If successful, the team hopes to reduce platinum use by 25%.[65]

Nanotechnology also has a prominent role in the fast developing field of Tissue Engineering. When designing scaffolds, researchers attempt to the mimic the nanoscale features of a Cell’s microenvironment to direct its differentiation down a suitable lineage.[66] For example, when creating scaffolds to support the growth of bone, researchers may mimic osteoclast resorption pits.[67]

Researchers have successfully used DNA origami-based nanobots capable of carrying out logic functions to achieve targeted drug delivery in cockroaches. It is said that the computational power of these nanobots can be scaled up to that of a Commodore 64.[68]

An area of concern is the effect that industrial-scale manufacturing and use of nanomaterials would have on human health and the environment, as suggested by nanotoxicology research. For these reasons, some groups advocate that nanotechnology be regulated by governments. Others counter that overregulation would stifle scientific research and the development of beneficial innovations. Public health research agencies, such as the National Institute for Occupational Safety and Health are actively conducting research on potential health effects stemming from exposures to nanoparticles.[69][70]

Some nanoparticle products may have unintended consequences. Researchers have discovered that bacteriostatic silver nanoparticles used in socks to reduce foot odor are being released in the wash.[71] These particles are then flushed into the waste water stream and may destroy bacteria which are critical components of natural ecosystems, farms, and waste treatment processes.[72]

Public deliberations on risk perception in the US and UK carried out by the Center for Nanotechnology in Society found that participants were more positive about nanotechnologies for energy applications than for health applications, with health applications raising moral and ethical dilemmas such as cost and availability.[73]

Experts, including director of the Woodrow Wilson Center’s Project on Emerging Nanotechnologies David Rejeski, have testified[74] that successful commercialization depends on adequate oversight, risk research strategy, and public engagement. Berkeley, California is currently the only city in the United States to regulate nanotechnology;[75] Cambridge, Massachusetts in 2008 considered enacting a similar law,[76] but ultimately rejected it.[77] Relevant for both research on and application of nanotechnologies, the insurability of nanotechnology is contested.[78] Without state regulation of nanotechnology, the availability of private insurance for potential damages is seen as necessary to ensure that burdens are not socialised implicitly. Over the next several decades, applications of nanotechnology will likely include much higher-capacity computers, active materials of various kinds, and cellular-scale biomedical devices.[10]

Nanofibers are used in several areas and in different products, in everything from aircraft wings to tennis rackets. Inhaling airborne nanoparticles and nanofibers may lead to a number of pulmonary diseases, e.g. fibrosis.[79] Researchers have found that when rats breathed in nanoparticles, the particles settled in the brain and lungs, which led to significant increases in biomarkers for inflammation and stress response[80] and that nanoparticles induce skin aging through oxidative stress in hairless mice.[81][82]

A two-year study at UCLA’s School of Public Health found lab mice consuming nano-titanium dioxide showed DNA and chromosome damage to a degree “linked to all the big killers of man, namely cancer, heart disease, neurological disease and aging”.[83]

A major study published more recently in Nature Nanotechnology suggests some forms of carbon nanotubes a poster child for the “nanotechnology revolution” could be as harmful as asbestos if inhaled in sufficient quantities. Anthony Seaton of the Institute of Occupational Medicine in Edinburgh, Scotland, who contributed to the article on carbon nanotubes said “We know that some of them probably have the potential to cause mesothelioma. So those sorts of materials need to be handled very carefully.”[84] In the absence of specific regulation forthcoming from governments, Paull and Lyons (2008) have called for an exclusion of engineered nanoparticles in food.[85] A newspaper article reports that workers in a paint factory developed serious lung disease and nanoparticles were found in their lungs.[86][87][88][89]

Calls for tighter regulation of nanotechnology have occurred alongside a growing debate related to the human health and safety risks of nanotechnology.[90] There is significant debate about who is responsible for the regulation of nanotechnology. Some regulatory agencies currently cover some nanotechnology products and processes (to varying degrees) by “bolting on” nanotechnology to existing regulations there are clear gaps in these regimes.[91] Davies (2008) has proposed a regulatory road map describing steps to deal with these shortcomings.[92]

Stakeholders concerned by the lack of a regulatory framework to assess and control risks associated with the release of nanoparticles and nanotubes have drawn parallels with bovine spongiform encephalopathy (“mad cow” disease), thalidomide, genetically modified food,[93] nuclear energy, reproductive technologies, biotechnology, and asbestosis. Dr. Andrew Maynard, chief science advisor to the Woodrow Wilson Center’s Project on Emerging Nanotechnologies, concludes that there is insufficient funding for human health and safety research, and as a result there is currently limited understanding of the human health and safety risks associated with nanotechnology.[94] As a result, some academics have called for stricter application of the precautionary principle, with delayed marketing approval, enhanced labelling and additional safety data development requirements in relation to certain forms of nanotechnology.[95][96]

The Royal Society report[15] identified a risk of nanoparticles or nanotubes being released during disposal, destruction and recycling, and recommended that “manufacturers of products that fall under extended producer responsibility regimes such as end-of-life regulations publish procedures outlining how these materials will be managed to minimize possible human and environmental exposure” (p. xiii).

The Center for Nanotechnology in Society has found that people respond to nanotechnologies differently, depending on application with participants in public deliberations more positive about nanotechnologies for energy than health applications suggesting that any public calls for nano regulations may differ by technology sector.[73]

Read the original post:

Nanotechnology – Wikipedia

What is Nanotechnology? | Nano

Nanotechnology is science, engineering, and technologyconductedat the nanoscale, which is about 1 to 100 nanometers.

Physicist Richard Feynman, the father of nanotechnology.

Nanoscience and nanotechnology are the study and application of extremely small things and can be used across all the other science fields, such as chemistry, biology, physics, materials science, and engineering.

The ideas and concepts behind nanoscience and nanotechnology started with a talk entitled Theres Plenty of Room at the Bottom by physicist Richard Feynman at an American Physical Society meeting at the California Institute of Technology (CalTech) on December 29, 1959, long before the term nanotechnology was used. In his talk, Feynman described a process in which scientists would be able to manipulate and control individual atoms and molecules. Over a decade later, in his explorations of ultraprecision machining, Professor Norio Taniguchi coined the term nanotechnology. It wasn’t until 1981, with the development of the scanning tunneling microscope that could “see” individual atoms, that modern nanotechnology began.

Its hard to imagine just how small nanotechnology is. One nanometer is a billionth of a meter, or 10-9 of a meter. Here are a few illustrative examples:

Nanoscience and nanotechnology involve the ability to see and to control individual atoms and molecules. Everything on Earth is made up of atomsthe food we eat, the clothes we wear, the buildings and houses we live in, and our own bodies.

But something as small as an atom is impossible to see with the naked eye. In fact, its impossible to see with the microscopes typically used in a high school science classes. The microscopes needed to see things at the nanoscale were invented relatively recentlyabout 30 years ago.

Once scientists had the right tools, such as thescanning tunneling microscope (STM)and the atomic force microscope (AFM), the age of nanotechnology was born.

Although modern nanoscience and nanotechnology are quite new, nanoscale materialswereused for centuries. Alternate-sized gold and silver particles created colors in the stained glass windows of medieval churches hundreds of years ago. The artists back then just didnt know that the process they used to create these beautiful works of art actually led to changes in the composition of the materials they were working with.

Today’s scientists andengineers are finding a wide variety of ways to deliberatelymake materials at the nanoscale to take advantage of their enhanced properties such as higher strength, lighter weight,increased control oflight spectrum, and greater chemical reactivity than theirlarger-scale counterparts.

Excerpt from:

What is Nanotechnology? | Nano

Nanotechnology | Britannica.com

Nanotechnology, the manipulation and manufacture of materials and devices on the scale of atoms or small groups of atoms. The nanoscale is typically measured in nanometres, or billionths of a metre (nanos, the Greek word for dwarf, being the source of the prefix), and materials built at this scale often exhibit distinctive physical and chemical properties due to quantum mechanical effects. Although usable devices this small may be decades away (see microelectromechanical system), techniques for working at the nanoscale have become essential to electronic engineering, and nanoengineered materials have begun to appear in consumer products. For example, billions of microscopic nanowhiskers, each about 10 nanometres in length, have been molecularly hooked onto natural and synthetic fibres to impart stain resistance to clothing and other fabrics; zinc oxide nanocrystals have been used to create invisible sunscreens that block ultraviolet light; and silver nanocrystals have been embedded in bandages to kill bacteria and prevent infection.

Possibilities for the future are numerous. Nanotechnology may make it possible to manufacture lighter, stronger, and programmable materials that require less energy to produce than conventional materials, that produce less waste than with conventional manufacturing, and that promise greater fuel efficiency in land transportation, ships, aircraft, and space vehicles. Nanocoatings for both opaque and translucent surfaces may render them resistant to corrosion, scratches, and radiation. Nanoscale electronic, magnetic, and mechanical devices and systems with unprecedented levels of information processing may be fabricated, as may chemical, photochemical, and biological sensors for protection, health care, manufacturing, and the environment; new photoelectric materials that will enable the manufacture of cost-efficient solar-energy panels; and molecular-semiconductor hybrid devices that may become engines for the next revolution in the information age. The potential for improvements in health, safety, quality of life, and conservation of the environment are vast.

At the same time, significant challenges must be overcome for the benefits of nanotechnology to be realized. Scientists must learn how to manipulate and characterize individual atoms and small groups of atoms reliably. New and improved tools are needed to control the properties and structure of materials at the nanoscale; significant improvements in computer simulations of atomic and molecular structures are essential to the understanding of this realm. Next, new tools and approaches are needed for assembling atoms and molecules into nanoscale systems and for the further assembly of small systems into more-complex objects. Furthermore, nanotechnology products must provide not only improved performance but also lower cost. Finally, without integration of nanoscale objects with systems at the micro- and macroscale (that is, from millionths of a metre up to the millimetre scale), it will be very difficult to exploit many of the unique properties found at the nanoscale.

Nanotechnology is highly interdisciplinary, involving physics, chemistry, biology, materials science, and the full range of the engineering disciplines. The word nanotechnology is widely used as shorthand to refer to both the science and the technology of this emerging field. Narrowly defined, nanoscience concerns a basic understanding of physical, chemical, and biological properties on atomic and near-atomic scales. Nanotechnology, narrowly defined, employs controlled manipulation of these properties to create materials and functional systems with unique capabilities.

In contrast to recent engineering efforts, nature developed nanotechnologies over billions of years, employing enzymes and catalysts to organize with exquisite precision different kinds of atoms and molecules into complex microscopic structures that make life possible. These natural products are built with great efficiency and have impressive capabilities, such as the power to harvest solar energy, to convert minerals and water into living cells, to store and process massive amounts of data using large arrays of nerve cells, and to replicate perfectly billions of bits of information stored in molecules of deoxyribonucleic acid (DNA).

There are two principal reasons for qualitative differences in material behaviour at the nanoscale (traditionally defined as less than 100 nanometres). First, quantum mechanical effects come into play at very small dimensions and lead to new physics and chemistry. Second, a defining feature at the nanoscale is the very large surface-to-volume ratio of these structures. This means that no atom is very far from a surface or interface, and the behaviour of atoms at these higher-energy sites have a significant influence on the properties of the material. For example, the reactivity of a metal catalyst particle generally increases appreciably as its size is reducedmacroscopic gold is chemically inert, whereas at nanoscales gold becomes extremely reactive and catalytic and even melts at a lower temperature. Thus, at nanoscale dimensions material properties depend on and change with size, as well as composition and structure.

Using the processes of nanotechnology, basic industrial production may veer dramatically from the course followed by steel plants and chemical factories of the past. Raw materials will come from the atoms of abundant elementscarbon, hydrogen, and siliconand these will be manipulated into precise configurations to create nanostructured materials that exhibit exactly the right properties for each particular application. For example, carbon atoms can be bonded together in a number of different geometries to create variously a fibre, a tube, a molecular coating, or a wire, all with the superior strength-to-weight ratio of another carbon materialdiamond. Additionally, such material processing need not require smokestacks, power-hungry industrial machinery, or intensive human labour. Instead, it may be accomplished either by growing new structures through some combination of chemical catalysts and synthetic enzymes or by building them through new techniques based on patterning and self-assembly of nanoscale materials into useful predetermined designs. Nanotechnology ultimately may allow people to fabricate almost any type of material or product allowable under the laws of physics and chemistry. While such possibilities seem remote, even approaching natures virtuosity in energy-efficient fabrication would be revolutionary.

Even more revolutionary would be the fabrication of nanoscale machines and devices for incorporation into micro- and macroscale systems. Once again, nature has led the way with the fabrication of both linear and rotary molecular motors. These biological machines carry out such tasks as muscle contraction (in organisms ranging from clams to humans) and shuttling little packets of material around within cells while being powered by the recyclable, energy-efficient fuel adenosine triphosphate. Scientists are only beginning to develop the tools to fabricate functioning systems at such small scales, with most advances based on electronic or magnetic information processing and storage systems. The energy-efficient, reconfigurable, and self-repairing aspects of biological systems are just becoming understood.

The potential impact of nanotechnology processes, machines, and products is expected to be far-reaching, affecting nearly every conceivable information technology, energy source, agricultural product, medical device, pharmaceutical, and material used in manufacturing. Meanwhile, the dimensions of electronic circuits on semiconductors continue to shrink, with minimum feature sizes now reaching the nanorealm, under 100 nanometres. Likewise, magnetic memory materials, which form the basis of hard disk drives, have achieved dramatically greater memory density as a result of nanoscale structuring to exploit new magnetic effects at nanodimensions. These latter two areas represent another major trend, the evolution of critical elements of microtechnology into the realm of nanotechnology to enhance performance. They are immense markets driven by the rapid advance of information technology.

In a lecture in 1959 to the American Physical Society, Theres Plenty of Room at the Bottom, American Nobelist Richard P. Feynman presented his audience with a vision of what could be done with extreme miniaturization. He began his lecture by noting that the Lords Prayer had been written on the head of a pin and asked,

Why cannot we write the entire 24 volumes of the Encyclopdia Britannica on the head of a pin? Lets see what would be involved. The head of a pin is a sixteenth of an inch across. If you magnify it by 25,000 diameters, the area of the head of the pin is then equal to the area of all the pages of the Encyclopdia Britannica. Therefore, all it is necessary to do is to reduce in size all the writing in the Encyclopdia by 25,000 times. Is that possible? The resolving power of the eye is about 1/120 of an inchthat is roughly the diameter of one of the little dots on the fine half-tone reproductions in the Encyclopdia. This, when you demagnify it by 25,000 times, is still 80 angstroms in diameter32 atoms across, in an ordinary metal. In other words, one of those dots still would contain in its area 1,000 atoms. So, each dot can easily be adjusted in size as required by the photoengraving, and there is no question that there is enough room on the head of a pin to put all of the Encyclopdia Britannica.

Feynman was intrigued by biology and pointed out that

cells are very tiny, but they are very active; they manufacture various substances; they walk around; they wiggle; and they do all kinds of marvelous thingsall on a very small scale. Also, they store information. Consider the possibility that we too can make a thing very small which does what we wantthat we can manufacture an object that maneuvers at that level!

He also considered using big tools to make smaller tools that could make yet smaller tools, eventually obtaining nanoscale tools for directly manipulating atoms and molecules. In considering what all this might mean, Feynman declared,

I can hardly doubt that when we have some control of the arrangement of things on a small scale we will get an enormously greater range of possible properties that substances can have, and of different things that we can do.

Perhaps the biggest barrier to following these prophetic thoughts was simply the immediate lack of tools to manipulate and visualize matter at such a small scale. The availability of tools has always been an enabling aspect of the advance of all science and technology, and some of the key tools for nanotechnology are discussed in the next section, Pioneers.

Starting with a 1981 paper in the Proceedings of the National Academy of Sciences and following with two popular books, Engines of Creation (1986) and Nanosystems (1992), American scientist K. Eric Drexler became one of the foremost advocates of nanotechnology. In fact, Drexler was the first person anywhere to receive a Ph.D. in molecular nanotechnology (from the Massachusetts Institute of Technology). In his written works he takes a molecular view of the world and envisions molecular machines doing much of the work of the future. For example, he refers to assemblers, which will manipulate individual atoms to manufacture structures, and replicators, which will be able to make multiple copies of themselves in order to save time dealing with the billions of atoms needed to make objects of useful size. In an article for Encyclopdia Britannicas 1990 Yearbook of Science and the Future, Drexler wrote:

Cells and tissues in the human body are built and maintained by molecular machinery, but sometimes that machinery proves inadequate: viruses multiply, cancer cells spread, or systems age and deteriorate. As one might expect, new molecular machines and computers of subcellular size could support the bodys own mechanisms. Devices containing nanocomputers interfaced to molecular sensors and effectors could serve as an augmented immune system, searching out and destroying viruses and cancer cells. Similar devices programmed as repair machines could enter living cells to edit out viral DNA sequences and repair molecular damage. Such machines would bring surgical control to the molecular level, opening broad new horizons in medicine.

Drexlers futurist visions have stimulated much thought, but the assembler approach has failed to account for the strong influence of atomic and molecular forces (i.e., the chemistry) at such dimensions. The controversy surrounding these popularizations, and the potential dangers of entities such as intelligent replicators (however remote), have stimulated debate over the ethical and societal implications of nanotechnology.

A number of key technological milestones have been achieved by working pioneers. Molecular beam epitaxy, invented by Alfred Cho and John Arthur at Bell Labs in 1968 and developed in the 1970s, enabled the controlled deposition of single atomic layers. This tool provided for nanostructuring in one dimension as atomic layers were grown one upon the next. It subsequently became important in the area of compound semiconductor device fabrication. For example, sandwiching one-nanometre-thick layers of nonmagnetic-sensor materials between magnetic layers in computer disk drives resulted in large increases in storage capacity, and a similar use of nanostructuring resulted in more energy-efficient semiconductor lasers for use in compact disc players.

In 1981 Gerd Binnig and Heinrich Rohrer developed the scanning tunneling microscope at IBMs laboratories in Switzerland. This tool provided a revolutionary advance by enabling scientists to image the position of individual atoms on surfaces. It earned Binnig and Rohrer a Nobel Prize in 1986 and spawned a wide variety of scanning probe tools for nanoscale observations.

The observation of new carbon structures marked another important milestone in the advance of nanotechnology, with Nobel Prizes for the discoverers. In 1985 Robert F. Curl, Jr., Harold W. Kroto, and Richard E. Smalley discovered the first fullerene, the third known form of pure carbon (after diamond and graphite). They named their discovery buckminsterfullerene (buckyball) for its resemblance to the geodesic domes promoted by the American architect R. Buckminster Fuller. Technically called C60 for the 60 carbon atoms that form their hollow spherical structure, buckyballs resemble a football one nanometre in diameter (see figure). In 1991 Sumio Iijima of NEC Corporation in Japan discovered carbon nanotubes, in which the carbon ringlike structures are extended from spheres into long tubes of varying diameter. Taken together, these new structures surprised and excited the imaginations of scientists about the possibilities of forming well-defined nanostructures with unexpected new properties.

The scanning tunneling microscope not only allowed for the imaging of atoms by scanning a sharp probe tip over a surface, but it also allowed atoms to be pushed around on the surface. With a slight bias voltage applied to the probe tip, certain atoms could be made to adhere to the tip used for imaging and then to be released from it. Thus, in 1990 Donald Eigler spelled out the letters of his companys logo, IBM, by moving 35 xenon atoms into place on a nickel surface. This demonstration caught the publics attention because it showed the precision of the emerging nanoscale tools.

At nanoscale dimensions the properties of materials no longer depend solely on composition and structure in the usual sense. Nanomaterials display new phenomena associated with quantized effects and with the preponderance of surfaces and interfaces.

Quantized effects arise in the nanometre regime because the overall dimensions of objects are comparable to the characteristic wavelength for fundamental excitations in materials. For example, electron wave functions (see also de Broglie wave) in semiconductors are typically on the order of 10 to 100 nanometres. Such excitations include the wavelength of electrons, photons, phonons, and magnons, to name a few. These excitations carry the quanta of energy through materials and thus determine the dynamics of their propagation and transformation from one form to another. When the size of structures is comparable to the quanta themselves, it influences how these excitations move through and interact in the material. Small structures may limit flow, create wave interference effects, and otherwise bring into play quantum mechanical selection rules not apparent at larger dimensions.

Quantum mechanical properties for confinement of electrons in one dimension have long been exploited in solid-state electronics. Semiconductor devices are grown with thin layers of differing composition so that electrons (or holes in the case of missing electron charges) can be confined in specific regions of the structure (known as quantum wells). Thin layers with larger energy bandgaps can serve as barriers that restrict the flow of charges to certain conditions under which they can tunnel through these barriersthe basis of resonant tunneling diodes. Superlattices are periodic structures of repeating wells that set up a new set of selection rules which affect the conditions for charges to flow through the structure. Superlattices have been exploited in cascade lasers to achieve far infrared wavelengths. Modern telecommunications is based on semiconductor lasers that exploit the unique properties of quantum wells to achieve specific wavelengths and high efficiency.

The propagation of photons is altered dramatically when the size and periodicity of the transient structure approach the wavelength of visible light (400 to 800 nanometres). When photons propagate through a periodically varying dielectric constantfor example, semiconductor posts surrounded by airquantum mechanical rules define and limit the propagation of the photons depending on their energy (wavelength). This new behaviour is analogous to the quantum mechanical rules that define the motion of electrons through crystals, giving bandgaps for semiconductors. In one dimension, compound semiconductor superlattices can be grown epitaxially with the alternating layers having different dielectric constants, thus providing highly reflective mirrors for specific wavelengths as determined by the repeat distance of layers in the superlattice. These structures are used to provide built-in mirrors for vertical-cavity surface-emitting lasers, which are used in communications applications. In two and three dimensions, periodic structures known as photonic crystals offer additional control over photon propagation.

Photonic crystals are being explored in a variety of materials and periodicities, such as two-dimensional hexagonal arrays of posts fabricated in compound semiconductors or stacked loglike arrays of silicon bars in three dimensions. The dimensions of these structures depend on the wavelength of light being propagated and are typically in the range of a few hundred nanometres for wavelengths in the visible and near infrared. Photonic crystal properties based on nanostructured materials offer the possibility of confining, steering, and separating light by wavelength on unprecedented small scales and of creating new devices such as lasers that require very low currents to initiate lasing (called near-thresholdless lasers). These structures are being extensively investigated as the tools for nanostructuring materials are steadily advancing. Researchers are particularly interested in the infrared wavelengths, where dimensional control is not as stringent as at the shorter visible wavelengths and where optical communications and chemical sensing provide motivation for potential new applications.

Nanoscale materials also have size-dependent magnetic behaviour, mechanical properties, and chemical reactivity. At very small sizes (a few nanometres), magnetic nanoclusters have a single magnetic domain, and the strongly coupled magnetic spins on each atom combine to produce a particle with a single giant spin. For example, the giant spin of a ferromagnetic iron particle rotates freely at room temperature for diameters below about 16 nanometres, an effect termed superparamagnetism. Mechanical properties of nanostructured materials can reach exceptional strengths. As a specific example, the introduction of two-nanometre aluminum oxide precipitates into thin films of pure nickel results in yield strengths increasing from 0.15 to 5 gigapascals, which is more than twice that for a hard bearing steel. Another example of exceptional mechanical properties at the nanoscale is the carbon nanotube, which exhibits great strength and stiffness along its longitudinal axis.

The preponderance of surfaces is a major reason for the change in behaviour of materials at the nanoscale. Since up to half of all the atoms in nanoparticles are surface atoms, properties such as electrical transport are no longer determined by solid-state bulk phenomena. Likewise, the atoms in nanostructures have a higher average energy than atoms in larger structures, because of the large proportion of surface atoms. For example, catalytic materials have a greater chemical activity per atom of exposed surface as the catalyst is reduced in size at the nanoscale. Defects and impurities may be attracted to surfaces and interfaces, and interactions between particles at these small dimensions can depend on the structure and nature of chemical bonding at the surface. Molecular monolayers may be used to change or control surface properties and to mediate the interaction between nanoparticles.

Surfaces and their interactions with molecular structures are basic to all biology. The intersection of nanotechnology and biotechnology offers the possibility of achieving new functions and properties with nanostructured surfaces. In this surface- and interface-dominated regime, biology does an exquisite job of selectively controlling functions through a combination of structure and chemical forces. The transcription of information stored in genes and the selectivity of biochemical reactions based on chemical recognition of complex molecules are examples where interfaces play the key role in establishing nanoscale behaviour. Atomic forces and chemical bonds dominate at these dimensions, while macroscopic effectssuch as convection, turbulence, and momentum (inertial forces)are of little consequence.

As discussed in the section Properties at the nanoscale, material propertieselectrical, optical, magnetic, mechanical, and chemicaldepend on their exact dimensions. This opens the way for development of new and improved materials through manipulation of their nanostructure. Hierarchical assemblies of nanoscale-engineered materials into larger structures, or their incorporation into devices, provide the basis for tailoring radically new materials and machines.

Natures assemblies point the way to improving structural materials. The often-cited abalone seashell provides a beautiful example of how the combination of a hard, brittle inorganic material with nanoscale structuring and a soft, tough organic material can produce a strong, durable nanocompositebasically, these nanocomposites are made of calcium carbonate bricks held together by a glycoprotein glue. New engineered materials are emergingsuch as polymer-clay nanocompositesthat are not only strong and tough but also lightweight and easier to recycle than conventional reinforced plastics. Such improvements in structural materials are particularly important for the transportation industry, where reduced weight directly translates into improved fuel economy. Other improvements can increase safety or decrease the impact on the environment of fabrication and recycling. Further advances, such as truly smart materials that signal their impending failure or are even able to self-repair flaws, may be possible with composites of the future.

Sensors are central to almost all modern control systems. For example, multiple sensors are used in automobiles for such diverse tasks as engine management, emission control, security, safety, comfort, vehicle monitoring, and diagnostics. While such traditional applications for physical sensing generally rely on microscale sensing devices, the advent of nanoscale materials and structures has led to new electronic, photonic, and magnetic nanosensors, sometimes known as smart dust. Because of their small size, nanosensors exhibit unprecedented speed and sensitivity, extending in some cases down to the detection of single molecules. For example, nanowires made of carbon nanotubes, silicon, or other semiconductor materials exhibit exceptional sensitivity to chemical species or biological agents. Electrical current through nanowires can be altered by having molecules attached to their surface that locally perturb their electronic band structure. By means of nanowire surfaces coated with sensor molecules that selectively attach particular species, charge-induced changes in current can be used to detect the presence of those species. This same strategy is adopted for many classes of sensing systems. New types of sensors with ultrahigh sensitivity and specificity will have many applications; for example, sensors that can detect cancerous tumours when they consist of only a few cells would be a very significant advance.

Nanomaterials also make excellent filters for trapping heavy metals and other pollutants from industrial wastewater. One of the greatest potential impacts of nanotechnology on the lives of the majority of people on Earth will be in the area of economical water desalination and purification. Nanomaterials will very likely find important use in fuel cells, bioconversion for energy, bioprocessing of food products, waste remediation, and pollution-control systems.

A recent concern regarding nanoparticles is whether their small sizes and novel properties may pose significant health or environmental risks. In general, ultrafine particlessuch as the carbon in photocopier toners or in soot produced by combustion engines and factorieshave adverse respiratory and cardiovascular effects on people and animals. Studies are under way to determine if specific nanoscale particles pose higher risks that may require special regulatory restrictions. Of particular concern are potential carcinogenic risks from inhaled particles and the possibility for very small nanoparticles to cross the blood-brain barrier to unknown effect. Nanomaterials currently receiving attention from health officials include carbon nanotubes, buckyballs, and cadmium selenide quantum dots. Studies of the absorption through the skin of titanium oxide nanoparticles (used in sunscreens) are also planned. More far-ranging studies of the toxicity, transport, and overall fate of nanoparticles in ecosystems and the environment have not yet been undertaken. Some early animal studies, involving the introduction of very high levels of nanoparticles which resulted in the rapid death of many of the subjects, are quite controversial.

Nanotechnology promises to impact medical treatment in multiple ways. First, advances in nanoscale particle design and fabrication provide new options for drug delivery and drug therapies. More than half of the new drugs developed each year are not water-soluble, which makes their delivery difficult. In the form of nanosized particles, however, these drugs are more readily transported to their destination, and they can be delivered in the conventional form of pills.

More important, nanotechnology may enable drugs to be delivered to precisely the right location in the body and to release drug doses on a predetermined schedule for optimal treatment. The general approach is to attach the drug to a nanosized carrier that will release the medicine in the body over an extended period of time or when specifically triggered to do so. In addition, the surfaces of these nanoscale carriers may be treated to seek out and become localized at a disease sitefor example, attaching to cancerous tumours. One type of molecule of special interest for these applications is an organic dendrimer. A dendrimer is a special class of polymeric molecule that weaves in and out from a hollow central region. These spherical fuzz balls are about the size of a typical protein but cannot unfold like proteins. Interest in dendrimers derives from the ability to tailor their cavity sizes and chemical properties to hold different therapeutic agents. Researchers hope to design different dendrimers that can swell and release their drug on exposure to specifically recognized molecules that indicate a disease target. This same general approach to nanoparticle-directed drug delivery is being explored for other types of nanoparticles as well.

Another approach involves gold-coated nanoshells whose size can be adjusted to absorb light energy at different wavelengths. In particular, infrared light will pass through several centimetres of body tissue, allowing a delicate and precise heating of such capsules in order to release the therapeutic substance within. Furthermore, antibodies may be attached to the outer gold surface of the shells to cause them to bind specifically to certain tumour cells, thereby reducing the damage to surrounding healthy cells.

A second area of intense study in nanomedicine is that of developing new diagnostic tools. Motivation for this work ranges from fundamental biomedical research at the level of single genes or cells to point-of-care applications for health delivery services. With advances in molecular biology, much diagnostic work now focuses on detecting specific biological signatures. These analyses are referred to as bioassays. Examples include studies to determine which genes are active in response to a particular disease or drug therapy. A general approach involves attaching fluorescing dye molecules to the target biomolecules in order to reveal their concentration.

Another approach to bioassays uses semiconductor nanoparticles, such as cadmium selenide, which emit light of a specific wavelength depending on their size. Different-size particles can be tagged to different receptors so that a wider variety of distinct colour tags are available than can be distinguished for dye molecules. The degradation in fluorescence with repeated excitation for dyes is avoided. Furthermore, various-size particles can be encapsulated in latex beads and their resulting wavelengths read like a bar code. This approach, while still in the exploratory stage, would allow for an enormous number of distinct labels for bioassays.

Another nanotechnology variation on bioassays is to attach one half of the single-stranded complementary DNA segment for the genetic sequence to be detected to one set of gold particles and the other half to a second set of gold particles. When the material of interest is present in a solution, the two attachments cause the gold balls to agglomerate, providing a large change in optical properties that can be seen in the colour of the solution. If both halves of the sequence do not match, no agglomeration will occur and no change will be observed.

Approaches that do not involve optical detection techniques are also being explored with nanoparticles. For example, magnetic nanoparticles can be attached to antibodies that in turn recognize and attach to specific biomolecules. The magnetic particles then act as tags and handlebars through which magnetic fields can be used for mixing, extracting, or identifying the attached biomolecules within microlitre- or nanolitre-sized samples. For example, magnetic nanoparticles stay magnetized as a single domain for a significant period, which enables them to be aligned and detected in a magnetic field. In particular, attached antibodymagnetic-nanoparticle combinations rotate slowly and give a distinctive magnetic signal. In contrast, magnetically tagged antibodies that are not attached to the biological material being detected rotate more rapidly and so do not give the same distinctive signal.

Microfluidic systems, or labs-on-chips, have been developed for biochemical assays of minuscule samples. Typically cramming numerous electronic and mechanical components into a portable unit no larger than a credit card, they are especially useful for conducting rapid analysis in the field. While these microfluidic systems primarily operate at the microscale (that is, millionths of a metre), nanotechnology has contributed new concepts and will likely play an increasing role in the future. For example, separation of DNA is sensitive to entropic effects, such as the entropy required to unfold DNA of a given length. A new approach to separating DNA could take advantage of its passage through a nanoscale array of posts or channels such that DNA molecules of different lengths would uncoil at different rates.

Other researchers have focused on detecting signal changes as nanometre-wide DNA strands are threaded through a nanoscale pore. Early studies used pores punched in membranes by viruses; artificially fabricated nanopores are also being tested. By applying an electric potential across the membrane in a liquid cell to pull the DNA through, changes in ion current can be measured as different repeating base units of the molecule pass through the pores. Nanotechnology-enabled advances in the entire area of bioassays will clearly impact health care in many ways, from early detection, rapid clinical analysis, and home monitoring to new understanding of molecular biology and genetic-based treatments for fighting disease.

Another biomedical application of nanotechnology involves assistive devices for people who have lost or lack certain natural capabilities. For example, researchers hope to design retinal implants for vision-impaired individuals. The concept is to implant chips with photodetector arrays to transmit signals from the retina to the brain via the optic nerve. Meaningful spatial information, even if only at a rudimentary level, would be of great assistance to the blind. Such research illustrates the tremendous challenge of designing hybrid systems that work at the interface between inorganic devices and biological systems.

Closely related research involves implanting nanoscale neural probes in brain tissue to activate and control motor functions. This requires effective and stable wiring of many electrodes to neurons. It is exciting because of the possibility of recovery of control for motor-impaired individuals. Studies employing neural stimulation of damaged spinal cords by electrical signals have demonstrated the return of some locomotion. Researchers are also seeking ways to assist in the regeneration and healing of bone, skin, and cartilagefor example, developing synthetic biocompatible or biodegradable structures with nanosized voids that would serve as templates for regenerating specific tissue while delivering chemicals to assist in the repair process. At a more sophisticated level, researchers hope to someday build nanoscale or microscale machines that can repair, assist, or replace more-complex organs.

Semiconductor experts agree that the ongoing shrinkage in conventional electronic devices will inevitably reach fundamental limits due to quantum effects such as tunneling, in which electrons jump out of their prescribed circuit path and create atomic-scale interference between devices. At that point, radical new approaches to data storage and information processing will be required for further advances. For example, radically new systems have been imagined that are based on quantum computing or biomolecular computing.

The use of molecules for electronic devices was suggested by Mark Ratner of Northwestern University and Avi Aviram of IBM as early as the 1970s, but proper nanotechnology tools did not become available until the turn of the 21st century. Wiring up molecules some half a nanometre wide and a few nanometres long remains a major challenge, and an understanding of electrical transport through single molecules is only beginning to emerge. A number of groups have been able to demonstrate molecular switches, for example, that could conceivably be used in computer memory or logic arrays. Current areas of research include mechanisms to guide the selection of molecules, architectures for assembling molecules into nanoscale gates, and three-terminal molecules for transistor-like behaviour. More-radical approaches include DNA computing, where single-stranded DNA on a silicon chip would encode all possible variable values and complementary strand interactions would be used for a parallel processing approach to finding solutions. An area related to molecular electronics is that of organic thin-film transistors and light emitters, which promise new applications such as video displays that can be rolled out like wallpaper and flexible electronic newspapers.

Carbon nanotubes have remarkable electronic, mechanical, and chemical properties. Depending on their specific diameter and the bonding arrangement of their carbon atoms, nanotubes exhibit either metallic or semiconducting behaviour. Electrical conduction within a perfect nanotube is ballistic (negligible scattering), with low thermal dissipation. As a result, a wire made from a nanotube, or a nanowire, can carry much more current than an ordinary metal wire of comparable size. At 1.4 nanometres in diameter, nanotubes are about a hundred times smaller than the gate width of silicon semiconductor devices. In addition to nanowires for conduction, transistors, diodes, and simple logic circuits have been demonstrated by combining metallic and semiconductor carbon nanotubes. Similarly, silicon nanowires have been used to build experimental devices, such as field-effect transistors, bipolar transistors, inverters, light-emitting diodes, sensors, and even simple memory. A major challenge for nanowire circuits, as for molecular electronics, is connecting and integrating these devices into a workable high-density architecture. Ideally, the structure would be grown and assembled in place. Crossbar architectures that combine the function of wires and devices are of particular interest.

At nanoscale dimensions the energy required to add one additional electron to a small island (isolated physical region)for example, through a tunneling barrierbecomes significant. This change in energy provides the basis for devising single-electron transistors. At low temperatures, where thermal fluctuations are small, various single-electron-device nanostructures are readily achievable, and extensive research has been carried out for structures with confined electron flow. However, room-temperature applications will require that sizes be reduced significantly, to the one-nanometre range, to achieve stable operation. For large-scale application with millions of devices, as found in current integrated circuits, the need for structures with very uniform size to maintain uniform device characteristics presents a significant challenge. Also, in this and many new nanodevices being explored, the lack of gain is a serious drawback limiting implementation in large-scale electronic circuits.

Spintronics refers to electronic devices that perform logic operations based on not just the electrical charge of carriers but also their spin. For example, information could be transported or stored through the spin-up or spin-down states of electrons. This is a new area of research, and issues include the injection of spin-polarized carriers, their transport, and their detection. The role of nanoscale structure and electronic properties of the ferromagnetic-semiconductor interface on the spin injection process, the growth of new ferromagnetic semiconductors with nanoscale control, and the possible use of nanostructured features to manipulate spin are all of interest.

Current approaches to information storage and retrieval include high-density, high-speed, solid-state electronic memories, as well as slower (but generally more spacious) magnetic and optical discs (see computer memory). As the minimum feature size for electronic processing approaches 100 nanometres, nanotechnology provides ways to decrease further the bit size of the stored information, thus increasing density and reducing interconnection distances for obtaining still-higher speeds. For example, the basis of the current generation of magnetic disks is the giant magnetoresistance effect. A magnetic read/write head stores bits of information by setting the direction of the magnetic field in nanometre-thick metallic layers that alternate between ferromagnetic and nonferromagnetic. Differences in spin-dependent scattering of electrons at the interface layers lead to resistance differences that can be read by the magnetic head. Mechanical properties, particularly tribology (friction and wear of moving surfaces), also play an important role in magnetic hard disk drives, since magnetic heads float only about 10 nanometres above spinning magnetic disks.

Another approach to information storage that is dependent on designing nanometre-thick magnetic layers is under commercial development. Known as magnetic random access memory (MRAM), a line of electrically switchable magnetic material is separated from a permanently magnetized layer by a nanoscale nonmagnetic interlayer. A resistance change that depends on the relative alignment of the fields is read electrically from a large array of wires through cross lines. MRAM will require a relatively small evolution from conventional semiconductor manufacturing, and it has the added benefit of producing nonvolatile memory (no power or batteries are needed to maintain stored memory states).

Still at an exploratory stage, studies of electrical conduction through molecules have generated interest in their possible use as memory. While still very speculative, molecular and nanowire approaches to memory are intriguing because of the small volume in which the bits of memory are stored and the effectiveness with which biological systems store large amounts of information.

Nanoscale structuring of optical devices, such as vertical-cavity surface-emitting lasers (VCSELs), quantum dot lasers, and photonic crystal materials, is leading to additional advances in communications technology.

VCSELs have nanoscale layers of compound semiconductors epitaxially grown into their structurealternating dielectric layers as mirrors and quantum wells. Quantum wells allow the charge carriers to be confined in well-defined regions and provide the energy conversion into light at desired wavelengths. They are placed in the lasers cavity to confine carriers at the nodes of a standing wave and to tailor the band structure for more efficient radiative recombination. One-dimensional nanotechnology techniques involving precise growth of very thin epitaxial semiconductor layers were developed during the 1990s. Such nanostructuring has enhanced the efficiency of VCSELs and reduced the current required for lasing to start (called the threshold current). Because of improving performance and their compatibility with planar manufacturing technology, VCSELs are fast becoming a preferred laser source in a variety of communications applications.

More recently, the introduction of quantum dots (regions so small that they can be given a single electric charge) into semiconductor lasers has been investigated and found to give additional benefitsboth further reductions in threshold current and narrower line widths. Quantum dots further confine the optical emission modes within a very narrow spectrum and give the lowest threshold current densities for lasing achieved to date in VCSELs. The quantum dots are introduced into the laser during the growth of strained layers, by a process called Stransky-Krastanov growth. They arise because of the lattice mismatch stress and surface tension of the growing film. Improvements in ways to control precisely the resulting quantum dots to a more uniform single size are still being sought.

Photonic crystals provide a new means to control the steering and manipulation of photons based on periodic dielectric lattices with repeat dimensions on the order of the wavelength of light. These materials can have very exotic properties, such as not allowing light within certain wavelengths to be propagated in a material based on the particular periodic structure. Photonic lattices can act as perfect wavelength-selective mirrors to reflect back incident light from all orientations. They provide the basis for optical switching, steering, and wavelength separation on unprecedented small scales. The periodic structures required for these artificial crystals can be configured as both two- and three-dimensional lattices. Optical sources, switches, and routers are being considered, with two-dimensional planar geometries receiving the most attention, because of their greater ease of fabrication.

Another potentially important communications application for nanotechnology is microelectromechanical systems (MEMS), devices sized at the micrometre level (millionths of a metre). MEMS are currently poised to have a major impact on communications via optical switching. In the future, electromechanical devices may shrink to nanodimensions to take advantage of the higher frequencies of mechanical vibration at smaller masses. The natural (resonant) frequency of vibration for small mechanical beams increases as their size decreases, so that little power is needed to drive them as oscillators. Their efficiency is rated by a quality factor, known as Q, which is a ratio of the energy stored per cycle versus the energy dissipated per cycle. The higher the Q, the more precise the absolute frequency of an oscillator. The Q is very high for micro- and nanoscale mechanical oscillators, and these devices can reach very high frequencies (up to microwave frequencies), making them potential low-power replacements for electronic-based oscillators and filters.

Mechanical oscillators have been made from silicon at dimensions of 10 100 nanometres, where more than 10 percent of the atoms are less than one atomic distance from the surface. While highly homogeneous materials can be made at these dimensionsfor example, single-crystal silicon barssurfaces play an increasing role at nanoscales, and energy losses increase, presumably because of surface defects and molecular species absorbed on surfaces.

It is possible to envision even higher frequencies, in what might be viewed as the ultimate in nanomechanical systems, by moving from nanomachined structures to molecular systems. As an example, multiwalled carbon nanotubes are being explored for their mechanical properties. When the ends of the outer nanotube are removed, the inner tube may be pulled partway out from the outer tube where van der Waals forces between the two tubes will supply a restoring force. The inner tube can thus oscillate, sliding back and forth inside the outer tube. The resonant frequency of oscillation for such structures is predicted to be above one gigahertz (one billion cycles per second). It is unknown whether connecting such systems to the macro world and protecting them from surface effects will ever be practical.

Here is the original post:

Nanotechnology | Britannica.com

What is nanotechnology?

A short introduction to nanotechnology, and why you should care about it.

The video dives into materials science and advanced materials, and looks at how designing and engineering substances from the atoms they’re made of upward allows novel properties to be developed and used. It also looks at responsible innovation when it comes to grappling with the benefits as well as the health and environmental risks of nanoparticles and nanomaterials.

Stand-alone copies are available on request from Andrew Maynard at Andrew.maynard@asu.edu

USEFUL LINKS

NOVA nanotechnology resources: http://www.pbs.org/wgbh/nova/search/r…

Nanotechnology 101 from the US Government: http://www.nano.gov/nanotech-101

K-12 nanotechnology lesson plans, from NISE Net: http://nisenet.org/search/product_cat…

Nano & Me: Nanotechnology in our lives: http://www.nanoandme.org/home/

24 questions and answers on nanotechnology safety: http://2020science.org/2010/02/12/24-…

Nanotechnology basics from nanotechnology for Dummies: http://www.dummies.com/how-to/educati…

Nanotech rewards (video from Discovery): https://www.youtube.com/watch?v=yYXWH…

Nanotech risks (video from Discovery): https://www.youtube.com/watch?v=qc0KL…

ACKNOWLEDGEMENTS

This video was developed as part of the NSF-funded Nanosystems Engineering Research Center for Nanotechnology-Enabled Water Treatment (NEWT), under NSF Award Number EEC-1449500. It was produced in collaboration with Claire Cook.

RISK BITES

Risk Bites videos are devised, created and produced by Andrew Maynard, in association with the Arizona State University School for the Future of Innovation in Society (http://sfis.asu.edu). They are produced under a Creative Commons License CC-BY-SA

Backing track: Mandolin Highway by Olive Musique. http://www.premiumbeat.com/royalty_fr…

Risk Bites is your guide to making sense of risk. We cover everything from understanding and balancing the risks and benefits of everyday products, to health science more broadly, to the potential impacts of emerging technologies, to making sense of risk perception. If you enjoy our videos, please subscribe, and spread the word!

Read more here:

What is nanotechnology?

History of nanotechnology – Wikipedia

The history of nanotechnology traces the development of the concepts and experimental work falling under the broad category of nanotechnology. Although nanotechnology is a relatively recent development in scientific research, the development of its central concepts happened over a longer period of time. The emergence of nanotechnology in the 1980s was caused by the convergence of experimental advances such as the invention of the scanning tunneling microscope in 1981 and the discovery of fullerenes in 1985, with the elucidation and popularization of a conceptual framework for the goals of nanotechnology beginning with the 1986 publication of the book Engines of Creation. The field was subject to growing public awareness and controversy in the early 2000s, with prominent debates about both its potential implications as well as the feasibility of the applications envisioned by advocates of molecular nanotechnology, and with governments moving to promote and fund research into nanotechnology. The early 2000s also saw the beginnings of commercial applications of nanotechnology, although these were limited to bulk applications of nanomaterials rather than the transformative applications envisioned by the field. .

The earliest evidence of the use and applications of nanotechnology can be traced back to carbon nanotubes, cementite nanowires found in the microstructure of wootz steel manufactured in ancient India from the time period of 600 BC and exported globally.[1]

Although nanoparticles are associated with modern science, they were used by artisans as far back as the ninth century in Mesopotamia for creating a glittering effect on the surface of pots.[2][3]

In modern times, pottery from the Middle Ages and Renaissance often retains a distinct gold- or copper-colored metallic glitter. This luster is caused by a metallic film that was applied to the transparent surface of a glazing, which contains silver and copper nanoparticles dispersed homogeneously in the glassy matrix of the ceramic glaze. These nanoparticles are created by the artisans by adding copper and silver salts and oxides together with vinegar, ochre, and clay on the surface of previously-glazed pottery. The technique originated in the Muslim world. As Muslims were not allowed to use gold in artistic representations, they sought a way to create a similar effect without using real gold. The solution they found was using luster.[3][4]

The American physicist Richard Feynman lectured, “There’s Plenty of Room at the Bottom,” at an American Physical Society meeting at Caltech on December 29, 1959, which is often held to have provided inspiration for the field of nanotechnology. Feynman had described a process by which the ability to manipulate individual atoms and molecules might be developed, using one set of precise tools to build and operate another proportionally smaller set, so on down to the needed scale. In the course of this, he noted, scaling issues would arise from the changing magnitude of various physical phenomena: gravity would become less important, surface tension and Van der Waals attraction would become more important.[5]

After Feynman’s death, scholars studying the historical development of nanotechnology have concluded that his actual role in catalyzing nanotechnology research was limited, based on recollections from many of the people active in the nascent field in the 1980s and 1990s. Chris Toumey, a cultural anthropologist at the University of South Carolina, found that the published versions of Feynmans talk had a negligible influence in the twenty years after it was first published, as measured by citations in the scientific literature, and not much more influence in the decade after the Scanning Tunneling Microscope was invented in 1981. Subsequently, interest in Plenty of Room in the scientific literature greatly increased in the early 1990s. This is probably because the term nanotechnology gained serious attention just before that time, following its use by K. Eric Drexler in his 1986 book, Engines of Creation: The Coming Era of Nanotechnology, which took the Feynman concept of a billion tiny factories and added the idea that they could make more copies of themselves via computer control instead of control by a human operator; and in a cover article headlined “Nanotechnology”,[6][7] published later that year in a mass-circulation science-oriented magazine, OMNI. Toumeys analysis also includes comments from distinguished scientists in nanotechnology who say that Plenty of Room did not influence their early work, and in fact most of them had not read it until a later date.[8][9]

These and other developments hint that the retroactive rediscovery of Feynmans Plenty of Room gave nanotechnology a packaged history that provided an early date of December 1959, plus a connection to the charisma and genius of Richard Feynman. Feynman’s stature as a Nobel laureate and as an iconic figure in 20th century science surely helped advocates of nanotechnology and provided a valuable intellectual link to the past.[10]

The Japanese scientist called Norio Taniguchi of Tokyo University of Science was first to use the term “nano-technology” in a 1974 conference,[11] to describe semiconductor processes such as thin film deposition and ion beam milling exhibiting characteristic control on the order of a nanometer. His definition was, “‘Nano-technology’ mainly consists of the processing of, separation, consolidation, and deformation of materials by one atom or one molecule.” However, the term was not used again until 1981 when Eric Drexler, who was unaware of Taniguchi’s prior use of the term, published his first paper on nanotechnology in 1981.[12][13][14]

In the 1980s the idea of nanotechnology as a deterministic, rather than stochastic, handling of individual atoms and molecules was conceptually explored in depth by K. Eric Drexler, who promoted the technological significance of nano-scale phenomena and devices through speeches and two influential books.

In 1980, Drexler encountered Feynman’s provocative 1959 talk “There’s Plenty of Room at the Bottom” while preparing his initial scientific paper on the subject, Molecular Engineering: An approach to the development of general capabilities for molecular manipulation, published in the Proceedings of the National Academy of Sciences in 1981.[15] The term “nanotechnology” (which paralleled Taniguchi’s “nano-technology”) was independently applied by Drexler in his 1986 book Engines of Creation: The Coming Era of Nanotechnology, which proposed the idea of a nanoscale “assembler” which would be able to build a copy of itself and of other items of arbitrary complexity. He also first published the term “grey goo” to describe what might happen if a hypothetical self-replicating machine, capable of independent operation, were constructed and released. Drexler’s vision of nanotechnology is often called “Molecular Nanotechnology” (MNT) or “molecular manufacturing.”

His 1991 Ph.D. work at the MIT Media Lab was the first doctoral degree on the topic of molecular nanotechnology and (after some editing) his thesis, “Molecular Machinery and Manufacturing with Applications to Computation,”[16] was published as Nanosystems: Molecular Machinery, Manufacturing, and Computation,[17] which received the Association of American Publishers award for Best Computer Science Book of 1992. Drexler founded the Foresight Institute in 1986 with the mission of “Preparing for nanotechnology. Drexler is no longer a member of the Foresight Institute.[citation needed]

Nanotechnology and nanoscience got a boost in the early 1980s with two major developments: the birth of cluster science and the invention of the scanning tunneling microscope (STM). These developments led to the discovery of fullerenes in 1985 and the structural assignment of carbon nanotubes a few years later

The scanning tunneling microscope, an instrument for imaging surfaces at the atomic level, was developed in 1981 by Gerd Binnig and Heinrich Rohrer at IBM Zurich Research Laboratory, for which they were awarded the Nobel Prize in Physics in 1986.[18][19] Binnig, Calvin Quate and Christoph Gerber invented the first atomic force microscope in 1986. The first commercially available atomic force microscope was introduced in 1989.

IBM researcher Don Eigler was the first to manipulate atoms using a scanning tunneling microscope in 1989. He used 35 Xenon atoms to spell out the IBM logo.[20] He shared the 2010 Kavli Prize in Nanoscience for this work.[21]

Interface and colloid science had existed for nearly a century before they became associated with nanotechnology.[22][23] The first observations and size measurements of nanoparticles had been made during the first decade of the 20th century by Richard Adolf Zsigmondy, winner of the 1925 Nobel Prize in Chemistry, who made a detailed study of gold sols and other nanomaterials with sizes down to 10nm using an ultramicroscope which was capable of visualizing particles much smaller than the light wavelength.[24] Zsigmondy was also the first to use the term “nanometer” explicitly for characterizing particle size. In the 1920s, Irving Langmuir, winner of the 1932 Nobel Prize in Chemistry, and Katharine B. Blodgett introduced the concept of a monolayer, a layer of material one molecule thick. In the early 1950s, Derjaguin and Abrikosova conducted the first measurement of surface forces.[25]

In 1974 the process of atomic layer deposition for depositing uniform thin films one atomic layer at a time was developed and patented by Tuomo Suntola and co-workers in Finland.[26]

In another development, the synthesis and properties of semiconductor nanocrystals were studied. This led to a fast increasing number of semiconductor nanoparticles of quantum dots.

Fullerenes were discovered in 1985 by Harry Kroto, Richard Smalley, and Robert Curl, who together won the 1996 Nobel Prize in Chemistry. Smalley’s research in physical chemistry investigated formation of inorganic and semiconductor clusters using pulsed molecular beams and time of flight mass spectrometry. As a consequence of this expertise, Curl introduced him to Kroto in order to investigate a question about the constituents of astronomical dust. These are carbon rich grains expelled by old stars such as R Corona Borealis. The result of this collaboration was the discovery of C60 and the fullerenes as the third allotropic form of carbon. Subsequent discoveries included the endohedral fullerenes, and the larger family of fullerenes the following year.[27][28]

The discovery of carbon nanotubes is largely attributed to Sumio Iijima of NEC in 1991, although carbon nanotubes have been produced and observed under a variety of conditions prior to 1991.[29] Iijima’s discovery of multi-walled carbon nanotubes in the insoluble material of arc-burned graphite rods in 1991[30] and Mintmire, Dunlap, and White’s independent prediction that if single-walled carbon nanotubes could be made, then they would exhibit remarkable conducting properties[31] helped create the initial buzz that is now associated with carbon nanotubes. Nanotube research accelerated greatly following the independent discoveries[32][33] by Bethune at IBM[34] and Iijima at NEC of single-walled carbon nanotubes and methods to specifically produce them by adding transition-metal catalysts to the carbon in an arc discharge.

In the early 1990s Huffman and Kraetschmer, of the University of Arizona, discovered how to synthesize and purify large quantities of fullerenes. This opened the door to their characterization and functionalization by hundreds of investigators in government and industrial laboratories. Shortly after, rubidium doped C60 was found to be a mid temperature (Tc = 32 K) superconductor. At a meeting of the Materials Research Society in 1992, Dr. T. Ebbesen (NEC) described to a spellbound audience his discovery and characterization of carbon nanotubes. This event sent those in attendance and others downwind of his presentation into their laboratories to reproduce and push those discoveries forward. Using the same or similar tools as those used by Huffman and Kratschmer, hundreds of researchers further developed the field of nanotube-based nanotechnology.

The National Nanotechnology Initiative is a United States federal nanotechnology research and development program. The NNI serves as the central point of communication, cooperation, and collaboration for all Federal agencies engaged in nanotechnology research, bringing together the expertise needed to advance this broad and complex field.”[35] Its goals are to advance a world-class nanotechnology research and development (R&D) program, foster the transfer of new technologies into products for commercial and public benefit, develop and sustain educational resources, a skilled workforce, and the supporting infrastructure and tools to advance nanotechnology, and support responsible development of nanotechnology. The initiative was spearheaded by Mihail Roco, who formally proposed the National Nanotechnology Initiative to the Office of Science and Technology Policy during the Clinton administration in 1999, and was a key architect in its development. He is currently the Senior Advisor for Nanotechnology at the National Science Foundation, as well as the founding chair of the National Science and Technology Council subcommittee on Nanoscale Science, Engineering and Technology.[36]

President Bill Clinton advocated nanotechnology development. In a 21 January 2000 speech[37] at the California Institute of Technology, Clinton said, “Some of our research goals may take twenty or more years to achieve, but that is precisely why there is an important role for the federal government.” Feynman’s stature and concept of atomically precise fabrication played a role in securing funding for nanotechnology research, as mentioned in President Clinton’s speech:

My budget supports a major new National Nanotechnology Initiative, worth $500 million. Caltech is no stranger to the idea of nanotechnology the ability to manipulate matter at the atomic and molecular level. Over 40 years ago, Caltech’s own Richard Feynman asked, “What would happen if we could arrange the atoms one by one the way we want them?”[38]

President George W. Bush further increased funding for nanotechnology. On December 3, 2003 Bush signed into law the 21st Century Nanotechnology Research and Development Act,[39] which authorizes expenditures for five of the participating agencies totaling US$3.63 billion over four years.[40] The NNI budget supplement for Fiscal Year 2009 provides $1.5 billion to the NNI, reflecting steady growth in the nanotechnology investment.[41]

“Why the future doesn’t need us” is an article written by Bill Joy, then Chief Scientist at Sun Microsystems, in the April 2000 issue of Wired magazine. In the article, he argues that “Our most powerful 21st-century technologies robotics, genetic engineering, and nanotech are threatening to make humans an endangered species.” Joy argues that developing technologies provide a much greater danger to humanity than any technology before it has ever presented. In particular, he focuses on genetics, nanotechnology and robotics. He argues that 20th-century technologies of destruction, such as the nuclear bomb, were limited to large governments, due to the complexity and cost of such devices, as well as the difficulty in acquiring the required materials. He also voices concern about increasing computer power. His worry is that computers will eventually become more intelligent than we are, leading to such dystopian scenarios as robot rebellion. He notably quotes the Unabomber on this topic. After the publication of the article, Bill Joy suggested assessing technologies to gauge their implicit dangers, as well as having scientists refuse to work on technologies that have the potential to cause harm.

In the AAAS Science and Technology Policy Yearbook 2001 article titled A Response to Bill Joy and the Doom-and-Gloom Technofuturists, Bill Joy was criticized for having technological tunnel vision on his prediction, by failing to consider social factors.[42] In Ray Kurzweil’s The Singularity Is Near, he questioned the regulation of potentially dangerous technology, asking “Should we tell the millions of people afflicted with cancer and other devastating conditions that we are canceling the development of all bioengineered treatments because there is a risk that these same technologies may someday be used for malevolent purposes?”.

Prey is a 2002 novel by Michael Crichton which features an artificial swarm of nanorobots which develop intelligence and threaten their human inventors. The novel generated concern within the nanotechnology community that the novel could negatively affect public perception of nanotechnology by creating fear of a similar scenario in real life.[43]

Richard Smalley, best known for co-discovering the soccer ball-shaped buckyball molecule and a leading advocate of nanotechnology and its many applications, was an outspoken critic of the idea of molecular assemblers, as advocated by Eric Drexler. In 2001 he introduced scientific objections to them[44] attacking the notion of universal assemblers in a 2001 Scientific American article, leading to a rebuttal later that year from Drexler and colleagues,[45] and eventually to an exchange of open letters in 2003.[46]

Smalley criticized Drexler’s work on nanotechnology as naive, arguing that chemistry is extremely complicated, reactions are hard to control, and that a universal assembler is science fiction. Smalley believed that such assemblers were not physically possible and introduced scientific objections to them. His two principal technical objections, which he had termed the fat fingers problem” and the “sticky fingers problem, argued against the feasibility of molecular assemblers being able to precisely select and place individual atoms. He also believed that Drexlers speculations about apocalyptic dangers of molecular assemblers threaten the public support for development of nanotechnology.

Smalley first argued that “fat fingers” made MNT impossible. He later argued that nanomachines would have to resemble chemical enzymes more than Drexler’s assemblers and could only work in water. He believed these would exclude the possibility of “molecular assemblers” that worked by precision picking and placing of individual atoms. Also, Smalley argued that nearly all of modern chemistry involves reactions that take place in a solvent (usually water), because the small molecules of a solvent contribute many things, such as lowering binding energies for transition states. Since nearly all known chemistry requires a solvent, Smalley felt that Drexler’s proposal to use a high vacuum environment was not feasible.

Smalley also believed that Drexler’s speculations about apocalyptic dangers of self-replicating machines that have been equated with “molecular assemblers” would threaten the public support for development of nanotechnology. To address the debate between Drexler and Smalley regarding molecular assemblers Chemical & Engineering News published a point-counterpoint consisting of an exchange of letters that addressed the issues.[46]

Drexler and coworkers responded to these two issues[45] in a 2001 publication. Drexler and colleagues noted that Drexler never proposed universal assemblers able to make absolutely anything, but instead proposed more limited assemblers able to make a very wide variety of things. They challenged the relevance of Smalley’s arguments to the more specific proposals advanced in Nanosystems. Drexler maintained that both were straw man arguments, and in the case of enzymes, Prof. Klibanov wrote in 1994, “…using an enzyme in organic solvents eliminates several obstacles…”[47] Drexler also addresses this in Nanosystems by showing mathematically that well designed catalysts can provide the effects of a solvent and can fundamentally be made even more efficient than a solvent/enzyme reaction could ever be. Drexler had difficulty in getting Smalley to respond, but in December 2003, Chemical & Engineering News carried a 4-part debate.[46]

Ray Kurzweil spends four pages in his book ‘The Singularity Is Near’ to showing that Richard Smalley’s arguments are not valid, and disputing them point by point. Kurzweil ends by stating that Drexler’s visions are very practicable and even happening already.[48]

The Royal Society and Royal Academy of Engineering’s 2004 report on the implications of nanoscience and nanotechnologies[49] was inspired by Prince Charles’ concerns about nanotechnology, including molecular manufacturing. However, the report spent almost no time on molecular manufacturing.[50] In fact, the word “Drexler” appears only once in the body of the report (in passing), and “molecular manufacturing” or “molecular nanotechnology” not at all. The report covers various risks of nanoscale technologies, such as nanoparticle toxicology. It also provides a useful overview of several nanoscale fields. The report contains an annex (appendix) on grey goo, which cites a weaker variation of Richard Smalley’s contested argument against molecular manufacturing. It concludes that there is no evidence that autonomous, self replicating nanomachines will be developed in the foreseeable future, and suggests that regulators should be more concerned with issues of nanoparticle toxicology.

The early 2000s saw the beginnings of the use of nanotechnology in commercial products, although most applications are limited to the bulk use of passive nanomaterials. Examples include titanium dioxide and zinc oxide nanoparticles in sunscreen, cosmetics and some food products; silver nanoparticles in food packaging, clothing, disinfectants and household appliances such as Silver Nano; carbon nanotubes for stain-resistant textiles; and cerium oxide as a fuel catalyst.[51] As of March 10, 2011, the Project on Emerging Nanotechnologies estimated that over 1300 manufacturer-identified nanotech products are publicly available, with new ones hitting the market at a pace of 34 per week.[52]

The National Science Foundation funded researcher David Berube to study the field of nanotechnology. His findings are published in the monograph Nano-Hype: The Truth Behind the Nanotechnology Buzz. This study concludes that much of what is sold as nanotechnology is in fact a recasting of straightforward materials science, which is leading to a nanotech industry built solely on selling nanotubes, nanowires, and the like which will end up with a few suppliers selling low margin products in huge volumes.” Further applications which require actual manipulation or arrangement of nanoscale components await further research. Though technologies branded with the term ‘nano’ are sometimes little related to and fall far short of the most ambitious and transformative technological goals of the sort in molecular manufacturing proposals, the term still connotes such ideas. According to Berube, there may be a danger that a “nano bubble” will form, or is forming already, from the use of the term by scientists and entrepreneurs to garner funding, regardless of interest in the transformative possibilities of more ambitious and far-sighted work.[53]

Go here to see the original:

History of nanotechnology – Wikipedia

Nanotechnology – Wikipedia

Nanotechnology (“nanotech”) is manipulation of matter on an atomic, molecular, and supramolecular scale. The earliest, widespread description of nanotechnology[1][2] referred to the particular technological goal of precisely manipulating atoms and molecules for fabrication of macroscale products, also now referred to as molecular nanotechnology. A more generalized description of nanotechnology was subsequently established by the National Nanotechnology Initiative, which defines nanotechnology as the manipulation of matter with at least one dimension sized from 1 to 100 nanometers. This definition reflects the fact that quantum mechanical effects are important at this quantum-realm scale, and so the definition shifted from a particular technological goal to a research category inclusive of all types of research and technologies that deal with the special properties of matter which occur below the given size threshold. It is therefore common to see the plural form “nanotechnologies” as well as “nanoscale technologies” to refer to the broad range of research and applications whose common trait is size. Because of the variety of potential applications (including industrial and military), governments have invested billions of dollars in nanotechnology research. Through 2012, the USA has invested $3.7 billion using its National Nanotechnology Initiative, the European Union has invested $1.2 billion, and Japan has invested $750 million.[3]

Nanotechnology as defined by size is naturally very broad, including fields of science as diverse as surface science, organic chemistry, molecular biology, semiconductor physics, energy storage,[4][5] microfabrication,[6] molecular engineering, etc.[7] The associated research and applications are equally diverse, ranging from extensions of conventional device physics to completely new approaches based upon molecular self-assembly,[8] from developing new materials with dimensions on the nanoscale to direct control of matter on the atomic scale.

Scientists currently debate the future implications of nanotechnology. Nanotechnology may be able to create many new materials and devices with a vast range of applications, such as in nanomedicine, nanoelectronics, biomaterials energy production, and consumer products. On the other hand, nanotechnology raises many of the same issues as any new technology, including concerns about the toxicity and environmental impact of nanomaterials,[9] and their potential effects on global economics, as well as speculation about various doomsday scenarios. These concerns have led to a debate among advocacy groups and governments on whether special regulation of nanotechnology is warranted.

The concepts that seeded nanotechnology were first discussed in 1959 by renowned physicist Richard Feynman in his talk There’s Plenty of Room at the Bottom, in which he described the possibility of synthesis via direct manipulation of atoms. The term “nano-technology” was first used by Norio Taniguchi in 1974, though it was not widely known.

Inspired by Feynman’s concepts, K. Eric Drexler used the term “nanotechnology” in his 1986 book Engines of Creation: The Coming Era of Nanotechnology, which proposed the idea of a nanoscale “assembler” which would be able to build a copy of itself and of other items of arbitrary complexity with atomic control. Also in 1986, Drexler co-founded The Foresight Institute (with which he is no longer affiliated) to help increase public awareness and understanding of nanotechnology concepts and implications.

Thus, emergence of nanotechnology as a field in the 1980s occurred through convergence of Drexler’s theoretical and public work, which developed and popularized a conceptual framework for nanotechnology, and high-visibility experimental advances that drew additional wide-scale attention to the prospects of atomic control of matter. Since the popularity spike in the 1980s, most of nanotechnology has involved investigation of several approaches to making mechanical devices out of a small number of atoms.[10]

In the 1980s, two major breakthroughs sparked the growth of nanotechnology in modern era. First, the invention of the scanning tunneling microscope in 1981 which provided unprecedented visualization of individual atoms and bonds, and was successfully used to manipulate individual atoms in 1989. The microscope’s developers Gerd Binnig and Heinrich Rohrer at IBM Zurich Research Laboratory received a Nobel Prize in Physics in 1986.[11][12] Binnig, Quate and Gerber also invented the analogous atomic force microscope that year.

Second, Fullerenes were discovered in 1985 by Harry Kroto, Richard Smalley, and Robert Curl, who together won the 1996 Nobel Prize in Chemistry.[13][14] C60 was not initially described as nanotechnology; the term was used regarding subsequent work with related graphene tubes (called carbon nanotubes and sometimes called Bucky tubes) which suggested potential applications for nanoscale electronics and devices.

In the early 2000s, the field garnered increased scientific, political, and commercial attention that led to both controversy and progress. Controversies emerged regarding the definitions and potential implications of nanotechnologies, exemplified by the Royal Society’s report on nanotechnology.[15] Challenges were raised regarding the feasibility of applications envisioned by advocates of molecular nanotechnology, which culminated in a public debate between Drexler and Smalley in 2001 and 2003.[16]

Meanwhile, commercialization of products based on advancements in nanoscale technologies began emerging. These products are limited to bulk applications of nanomaterials and do not involve atomic control of matter. Some examples include the Silver Nano platform for using silver nanoparticles as an antibacterial agent, nanoparticle-based transparent sunscreens, carbon fiber strengthening using silica nanoparticles, and carbon nanotubes for stain-resistant textiles.[17][18]

Governments moved to promote and fund research into nanotechnology, such as in the U.S. with the National Nanotechnology Initiative, which formalized a size-based definition of nanotechnology and established funding for research on the nanoscale, and in Europe via the European Framework Programmes for Research and Technological Development.

By the mid-2000s new and serious scientific attention began to flourish. Projects emerged to produce nanotechnology roadmaps[19][20] which center on atomically precise manipulation of matter and discuss existing and projected capabilities, goals, and applications.

Nanotechnology is the engineering of functional systems at the molecular scale. This covers both current work and concepts that are more advanced. In its original sense, nanotechnology refers to the projected ability to construct items from the bottom up, using techniques and tools being developed today to make complete, high performance products.

One nanometer (nm) is one billionth, or 109, of a meter. By comparison, typical carbon-carbon bond lengths, or the spacing between these atoms in a molecule, are in the range 0.120.15 nm, and a DNA double-helix has a diameter around 2nm. On the other hand, the smallest cellular life-forms, the bacteria of the genus Mycoplasma, are around 200nm in length. By convention, nanotechnology is taken as the scale range 1 to 100 nm following the definition used by the National Nanotechnology Initiative in the US. The lower limit is set by the size of atoms (hydrogen has the smallest atoms, which are approximately a quarter of a nm kinetic diameter) since nanotechnology must build its devices from atoms and molecules. The upper limit is more or less arbitrary but is around the size below which phenomena not observed in larger structures start to become apparent and can be made use of in the nano device.[21] These new phenomena make nanotechnology distinct from devices which are merely miniaturised versions of an equivalent macroscopic device; such devices are on a larger scale and come under the description of microtechnology.[22]

To put that scale in another context, the comparative size of a nanometer to a meter is the same as that of a marble to the size of the earth.[23] Or another way of putting it: a nanometer is the amount an average man’s beard grows in the time it takes him to raise the razor to his face.[23]

Two main approaches are used in nanotechnology. In the “bottom-up” approach, materials and devices are built from molecular components which assemble themselves chemically by principles of molecular recognition.[24] In the “top-down” approach, nano-objects are constructed from larger entities without atomic-level control.[25]

Areas of physics such as nanoelectronics, nanomechanics, nanophotonics and nanoionics have evolved during the last few decades to provide a basic scientific foundation of nanotechnology.

Several phenomena become pronounced as the size of the system decreases. These include statistical mechanical effects, as well as quantum mechanical effects, for example the “quantum size effect” where the electronic properties of solids are altered with great reductions in particle size. This effect does not come into play by going from macro to micro dimensions. However, quantum effects can become significant when the nanometer size range is reached, typically at distances of 100 nanometers or less, the so-called quantum realm. Additionally, a number of physical (mechanical, electrical, optical, etc.) properties change when compared to macroscopic systems. One example is the increase in surface area to volume ratio altering mechanical, thermal and catalytic properties of materials. Diffusion and reactions at nanoscale, nanostructures materials and nanodevices with fast ion transport are generally referred to nanoionics. Mechanical properties of nanosystems are of interest in the nanomechanics research. The catalytic activity of nanomaterials also opens potential risks in their interaction with biomaterials.

Materials reduced to the nanoscale can show different properties compared to what they exhibit on a macroscale, enabling unique applications. For instance, opaque substances can become transparent (copper); stable materials can turn combustible (aluminium); insoluble materials may become soluble (gold). A material such as gold, which is chemically inert at normal scales, can serve as a potent chemical catalyst at nanoscales. Much of the fascination with nanotechnology stems from these quantum and surface phenomena that matter exhibits at the nanoscale.[26]

Modern synthetic chemistry has reached the point where it is possible to prepare small molecules to almost any structure. These methods are used today to manufacture a wide variety of useful chemicals such as pharmaceuticals or commercial polymers. This ability raises the question of extending this kind of control to the next-larger level, seeking methods to assemble these single molecules into supramolecular assemblies consisting of many molecules arranged in a well defined manner.

These approaches utilize the concepts of molecular self-assembly and/or supramolecular chemistry to automatically arrange themselves into some useful conformation through a bottom-up approach. The concept of molecular recognition is especially important: molecules can be designed so that a specific configuration or arrangement is favored due to non-covalent intermolecular forces. The WatsonCrick basepairing rules are a direct result of this, as is the specificity of an enzyme being targeted to a single substrate, or the specific folding of the protein itself. Thus, two or more components can be designed to be complementary and mutually attractive so that they make a more complex and useful whole.

Such bottom-up approaches should be capable of producing devices in parallel and be much cheaper than top-down methods, but could potentially be overwhelmed as the size and complexity of the desired assembly increases. Most useful structures require complex and thermodynamically unlikely arrangements of atoms. Nevertheless, there are many examples of self-assembly based on molecular recognition in biology, most notably WatsonCrick basepairing and enzyme-substrate interactions. The challenge for nanotechnology is whether these principles can be used to engineer new constructs in addition to natural ones.

Molecular nanotechnology, sometimes called molecular manufacturing, describes engineered nanosystems (nanoscale machines) operating on the molecular scale. Molecular nanotechnology is especially associated with the molecular assembler, a machine that can produce a desired structure or device atom-by-atom using the principles of mechanosynthesis. Manufacturing in the context of productive nanosystems is not related to, and should be clearly distinguished from, the conventional technologies used to manufacture nanomaterials such as carbon nanotubes and nanoparticles.

When the term “nanotechnology” was independently coined and popularized by Eric Drexler (who at the time was unaware of an earlier usage by Norio Taniguchi) it referred to a future manufacturing technology based on molecular machine systems. The premise was that molecular scale biological analogies of traditional machine components demonstrated molecular machines were possible: by the countless examples found in biology, it is known that sophisticated, stochastically optimised biological machines can be produced.

It is hoped that developments in nanotechnology will make possible their construction by some other means, perhaps using biomimetic principles. However, Drexler and other researchers[27] have proposed that advanced nanotechnology, although perhaps initially implemented by biomimetic means, ultimately could be based on mechanical engineering principles, namely, a manufacturing technology based on the mechanical functionality of these components (such as gears, bearings, motors, and structural members) that would enable programmable, positional assembly to atomic specification.[28] The physics and engineering performance of exemplar designs were analyzed in Drexler’s book Nanosystems.

In general it is very difficult to assemble devices on the atomic scale, as one has to position atoms on other atoms of comparable size and stickiness. Another view, put forth by Carlo Montemagno,[29] is that future nanosystems will be hybrids of silicon technology and biological molecular machines. Richard Smalley argued that mechanosynthesis are impossible due to the difficulties in mechanically manipulating individual molecules.

This led to an exchange of letters in the ACS publication Chemical & Engineering News in 2003.[30] Though biology clearly demonstrates that molecular machine systems are possible, non-biological molecular machines are today only in their infancy. Leaders in research on non-biological molecular machines are Dr. Alex Zettl and his colleagues at Lawrence Berkeley Laboratories and UC Berkeley.[1] They have constructed at least three distinct molecular devices whose motion is controlled from the desktop with changing voltage: a nanotube nanomotor, a molecular actuator,[31] and a nanoelectromechanical relaxation oscillator.[32] See nanotube nanomotor for more examples.

An experiment indicating that positional molecular assembly is possible was performed by Ho and Lee at Cornell University in 1999. They used a scanning tunneling microscope to move an individual carbon monoxide molecule (CO) to an individual iron atom (Fe) sitting on a flat silver crystal, and chemically bound the CO to the Fe by applying a voltage.

The nanomaterials field includes subfields which develop or study materials having unique properties arising from their nanoscale dimensions.[35]

These seek to arrange smaller components into more complex assemblies.

These seek to create smaller devices by using larger ones to direct their assembly.

These seek to develop components of a desired functionality without regard to how they might be assembled.

These subfields seek to anticipate what inventions nanotechnology might yield, or attempt to propose an agenda along which inquiry might progress. These often take a big-picture view of nanotechnology, with more emphasis on its societal implications than the details of how such inventions could actually be created.

Nanomaterials can be classified in 0D, 1D, 2D and 3D nanomaterials. The dimensionality play a major role in determining the characteristic of nanomaterials including physical, chemical and biological characteristics. With the decrease in dimensionality, an increase in surface-to-volume ratio is observed. This indicate that smaller dimensional nanomaterials have higher surface area compared to 3D nanomaterials. Recently, two dimensional (2D) nanomaterials are extensively investigated for electronic, biomedical, drug delivery and biosensor applications.

There are several important modern developments. The atomic force microscope (AFM) and the Scanning Tunneling Microscope (STM) are two early versions of scanning probes that launched nanotechnology. There are other types of scanning probe microscopy. Although conceptually similar to the scanning confocal microscope developed by Marvin Minsky in 1961 and the scanning acoustic microscope (SAM) developed by Calvin Quate and coworkers in the 1970s, newer scanning probe microscopes have much higher resolution, since they are not limited by the wavelength of sound or light.

The tip of a scanning probe can also be used to manipulate nanostructures (a process called positional assembly). Feature-oriented scanning methodology may be a promising way to implement these nanomanipulations in automatic mode.[53][54] However, this is still a slow process because of low scanning velocity of the microscope.

Various techniques of nanolithography such as optical lithography, X-ray lithography, dip pen nanolithography, electron beam lithography or nanoimprint lithography were also developed. Lithography is a top-down fabrication technique where a bulk material is reduced in size to nanoscale pattern.

Another group of nanotechnological techniques include those used for fabrication of nanotubes and nanowires, those used in semiconductor fabrication such as deep ultraviolet lithography, electron beam lithography, focused ion beam machining, nanoimprint lithography, atomic layer deposition, and molecular vapor deposition, and further including molecular self-assembly techniques such as those employing di-block copolymers. The precursors of these techniques preceded the nanotech era, and are extensions in the development of scientific advancements rather than techniques which were devised with the sole purpose of creating nanotechnology and which were results of nanotechnology research.[55]

The top-down approach anticipates nanodevices that must be built piece by piece in stages, much as manufactured items are made. Scanning probe microscopy is an important technique both for characterization and synthesis of nanomaterials. Atomic force microscopes and scanning tunneling microscopes can be used to look at surfaces and to move atoms around. By designing different tips for these microscopes, they can be used for carving out structures on surfaces and to help guide self-assembling structures. By using, for example, feature-oriented scanning approach, atoms or molecules can be moved around on a surface with scanning probe microscopy techniques.[53][54] At present, it is expensive and time-consuming for mass production but very suitable for laboratory experimentation.

In contrast, bottom-up techniques build or grow larger structures atom by atom or molecule by molecule. These techniques include chemical synthesis, self-assembly and positional assembly. Dual polarisation interferometry is one tool suitable for characterisation of self assembled thin films. Another variation of the bottom-up approach is molecular beam epitaxy or MBE. Researchers at Bell Telephone Laboratories like John R. Arthur. Alfred Y. Cho, and Art C. Gossard developed and implemented MBE as a research tool in the late 1960s and 1970s. Samples made by MBE were key to the discovery of the fractional quantum Hall effect for which the 1998 Nobel Prize in Physics was awarded. MBE allows scientists to lay down atomically precise layers of atoms and, in the process, build up complex structures. Important for research on semiconductors, MBE is also widely used to make samples and devices for the newly emerging field of spintronics.

However, new therapeutic products, based on responsive nanomaterials, such as the ultradeformable, stress-sensitive Transfersome vesicles, are under development and already approved for human use in some countries.[56]

As of August 21, 2008, the Project on Emerging Nanotechnologies estimates that over 800 manufacturer-identified nanotech products are publicly available, with new ones hitting the market at a pace of 34 per week.[18] The project lists all of the products in a publicly accessible online database. Most applications are limited to the use of “first generation” passive nanomaterials which includes titanium dioxide in sunscreen, cosmetics, surface coatings,[57] and some food products; Carbon allotropes used to produce gecko tape; silver in food packaging, clothing, disinfectants and household appliances; zinc oxide in sunscreens and cosmetics, surface coatings, paints and outdoor furniture varnishes; and cerium oxide as a fuel catalyst.[17]

Further applications allow tennis balls to last longer, golf balls to fly straighter, and even bowling balls to become more durable and have a harder surface. Trousers and socks have been infused with nanotechnology so that they will last longer and keep people cool in the summer. Bandages are being infused with silver nanoparticles to heal cuts faster.[58] Video game consoles and personal computers may become cheaper, faster, and contain more memory thanks to nanotechnology.[59] Also, to build structures for on chip computing with light, for example on chip optical quantum information processing, and picosecond transmission of information.[60]

Nanotechnology may have the ability to make existing medical applications cheaper and easier to use in places like the general practitioner’s office and at home.[61] Cars are being manufactured with nanomaterials so they may need fewer metals and less fuel to operate in the future.[62]

Scientists are now turning to nanotechnology in an attempt to develop diesel engines with cleaner exhaust fumes. Platinum is currently used as the diesel engine catalyst in these engines. The catalyst is what cleans the exhaust fume particles. First a reduction catalyst is employed to take nitrogen atoms from NOx molecules in order to free oxygen. Next the oxidation catalyst oxidizes the hydrocarbons and carbon monoxide to form carbon dioxide and water.[63] Platinum is used in both the reduction and the oxidation catalysts.[64] Using platinum though, is inefficient in that it is expensive and unsustainable. Danish company InnovationsFonden invested DKK 15 million in a search for new catalyst substitutes using nanotechnology. The goal of the project, launched in the autumn of 2014, is to maximize surface area and minimize the amount of material required. Objects tend to minimize their surface energy; two drops of water, for example, will join to form one drop and decrease surface area. If the catalyst’s surface area that is exposed to the exhaust fumes is maximized, efficiency of the catalyst is maximized. The team working on this project aims to create nanoparticles that will not merge. Every time the surface is optimized, material is saved. Thus, creating these nanoparticles will increase the effectiveness of the resulting diesel engine catalystin turn leading to cleaner exhaust fumesand will decrease cost. If successful, the team hopes to reduce platinum use by 25%.[65]

Nanotechnology also has a prominent role in the fast developing field of Tissue Engineering. When designing scaffolds, researchers attempt to the mimic the nanoscale features of a Cell’s microenvironment to direct its differentiation down a suitable lineage.[66] For example, when creating scaffolds to support the growth of bone, researchers may mimic osteoclast resorption pits.[67]

Researchers have successfully used DNA origami-based nanobots capable of carrying out logic functions to achieve targeted drug delivery in cockroaches. It is said that the computational power of these nanobots can be scaled up to that of a Commodore 64.[68]

An area of concern is the effect that industrial-scale manufacturing and use of nanomaterials would have on human health and the environment, as suggested by nanotoxicology research. For these reasons, some groups advocate that nanotechnology be regulated by governments. Others counter that overregulation would stifle scientific research and the development of beneficial innovations. Public health research agencies, such as the National Institute for Occupational Safety and Health are actively conducting research on potential health effects stemming from exposures to nanoparticles.[69][70]

Some nanoparticle products may have unintended consequences. Researchers have discovered that bacteriostatic silver nanoparticles used in socks to reduce foot odor are being released in the wash.[71] These particles are then flushed into the waste water stream and may destroy bacteria which are critical components of natural ecosystems, farms, and waste treatment processes.[72]

Public deliberations on risk perception in the US and UK carried out by the Center for Nanotechnology in Society found that participants were more positive about nanotechnologies for energy applications than for health applications, with health applications raising moral and ethical dilemmas such as cost and availability.[73]

Experts, including director of the Woodrow Wilson Center’s Project on Emerging Nanotechnologies David Rejeski, have testified[74] that successful commercialization depends on adequate oversight, risk research strategy, and public engagement. Berkeley, California is currently the only city in the United States to regulate nanotechnology;[75] Cambridge, Massachusetts in 2008 considered enacting a similar law,[76] but ultimately rejected it.[77] Relevant for both research on and application of nanotechnologies, the insurability of nanotechnology is contested.[78] Without state regulation of nanotechnology, the availability of private insurance for potential damages is seen as necessary to ensure that burdens are not socialised implicitly. Over the next several decades, applications of nanotechnology will likely include much higher-capacity computers, active materials of various kinds, and cellular-scale biomedical devices.[10]

Nanofibers are used in several areas and in different products, in everything from aircraft wings to tennis rackets. Inhaling airborne nanoparticles and nanofibers may lead to a number of pulmonary diseases, e.g. fibrosis.[79] Researchers have found that when rats breathed in nanoparticles, the particles settled in the brain and lungs, which led to significant increases in biomarkers for inflammation and stress response[80] and that nanoparticles induce skin aging through oxidative stress in hairless mice.[81][82]

A two-year study at UCLA’s School of Public Health found lab mice consuming nano-titanium dioxide showed DNA and chromosome damage to a degree “linked to all the big killers of man, namely cancer, heart disease, neurological disease and aging”.[83]

A major study published more recently in Nature Nanotechnology suggests some forms of carbon nanotubes a poster child for the “nanotechnology revolution” could be as harmful as asbestos if inhaled in sufficient quantities. Anthony Seaton of the Institute of Occupational Medicine in Edinburgh, Scotland, who contributed to the article on carbon nanotubes said “We know that some of them probably have the potential to cause mesothelioma. So those sorts of materials need to be handled very carefully.”[84] In the absence of specific regulation forthcoming from governments, Paull and Lyons (2008) have called for an exclusion of engineered nanoparticles in food.[85] A newspaper article reports that workers in a paint factory developed serious lung disease and nanoparticles were found in their lungs.[86][87][88][89]

Calls for tighter regulation of nanotechnology have occurred alongside a growing debate related to the human health and safety risks of nanotechnology.[90] There is significant debate about who is responsible for the regulation of nanotechnology. Some regulatory agencies currently cover some nanotechnology products and processes (to varying degrees) by “bolting on” nanotechnology to existing regulations there are clear gaps in these regimes.[91] Davies (2008) has proposed a regulatory road map describing steps to deal with these shortcomings.[92]

Stakeholders concerned by the lack of a regulatory framework to assess and control risks associated with the release of nanoparticles and nanotubes have drawn parallels with bovine spongiform encephalopathy (“mad cow” disease), thalidomide, genetically modified food,[93] nuclear energy, reproductive technologies, biotechnology, and asbestosis. Dr. Andrew Maynard, chief science advisor to the Woodrow Wilson Center’s Project on Emerging Nanotechnologies, concludes that there is insufficient funding for human health and safety research, and as a result there is currently limited understanding of the human health and safety risks associated with nanotechnology.[94] As a result, some academics have called for stricter application of the precautionary principle, with delayed marketing approval, enhanced labelling and additional safety data development requirements in relation to certain forms of nanotechnology.[95][96]

The Royal Society report[15] identified a risk of nanoparticles or nanotubes being released during disposal, destruction and recycling, and recommended that “manufacturers of products that fall under extended producer responsibility regimes such as end-of-life regulations publish procedures outlining how these materials will be managed to minimize possible human and environmental exposure” (p. xiii).

The Center for Nanotechnology in Society has found that people respond to nanotechnologies differently, depending on application with participants in public deliberations more positive about nanotechnologies for energy than health applications suggesting that any public calls for nano regulations may differ by technology sector.[73]

Here is the original post:

Nanotechnology – Wikipedia

What is Nanotechnology? | Nano

Nanotechnology is science, engineering, and technologyconductedat the nanoscale, which is about 1 to 100 nanometers.

Physicist Richard Feynman, the father of nanotechnology.

Nanoscience and nanotechnology are the study and application of extremely small things and can be used across all the other science fields, such as chemistry, biology, physics, materials science, and engineering.

The ideas and concepts behind nanoscience and nanotechnology started with a talk entitled Theres Plenty of Room at the Bottom by physicist Richard Feynman at an American Physical Society meeting at the California Institute of Technology (CalTech) on December 29, 1959, long before the term nanotechnology was used. In his talk, Feynman described a process in which scientists would be able to manipulate and control individual atoms and molecules. Over a decade later, in his explorations of ultraprecision machining, Professor Norio Taniguchi coined the term nanotechnology. It wasn’t until 1981, with the development of the scanning tunneling microscope that could “see” individual atoms, that modern nanotechnology began.

Its hard to imagine just how small nanotechnology is. One nanometer is a billionth of a meter, or 10-9 of a meter. Here are a few illustrative examples:

Nanoscience and nanotechnology involve the ability to see and to control individual atoms and molecules. Everything on Earth is made up of atomsthe food we eat, the clothes we wear, the buildings and houses we live in, and our own bodies.

But something as small as an atom is impossible to see with the naked eye. In fact, its impossible to see with the microscopes typically used in a high school science classes. The microscopes needed to see things at the nanoscale were invented relatively recentlyabout 30 years ago.

Once scientists had the right tools, such as thescanning tunneling microscope (STM)and the atomic force microscope (AFM), the age of nanotechnology was born.

Although modern nanoscience and nanotechnology are quite new, nanoscale materialswereused for centuries. Alternate-sized gold and silver particles created colors in the stained glass windows of medieval churches hundreds of years ago. The artists back then just didnt know that the process they used to create these beautiful works of art actually led to changes in the composition of the materials they were working with.

Today’s scientists andengineers are finding a wide variety of ways to deliberatelymake materials at the nanoscale to take advantage of their enhanced properties such as higher strength, lighter weight,increased control oflight spectrum, and greater chemical reactivity than theirlarger-scale counterparts.

Go here to see the original:

What is Nanotechnology? | Nano

Nanotechnology | Britannica.com

Nanotechnology, the manipulation and manufacture of materials and devices on the scale of atoms or small groups of atoms. The nanoscale is typically measured in nanometres, or billionths of a metre (nanos, the Greek word for dwarf, being the source of the prefix), and materials built at this scale often exhibit distinctive physical and chemical properties due to quantum mechanical effects. Although usable devices this small may be decades away (see microelectromechanical system), techniques for working at the nanoscale have become essential to electronic engineering, and nanoengineered materials have begun to appear in consumer products. For example, billions of microscopic nanowhiskers, each about 10 nanometres in length, have been molecularly hooked onto natural and synthetic fibres to impart stain resistance to clothing and other fabrics; zinc oxide nanocrystals have been used to create invisible sunscreens that block ultraviolet light; and silver nanocrystals have been embedded in bandages to kill bacteria and prevent infection.

Possibilities for the future are numerous. Nanotechnology may make it possible to manufacture lighter, stronger, and programmable materials that require less energy to produce than conventional materials, that produce less waste than with conventional manufacturing, and that promise greater fuel efficiency in land transportation, ships, aircraft, and space vehicles. Nanocoatings for both opaque and translucent surfaces may render them resistant to corrosion, scratches, and radiation. Nanoscale electronic, magnetic, and mechanical devices and systems with unprecedented levels of information processing may be fabricated, as may chemical, photochemical, and biological sensors for protection, health care, manufacturing, and the environment; new photoelectric materials that will enable the manufacture of cost-efficient solar-energy panels; and molecular-semiconductor hybrid devices that may become engines for the next revolution in the information age. The potential for improvements in health, safety, quality of life, and conservation of the environment are vast.

At the same time, significant challenges must be overcome for the benefits of nanotechnology to be realized. Scientists must learn how to manipulate and characterize individual atoms and small groups of atoms reliably. New and improved tools are needed to control the properties and structure of materials at the nanoscale; significant improvements in computer simulations of atomic and molecular structures are essential to the understanding of this realm. Next, new tools and approaches are needed for assembling atoms and molecules into nanoscale systems and for the further assembly of small systems into more-complex objects. Furthermore, nanotechnology products must provide not only improved performance but also lower cost. Finally, without integration of nanoscale objects with systems at the micro- and macroscale (that is, from millionths of a metre up to the millimetre scale), it will be very difficult to exploit many of the unique properties found at the nanoscale.

Nanotechnology is highly interdisciplinary, involving physics, chemistry, biology, materials science, and the full range of the engineering disciplines. The word nanotechnology is widely used as shorthand to refer to both the science and the technology of this emerging field. Narrowly defined, nanoscience concerns a basic understanding of physical, chemical, and biological properties on atomic and near-atomic scales. Nanotechnology, narrowly defined, employs controlled manipulation of these properties to create materials and functional systems with unique capabilities.

In contrast to recent engineering efforts, nature developed nanotechnologies over billions of years, employing enzymes and catalysts to organize with exquisite precision different kinds of atoms and molecules into complex microscopic structures that make life possible. These natural products are built with great efficiency and have impressive capabilities, such as the power to harvest solar energy, to convert minerals and water into living cells, to store and process massive amounts of data using large arrays of nerve cells, and to replicate perfectly billions of bits of information stored in molecules of deoxyribonucleic acid (DNA).

There are two principal reasons for qualitative differences in material behaviour at the nanoscale (traditionally defined as less than 100 nanometres). First, quantum mechanical effects come into play at very small dimensions and lead to new physics and chemistry. Second, a defining feature at the nanoscale is the very large surface-to-volume ratio of these structures. This means that no atom is very far from a surface or interface, and the behaviour of atoms at these higher-energy sites have a significant influence on the properties of the material. For example, the reactivity of a metal catalyst particle generally increases appreciably as its size is reducedmacroscopic gold is chemically inert, whereas at nanoscales gold becomes extremely reactive and catalytic and even melts at a lower temperature. Thus, at nanoscale dimensions material properties depend on and change with size, as well as composition and structure.

Using the processes of nanotechnology, basic industrial production may veer dramatically from the course followed by steel plants and chemical factories of the past. Raw materials will come from the atoms of abundant elementscarbon, hydrogen, and siliconand these will be manipulated into precise configurations to create nanostructured materials that exhibit exactly the right properties for each particular application. For example, carbon atoms can be bonded together in a number of different geometries to create variously a fibre, a tube, a molecular coating, or a wire, all with the superior strength-to-weight ratio of another carbon materialdiamond. Additionally, such material processing need not require smokestacks, power-hungry industrial machinery, or intensive human labour. Instead, it may be accomplished either by growing new structures through some combination of chemical catalysts and synthetic enzymes or by building them through new techniques based on patterning and self-assembly of nanoscale materials into useful predetermined designs. Nanotechnology ultimately may allow people to fabricate almost any type of material or product allowable under the laws of physics and chemistry. While such possibilities seem remote, even approaching natures virtuosity in energy-efficient fabrication would be revolutionary.

Even more revolutionary would be the fabrication of nanoscale machines and devices for incorporation into micro- and macroscale systems. Once again, nature has led the way with the fabrication of both linear and rotary molecular motors. These biological machines carry out such tasks as muscle contraction (in organisms ranging from clams to humans) and shuttling little packets of material around within cells while being powered by the recyclable, energy-efficient fuel adenosine triphosphate. Scientists are only beginning to develop the tools to fabricate functioning systems at such small scales, with most advances based on electronic or magnetic information processing and storage systems. The energy-efficient, reconfigurable, and self-repairing aspects of biological systems are just becoming understood.

The potential impact of nanotechnology processes, machines, and products is expected to be far-reaching, affecting nearly every conceivable information technology, energy source, agricultural product, medical device, pharmaceutical, and material used in manufacturing. Meanwhile, the dimensions of electronic circuits on semiconductors continue to shrink, with minimum feature sizes now reaching the nanorealm, under 100 nanometres. Likewise, magnetic memory materials, which form the basis of hard disk drives, have achieved dramatically greater memory density as a result of nanoscale structuring to exploit new magnetic effects at nanodimensions. These latter two areas represent another major trend, the evolution of critical elements of microtechnology into the realm of nanotechnology to enhance performance. They are immense markets driven by the rapid advance of information technology.

In a lecture in 1959 to the American Physical Society, Theres Plenty of Room at the Bottom, American Nobelist Richard P. Feynman presented his audience with a vision of what could be done with extreme miniaturization. He began his lecture by noting that the Lords Prayer had been written on the head of a pin and asked,

Why cannot we write the entire 24 volumes of the Encyclopdia Britannica on the head of a pin? Lets see what would be involved. The head of a pin is a sixteenth of an inch across. If you magnify it by 25,000 diameters, the area of the head of the pin is then equal to the area of all the pages of the Encyclopdia Britannica. Therefore, all it is necessary to do is to reduce in size all the writing in the Encyclopdia by 25,000 times. Is that possible? The resolving power of the eye is about 1/120 of an inchthat is roughly the diameter of one of the little dots on the fine half-tone reproductions in the Encyclopdia. This, when you demagnify it by 25,000 times, is still 80 angstroms in diameter32 atoms across, in an ordinary metal. In other words, one of those dots still would contain in its area 1,000 atoms. So, each dot can easily be adjusted in size as required by the photoengraving, and there is no question that there is enough room on the head of a pin to put all of the Encyclopdia Britannica.

Feynman was intrigued by biology and pointed out that

cells are very tiny, but they are very active; they manufacture various substances; they walk around; they wiggle; and they do all kinds of marvelous thingsall on a very small scale. Also, they store information. Consider the possibility that we too can make a thing very small which does what we wantthat we can manufacture an object that maneuvers at that level!

He also considered using big tools to make smaller tools that could make yet smaller tools, eventually obtaining nanoscale tools for directly manipulating atoms and molecules. In considering what all this might mean, Feynman declared,

I can hardly doubt that when we have some control of the arrangement of things on a small scale we will get an enormously greater range of possible properties that substances can have, and of different things that we can do.

Perhaps the biggest barrier to following these prophetic thoughts was simply the immediate lack of tools to manipulate and visualize matter at such a small scale. The availability of tools has always been an enabling aspect of the advance of all science and technology, and some of the key tools for nanotechnology are discussed in the next section, Pioneers.

Starting with a 1981 paper in the Proceedings of the National Academy of Sciences and following with two popular books, Engines of Creation (1986) and Nanosystems (1992), American scientist K. Eric Drexler became one of the foremost advocates of nanotechnology. In fact, Drexler was the first person anywhere to receive a Ph.D. in molecular nanotechnology (from the Massachusetts Institute of Technology). In his written works he takes a molecular view of the world and envisions molecular machines doing much of the work of the future. For example, he refers to assemblers, which will manipulate individual atoms to manufacture structures, and replicators, which will be able to make multiple copies of themselves in order to save time dealing with the billions of atoms needed to make objects of useful size. In an article for Encyclopdia Britannicas 1990 Yearbook of Science and the Future, Drexler wrote:

Cells and tissues in the human body are built and maintained by molecular machinery, but sometimes that machinery proves inadequate: viruses multiply, cancer cells spread, or systems age and deteriorate. As one might expect, new molecular machines and computers of subcellular size could support the bodys own mechanisms. Devices containing nanocomputers interfaced to molecular sensors and effectors could serve as an augmented immune system, searching out and destroying viruses and cancer cells. Similar devices programmed as repair machines could enter living cells to edit out viral DNA sequences and repair molecular damage. Such machines would bring surgical control to the molecular level, opening broad new horizons in medicine.

Drexlers futurist visions have stimulated much thought, but the assembler approach has failed to account for the strong influence of atomic and molecular forces (i.e., the chemistry) at such dimensions. The controversy surrounding these popularizations, and the potential dangers of entities such as intelligent replicators (however remote), have stimulated debate over the ethical and societal implications of nanotechnology.

A number of key technological milestones have been achieved by working pioneers. Molecular beam epitaxy, invented by Alfred Cho and John Arthur at Bell Labs in 1968 and developed in the 1970s, enabled the controlled deposition of single atomic layers. This tool provided for nanostructuring in one dimension as atomic layers were grown one upon the next. It subsequently became important in the area of compound semiconductor device fabrication. For example, sandwiching one-nanometre-thick layers of nonmagnetic-sensor materials between magnetic layers in computer disk drives resulted in large increases in storage capacity, and a similar use of nanostructuring resulted in more energy-efficient semiconductor lasers for use in compact disc players.

In 1981 Gerd Binnig and Heinrich Rohrer developed the scanning tunneling microscope at IBMs laboratories in Switzerland. This tool provided a revolutionary advance by enabling scientists to image the position of individual atoms on surfaces. It earned Binnig and Rohrer a Nobel Prize in 1986 and spawned a wide variety of scanning probe tools for nanoscale observations.

The observation of new carbon structures marked another important milestone in the advance of nanotechnology, with Nobel Prizes for the discoverers. In 1985 Robert F. Curl, Jr., Harold W. Kroto, and Richard E. Smalley discovered the first fullerene, the third known form of pure carbon (after diamond and graphite). They named their discovery buckminsterfullerene (buckyball) for its resemblance to the geodesic domes promoted by the American architect R. Buckminster Fuller. Technically called C60 for the 60 carbon atoms that form their hollow spherical structure, buckyballs resemble a football one nanometre in diameter (see figure). In 1991 Sumio Iijima of NEC Corporation in Japan discovered carbon nanotubes, in which the carbon ringlike structures are extended from spheres into long tubes of varying diameter. Taken together, these new structures surprised and excited the imaginations of scientists about the possibilities of forming well-defined nanostructures with unexpected new properties.

The scanning tunneling microscope not only allowed for the imaging of atoms by scanning a sharp probe tip over a surface, but it also allowed atoms to be pushed around on the surface. With a slight bias voltage applied to the probe tip, certain atoms could be made to adhere to the tip used for imaging and then to be released from it. Thus, in 1990 Donald Eigler spelled out the letters of his companys logo, IBM, by moving 35 xenon atoms into place on a nickel surface. This demonstration caught the publics attention because it showed the precision of the emerging nanoscale tools.

At nanoscale dimensions the properties of materials no longer depend solely on composition and structure in the usual sense. Nanomaterials display new phenomena associated with quantized effects and with the preponderance of surfaces and interfaces.

Quantized effects arise in the nanometre regime because the overall dimensions of objects are comparable to the characteristic wavelength for fundamental excitations in materials. For example, electron wave functions (see also de Broglie wave) in semiconductors are typically on the order of 10 to 100 nanometres. Such excitations include the wavelength of electrons, photons, phonons, and magnons, to name a few. These excitations carry the quanta of energy through materials and thus determine the dynamics of their propagation and transformation from one form to another. When the size of structures is comparable to the quanta themselves, it influences how these excitations move through and interact in the material. Small structures may limit flow, create wave interference effects, and otherwise bring into play quantum mechanical selection rules not apparent at larger dimensions.

Quantum mechanical properties for confinement of electrons in one dimension have long been exploited in solid-state electronics. Semiconductor devices are grown with thin layers of differing composition so that electrons (or holes in the case of missing electron charges) can be confined in specific regions of the structure (known as quantum wells). Thin layers with larger energy bandgaps can serve as barriers that restrict the flow of charges to certain conditions under which they can tunnel through these barriersthe basis of resonant tunneling diodes. Superlattices are periodic structures of repeating wells that set up a new set of selection rules which affect the conditions for charges to flow through the structure. Superlattices have been exploited in cascade lasers to achieve far infrared wavelengths. Modern telecommunications is based on semiconductor lasers that exploit the unique properties of quantum wells to achieve specific wavelengths and high efficiency.

The propagation of photons is altered dramatically when the size and periodicity of the transient structure approach the wavelength of visible light (400 to 800 nanometres). When photons propagate through a periodically varying dielectric constantfor example, semiconductor posts surrounded by airquantum mechanical rules define and limit the propagation of the photons depending on their energy (wavelength). This new behaviour is analogous to the quantum mechanical rules that define the motion of electrons through crystals, giving bandgaps for semiconductors. In one dimension, compound semiconductor superlattices can be grown epitaxially with the alternating layers having different dielectric constants, thus providing highly reflective mirrors for specific wavelengths as determined by the repeat distance of layers in the superlattice. These structures are used to provide built-in mirrors for vertical-cavity surface-emitting lasers, which are used in communications applications. In two and three dimensions, periodic structures known as photonic crystals offer additional control over photon propagation.

Photonic crystals are being explored in a variety of materials and periodicities, such as two-dimensional hexagonal arrays of posts fabricated in compound semiconductors or stacked loglike arrays of silicon bars in three dimensions. The dimensions of these structures depend on the wavelength of light being propagated and are typically in the range of a few hundred nanometres for wavelengths in the visible and near infrared. Photonic crystal properties based on nanostructured materials offer the possibility of confining, steering, and separating light by wavelength on unprecedented small scales and of creating new devices such as lasers that require very low currents to initiate lasing (called near-thresholdless lasers). These structures are being extensively investigated as the tools for nanostructuring materials are steadily advancing. Researchers are particularly interested in the infrared wavelengths, where dimensional control is not as stringent as at the shorter visible wavelengths and where optical communications and chemical sensing provide motivation for potential new applications.

Nanoscale materials also have size-dependent magnetic behaviour, mechanical properties, and chemical reactivity. At very small sizes (a few nanometres), magnetic nanoclusters have a single magnetic domain, and the strongly coupled magnetic spins on each atom combine to produce a particle with a single giant spin. For example, the giant spin of a ferromagnetic iron particle rotates freely at room temperature for diameters below about 16 nanometres, an effect termed superparamagnetism. Mechanical properties of nanostructured materials can reach exceptional strengths. As a specific example, the introduction of two-nanometre aluminum oxide precipitates into thin films of pure nickel results in yield strengths increasing from 0.15 to 5 gigapascals, which is more than twice that for a hard bearing steel. Another example of exceptional mechanical properties at the nanoscale is the carbon nanotube, which exhibits great strength and stiffness along its longitudinal axis.

The preponderance of surfaces is a major reason for the change in behaviour of materials at the nanoscale. Since up to half of all the atoms in nanoparticles are surface atoms, properties such as electrical transport are no longer determined by solid-state bulk phenomena. Likewise, the atoms in nanostructures have a higher average energy than atoms in larger structures, because of the large proportion of surface atoms. For example, catalytic materials have a greater chemical activity per atom of exposed surface as the catalyst is reduced in size at the nanoscale. Defects and impurities may be attracted to surfaces and interfaces, and interactions between particles at these small dimensions can depend on the structure and nature of chemical bonding at the surface. Molecular monolayers may be used to change or control surface properties and to mediate the interaction between nanoparticles.

Surfaces and their interactions with molecular structures are basic to all biology. The intersection of nanotechnology and biotechnology offers the possibility of achieving new functions and properties with nanostructured surfaces. In this surface- and interface-dominated regime, biology does an exquisite job of selectively controlling functions through a combination of structure and chemical forces. The transcription of information stored in genes and the selectivity of biochemical reactions based on chemical recognition of complex molecules are examples where interfaces play the key role in establishing nanoscale behaviour. Atomic forces and chemical bonds dominate at these dimensions, while macroscopic effectssuch as convection, turbulence, and momentum (inertial forces)are of little consequence.

As discussed in the section Properties at the nanoscale, material propertieselectrical, optical, magnetic, mechanical, and chemicaldepend on their exact dimensions. This opens the way for development of new and improved materials through manipulation of their nanostructure. Hierarchical assemblies of nanoscale-engineered materials into larger structures, or their incorporation into devices, provide the basis for tailoring radically new materials and machines.

Natures assemblies point the way to improving structural materials. The often-cited abalone seashell provides a beautiful example of how the combination of a hard, brittle inorganic material with nanoscale structuring and a soft, tough organic material can produce a strong, durable nanocompositebasically, these nanocomposites are made of calcium carbonate bricks held together by a glycoprotein glue. New engineered materials are emergingsuch as polymer-clay nanocompositesthat are not only strong and tough but also lightweight and easier to recycle than conventional reinforced plastics. Such improvements in structural materials are particularly important for the transportation industry, where reduced weight directly translates into improved fuel economy. Other improvements can increase safety or decrease the impact on the environment of fabrication and recycling. Further advances, such as truly smart materials that signal their impending failure or are even able to self-repair flaws, may be possible with composites of the future.

Sensors are central to almost all modern control systems. For example, multiple sensors are used in automobiles for such diverse tasks as engine management, emission control, security, safety, comfort, vehicle monitoring, and diagnostics. While such traditional applications for physical sensing generally rely on microscale sensing devices, the advent of nanoscale materials and structures has led to new electronic, photonic, and magnetic nanosensors, sometimes known as smart dust. Because of their small size, nanosensors exhibit unprecedented speed and sensitivity, extending in some cases down to the detection of single molecules. For example, nanowires made of carbon nanotubes, silicon, or other semiconductor materials exhibit exceptional sensitivity to chemical species or biological agents. Electrical current through nanowires can be altered by having molecules attached to their surface that locally perturb their electronic band structure. By means of nanowire surfaces coated with sensor molecules that selectively attach particular species, charge-induced changes in current can be used to detect the presence of those species. This same strategy is adopted for many classes of sensing systems. New types of sensors with ultrahigh sensitivity and specificity will have many applications; for example, sensors that can detect cancerous tumours when they consist of only a few cells would be a very significant advance.

Nanomaterials also make excellent filters for trapping heavy metals and other pollutants from industrial wastewater. One of the greatest potential impacts of nanotechnology on the lives of the majority of people on Earth will be in the area of economical water desalination and purification. Nanomaterials will very likely find important use in fuel cells, bioconversion for energy, bioprocessing of food products, waste remediation, and pollution-control systems.

A recent concern regarding nanoparticles is whether their small sizes and novel properties may pose significant health or environmental risks. In general, ultrafine particlessuch as the carbon in photocopier toners or in soot produced by combustion engines and factorieshave adverse respiratory and cardiovascular effects on people and animals. Studies are under way to determine if specific nanoscale particles pose higher risks that may require special regulatory restrictions. Of particular concern are potential carcinogenic risks from inhaled particles and the possibility for very small nanoparticles to cross the blood-brain barrier to unknown effect. Nanomaterials currently receiving attention from health officials include carbon nanotubes, buckyballs, and cadmium selenide quantum dots. Studies of the absorption through the skin of titanium oxide nanoparticles (used in sunscreens) are also planned. More far-ranging studies of the toxicity, transport, and overall fate of nanoparticles in ecosystems and the environment have not yet been undertaken. Some early animal studies, involving the introduction of very high levels of nanoparticles which resulted in the rapid death of many of the subjects, are quite controversial.

Nanotechnology promises to impact medical treatment in multiple ways. First, advances in nanoscale particle design and fabrication provide new options for drug delivery and drug therapies. More than half of the new drugs developed each year are not water-soluble, which makes their delivery difficult. In the form of nanosized particles, however, these drugs are more readily transported to their destination, and they can be delivered in the conventional form of pills.

More important, nanotechnology may enable drugs to be delivered to precisely the right location in the body and to release drug doses on a predetermined schedule for optimal treatment. The general approach is to attach the drug to a nanosized carrier that will release the medicine in the body over an extended period of time or when specifically triggered to do so. In addition, the surfaces of these nanoscale carriers may be treated to seek out and become localized at a disease sitefor example, attaching to cancerous tumours. One type of molecule of special interest for these applications is an organic dendrimer. A dendrimer is a special class of polymeric molecule that weaves in and out from a hollow central region. These spherical fuzz balls are about the size of a typical protein but cannot unfold like proteins. Interest in dendrimers derives from the ability to tailor their cavity sizes and chemical properties to hold different therapeutic agents. Researchers hope to design different dendrimers that can swell and release their drug on exposure to specifically recognized molecules that indicate a disease target. This same general approach to nanoparticle-directed drug delivery is being explored for other types of nanoparticles as well.

Another approach involves gold-coated nanoshells whose size can be adjusted to absorb light energy at different wavelengths. In particular, infrared light will pass through several centimetres of body tissue, allowing a delicate and precise heating of such capsules in order to release the therapeutic substance within. Furthermore, antibodies may be attached to the outer gold surface of the shells to cause them to bind specifically to certain tumour cells, thereby reducing the damage to surrounding healthy cells.

A second area of intense study in nanomedicine is that of developing new diagnostic tools. Motivation for this work ranges from fundamental biomedical research at the level of single genes or cells to point-of-care applications for health delivery services. With advances in molecular biology, much diagnostic work now focuses on detecting specific biological signatures. These analyses are referred to as bioassays. Examples include studies to determine which genes are active in response to a particular disease or drug therapy. A general approach involves attaching fluorescing dye molecules to the target biomolecules in order to reveal their concentration.

Another approach to bioassays uses semiconductor nanoparticles, such as cadmium selenide, which emit light of a specific wavelength depending on their size. Different-size particles can be tagged to different receptors so that a wider variety of distinct colour tags are available than can be distinguished for dye molecules. The degradation in fluorescence with repeated excitation for dyes is avoided. Furthermore, various-size particles can be encapsulated in latex beads and their resulting wavelengths read like a bar code. This approach, while still in the exploratory stage, would allow for an enormous number of distinct labels for bioassays.

Another nanotechnology variation on bioassays is to attach one half of the single-stranded complementary DNA segment for the genetic sequence to be detected to one set of gold particles and the other half to a second set of gold particles. When the material of interest is present in a solution, the two attachments cause the gold balls to agglomerate, providing a large change in optical properties that can be seen in the colour of the solution. If both halves of the sequence do not match, no agglomeration will occur and no change will be observed.

Approaches that do not involve optical detection techniques are also being explored with nanoparticles. For example, magnetic nanoparticles can be attached to antibodies that in turn recognize and attach to specific biomolecules. The magnetic particles then act as tags and handlebars through which magnetic fields can be used for mixing, extracting, or identifying the attached biomolecules within microlitre- or nanolitre-sized samples. For example, magnetic nanoparticles stay magnetized as a single domain for a significant period, which enables them to be aligned and detected in a magnetic field. In particular, attached antibodymagnetic-nanoparticle combinations rotate slowly and give a distinctive magnetic signal. In contrast, magnetically tagged antibodies that are not attached to the biological material being detected rotate more rapidly and so do not give the same distinctive signal.

Microfluidic systems, or labs-on-chips, have been developed for biochemical assays of minuscule samples. Typically cramming numerous electronic and mechanical components into a portable unit no larger than a credit card, they are especially useful for conducting rapid analysis in the field. While these microfluidic systems primarily operate at the microscale (that is, millionths of a metre), nanotechnology has contributed new concepts and will likely play an increasing role in the future. For example, separation of DNA is sensitive to entropic effects, such as the entropy required to unfold DNA of a given length. A new approach to separating DNA could take advantage of its passage through a nanoscale array of posts or channels such that DNA molecules of different lengths would uncoil at different rates.

Other researchers have focused on detecting signal changes as nanometre-wide DNA strands are threaded through a nanoscale pore. Early studies used pores punched in membranes by viruses; artificially fabricated nanopores are also being tested. By applying an electric potential across the membrane in a liquid cell to pull the DNA through, changes in ion current can be measured as different repeating base units of the molecule pass through the pores. Nanotechnology-enabled advances in the entire area of bioassays will clearly impact health care in many ways, from early detection, rapid clinical analysis, and home monitoring to new understanding of molecular biology and genetic-based treatments for fighting disease.

Another biomedical application of nanotechnology involves assistive devices for people who have lost or lack certain natural capabilities. For example, researchers hope to design retinal implants for vision-impaired individuals. The concept is to implant chips with photodetector arrays to transmit signals from the retina to the brain via the optic nerve. Meaningful spatial information, even if only at a rudimentary level, would be of great assistance to the blind. Such research illustrates the tremendous challenge of designing hybrid systems that work at the interface between inorganic devices and biological systems.

Closely related research involves implanting nanoscale neural probes in brain tissue to activate and control motor functions. This requires effective and stable wiring of many electrodes to neurons. It is exciting because of the possibility of recovery of control for motor-impaired individuals. Studies employing neural stimulation of damaged spinal cords by electrical signals have demonstrated the return of some locomotion. Researchers are also seeking ways to assist in the regeneration and healing of bone, skin, and cartilagefor example, developing synthetic biocompatible or biodegradable structures with nanosized voids that would serve as templates for regenerating specific tissue while delivering chemicals to assist in the repair process. At a more sophisticated level, researchers hope to someday build nanoscale or microscale machines that can repair, assist, or replace more-complex organs.

Semiconductor experts agree that the ongoing shrinkage in conventional electronic devices will inevitably reach fundamental limits due to quantum effects such as tunneling, in which electrons jump out of their prescribed circuit path and create atomic-scale interference between devices. At that point, radical new approaches to data storage and information processing will be required for further advances. For example, radically new systems have been imagined that are based on quantum computing or biomolecular computing.

The use of molecules for electronic devices was suggested by Mark Ratner of Northwestern University and Avi Aviram of IBM as early as the 1970s, but proper nanotechnology tools did not become available until the turn of the 21st century. Wiring up molecules some half a nanometre wide and a few nanometres long remains a major challenge, and an understanding of electrical transport through single molecules is only beginning to emerge. A number of groups have been able to demonstrate molecular switches, for example, that could conceivably be used in computer memory or logic arrays. Current areas of research include mechanisms to guide the selection of molecules, architectures for assembling molecules into nanoscale gates, and three-terminal molecules for transistor-like behaviour. More-radical approaches include DNA computing, where single-stranded DNA on a silicon chip would encode all possible variable values and complementary strand interactions would be used for a parallel processing approach to finding solutions. An area related to molecular electronics is that of organic thin-film transistors and light emitters, which promise new applications such as video displays that can be rolled out like wallpaper and flexible electronic newspapers.

Carbon nanotubes have remarkable electronic, mechanical, and chemical properties. Depending on their specific diameter and the bonding arrangement of their carbon atoms, nanotubes exhibit either metallic or semiconducting behaviour. Electrical conduction within a perfect nanotube is ballistic (negligible scattering), with low thermal dissipation. As a result, a wire made from a nanotube, or a nanowire, can carry much more current than an ordinary metal wire of comparable size. At 1.4 nanometres in diameter, nanotubes are about a hundred times smaller than the gate width of silicon semiconductor devices. In addition to nanowires for conduction, transistors, diodes, and simple logic circuits have been demonstrated by combining metallic and semiconductor carbon nanotubes. Similarly, silicon nanowires have been used to build experimental devices, such as field-effect transistors, bipolar transistors, inverters, light-emitting diodes, sensors, and even simple memory. A major challenge for nanowire circuits, as for molecular electronics, is connecting and integrating these devices into a workable high-density architecture. Ideally, the structure would be grown and assembled in place. Crossbar architectures that combine the function of wires and devices are of particular interest.

At nanoscale dimensions the energy required to add one additional electron to a small island (isolated physical region)for example, through a tunneling barrierbecomes significant. This change in energy provides the basis for devising single-electron transistors. At low temperatures, where thermal fluctuations are small, various single-electron-device nanostructures are readily achievable, and extensive research has been carried out for structures with confined electron flow. However, room-temperature applications will require that sizes be reduced significantly, to the one-nanometre range, to achieve stable operation. For large-scale application with millions of devices, as found in current integrated circuits, the need for structures with very uniform size to maintain uniform device characteristics presents a significant challenge. Also, in this and many new nanodevices being explored, the lack of gain is a serious drawback limiting implementation in large-scale electronic circuits.

Spintronics refers to electronic devices that perform logic operations based on not just the electrical charge of carriers but also their spin. For example, information could be transported or stored through the spin-up or spin-down states of electrons. This is a new area of research, and issues include the injection of spin-polarized carriers, their transport, and their detection. The role of nanoscale structure and electronic properties of the ferromagnetic-semiconductor interface on the spin injection process, the growth of new ferromagnetic semiconductors with nanoscale control, and the possible use of nanostructured features to manipulate spin are all of interest.

Current approaches to information storage and retrieval include high-density, high-speed, solid-state electronic memories, as well as slower (but generally more spacious) magnetic and optical discs (see computer memory). As the minimum feature size for electronic processing approaches 100 nanometres, nanotechnology provides ways to decrease further the bit size of the stored information, thus increasing density and reducing interconnection distances for obtaining still-higher speeds. For example, the basis of the current generation of magnetic disks is the giant magnetoresistance effect. A magnetic read/write head stores bits of information by setting the direction of the magnetic field in nanometre-thick metallic layers that alternate between ferromagnetic and nonferromagnetic. Differences in spin-dependent scattering of electrons at the interface layers lead to resistance differences that can be read by the magnetic head. Mechanical properties, particularly tribology (friction and wear of moving surfaces), also play an important role in magnetic hard disk drives, since magnetic heads float only about 10 nanometres above spinning magnetic disks.

Another approach to information storage that is dependent on designing nanometre-thick magnetic layers is under commercial development. Known as magnetic random access memory (MRAM), a line of electrically switchable magnetic material is separated from a permanently magnetized layer by a nanoscale nonmagnetic interlayer. A resistance change that depends on the relative alignment of the fields is read electrically from a large array of wires through cross lines. MRAM will require a relatively small evolution from conventional semiconductor manufacturing, and it has the added benefit of producing nonvolatile memory (no power or batteries are needed to maintain stored memory states).

Still at an exploratory stage, studies of electrical conduction through molecules have generated interest in their possible use as memory. While still very speculative, molecular and nanowire approaches to memory are intriguing because of the small volume in which the bits of memory are stored and the effectiveness with which biological systems store large amounts of information.

Nanoscale structuring of optical devices, such as vertical-cavity surface-emitting lasers (VCSELs), quantum dot lasers, and photonic crystal materials, is leading to additional advances in communications technology.

VCSELs have nanoscale layers of compound semiconductors epitaxially grown into their structurealternating dielectric layers as mirrors and quantum wells. Quantum wells allow the charge carriers to be confined in well-defined regions and provide the energy conversion into light at desired wavelengths. They are placed in the lasers cavity to confine carriers at the nodes of a standing wave and to tailor the band structure for more efficient radiative recombination. One-dimensional nanotechnology techniques involving precise growth of very thin epitaxial semiconductor layers were developed during the 1990s. Such nanostructuring has enhanced the efficiency of VCSELs and reduced the current required for lasing to start (called the threshold current). Because of improving performance and their compatibility with planar manufacturing technology, VCSELs are fast becoming a preferred laser source in a variety of communications applications.

More recently, the introduction of quantum dots (regions so small that they can be given a single electric charge) into semiconductor lasers has been investigated and found to give additional benefitsboth further reductions in threshold current and narrower line widths. Quantum dots further confine the optical emission modes within a very narrow spectrum and give the lowest threshold current densities for lasing achieved to date in VCSELs. The quantum dots are introduced into the laser during the growth of strained layers, by a process called Stransky-Krastanov growth. They arise because of the lattice mismatch stress and surface tension of the growing film. Improvements in ways to control precisely the resulting quantum dots to a more uniform single size are still being sought.

Photonic crystals provide a new means to control the steering and manipulation of photons based on periodic dielectric lattices with repeat dimensions on the order of the wavelength of light. These materials can have very exotic properties, such as not allowing light within certain wavelengths to be propagated in a material based on the particular periodic structure. Photonic lattices can act as perfect wavelength-selective mirrors to reflect back incident light from all orientations. They provide the basis for optical switching, steering, and wavelength separation on unprecedented small scales. The periodic structures required for these artificial crystals can be configured as both two- and three-dimensional lattices. Optical sources, switches, and routers are being considered, with two-dimensional planar geometries receiving the most attention, because of their greater ease of fabrication.

Another potentially important communications application for nanotechnology is microelectromechanical systems (MEMS), devices sized at the micrometre level (millionths of a metre). MEMS are currently poised to have a major impact on communications via optical switching. In the future, electromechanical devices may shrink to nanodimensions to take advantage of the higher frequencies of mechanical vibration at smaller masses. The natural (resonant) frequency of vibration for small mechanical beams increases as their size decreases, so that little power is needed to drive them as oscillators. Their efficiency is rated by a quality factor, known as Q, which is a ratio of the energy stored per cycle versus the energy dissipated per cycle. The higher the Q, the more precise the absolute frequency of an oscillator. The Q is very high for micro- and nanoscale mechanical oscillators, and these devices can reach very high frequencies (up to microwave frequencies), making them potential low-power replacements for electronic-based oscillators and filters.

Mechanical oscillators have been made from silicon at dimensions of 10 100 nanometres, where more than 10 percent of the atoms are less than one atomic distance from the surface. While highly homogeneous materials can be made at these dimensionsfor example, single-crystal silicon barssurfaces play an increasing role at nanoscales, and energy losses increase, presumably because of surface defects and molecular species absorbed on surfaces.

It is possible to envision even higher frequencies, in what might be viewed as the ultimate in nanomechanical systems, by moving from nanomachined structures to molecular systems. As an example, multiwalled carbon nanotubes are being explored for their mechanical properties. When the ends of the outer nanotube are removed, the inner tube may be pulled partway out from the outer tube where van der Waals forces between the two tubes will supply a restoring force. The inner tube can thus oscillate, sliding back and forth inside the outer tube. The resonant frequency of oscillation for such structures is predicted to be above one gigahertz (one billion cycles per second). It is unknown whether connecting such systems to the macro world and protecting them from surface effects will ever be practical.

Originally posted here:

Nanotechnology | Britannica.com

UC San Diego NanoEngineering Department

The NanoEngineering program has received accreditation by the Accreditation Commission of ABET, the global accreditor of college and university programs in applied and natural science, computing, engineering and engineering technology. UC San Diego’s NanoEngineering program is the first of its kind in the nation to receive this accreditation. Our NanoEngineering students can feel confident that their education meets global standards and that they will be prepared to enter the workforce worldwide.

ABET accreditation assures that programs meet standards to produce graduates ready to enter critical technical fields that are leading the way in innovation and emerging technologies, and anticipating the welfare and safety needs of the public. Please visit the ABET website for more information on why accreditation matters.

Congratulations to the NanoEngineering department and students!

View original post here:

UC San Diego NanoEngineering Department

Nanoengineering – Wikipedia

Nanoengineering is the practice of engineering on the nanoscale. It derives its name from the nanometre, a unit of measurement equalling one billionth of a meter.

Nanoengineering is largely a synonym for nanotechnology, but emphasizes the engineering rather than the pure science aspects of the field.

The first nanoengineering program was started at the University of Toronto within the Engineering Science program as one of the options of study in the final years. In 2003, the Lund Institute of Technology started a program in Nanoengineering. In 2004, the College of Nanoscale Science and Engineering at SUNY Polytechnic Institute was established on the campus of the University at Albany. In 2005, the University of Waterloo established a unique program which offers a full degree in Nanotechnology Engineering. [1] Louisiana Tech University started the first program in the U.S. in 2005. In 2006 the University of Duisburg-Essen started a Bachelor and a Master program NanoEngineering. [2] Unlike early NanoEngineering programs, the first Nanoengineering Department in the world, offering both undergraduate and graduate degrees, was established by the University of California, San Diego in 2007.In 2009, the University of Toronto began offering all Options of study in Engineering Science as degrees, bringing the second nanoengineering degree to Canada. Rice University established in 2016 a Department of Materials Science and NanoEngineering (MSNE).DTU Nanotech – the Department of Micro- and Nanotechnology – is a department at the Technical University of Denmark established in 1990.

In 2013, Wayne State University began offering a Nanoengineering Undergraduate Certificate Program, which is funded by a Nanoengineering Undergraduate Education (NUE) grant from the National Science Foundation. The primary goal is to offer specialized undergraduate training in nanotechnology. Other goals are: 1) to teach emerging technologies at the undergraduate level, 2) to train a new adaptive workforce, and 3) to retrain working engineers and professionals.[3]

More:

Nanoengineering – Wikipedia

NETS – What are Nanoengineering and Nanotechnology?

is one billionth of a meter, or three to five atoms in width. It would take approximately 40,000 nanometers lined up in a row to equal the width of a human hair. NanoEngineering concerns itself with manipulating processes that occur on the scale of 1-100 nanometers.

The general term, nanotechnology, is sometimes used to refer to common products that have improved properties due to being fortified with nanoscale materials. One example is nano-improved tooth-colored enamel, as used by dentists for fillings. The general use of the term nanotechnology then differs from the more specific sciences that fall under its heading.

NanoEngineering is an interdisciplinary science that builds biochemical structures smaller than bacterium, which function like microscopic factories. This is possible by utilizing basic biochemical processes at the atomic or molecular level. In simple terms, molecules interact through natural processes, and NanoEngineering takes advantage of those processes by direct manipulation.

SOURCE:http://www.wisegeek.com/what-is-nanoengineering.htm

Read the original:

NETS – What are Nanoengineering and Nanotechnology?

Undergraduate Degree Programs | NanoEngineering

The Department of NanoEngineering offers undergraduate programs leading to theB.S. degreesinNanoengineeringandChemical Engineering. The Chemical Engineering and NanoEngineering undergraduate programs areaccredited by the Engineering Accreditation Commission of ABET. The undergraduate degree programs focus on integrating the various sciences and engineering disciplines necessary for successful careers in the evolving nanotechnology industry.These two degree programshave very different requirements and are described in separate sections.

B.S. NanoEngineering

TheNanoEngineering Undergraduate Program became effective Fall 2010.Thismajor focuses on nanoscale science, engineering, and technology that have the potential to make valuable advances in different areas that include, to name a few, new materials, biology and medicine, energy conversion, sensors, and environmental remediation. The program includes affiliated faculty from the Department of NanoEngineering, Department of Mechanical and Aerospace Engineering, Department of Chemistry and Biochemistry, and the Department of Bioengineering. The NanoEngineering undergraduate program is tailored to provide breadth and flexibility by taking advantage of the strength of basic sciences and other engineering disciplines at UC San Diego. The intention is to graduate nanoengineers who are multidisciplinary and can work in a broad spectrum of industries.

B.S. Chemical Engineering

The Chemical Engineering undergraduate program is housed within the NanoEngineering Department. The program is made up of faculty from the Department of Mechanical and Aerospace Engineering, Department of Chemistry and Biochemistry, the Department of Bioengineering and the Department of NanoEngineering. The curricula at both the undergraduate and graduate levels are designed to support and foster chemical engineering as a profession that interfaces engineering and all aspects of basic sciences (physics, chemistry, and biology). As of Fall 2008, the Department of NanoEngineering has taken over the administration of the B.S. degree in Chemical Engineering.

Academic Advising

Upon admission to the major, students should consult the catalog or NanoEngineering website for their program of study, and their undergraduate/graduate advisor if they have questions. Because some course and/or curricular changes may be made every year, it is imperative that students consult with the departments student affairs advisors on an annual basis.

Students can meet with the academic advisors during walk-in hours, schedule an appointment, or send messages through the Virtual Advising Center (VAC).

Program Alterations/Exceptions to Requirements

Variations from or exceptions to any program or course requirements are possible only if the Undergraduate Affairs Committee approves a petition before the courses in question are taken.

Independent Study

Students may take NANO 199 or CENG 199, Independent Study for Undergraduates, under the guidance of a NANO or CENG faculty member. This course is taken as an elective on a P/NP basis. Under very restrictive conditions, however, it may be used to satisfy upper-division Technical Elective or Nanoengineering Elective course requirements for the major. Students interested in this alternative must have completed at least 90 units and earned a UCSD cumulative GPA of 3.0 or better. Eligible students must identify a faculty member with whom they wish to work and propose a two-quarter research or study topic. Please visit the Student Affairs office for more information.

Read the original post:

Undergraduate Degree Programs | NanoEngineering

About the NANO-ENGINEERING FLAGSHIP

Turning the NaI concept into reality necessitates an extraordinary and long-term effort. This requires the integration of nanoelectronics, nanophotonics, nanophononics, nanospintronics, topological effects, as well as the physics and chemistry of materials. This also requires operations in an extremely broad range of science and technology, including Microwaves, Millimeter waves, TeraHertz, Infrared and Optics, and will exploit various excitations, such as surface waves, spin waves, phonons, electrons, photons, plasmons, and their hybrids, for sensing, information processing and storage. Integrating

This high level of integration, which goes beyond individual functionalities, components and devices and requires cooperation across a range of disciplines, makes the Nano Engineering Flagship unique in its approach. It will be crucial in tackling the 6 strategic challenges identified as:

Original post:

About the NANO-ENGINEERING FLAGSHIP

The NANO-ENGINEERING FLAGSHIP initiative

Nano-Engineering introduces a novel key-enabling non-invasive broadband technology, the Nano-engineered Interface (NaI), realising omni -connectivity and putting humans and their interactions at the center of the future digital society.Omni-connectivity encompasses real-time communication, sensing, monitoring, and data processing among humans, objects, and their environment. The vision of Omni-connectivity englobes people in a new sphere of extremely simplified, intuitive and natural communication.The Nano-engineered Interface (NaI) a non-invasive wireless ultraflat functional system will make this possible. NaI will be applicable to any surface on any physical item and thereby exponentially diversify and increase connections among humans, wearables, vehicles, and everyday objects. NaI will communicate with other NaI-networks from local up to satellites by using the whole frequency spectrum from microwave frequency to optics

Visit link:

The NANO-ENGINEERING FLAGSHIP initiative

An AI Conference Refusing a Name Change Highlights a Tech Industry Problem

Name Game

There’s a prominent artificial intelligence conference that goes by the suggestive acronym NIPS, which stands for “Neural Information Processing Systems.”

After receiving complaints that the acronym was alienating to women, the conference’s leadership collected suggestions for a new name via an online poll, according to WIRED. But the conference announced Monday that it would be sticking with NIPS all the same.

Knock It Off

It’s convenient to imagine that this acronym just sort of emerged by coincidence, but let’s not indulge in that particular fantasy.

It’s more likely that tech geeks cackled maniacally when they came up with the acronym, and the refusal to do better even when people looking up the conference in good faith are bombarded with porn is a particularly telling failure of the AI research community.

Small Things Matter

This problem goes far beyond a silly name — women are severely underrepresented in technology research and even more so when it comes to artificial intelligence. And if human decency — comforting those who are regularly alienated by the powers that be — isn’t enough of a reason to challenge the sexist culture embedded in tech research, just think about what we miss out on.

True progress in artificial intelligence cannot happen without a broad range of diverse voices — voices that are silenced by “locker room talk” among an old boy’s club. Otherwise, our technological development will become just as stuck in place as our cultural development often seems to be.

READ MORE: AI RESEARCHERS FIGHT OVER FOUR LETTERS: NIPS [WIRED]

More on Silicon Valley sexism: The Tech Industry’s Gender Problem Isn’t Just Hurting Women

See the original post here:

An AI Conference Refusing a Name Change Highlights a Tech Industry Problem

Scientists Are Hopeful AI Could Help Predict Earthquakes

Quake Rate

Earlier this year, I interviewed U.S. Geological Survey geologist Annemarie Baltay for a story about why it’s incredibly difficult to predict earthquakes.

“We don’t use that ‘p word’ — ‘predict’ — at all,” she told me. “Earthquakes are chaotic. We don’t know when or where they’ll occur.”

Neural Earthwork

That could finally be starting to change, according to a fascinating feature in The New York Times.

By feeding seismic data into a neural network — a type of artificial intelligence that learns to recognize patterns by scrutinizing examples — researchers say they can now predict moments after a quake strikes how far its aftershocks will travel.

And eventually, some believe, they’ll be able to listen to signals from fault lines and predict when an earthquake will strike in the first place.

Future Vision

But like Baltay, some researchers aren’t convinced we’ll ever be able to predict earthquakes.University of Tokyo seismologist Robert Geller told the Times that until an algorithm actually predicts an upcoming quake, he’ll remain skeptical.

“There are no shortcuts,” he said. “If you cannot predict the future, then your hypothesis is wrong.”

READ MORE: A.I. Is Helping Scientist Predict When and Where the Next Big Earthquake Will Be [The New York Times]

More on earthquake AI: A New AI Detected 17 Times More Earthquakes Than Traditional Methods

Follow this link:

Scientists Are Hopeful AI Could Help Predict Earthquakes

A Stem Cell Transplant Let a Wheelchair-Bound Man Dance Again

Stand Up Guy

For 10 years, Roy Palmer had no feeling in his lower extremities. Two days after receiving a stem cell transplant, he cried tears of joy because he could feel a cramp in his leg.

The technical term for the procedure the British man underwent is hematopoietic stem cell transplantation (HSCT). And while risky, it’s offering new hope to people like Palmer, who found himself wheelchair-bound after multiple sclerosis (MS) caused his immune system to attack his nerves’ protective coverings.

Biological Reboot

Ever hear the IT troubleshooting go-to of turning a system off and on again to fix it? The HSCT process is similar, but instead of a computer, doctors attempt to reboot a patient’s immune system.

To do this, they first remove stem cells from the patient’s body. Then the patient undergoes chemotherapy, which kills the rest of their immune system. After that, the doctors use the extracted stem cells to reboot the patient’s immune system.

It took just two days for the treatment to restore some of the feeling in Palmer’s legs. Eventually, he was able to walk on his own and even dance. He told the BBC in a recent interview that he now feels like he has a second chance at life.

“We went on holiday, not so long ago, to Turkey. I walked on the beach,” said Palmer. “Little things like that, people do not realize what it means to me.”

Risk / Reward

Still, HSCT isn’t some miracle cure for MS. Though it worked for Palmer, that’s not always the case, and HSCT can also cause infections and infertility. The National MS Society still considers HSCT to be an experimental treatment, and the Food and Drug Administration has yet to approve the therapy in the U.S.

However, MS affects more than 2.3 million people, and if a stem cell transplant can help even some of those folks the way it helped Palmer, it’s a therapy worth exploring.

READ MORE: Walking Again After Ten Years With MS [BBC]

More on HCST: New Breakthrough Treatment Could “Reverse Disability” for MS Patients

See the rest here:

A Stem Cell Transplant Let a Wheelchair-Bound Man Dance Again

AI Dreamed Up These Nightmare Fuel Halloween Masks

Nightmare Fuel

Someone programmed an AI to dream up Halloween masks, and the results are absolute nightmare fuel. Seriously, just look at some of these things.

“What’s so scary or unsettling about it is that it’s not so detailed that it shows you everything,” said Matt Reed, the creator of the masks, in an interview with New Scientist. “It leaves just enough open for your imagination to connect the dots.”

A selection of masks featured on Reed’s twitter. Credit: Matt Reed/Twitter

Creative Horror

To create the masks, Reed — whose day job is as a technologist at a creative agency called redpepper — fed an open source AI tool 5,000 pictures of Halloween masks he sourced from Google Images. He then instructed the tool to generate its own masks.

The fun and spooky project is yet another sign that AI is coming into its own as a creative tool. Just yesterday, a portrait generated by a similar system fetched more than $400,000 at a prominent British auction house.

And Reed’s masks are evocative. Here at the Byte, if we looked through the peephole and saw one of these on a trick or treater, we might not open our door.

READ MORE: AI Designed These Halloween Masks and They Are Absolutely Terrifying [New Scientist]

More on AI-generated art: Generated Art Will Go on Sale Alongside Human-Made Works This Fall

Read the original here:

AI Dreamed Up These Nightmare Fuel Halloween Masks

Robot Security Guards Will Constantly Nag Spectators at the Tokyo Olympics

Over and Over

“The security robot is patrolling. Ding-ding. Ding-ding. The security robot is patrolling. Ding-ding. Ding-ding.”

That’s what Olympic attendees will hear ad nauseam when they step onto the platforms of Tokyo’s train stations in 2020. The source: Perseusbot, a robot security guard Japanese developers unveiled to the press on Thursday.

Observe and Report

According to reporting by Kyodo News, the purpose of the AI-powered Perseusbot is to lower the burden on the stations’ staff when visitors flood Tokyo during the 2020 Olympics.

The robot is roughly 5.5 feet tall and equipped with security cameras that allow it to note suspicious behaviors, such as signs of violence breaking out or unattended packages, as it autonomous patrols the area. It can then alert security staff to the issues by sending notifications directly to their smart phones.

Prior Prepration

Just like the athletes who will head to Tokyo in 2020, Perseusbot already has a training program in the works — it’ll patrol Tokyo’s Seibu Shinjuku Station from November 26 to 30. This dry run should give the bot’s developers a chance to work out any kinks before 2020.

If all goes as hoped, the bot will be ready to annoy attendees with its incessant chant before the Olympic torch is lit. And, you know, keep everyone safe, too.

READ MORE: Robot Station Security Guard Unveiled Ahead of 2020 Tokyo Olympics [Kyodo News]

More robot security guards: Robot Security Guards Are Just the Beginning

Read the original post:

Robot Security Guards Will Constantly Nag Spectators at the Tokyo Olympics

People Would Rather a Self-Driving Car Kill a Criminal Than a Dog

Snap Decisions

On first glance, a site that collects people’s opinions about whose life an autonomous car should favor doesn’t tell us anything we didn’t already know. But look closer, and you’ll catch a glimpse of humanity’s dark side.

The Moral Machine is an online survey designed by MIT researchers to gauge how the public would want an autonomous car to behave in a scenario in which someone has to die. It asks questions like: “If an autonomous car has to choose between killing a man or a woman, who should it kill? What if the woman is elderly but the man is young?”

Essentially, it’s a 21st century update on the Trolley Problem, an ethical thought experiment no doubt permanently etched into the mind of anyone who’s seen the second season of “The Good Place.”

Ethical Dilemma

The MIT team launched the Moral Machine in 2016, and more than two million people from 233 countries participated in the survey — quite a significant sample size.

On Wednesday, the researchers published the results of the experiment in the journal Nature, and they really aren’t all that surprising: Respondents value the life of a baby over all others, with a female child, male child, and pregnant woman following closely behind. Yawn.

It’s when you look at the other end of the spectrum — the characters survey respondents were least likely to “save” — that you’ll see something startling: Survey respondents would rather the autonomous car kill a human criminal than a dog.

moral machine
Image Credit: MIT

Ugly Reflection

While the team designed the survey to help shape the future of autonomous vehicles, it’s hard not to focus on this troubling valuing of a dog’s life over that of any human, criminal or not. Does this tell us something important about how society views the criminal class? Reveal that we’re all monsters when hidden behind the internet’s cloak of anonymity? Confirm that we really like dogs?

The MIT team doesn’t address any of these questions in their paper, and really, we wouldn’t expect them to — it’s their job to report the survey results, not extrapolate some deeper meaning from them. But whether the Moral Machine informs the future of autonomous vehicles or not, it’s certainly held up a mirror to humanity’s values, and we do not like the reflection we see.

READ MORE: Driverless Cars Should Spare Young People Over Old in Unavoidable Accidents, Massive Survey Finds [Motherboard]

More on the Moral Machine: MIT’s “Moral Machine” Lets You Decide Who Lives & Dies in Self-Driving Car Crashes

Read the original here:

People Would Rather a Self-Driving Car Kill a Criminal Than a Dog


12345...102030...