50th anniversary of Oregon Beach Bill – Coos Bay World

OREGON COAST Oregon beachgoers have unfettered access to the states 362 miles of coastline thanks to a landmark piece of legislation signed in 1967.

Fifty years ago today, Gov. Tom McCall signed the Oregon Beach Bill.

The bill established public ownership along Oregons coastline, from the water up to 16 feet above the low tide mark.

The anniversary of the legislation has caused some to reflect on the importance of the areas beaches.

Fawn Custer, volunteer coordinator for Oregon Shores Conservation Coalitions CoastWatch program, said public access to the beach is a novel situation thats unique to Oregon. As such, she said its important to be diligent about protecting Oregons access to its beaches.

We want our beaches to be for the public, for us and not for some private entity that just happens to have a little bit more money, Custer said, Its up to us to say no, you know what, we like having our public beaches.

She said the purpose of the CoastWatch program is to have people keep an eye on the beach and document changes.

The plus of all of that is the fact that as conservation groups were able to be on the forefront of maintaining that beach bill, Custer said.

The volunteer coordinator grew up on the East Coast where many beaches have privately-owned sections.

Youre on the beach and all of the sudden you come onto a fence, Custer said, Youre not able to walk the whole beach.

Marty Giles with Wavecrest Discoveries had a similar story.

She remembered visiting North Carolina and having to drive for miles along the road to find a spot that had public access.

I could smell the ocean, hear it, but couldnt access it because it was privately owned, Giles said.

In Oregon, she said residents have the ability to drive toward the ocean and access whatever beach they end up at. Giles said its not that way everywhere.

You talk to people from other areas and they dont have the same sense of you just go west and find a beach, she said.

Giles said you can judge the importance of public access to the beach in two different realms.

The first is the business realm.

We live in a place where those natural resources are so readily available and theyre a key part of tourism, Giles said, All our businesses in the region benefit from the tourism industry.

The second realm is personal benefit.

She said people visit the Oregon Coast to come to the beach, but theres also a direct benefit that all Oregonians enjoy.

Access to natural resources is a benefit to our quality of life, Giles said.

She said thats one of the things that draws and keeps people in the area.

Its certainly that way for Giles.

My best appreciation is that boundary between land and sea and to witness the inherent processes that are within that boundary, she said.

Giles like to visit Sunset Bay State Park to enjoy some of the beauty the coast has the offer. For her, the Beach Bill anniversary is a reminder to appreciate that beauty.

The point of the fifty years is giving us an opportunity to remind ourselves of really remarkable resources that we need to appreciate fully, use responsibly and need to defend, Giles said.

Read more:

50th anniversary of Oregon Beach Bill - Coos Bay World

Thousands of Pounds of Trash At Tahoe Beaches After 4th – KTVN … – KTVN

Crowds flocked to Lake Tahoe to celebrate the Fourth of July Tuesday, but unfortunately, they left the evidence behind.

The trash-covered beaches have become an every-year occurrence. So much so, that it's become a tradition for some groups to celebrate on the 4th and then assemble volunteers to clean up after everyone else on the 5th.

"When we walked out here this morning, there were broken floaties, lawn chairs left over, corn on the cobs just left, beer cans, everything," Keep Tahoe Blue Natural Resources Associate Savannah Rudroff said during this year's clean-up.

Volunteers found pizza boxes, wine bottles, old inner tubes, and lots of cigarette butts on Lake Tahoe's beaches Wednesday morning. They're just part of the more than 300 good Samaritans who dedicate their fifth of July every year to keeping Tahoe clean.

This year, 321 volunteers picked up more than 1,678 pounds of trash on 5.64 miles of beach. While the mess is frustrating, the mood was pretty positive. They said they're just grateful that this many people want to help.

"It's totally amazing that we have so many people participating in this clean-up, to get all of that off our beaches, and we can continue to enjoy them as they are," Rudroff said. "It's super important that we get all of the trash off of our beaches so that we can maintain the scenic quality of Lake Tahoe, but also the water quality of Lake Tahoe, and we can protect our wildlife that lives here."

The clean-ups are organized by Keep Tahoe Blue. If you couldn't join in the clean-up but still want to help, they take donations on their website, here.

View original post here:

Thousands of Pounds of Trash At Tahoe Beaches After 4th - KTVN ... - KTVN

Nonprofit wants the disabled, elderly closer to the water at NJ beaches – New Jersey 101.5 FM Radio

Seaside Heights beach access ramp (Photo provided by Beach Days for All)

All towns along New Jerseys coast are required to offer some type of handicapped access to their public beaches. But getting anywhere near the water is a different story.

A newly created nonprofit out of Beachwood aims to increase access for those with disabilities, one town at a time.

They have the access ramp which ends right below the dunes, said Jessica Krill, founder of Beach Days for All, whose work at a childrens hospital has introduced her to several families with special needs children.

And then once you hit the beach, it then becomes nonfunctional for a person with a disability, she said, noting regular wheelchairs cant trudge through the sand.

Beach wheelchairs can do the trick, Krill said, but most do not recline for those who can not sit vertically. Theyre also a large expense.

Starting with Seaside Park, where she was born and raised, Krill hopes to get officials on board with installing a mat that reaches approximately 40 feet from the water, where the disabled and elderly can feel the cool ocean breeze and feel like part of the crowd.

Krill said its important to note this is not a fight for her cause. She hopes to partner with municipalities, and if theyre not amenable to the idea, shell move on to the next town.

A number of shore towns have mats installed for handicapped beach access from the road. Belmar entered the 2017 summer season with mats at several beach entrances.

More from New Jersey 101.5:

Subscribe to New Jersey 101.5 FM on

Contact reporter Dino Flammia at dino.flammia@townsquaremedia.com.

Read more from the original source:

Nonprofit wants the disabled, elderly closer to the water at NJ beaches - New Jersey 101.5 FM Radio

star | astronomy | Britannica.com

Star, any massive self-luminous celestial body of gas that shines by radiation derived from its internal energy sources. Of the tens of billions of trillions of stars composing the observable universe, only a very small percentage are visible to the naked eye. Many stars occur in pairs, multiple systems, and star clusters. The members of such stellar groups are physically related through common origin and are bound by mutual gravitational attraction. Somewhat related to star clusters are stellar associations, which consist of loose groups of physically similar stars that have insufficient mass as a group to remain together as an organization.

This article describes the properties and evolution of individual stars. Included in the discussion are the sizes, energetics, temperatures, masses, and chemical compositions of stars, as well as their distances and motions. The myriad other stars are compared to the Sun, strongly implying that our star is in no way special.

With regard to mass, size, and intrinsic brightness, the Sun is a typical star. Its approximate mass is 2 1030 kg (about 330,000 Earth masses), its approximate radius 700,000 km (430,000 miles), and its approximate luminosity 4 1033 ergs per second (or equivalently 4 1023 kilowatts of power). Other stars often have their respective quantities measured in terms of those of the Sun.

The table lists data pertaining to the 20 brightest stars, or, more precisely, stellar systems, since some of them are double (binary stars) or even triple stars. Successive columns give the name of the star, its brightness expressed in visual magnitude and spectral type (see below Classification of spectral types), the distance from Earth in light-years (a light-year is the distance that light waves travel in one Earth year: 9.46 trillion km, or 5.88 trillion miles), and the visual luminosity in terms of that of the Sun. All the primary stars (designated as the A component in the table) are intrinsically as bright as or brighter than the Sun; some of the companion stars are fainter.

Many stars vary in the amount of light they radiate. Stars such as Altair, Alpha Centauri A and B, and Procyon A are called dwarf stars; their dimensions are roughly comparable to those of the Sun. Sirius A and Vega, though much brighter, also are dwarf stars; their higher temperatures yield a larger rate of emission per unit area. Aldebaran A, Arcturus, and Capella A are examples of giant stars, whose dimensions are much larger than those of the Sun. Observations with an interferometer (an instrument that measures the angle subtended by the diameter of a star at the observers position), combined with parallax measurements (which yield a stars distance; see below Determining stellar distances), give sizes of 12 and 22 solar radii for Arcturus and Aldebaran A. Betelgeuse and Antares A are examples of supergiant stars. The latter has a radius some 300 times that of the Sun, whereas the variable star Betelgeuse oscillates between roughly 300 and 600 solar radii. Several of the stellar class of white dwarf stars, which have low luminosities and high densities, also are among the brightest stars. Sirius B is a prime example, having a radius one-thousandth that of the Sun, which is comparable to the size of Earth. Also among the brightest stars are Rigel A, a young supergiant in the constellation Orion, and Canopus, a bright beacon in the Southern Hemisphere often used for spacecraft navigation.

Test Your Knowledge

Stars: Explosions in Space

The Suns activity is apparently not unique. It has been found that stars of many types are active and have stellar winds analogous to the solar wind. The importance and ubiquity of strong stellar winds became apparent only through advances in spaceborne ultraviolet and X-ray astronomy as well as in radio and infrared surface-based astronomy.

X-ray observations that were made during the early 1980s yielded some rather unexpected findings. They revealed that nearly all types of stars are surrounded by coronas having temperatures of one million kelvins (K) or more. Furthermore, all stars seemingly display active regions, including spots, flares, and prominences much like those of the Sun (see sunspot; solar flare; solar prominence). Some stars exhibit starspots so large that an entire face of the star is relatively dark, while others display flare activity thousands of times more intense than that on the Sun.

The highly luminous hot, blue stars have by far the strongest stellar winds. Observations of their ultraviolet spectra with telescopes on sounding rockets and spacecraft have shown that their wind speeds often reach 3,000 km (roughly 2,000 miles) per second, while losing mass at rates up to a billion times that of the solar wind. The corresponding mass-loss rates approach and sometimes exceed one hundred-thousandth of a solar mass per year, which means that one entire solar mass (perhaps a tenth of the total mass of the star) is carried away into space in a relatively short span of 100,000 years. Accordingly, the most luminous stars are thought to lose substantial fractions of their mass during their lifetimes, which are calculated to be only a few million years.

Ultraviolet observations have proved that to produce such great winds the pressure of hot gases in a corona, which drives the solar wind, is not enough. Instead, the winds of the hot stars must be driven directly by the pressure of the energetic ultraviolet radiation emitted by these stars. Aside from the simple realization that copious quantities of ultraviolet radiation flow from such hot stars, the details of the process are not well understood. Whatever is going on, it is surely complex, for the ultraviolet spectra of the stars tend to vary with time, implying that the wind is not steady. In an effort to understand better the variations in the rate of flow, theorists are investigating possible kinds of instabilities that might be peculiar to luminous hot stars.

Britannica Lists & Quizzes

Geography List

Health & Medicine Quiz

Literature & Language List

Observations made with radio and infrared telescopes as well as with optical instruments prove that luminous cool stars also have winds whose total mass-flow rates are comparable to those of the luminous hot stars, though their velocities are much lowerabout 30 km (20 miles) per second. Because luminous red stars are inherently cool objects (having a surface temperature of about 3,000 K, or half that of the Sun), they emit very little detectable ultraviolet or X-ray radiation; thus, the mechanism driving the winds must differ from that in luminous hot stars. Winds from luminous cool stars, unlike those from hot stars, are rich in dust grains and molecules. Since nearly all stars more massive than the Sun eventually evolve into such cool stars, their winds, pouring into space from vast numbers of stars, provide a major source of new gas and dust in interstellar space, thereby furnishing a vital link in the cycle of star formation and galactic evolution. As in the case of the hot stars, the specific mechanism that drives the winds of the cool stars is not understood; at this time, investigators can only surmise that gas turbulence, magnetic fields, or both in the atmospheres of these stars are somehow responsible.

Strong winds also are found to be associated with objects called protostars, which are huge gas balls that have not yet become full-fledged stars in which energy is provided by nuclear reactions (see below Star formation and evolution). Radio and infrared observations of deuterium (heavy hydrogen) and carbon monoxide (CO) molecules in the Orion Nebula have revealed clouds of gas expanding outward at velocities approaching 100 km (60 miles) per second. Furthermore, high-resolution, very-long-baseline interferometry observations have disclosed expanding knots of natural maser (coherent microwave) emission of water vapour near the star-forming regions in Orion, thus linking the strong winds to the protostars themselves. The specific causes of these winds remain unknown, but if they generally accompany star formation, astronomers will have to consider the implications for the early solar system. After all, the Sun was presumably once a protostar too.

Distances to stars were first determined by the technique of trigonometric parallax, a method still used for nearby stars. When the position of a nearby star is measured from two points on opposite sides of Earths orbit (i.e., six months apart), a small angular (artificial) displacement is observed relative to a background of very remote (essentially fixed) stars. Using the radius of Earths orbit as the baseline, the distance of the star can be found from the parallactic angle, p. If p = 1 (one second of arc), the distance of the star is 206,265 times Earths distance from the Sunnamely, 3.26 light-years. This unit of distance is termed the parsec, defined as the distance of an object whose parallax equals one arc second. Therefore, one parsec equals 3.26 light-years. Since parallax is inversely proportional to distance, a star at 10 parsecs would have a parallax of 0.1. The nearest star to Earth, Proxima Centauri (a member of the triple system of Alpha Centauri), has a parallax of 0.7716, meaning that its distance is 1/0.7716, or 1.296, parsecs, which equals 4.23 light-years. The parallax of Barnards star, the next closest after the Alpha Centauri system, is 0.5483, so that its distance is nearly 6 light-years. Errors of such parallaxes are now typically 0.001. Thus, measurements of trigonometric parallaxes are useful for only the nearby stars within a few thousand light-years. In fact, of the approximately 100 billion stars in the Milky Way Galaxy (also simply called the Galaxy), only about 2.5 million are close enough to have their parallaxes measured with useful accuracy. For more distant stars, indirect methods are used; most of them depend on comparing the intrinsic brightness of a star (found, for example, from its spectrum or other observable property) with its apparent brightness.

Only three stars, Alpha Centauri, Procyon, and Sirius, are both among the 20 nearest and among the 20 brightest stars (see above). Ironically, most of the relatively nearby stars are dimmer than the Sun and are invisible without the aid of a telescope. By contrast, some of the well-known bright stars outlining the constellations have parallaxes as small as the limiting value of 0.001 and are therefore well beyond several hundred light-years distance from the Sun. The most luminous stars can be seen at great distances, whereas the intrinsically faint stars can be observed only if they are relatively close to Earth.

Although the lists of the brightest and the nearest stars pertain to only a very small number of stars, they nonetheless serve to illustrate some important points. The stars listed fall roughly into three categories: (1) giant stars and supergiant stars having sizes of tens or even hundreds of solar radii and extremely low average densitiesin fact, several orders of magnitude less than that of water (one gram per cubic centimetre); (2) dwarf stars having sizes ranging from 0.1 to 5 solar radii and masses from 0.1 to about 10 solar masses; and (3) white dwarf stars having masses comparable to that of the Sun but dimensions appropriate to planets, meaning that their average densities are hundreds of thousands of times greater than that of water.

These rough groupings of stars correspond to stages in their life histories (see below Later stages of evolution). The second category is identified with what is called the main sequence (see below Hertzsprung-Russell diagram) and includes stars that emit energy mainly by converting hydrogen into helium in their cores. The first category comprises stars that have exhausted the hydrogen in their cores and are burning hydrogen within a shell surrounding the core. The white dwarfs represent the final stage in the life of a typical star, when most available sources of energy have been exhausted and the star has become relatively dim.

The large number of binary stars and even multiple systems is notable. These star systems exhibit scales comparable in size to that of the solar system. Some, and perhaps many, of the nearby single stars have invisible (or very dim) companions detectable by their gravitational effects on the primary star; this orbital motion of the unseen member causes the visible star to wobble in its motion through space. Some of the invisible companions have been found to have masses on the order of 0.001 solar mass or less, which is in the range of planetary rather than stellar dimensions. Current observations suggest that they are genuine planets, though some are merely extremely dim stars (sometimes called brown dwarfs). Nonetheless, a reasonable inference that can be drawn from these data is that double stars and planetary systems are formed by similar evolutionary processes.

Accurate observations of stellar positions are essential to many problems of astronomy. Positions of the brighter stars can be measured very accurately in the equatorial system (the coordinates of which are called right ascension [, or RA] and declination [, or DEC] and are given for some epochfor example, 1950.0 or, currently, 2000.0). Fainter stars are measured by using photographic plates or electronic imaging devices (e.g., a charge-coupled device, or CCD) with respect to the brighter stars, and finally the entire group is referred to the positions of known external galaxies (see galaxy). These distant galaxies are far enough away to define an essentially fixed, or immovable, system, whereas in the Milky Way the positions of both bright and faint stars are affected over relatively short periods of time by galactic rotation and by their own motions through the Galaxy.

Accurate measurements of position make it possible to determine the movement of a star across the line of sight (i.e., perpendicular to the observer)its proper motion. The amount of proper motion, denoted by (in arc seconds per year), divided by the parallax of the star and multiplied by a factor of 4.74 equals the tangential velocity, VT, in kilometres per second in the plane of the celestial sphere.

The motion along the line of sight (i.e., toward the observer), called radial velocity, is obtained directly from spectroscopic observations. If is the wavelength of a characteristic spectral line of some atom or ion present in the star, and L the wavelength of the same line measured in the laboratory, then the difference , or L, divided by L equals the radial velocity, VR, divided by the velocity of light, cnamely, /L = VR/c. Shifts of a spectral line toward the red end of the electromagnetic spectrum (i.e., positive VR) indicate recession, and those toward the blue end (negative VR) indicate approach (see Doppler effect; redshift). If the parallax is known, measurements of and VR enable a determination of the space motion of the star. Normally, radial velocities are corrected for Earths rotation and for its motion around the Sun, so that they refer to the line-of-sight motion of the star with respect to the Sun.

Consider a pertinent example. The proper motion of Alpha Centauri is about 3.5 arc seconds, which, at a distance of 4.4 light-years, means that this star moves 0.00007 light-year in one year. It thus has a projected velocity in the plane of the sky of 22 km per second. (One kilometre is about 0.62 mile.) As for motion along the line of sight, Alpha Centauris spectral lines are slightly blueshifted, implying a velocity of approach of about 20 km per second. The true space motion, equal to (222 + 202)1/2 or about 30 km per second, suggests that this star will make its closest approach to the Sun (at three light-years distance) some 280 centuries from now.

Stellar brightnesses are usually expressed by means of their magnitudes, a usage inherited from classical times. A star of the first magnitude is about 2.5 times as bright as one of the second magnitude, which in turn is some 2.5 times as bright as one of the third magnitude, and so on. A star of the first magnitude is therefore 2.55 or 100 times as bright as one of the sixth magnitude. The magnitude of Sirius, which appears to an observer on Earth as the brightest star in the sky (save the Sun), is 1.4. Canopus, the second brightest, has a magnitude of 0.7, while the faintest star normally seen without the aid of a telescope is of the sixth magnitude. Stars as faint as the 30th magnitude have been measured with modern telescopes, meaning that these instruments can detect stars about four billion times fainter than can the human eye alone.

The scale of magnitudes comprises a geometric progression of brightness. Magnitudes can be converted to light ratios by letting ln and lm be the brightnesses of stars of magnitudes n and m; the logarithm of the ratio of the two brightnesses then equals 0.4 times the difference between themi.e., log(lm/ln) = 0.4(n m). Magnitudes are actually defined in terms of observed brightness, a quantity that depends on the light-detecting device employed. Visual magnitudes were originally measured with the eye, which is most sensitive to yellow-green light, while photographic magnitudes were obtained from images on old photographic plates, which were most sensitive to blue light. Today, magnitudes are measured electronically, using detectors such as CCDs equipped with yellow-green or blue filters to create conditions that roughly correspond to those under which the original visual and photographic magnitudes were measured. Yellow-green magnitudes are still often designated V magnitudes, but blue magnitudes are now designated B. The scheme has been extended to other magnitudes, such as ultraviolet (U), red (R), and near-infrared (I). Other systems vary the details of this scheme. All magnitude systems must have a reference, or zero, point. In practice, this is fixed arbitrarily by agreed-upon magnitudes measured for a variety of standard stars.

The actually measured brightnesses of stars give apparent magnitudes. These cannot be converted to intrinsic brightnesses until the distances of the objects concerned are known. The absolute magnitude of a star is defined as the magnitude it would have if it were viewed at a standard distance of 10 parsecs (32.6 light-years). Since the apparent visual magnitude of the Sun is 26.75, its absolute magnitude corresponds to a diminution in brightness by a factor of (2,062,650)2 and is, using logarithms, 26.75 + 2.5 log(2,062,650)2, or 26.75 + 31.57 = 4.82. This is the magnitude that the Sun would have if it were at a distance of 10 parsecsan object still visible to the naked eye, though not a very conspicuous one and certainly not the brightest in the sky. Very luminous stars, such as Deneb, Rigel, and Betelgeuse, have absolute magnitudes of 7 to 9, while one of the faintest known stars, the companion to the star with the catalog name BD + 44048, has an absolute visual magnitude of +19, which is about a million times fainter than the Sun. Many astronomers suspect that large numbers of such faint stars exist, but most of these objects have so far eluded detection.

Stars differ in colour. Most of the stars in the constellation Orion visible to the naked eye are blue-white, most notably Rigel (Beta Orionis), but Betelgeuse (Alpha Orionis) is a deep red. In the telescope, Albireo (Beta Cygni) is seen as two stars, one blue and the other orange. One quantitative means of measuring stellar colours involves a comparison of the yellow (visual) magnitude of the star with its magnitude measured through a blue filter. Hot, blue stars appear brighter through the blue filter, while the opposite is true for cooler, red stars. In all magnitude scales, one magnitude step corresponds to a brightness ratio of 2.512. The zero point is chosen so that white stars with surface temperatures of about 10,000 K have the same visual and blue magnitudes. The conventional colour index is defined as the blue magnitude, B, minus the visual magnitude, V; the colour index, B V, of the Sun is thus +5.47 4.82 = 0.65.

Problems arise when only one colour index is observed. If, for instance, a star is found to have, say, a B V colour index of 1.0 (i.e., a reddish colour), it is impossible without further information to decide whether the star is red because it is cool or whether it is really a hot star whose colour has been reddened by the passage of light through interstellar dust. Astronomers have overcome these difficulties by measuring the magnitudes of the same stars through three or more filters, often U (ultraviolet), B, and V (see UBV system).

Observations of stellar infrared light also have assumed considerable importance. In addition, photometric observations of individual stars from spacecraft and rockets have made possible the measurement of stellar colours over a large range of wavelengths. These data are important for hot stars and for assessing the effects of interstellar attenuation.

The measured total of all radiation at all wavelengths from a star is called a bolometric magnitude. The corrections required to reduce visual magnitudes to bolometric magnitudes are large for very cool stars and for very hot ones, but they are relatively small for stars such as the Sun. A determination of the true total luminosity of a star affords a measure of its actual energy output. When the energy radiated by a star is observed from Earths surface, only that portion to which the energy detector is sensitive and that can be transmitted through the atmosphere is recorded. Most of the energy of stars like the Sun is emitted in spectral regions that can be observed from Earths surface. On the other hand, a cool dwarf star with a surface temperature of 3,000 K has an energy maximum on a wavelength scale at 10000 angstroms () in the far-infrared, and most of its energy cannot therefore be measured as visible light. (One angstrom equals 1010 metre, or 0.1 nanometre.) Bright, cool stars can be observed at infrared wavelengths, however, with special instruments that measure the amount of heat radiated by the star. Corrections for the heavy absorption of the infrared waves by water and other molecules in Earths air must be made unless the measurements are made from above the atmosphere.

The hotter stars pose more difficult problems, since Earths atmosphere extinguishes all radiation at wavelengths shorter than 2900 . A star whose surface temperature is 20,000 K or higher radiates most of its energy in the inaccessible ultraviolet part of the electromagnetic spectrum. Measurements made with detectors flown in rockets or spacecraft extend the observable wavelength region down to 1000 or lower, though most radiation of distant stars is extinguished below 912 a region in which absorption by neutral hydrogen atoms in intervening space becomes effective.

To compare the true luminosities of two stars, the appropriate bolometric corrections must first be added to each of their absolute magnitudes. The ratio of the luminosities can then be calculated.

A stars spectrum contains information about its temperature, chemical composition, and intrinsic luminosity. Spectrograms secured with a slit spectrograph consist of a sequence of images of the slit in the light of the star at successive wavelengths. Adequate spectral resolution (or dispersion) might show the star to be a member of a close binary system, in rapid rotation, or to have an extended atmosphere. Quantitative determination of its chemical composition then becomes possible. Inspection of a high-resolution spectrum of the star may reveal evidence of a strong magnetic field.

Spectral lines are produced by transitions of electrons within atoms or ions. As the electrons move closer to or farther from the nucleus of an atom (or of an ion), energy in the form of light (or other radiation) is emitted or absorbed. The yellow D lines of sodium (see D-lines) or the H and K lines of ionized calcium (seen as dark absorption lines) are produced by discrete quantum jumps from the lowest energy levels (ground states) of these atoms. The visible hydrogen lines (the so-called Balmer series; see spectral line series), however, are produced by electron transitions within atoms in the second energy level (or first excited state), which lies well above the ground level in energy. Only at high temperatures are sufficient numbers of atoms maintained in this state by collisions, radiations, and so forth to permit an appreciable number of absorptions to occur. At the low surface temperatures of a red dwarf star, few electrons populate the second level of hydrogen, and thus the hydrogen lines are dim. By contrast, at very high temperaturesfor instance, that of the surface of a blue giant starthe hydrogen atoms are nearly all ionized and therefore cannot absorb or emit any line radiation. Consequently, only faint dark hydrogen lines are observed. The characteristic features of ionized metals such as iron are often weak in such hotter stars because the appropriate electron transitions involve higher energy levels that tend to be more sparsely populated than the lower levels. Another factor is that the general fogginess, or opacity, of the atmospheres of these hotter stars is greatly increased, resulting in fewer atoms in the visible stellar layers capable of producing the observed lines.

The continuous (as distinct from the line) spectrum of the Sun is produced primarily by the photodissociation of negatively charged hydrogen ions (H)i.e., atoms of hydrogen to which an extra electron is loosely attached. In the Suns atmosphere, when H is subsequently destroyed by photodissociation, it can absorb energy at any of a whole range of wavelengths and thus produce a continuous range of absorption of radiation. The main source of light absorption in the hotter stars is the photoionization of hydrogen atoms, both from ground level and from higher levels.

The physical processes behind the formation of stellar spectra are well enough understood to permit determinations of temperatures, densities, and chemical compositions of stellar atmospheres. The star studied most extensively is, of course, the Sun, but many others also have been investigated in detail.

The general characteristics of the spectra of stars depend more on temperature variations among the stars than on their chemical differences. Spectral features also depend on the density of the absorbing atmospheric matter, and density in turn is related to a stars surface gravity. Dwarf stars, with great surface gravities, tend to have high atmospheric densities; giants and supergiants, with low surface gravities, have relatively low densities. Hydrogen absorption lines provide a case in point. Normally, an undisturbed atom radiates a very narrow line. If its energy levels are perturbed by charged particles passing nearby, it radiates at a wavelength near its characteristic wavelength. In a hot gas, the range of disturbance of the hydrogen lines is very high, so that the spectral line radiated by the whole mass of gas is spread out considerably; the amount of blurring depends on the density of the gas in a known fashion. Dwarf stars such as Sirius show broad hydrogen features with extensive wings where the line fades slowly out into the background, while supergiant stars, with less-dense atmospheres, display relatively narrow hydrogen lines.

Most stars are grouped into a small number of spectral types. The Henry Draper Catalogue and the Bright Star Catalogue list spectral types from the hottest to the coolest stars (see Harvard classification system). These types are designated, in order of decreasing temperature, by the letters O, B, A, F, G, K, and M. This group is supplemented by R- and N-type stars (today often referred to as carbon, or C-type, stars) and S-type stars. The R-, N-, and S-type stars differ from the others in chemical composition; also, they are invariably giant or supergiant stars. With the discovery of brown dwarfsobjects that form like stars but do not shine through thermonuclear fusionthe system of stellar classification has been expanded to include spectral types L and T.

The spectral sequence O through M represents stars of essentially the same chemical composition but of different temperatures and atmospheric pressures. This simple interpretation, put forward in the 1920s by the Indian astrophysicist Meghnad N. Saha, has provided the physical basis for all subsequent interpretations of stellar spectra. The spectral sequence is also a colour sequence: the O- and B-type stars are intrinsically the bluest and hottest; the M-, R-, N-, and S-type stars are the reddest and coolest.

In the case of cool stars of type M, the spectra indicate the presence of familiar metals, including iron, calcium, magnesium, and also titanium oxide molecules (TiO), particularly in the red and green parts of the spectrum. In the somewhat hotter K-type stars, the TiO features disappear, and the spectrum exhibits a wealth of metallic lines. A few especially stable fragments of molecules such as cyanogen (CN) and the hydroxyl radical (OH) persist in these stars and even in G-type stars such as the Sun. The spectra of G-type stars are dominated by the characteristic lines of metals, particularly those of iron, calcium, sodium, magnesium, and titanium.

The behaviour of calcium illustrates the phenomenon of thermal ionization. At low temperatures a calcium atom retains all of its electrons and radiates a spectrum characteristic of the neutral, or normal, atom; at higher temperatures collisions between atoms and electrons and the absorption of radiation both tend to detach electrons and to produce singly ionized calcium atoms. At the same time, these ions can recombine with electrons to produce neutral calcium atoms. At high temperatures or low electron pressures, or both, most of the atoms are ionized. At low temperatures and high densities, the equilibrium favours the neutral state. The concentrations of ions and neutral atoms can be computed from the temperature, the density, and the ionization potential (namely, the energy required to detach an electron from the atom).

The absorption line of neutral calcium at 4227 is thus strong in cool M-type dwarf stars, in which the pressure is high and the temperature is low. In the hotter G-type stars, however, the lines of ionized calcium at 3968 and 3933 (the H and K lines) become much stronger than any other feature in the spectrum.

In stars of spectral type F, the lines of neutral atoms are weak relative to those of ionized atoms. The hydrogen lines are stronger, attaining their maximum intensities in A-type stars, in which the surface temperature is about 9,000 K. Thereafter, these absorption lines gradually fade as the hydrogen becomes ionized.

The hot B-type stars, such as Epsilon Orionis, are characterized by lines of helium and of singly ionized oxygen, nitrogen, and neon. In very hot O-type stars, lines of ionized helium appear. Other prominent features include lines of doubly ionized nitrogen, oxygen, and carbon and of trebly ionized silicon, all of which require more energy to produce.

In the more modern system of spectral classification, called the MK system (after the American astronomers William W. Morgan and Philip C. Keenan, who introduced it), luminosity class is assigned to the star along with the Draper spectral type. For example, the star Alpha Persei is classified as F5 Ib, which means that it falls about halfway between the beginning of type F (i.e., F0) and of type G (i.e., G0). The Ib suffix means that it is a moderately luminous supergiant. The star Pi Cephei, classified as G2 III, is a giant falling between G0 and K0 but much closer to G0. The Sun, a dwarf star of type G2, is classified as G2 V. A star of luminosity class II falls between giants and supergiants; one of class IV is called a subgiant.

Temperatures of stars can be defined in a number of ways. From the character of the spectrum and the various degrees of ionization and excitation found from its analysis, an ionization or excitation temperature can be determined.

A comparison of the V and B magnitudes (see above Stellar colours) yields a B V colour index, which is related to the colour temperature of the star. The colour temperature is therefore a measure of the relative amounts of radiation in two more or less broad wavelength regions, while the ionization and excitation temperatures pertain to the temperatures of strata wherein spectral lines are formed.

Provided that the angular size of a star can be measured (see below Stellar radii) and that the total energy flux received at Earth (corrected for atmospheric extinction) is known, the so-called brightness temperature can be found.

The effective temperature, Teff, of a star is defined in terms of its total energy output and radius. Thus, since T4eff is the rate of radiation per unit area for a perfectly radiating sphere and if L is the total radiation (i.e., luminosity) of a star considered to be a sphere of radius R, such a sphere (called a blackbody) would emit a total amount of energy equal to its surface area, 4R2, multiplied by its energy per unit area. In symbols, L = 4R2T4eff. This relation defines the stars equivalent blackbody, or effective, temperature.

Since the total energy radiated by a star cannot be directly observed (except in the case of the Sun), the effective temperature is a derived quantity rather than an observed one. Yet, theoretically, it is the fundamental temperature. If the bolometric corrections are known, the effective temperature can be found for any star whose absolute visual magnitude and radius are known. Effective temperatures are closely related to spectral type and range from about 40,000 K for hot O-type stars, through 5,800 K for stars like the Sun, to about 300 K for brown dwarfs.

Masses of stars can be found directly only from binary systems and only if the scale of the orbits of the stars around each other is known. Binary stars are divided into three categories, depending on the mode of observation employed: visual binaries, spectroscopic binaries, and eclipsing binaries.

Visual binaries can be seen as double stars with the telescope. True doubles, as distinguished from apparent doubles caused by line-of-sight effects, move through space together and display a common space motion. Sometimes a common orbital motion can be measured as well. Provided that the distance to the binary is known, such systems permit a determination of stellar masses, m1 and m2, of the two members. The angular radius, a, of the orbit (more accurately, its semimajor axis) can be measured directly, and, with the distance known, the true dimensions of the semimajor axis, a, can be found. If a is expressed in astronomical units, which is given by a (measured in seconds of arc) multiplied by the distance in parsecs, and the period, P, also measured directly, is expressed in years, then the sum of the masses of the two orbiting stars can be found from an application of Keplers third law (see Keplers laws of planetary motion). (An astronomical unit is the average distance from Earth to the Sun, 149,597,870.7 km [92,955,807.3 miles].) In symbols, (m1 + m2) = a3/P2 in units of the Suns mass. For example, for the binary system 70 Ophiuchi, P is 87.8 years, and the distance is 5.0 parsecs; thus, a is 22.8 astronomical units, and m1 + m2 = 1.56 solar masses. From a measurement of the motions of the two members relative to the background stars, the orbit of each star has been determined with respect to their common centre of gravity. The mass ratio, m2/(m1 + m2), is 0.42; the individual masses for m1 and m2, respectively, are then 0.90 and 0.66 solar mass.

The star known as 61 Cygni was the first whose distance was measured (via parallax by the German astronomer Friedrich W. Bessel in the mid-19th century). Visually, 61 Cygni is a double star separated by 83.2 astronomical units. Its members move around one another with a period of 653 years. It was among the first stellar systems thought to contain a potential planet, although this has not been confirmed and is now considered unlikely. Nevertheless, since the 1990s a variety of discovery techniques have confirmed the existence of more than 500 planets orbiting other stars (see below Binaries and extrasolar planetary systems).

Spectroscopic binary stars are found from observations of radial velocity. At least the brighter member of such a binary can be seen to have a continuously changing periodic velocity that alters the wavelengths of its spectral lines in a rhythmic way; the velocity curve repeats itself exactly from one cycle to the next, and the motion can be interpreted as orbital motion. In some cases, rhythmic changes in the lines of both members can be measured. Unlike visual binaries, the semimajor axes or the individual masses cannot be found for most spectroscopic binaries, since the angle between the orbit plane and the plane of the sky cannot be determined. If spectra from both members are observed, mass ratios can be found. If one spectrum alone is observed, only a quantity called the mass function can be derived, from which is calculated a lower limit to the stellar masses. If a spectroscopic binary is also observed to be an eclipsing system, the inclination of the orbit and often the values of the individual masses can be ascertained.

An eclipsing binary consists of two close stars moving in an orbit so placed in space in relation to Earth that the light of one can at times be hidden behind the other. Depending on the orientation of the orbit and sizes of the stars, the eclipses can be total or annular (in the latter, a ring of one star shows behind the other at the maximum of the eclipse) or both eclipses can be partial. The best known example of an eclipsing binary is Algol (Beta Persei), which has a period (interval between eclipses) of 2.9 days. The brighter (B8-type) star contributes about 92 percent of the light of the system, and the eclipsed star provides less than 8 percent. The system contains a third star that is not eclipsed. Some 20 eclipsing binaries are visible to the naked eye.

The light curve for an eclipsing binary displays magnitude measurements for the system over a complete light cycle. The light of the variable star is usually compared with that of a nearby (comparison) star thought to be fixed in brightness. Often, a deep, or primary, minimum is produced when the component having the higher surface brightness is eclipsed. It represents the total eclipse and is characterized by a flat bottom. A shallower secondary eclipse occurs when the brighter component passes in front of the other; it corresponds to an annular eclipse (or transit). In a partial eclipse neither star is ever completely hidden, and the light changes continuously during an eclipse.

The shape of the light curve during an eclipse gives the ratio of the radii of the two stars and also one radius in terms of the size of the orbit, the ratio of luminosities, and the inclination of the orbital plane to the plane of the sky.

If radial-velocity curves are also availablei.e., if the binary is spectroscopic as well as eclipsingadditional information can be obtained. When both velocity curves are observable, the size of the orbit as well as the sizes, masses, and densities of the stars can be calculated. Furthermore, if the distance of the system is measurable, the brightness temperatures of the individual stars can be estimated from their luminosities and radii. All of these procedures have been carried out for the faint binary Castor C (two red-dwarf components of the six-member Castor multiple star system) and for the bright B-type star Mu Scorpii.

Close stars may reflect each others light noticeably. If a small, high-temperature star is paired with a larger object of low surface brightness and if the distance between the stars is small, the part of the cool star facing the hotter one is substantially brightened by it. Just before (and just after) secondary eclipse, this illuminated hemisphere is pointed toward the observer, and the total light of the system is at a maximum.

The properties of stars derived from eclipsing binary systems are not necessarily applicable to isolated single stars. Systems in which a smaller, hotter star is accompanied by a larger, cooler object are easier to detect than are systems that contain, for example, two main-sequence stars (see below Hertzsprung-Russell diagram). In such an unequal system, at least the cooler star has certainly been affected by evolutionary changes, and probably so has the brighter one. The evolutionary development of two stars near one another does not exactly parallel that of two well-separated or isolated ones.

Eclipsing binaries include combinations of a variety of stars ranging from white dwarfs to huge supergiants (e.g., VV Cephei), which would engulf Jupiter and all the inner planets of the solar system if placed at the position of the Sun.

Some members of eclipsing binaries are intrinsic variables, stars whose energy output fluctuates with time (see below Variable stars). In many such systems, large clouds of ionized gas swirl between the stellar members. In others, such as Castor C, at least one of the faint M-type dwarf components might be a flare star, one in which the brightness can unpredictably and suddenly increase to many times its normal value (see below Peculiar variables).

Near the Sun, most stars are members of binaries, and many of the nearest single stars are suspected of having companions. Although some binary members are separated by hundreds of astronomical units and others are contact binaries (stars close enough for material to pass between them), binary systems are most frequently built on the same scale as that of the solar systemnamely, on the order of about 10 astronomical units. The division in mass between two components of a binary seems to be nearly random. A mass ratio as small as about 1:20 could occur about 5 percent of the time, and under these circumstances a planetary system comparable to the solar system is able to form.

The formation of double and multiple stars on the one hand and that of planetary systems on the other seem to be different facets of the same process. Planets are probably produced as a natural by-product of star formation. Only a small fraction of the original nebula matter is likely to be retained in planets, since much of the mass and angular momentum is swept out of the system. Conceivably, as many as 100 million stars could have bona fide planets in the Milky Way Galaxy.

Individual planets around other starsi.e., extrasolar planetsare very difficult to observe directly because a star is always much brighter than its attendant planet. Jupiter, for example, would be only one-billionth as bright as the Sun and appear so close to it as to be undetectable from even the nearest star. If candidate stars are treated as possible spectroscopic binaries, however, then one may look for a periodic change in the stars radial velocity caused by a planet swinging around it. The effect is very smalleven Jupiter would cause a change in the apparent radial velocity of the Sun of only about 10 metres (33 feet) per second spread over Jupiters orbital period of about 12 years at best. Current techniques using very large telescopes to study fairly bright stars can measure radial velocities with a precision of a few metres per second, provided that the star has very sharp spectral lines, such as is observed for Sun-like stars and stars of types K and M. This means that at present the radial-velocity method normally can detect only massive Jupiter-like extrasolar planets. Planets like Earth, 300 times less massive, would cause too small a change in radial velocity to be detectable presently. Moreover, the closer the planet is to its parent star, the greater and quicker the velocity swing, so that detection of giant planets close to a star is favoured over planets farther out. And, because B- and A-type stars do not have spectral lines that allow precise velocity measurements, this method cannot reveal anything about their having planets. Finally, even when a planet is detected, the usual spectroscopic binary problem of not knowing the angle between the orbit plane and that of the sky allows only a minimum mass to be assigned to the planet.

One exception to this last problem is HD 209458, a seventh-magnitude G0 V star about 150 light-years away with a planetary object orbiting it every 3.5 days. Soon after the companion was discovered in 1999 by its effect on the stars radial velocity, it also was found to be eclipsing the star, meaning that its orbit is oriented almost edge-on toward Earth. This fortunate circumstance, as well as observations of spectral lines in the planets atmosphere, allowed determination of the planets mass and radius0.64 and 1.38 times those of Jupiter, respectively. These numbers imply that the planet is even more of a giant than Jupiter itself. What was unexpected is its proximity to the parent starmore than 100 times closer than Jupiter is to the Sun, raising the question of how a giant gaseous planet that close can survive the stars radiation. The fact that many other extrasolar planets have been found to have orbital periods measured in days rather than years, and thus to be very close to their parent stars, suggests that the HD 209458 case is not unusual. There are also some confirmed cases of planets around supernova remnants called pulsars, although whether the planets preceded the supernova explosions that produced the pulsars or were acquired afterward remains to be determined.

The first extrasolar planets were discovered in 1992. More than 500 extrasolar planets were known by the early years of the 21st century, with more such discoveries being added regularly. (For additional information on extrasolar planets and systems, see extrasolar planet; planet; solar system: Studies of other solar systems.)

In addition to the growing evidence for existence of extrasolar planets, space-based observatories designed to detect infrared radiation have found more than 100 young nearby stars (including Vega, Fomalhaut and Beta Pictoris) to have disks of warm matter orbiting them. This matter is composed of myriad particles mostly about the size of sand grains and might be taking part in the first stage of planetary formation.

The mass of most stars lies within the range of 0.3 to 3 solar masses. The star with the largest mass determined to date is R136a1, a giant of about 265 solar masses that had as much as 320 solar masses when it was formed. There is a theoretical upper limit to the masses of nuclear-burning stars (the Eddington limit), which limits stars to no more than a few hundred solar masses. On the low mass side, most stars seem to have at least 0.1 solar mass. The theoretical lower mass limit for an ordinary star is about 0.075 solar mass, for below this value an object cannot attain a central temperature high enough to enable it to shine by nuclear energy. Instead, it may produce a much lower level of energy by gravitational shrinkage. If its mass is not much below the critical 0.075 solar mass value, it will appear as a very cool, dim star known as a brown dwarf. Its evolution is simply to continue cooling toward eventual extinction. At still somewhat lower masses, the object would be a giant planet. Jupiter, with a mass roughly 0.001 that of the Sun, is just such an object, emitting a very low level of energy (apart from reflected sunlight) that is derived from gravitational shrinkage.

Brown dwarfs were late to be discovered, the first unambiguous identification having been made in 1995. It is estimated, however, that hundreds must exist in the solar neighbourhood. An extension of the spectral sequence for objects cooler than M-type stars has been constructed, using L for warmer brown dwarfs, T for cooler ones, and Y for the coolest. The presence of methane in the T brown dwarfs and of ammonia in the Y brown dwarfs emphasizes their similarity to giant planets. (For additional discussion of the topic, see eclipse: Eclipsing binary stars.)

Angular sizes of bright red giant and supergiant stars were first measured directly during the 1920s, using the principle of interference of light. Only bright stars with large angular size can be measured by this method. Provided the distance to the star is known, the physical radius can be determined.

Eclipsing binaries also provide extensive data on stellar dimensions. The timing of eclipses provides the angular size of any occulting object, and so analyzing the light curves of eclipsing binaries can be a useful means of determining the dimensions of either dwarf or giant stars. Members of close binary systems, however, are sometimes subject to evolutionary effects, mass exchange, and other disturbances that change the details of their spectra.

A more recent method, called speckle interferometry, has been developed to reproduce the true disks of red supergiant stars and to resolve spectroscopic binaries such as Capella. The speckle phenomenon is a rapidly changing interference-diffraction effect seen in a highly magnified diffraction image of a star observed with a large telescope.

If the absolute magnitude of a star and its temperature are known, its size can be computed. The temperature determines the rate at which energy is emitted by each unit of area, and the total luminosity gives the total power output. Thus, the surface area of the star and, from it, the radius of the object can be estimated. This is the only way available for estimating the dimensions of white dwarf stars. The chief uncertainty lies in choosing the temperature that represents the rate of energy emission.

Main-sequence stars range from very luminous objects to faint M-type dwarf stars, and they vary considerably in their surface temperatures, their bolometric (total) luminosities, and their radii. Moreover, for stars of a given mass, a fair spread in radius, luminosity, surface temperature, and spectral type may exist. This spread is produced by stellar evolutionary effects and tends to broaden the main sequence. Masses are obtained from visual and eclipsing binary systems observed spectroscopically. Radii are found from eclipsing binary systems, from direct measurements in a few favourable cases, by calculations, and from absolute visual magnitudes and temperatures.

Average values for radius, bolometric luminosity, and mass are meaningful only for dwarf stars. Giant and subgiant stars all show large ranges in radius for a given mass. Conversely, giant stars of very nearly the same radius, surface temperature, and luminosity can have appreciably different masses.

Some of the most important generalizations concerning the nature and evolution of stars can be derived from correlations between observable properties and from certain statistical results. One of the most important of these correlations concerns temperature and luminosityor, equivalently, colour and magnitude.

When the absolute magnitudes of stars, or their intrinsic luminosities on a logarithmic scale, are plotted in a diagram against temperature or, equivalently, against the spectral types, the stars do not fall at random on the diagram but tend to congregate in certain restricted domains. Such a plot is usually called a Hertzsprung-Russell diagram, named for the early 20th-century astronomers Ejnar Hertzsprung of Denmark and Henry Norris Russell of the United States, who independently discovered the relations shown in it. As is seen in the diagram, most of the congregated stars are dwarfs lying closely around a diagonal line called the main sequence. These stars range from hot, O- and B-type, blue objects at least 10,000 times brighter than the Sun down through white A-type stars such as Sirius to orange K-type stars such as Epsilon Eridani and finally to M-type red dwarfs thousands of times fainter than the Sun. The sequence is continuous; the luminosities fall off smoothly with decreasing surface temperature; the masses and radii decrease but at a much slower rate; and the stellar densities gradually increase.

The second group of stars to be recognized was a group of giantssuch objects as Capella, Arcturus, and Aldebaranwhich are yellow, orange, or red stars about 100 times as bright as the Sun and have radii on the order of 1030 million km (about 620 million miles, or 1540 times as large as the Sun). The giants lie above the main sequence in the upper right portion of the diagram. The category of supergiants includes stars of all spectral types; these stars show a large spread in intrinsic brightness, and some even approach absolute magnitudes of 7 or 8. A few red supergiants, such as the variable star VV Cephei, exceed in size the orbit of Jupiter or even that of Saturn, although most of them are smaller. Supergiants are short-lived and rare objects, but they can be seen at great distances because of their tremendous luminosity.

Subgiants are stars that are redder and larger than main-sequence stars of the same luminosity. Many of the best known examples are found in close binary systems where conditions favour their detection.

The white dwarf domain lies about 10 magnitudes below the main sequence. These stars are in the last stages of their evolution (see below End states of stars).

The spectrum-luminosity diagram has numerous gaps. Few stars exist above the white dwarfs and to the left of the main sequence. The giants are separated from the main sequence by a gap named for Hertzsprung, who in 1911 became the first to recognize the difference between main-sequence and giant stars. The actual concentration of stars differs considerably in different parts of the diagram. Highly luminous stars are rare, whereas those of low luminosity are very numerous.

The spectrum-luminosity diagram applies to the stars in the galactic spiral arm in the neighbourhood of the Sun and represents what would be obtained if a composite Hertzsprung-Russell diagram were constructed combining data for a large number of the star groups called open (or galactic) star clusters, as, for example, the double cluster h and Persei, the Pleiades, the Coma cluster, and the Hyades. It includes very young stars, a few million years old, as well as ancient stars perhaps as old as 10 billion years.

By contrast, another Hertzsprung-Russell diagram exhibits the type of temperature-luminosity, or colour-magnitude, relation characteristic of stars in globular clusters, in the central bulge of the Galaxy, and in elliptical external galaxiesnamely, of the so-called stellar Population II (see Populations I and II). (In addition to these oldest objects, Population II includes other very old stars that occur between the spiral arms of the Galaxy and at some distance above and below the galactic plane.) Because these systems are very remote from the observer, the stars are faint, and their spectra can be observed only with difficulty. As a consequence, their colours rather than their spectra must be measured. Since the colours are closely related to surface temperature and therefore to spectral types, equivalent spectral types may be used, but it is stellar colours, not spectral types, that are observed in this instance (see colour-magnitude diagram).

The differences between the two Hertzsprung-Russell diagrams are striking. In the second there are no supergiants, and, instead of a domain at an absolute magnitude of about 0, the giant stars form a branch that starts high and to the right at about 3.5 for very red stars and flows in a continuous sequence until it reaches an absolute magnitude of about 0. At that point the giant branch splitsa main band of stars, all about the same colour, proceeds downward (i.e., to fainter stars) to a magnitude of about +3 and then connects to the main sequence at about +4 by way of a narrow band. The main sequence of Population II stars extends downward to fainter, redder stars in much the same way as in the spiral-arm Population I stars. (Population I is the name given to the stars found within the spiral arms of the Milky Way system and other galaxies of the same type. Containing stars of all ages, from those in the process of formation to defunct white dwarfs, Population I stars are, nonetheless, always associated with the gas and dust of the interstellar medium.) The main sequence ends at about spectral type G, however, and does not extend up through the A, B, and O spectral types, though occasionally a few such stars are found in the region normally occupied by the main sequence.

The other band of stars formed from the split of the giant branch is the horizontal branch, which falls near magnitude +0.6 and fills the aforementioned Hertzsprung gap, extending to increasingly blue stars beyond the RR Lyrae stars (see below Variable stars), which are indicated by the crosshatched area in the diagram. Among these blue hot stars are found novas and the nuclei of planetary nebulas, the latter so called because their photographic image resembles that of a distant planet. Not all globular clusters show identical colour-magnitude diagrams, which may be due to differences in the cluster ages or other factors. (For a discussion of other aspects of colour-magnitude diagrams for star clusters, see star cluster: Globular cluster.)

The shapes of the colour-magnitude diagrams permit estimates of globular-cluster ages. Stars more massive than about 1.3 solar masses have evolved away from the main sequence at a point just above the position occupied by the Sun. The time required for such a star to exhaust the hydrogen in its core is about 56 billion years, and the cluster must be at least as old. More ancient clusters have been identified. In the Galaxy, globular clusters are all very ancient objects, having ages within a few billion years of the average of 11 billion years. In the Magellanic Clouds, however, clusters exist that resemble globular ones, but they contain numerous blue stars and therefore must be relatively young.

Open clusters in the spiral arms of the Galaxyextreme Population Itell a somewhat different story. A colour-magnitude diagram can be plotted for a number of different open clustersfor example, the double cluster h and Persei, the Pleiades, Praesepe, and M67with the main feature distinguishing the clusters being their ages. The young cluster h and Persei, which is a few million years old, contains stars ranging widely in luminosity. Some stars have already evolved into the supergiant stage (in such a diagram the top of the main sequence is bent over). The stars of luminosity 10,000 times greater than that of the Sun have already largely depleted the hydrogen in their cores and are leaving the main sequence.

The brightest stars of the Pleiades cluster, aged about 100 million years, have begun to leave the main sequence and are approaching the critical phase when they will have exhausted all the hydrogen in their cores. There are no giants in the Pleiades. Presumably, the cluster contained no stars as massive as some of those found in h and Persei.

The cluster known as Praesepe, or the Beehive, at an age of 790 million years, is older than the Pleiades. All stars much more luminous than the first magnitude have begun to leave the main sequence; there are some giants. The Hyades, about 620 million years old, displays a similar colour-magnitude array. These clusters contain a number of white dwarfs, indicating that the initially most luminous stars have already run the gamut of evolution. In a very old cluster such as M67, which is 4.5 billion years old, all of the bright main-sequence stars have disappeared.

The colour-magnitude diagrams for globular and open clusters differ quantitatively because the latter show a wider range of ages and differ in chemical composition. Most globular clusters have smaller metal-to-hydrogen ratios than do open clusters or the Sun. The gaps between the red giants and blue main-sequence stars of the open clusters (Population I) often contain unstable stars such as variables. The Cepheid variable stars, for instance, fall in these gaps (see below Variable stars).

The giant stars of the Praesepe cluster are comparable to the brightest stars in M67. The M67 giants have evolved from the main sequence near an absolute magnitude of +3.5, whereas the Praesepe giants must have masses about twice as great as those of the M67 giants. Giant stars of the same luminosity may therefore have appreciably different masses.

Of great statistical interest is the relationship between the luminosities of the stars and their frequency of occurrence. The naked-eye stars are nearly all intrinsically brighter than the Sun, but the opposite is true for the known stars within 20 light-years of the Sun. The bright stars are easily seen at great distances; the faint ones can be detected only if they are close. Only if stars of magnitude +11 were a billion times more abundant than stars of magnitude 4 could they be observed to some fixed limit of apparent brightness.

The luminosity function depends on population type. The luminosity function for pure Population II differs substantially from that for pure Population I. There is a small peak near absolute magnitude +0.6, corresponding to the horizontal branch for Population II, and no stars as bright as absolute magnitude 5. The luminosity function for pure Population I is evaluated best from open star clusters, the stars in such a cluster being at about the same distance. The neighbourhood of the Sun includes examples of both Populations I and II.

A plot of mass against bolometric luminosity for visual binaries for which good parallaxes and masses are available shows that for stars with masses comparable to that of the Sun the luminosity, L, varies as a power, 3 + , of the mass M. This relation can be expressed as L = (M)3+. The power differs for substantially fainter or much brighter stars.

This mass-luminosity correlation applies only to unevolved main-sequence stars. It fails for giants and supergiants and for the subgiant (dimmer) components of eclipsing binaries, all of which have changed considerably during their lifetimes. It does not apply to any stars in a globular cluster not on the main sequence, or to white dwarfs that are abnormally faint for their masses.

The mass-luminosity correlation, predicted theoretically in the early 20th century by the English astronomer Arthur Eddington, is a general relationship that holds for all stars having essentially the same internal density and temperature distributionsi.e., for what are termed the same stellar models.

Continue reading here:

star | astronomy | Britannica.com

Astronomy Nights | Astronomy | Mesa Community College

Mesa Community College invites the public to explore the amazing Universe in our state-of-the-art Planetarium! Our Astronomy Nights are the first Friday of the month during the Spring and Fall semesters. Planetarium shows run from 6:00-10:00 PM, typically every 30 minutes.Tickets are FREE and available on a first come, first served basis.

Wonders of the Universehighlights astronomical sights like the Big Bang and colliding galaxies in the far-off Universe, then brings the audience on a tour of the sights in our amazing Milky Way galaxy.

Stars of the Pharaohsillustrates the interconnected relationship between the beliefs of the ancient Egyptian people and the starry sky above. Learn about their temples and tombs and how they understood and revered the stars and their motions.

Note: This show runs on a 45 minute schedule, so there will be fewer showings on this night. We will try to accommodate as many guests as possible.

On Monday, August 21, 2017, a total solar eclipse will be visible from the U.S. mainland for the first time since 1979! The path of the Moon's shadow, known as the path of totality, will stretch from Oregon on the west coast to South Carolina on the east coast.

Details of the eclipse timing and path and information on safe viewing can be found at NASA's Eclipse website and Eclipse2017.org.

Here in Arizona, the Moon will not completely cover the Sun. But the Moon will still cover a maximum of 70% of the Sun at 10:34AM as viewed from Mesa. We are working on plans for an eclipse-viewing party at MCC. We will post additional information over the summer as the eclipse day draws near.

Stay tuned!

Please note that tickets are FREE andfirst come. first serve only. You are welcome to pick up tickets after5:30pm at the Planetarium entrance. Other activities are available for visitors waiting for their show time.

Groups of 25 or more are encouraged to click on the "Schedule a Private Visit" button on the left side of this page to schedule a customshow atthe Planetarium.

During Astronomy Nights, we also offer telescope viewing of the Moon, planets, and other celestial sights. Unlike the planetarium shows, telescope viewing will only occur if the skies are clear. The line starts 10 minutes before show time.

The planetarium is wheelchair accessible.

Astronomy Nights are held at the Physical Science Building (PS 15) just east of Dobson Road, on Planetarium Way between the Dobson-US 60 interchange andSouthern Avenue. Please see the map for our location on MCC's Southern & Dobson campus.

The Planetarium is located on the south side of the Physical Science Building and is labeled with "PLANETARIUM" in large black letters which are visible from Dobson Road and Planetarium Way.

Free parking is available in the lots south of the Physical Science Building. Please avoid parking in the spaces marked "EMPLOYEES" until after 6:00 PM.

Follow this link:

Astronomy Nights | Astronomy | Mesa Community College

Brown dwarfs are as plentiful as stars – Astronomy Magazine

It seems that for every star that ignites, there may be a failed star.

A recent study by researchers at the University of Toronto found that the Milky Way may be home to 100 billion brown dwarfs which matches the projected head count of 100 billion stars in our galaxy.

A brown dwarf is a so-called failed star because it never ignites in such a way as to fuse hydrogen into helium, which creates the hot, bright engines we know as stars. Instead, brown dwarfs fuse hydrogen into heavier isotopes like deuterium, if they fuse anything at all. They typically are gaseous objects about 13 Jupiter-masses or above, and form like stars rather than planets. (Most planets start as a rocky body before gathering envelopes of gas.)

The researchers performed an extensive survey of RCW 38, an ultra-dense star-forming cluster around 5,500 light-years away. Most stars that form in the region live fast, gain mass, and die young in a supernova explosion. But within the cluster, the researchers found the same ratio of brown dwarfs as in five other surveyed clusters going back to 2006, many without the same extreme conditions as RCW 38. In other words, there seems to be a fairly uniform distribution of brown dwarfs across the galaxy, regardless of environment.

Weve found a lot of brown dwarfs in these clusters. And whatever the cluster type, the brown dwarfs are really common, Alex Scholz, an astronomer at University of St. Andrews, said in a press release. Brown dwarfs form alongside stars in clusters, so our work suggests there are a huge number of brown dwarfs out there.

The bare minimum estimate is that there are 25 billion brown dwarfs in the galaxy. But because brown dwarfs are hard to detect some are frigid and emit no light at all that number climbs higher and higher. The third-closest stellar system to us, Luhman 16, consists of two brown dwarfs. Despite being only 6.5 light-years away, the pair went undiscovered until 2013. In fact, of the 40 closest stars (loosely termed), 15 are brown dwarfs and all but one were discovered this century.

Further studies of brown dwarfs and low-mass stars could help determine what causes some stars to thrive and others to fail. In the meantime, were not mad. Were just disappointed.

Read this article:

Brown dwarfs are as plentiful as stars - Astronomy Magazine

Astronomers find two classes of gas giant planets – Astronomy Magazine

According to the NASA Exoplanet Archive, astronomers have found 3,498 confirmed exoplanets as of June 29, 2017. Of those planets, 679 have measured masses, and 281 have masses greater than 300 times that of Earth (Jupiters mass is nearly 318 times that of Earth). As more planets circling other stars are discovered, astronomers are now hoping to use the increased statistics to understand how those planets form in the first place. And recent work has now found evidence for at least two formation mechanisms behind the growth of giant planets in extrasolar systems.

The work, published July 3 in Astronomy & Astrophysics, focuses on data gathered by a team at the Instituto de Astrofsica e Cincias do Espao (IA) in Porto, Portugal. Based on information about both the exoplanets that have been discovered and the stars around which they circle, the team at IA found evidence for two types of giant planets, each with its own formation scenario.

Our team, using public exoplanet data, obtained interesting observational evidence that giant planets such as Jupiter and its larger mass cousins, several thousand times more massive than the Earth (of which we do not have an example in the Solar System) form in different environments, and make two distinct populations, said Vardan Adibekyan of IA and Universidade do Porto, a co-author on the paper, in a press release.

These populations are divided by planetary mass: The first population is lower-mass giant planets less than four times the mass of Jupiter; the second is giant planets ranging from four to 20 Jupiter masses.

The team found that the lower-mass gas giants form around metal-rich stars via a process called core-accretion, during which a rocky or icy core is formed first, which then attracts gas from the surrounding protoplanetary disk to form a gas giant. (In astronomers parlance, any element heavier than helium is considered a metal; our Sun is considered a relatively metal-rich star.)

Alternatively, higher-mass gas giants seem to form via instabilities that occur in the protoplanetary disk, without first developing a core. Instead, these instabilities cause portions of the disk to condense into giant planets. These planets are also more likely to form around more massive but metal-poor stars.

The result now published suggests that both mechanisms may be at play, the first forming the lower mass planets, and the other one responsible for the formation of the higher mass ones, said Nuno Cardoso Santos of IA and Faculdade de Cincias da Universidade do Porto, who led the research.

The fact that more than one formation scenario exists affects the type of planets we expect to see, as well as where we expect to see them. Furthermore, determining how planets form and the environmental factors that play a role during this process will help astronomers and planetary scientists better understand how our own solar system formed. Both current and future missions, including GAIA, TESS, and JWST, will continue to provide constraints and insight on planetary formation throughout the galaxy.

Go here to see the original:

Astronomers find two classes of gas giant planets - Astronomy Magazine

Rolling with the shutter – SYFY WIRE (blog)

A couple of weeks ago, I posted an article about a very weird video effect I saw when I was in a small airplane: The propeller looked like it was in several pieces, with parts of it apparently hovering off the plane. This is obviously not something physically happening to the propeller, but is instead an artifact, an effect occurring inside the camera.

In my explanation, I said it was due to two effects: shutter roll and aliasing. Shutter roll has to do with how the digital detector rapidly scans the scene row by row, which can cause weird warped distortions in rapidly moving objects. Aliasing is when the video frame rate of the camera beats, or resonates, with a cyclic motion in the scene (like a wheel spinning). Although I dont say so explicitly in the article, I wound up implying that aliasing was the bigger of the two effects.

Heres the video:

In the video, I actually didnt mention shutter roll for the simple reason that at the time it slipped my mind! Mea culpa. Thats one reason I wrote the article; so I could add that in.

But my friend and fellow science communicator Destin, who makes the fantastic Smarter Every Day video series, has (with the help of another friend, Henry Reich of Minute Physics just put out a new video that explains rolling shutter extremely well. I mean, like very very well. The footage is simply stunning, and you really should watch this whole thing, because its so cool:

How about that? Ive seen a lot of these effects before, but the guitar string and coin spin were new to me. Henrys animations really bring home how the scanning of the shutter stretches out or compresses the motions of objects in cameras.

They also put together a behind-the-scenes video with more technical details for those of you who, like me, love to dig into the bits (haha) of digital imaging:

So, the weird distortion is due to rolling shutter, and the multiple dissociated propeller blades are due, in part to aliasing (note how when he changes the scan rate you see a different number of phantom blades).

At one point, near the beginning of that video, Destin says quite rightly that Henry is a wizard. He really drives home how this works.

I was surprised to feel a strong pang of nostalgia watching the second video. After I got my degree, I worked at NASAs Goddard Space Flight center helping to calibrate a camera that was being built to go onboard the Hubble Space Telescope. Called STIS, for the Space Telescope Imaging Spectrograph, it was an incredibly advanced machine, with three separate detectors and vast array of filters and spectroscopic settings. My job was to understand its performance: Literally, photons go in one end, and data (brightness, color info, and more) come out the other. What happens in between? If you want to fully understand what youre seeing in the images and spectra, you have to know whats happening inside the camera.

I used software (IDL, for those of you fluent in ancient languages) to do this analysis, and many times those of us working on this had to dream up odd ways of taking the data and manipulating it so we could understand it better. Watching Henry work reminded me strongly of that, and Ill admit it made me smile. The first idea I came up with to show the rolling shutter effect would have worked, but wouldve also been inefficient. Henrys method using a temporal gradient mask is way more efficient. Even as I write these words a part of my brain is chewing over how Id do this in IDL.

You can take the programmer away from Hubble, but you cant take the programmer out of the brain.

So, I apologize for my first article not being more clear on how this works, and Im delighted to be able to showcase Destins and Henrys work here. And the point I made in the article remains the same: Seeing is not believing, and what you see is never, ever what you really get. Cameras change whats really happening, inevitably, and if you dont understand how, youll be at the mercy of those who are trying to fool you when they say, The camera doesnt lie.

Because oh my, yes it does.

Read the original here:

Rolling with the shutter - SYFY WIRE (blog)

Scientists Discover ‘Doubly-Charmed’ Subatomic Particle – Seeker

European scientists said Thursday they have discovered a new subatomic particle containing a never-before-seen combination of quarks the most basic building blocks of matter.

The particle, a baryon dubbed Xicc++, contains two heavy "charm" quarks and one "up" quark, and has about four times the mass of a more familiar baryon the proton.

The particle is predicted in the Standard Model of particle physics, and its discovery was "not a shock," said Matthew Charles of the LPNHE physics lab in Paris.

He is one of about 800 scientists to attach their names to the discovery by the Large Hadron Collider (LHC) of the European Organization for Nuclear Research (CERN).

The collider is most famous for discovering the Higgs boson, which confers mass on matter.

The new particle is the first seen with two such heavy quarks, said the team.

There are six types of quark, with exotic names such as "charm," "strange," and "beauty."

The "charm," "top," and "bottom" quarks are the heaviest types.

Quarks make up baryons such as protons and neutrons that comprise most of the mass in the known universe.

Baryons gather together in atoms, which form the molecules that constitute matter.

"This type of particle, these doubly-charmed baryons... they've been quite elusive," Charles told AFP.

RELATED:The Mystery of How Black Holes Collide and Merge Is Beginning to Unravel

From their short-lived existence in the early Universe, none are left today. And to produce them in the lab requires an extreme concentration of energy, such as can be generated by the new, upgraded LHC.

The Xicc++ is an unstable baryon, said Charles. It lives for "a very small fraction of a second" before decaying into other, lighter particles.

Its discovery will allow scientists to continue testing the Standard Model of physics the mainstream theory of the fundamental particles that make up matter, and the forces that govern them.

It does not, however, explain dark matter, or why there is more matter than anti-matter in the universe.

Critically, the model is incompatible with Einstein's theory of general relativity the force of gravity as we know it does not seem to work at the subatomic quantum scale.

"A big part of our work as a field is trying to put our finger on the place where the Standard Model breaks down," to eventually find alternative explanations, said Charles.

"We're testing things in as many different places as we can," he said. "One of the things we... will be able to do with particles like this is to use them... for making further tests."

Go here to read the rest:

Scientists Discover 'Doubly-Charmed' Subatomic Particle - Seeker

New "Dark Universe" Telescope Detects Optical Signals of … – The Daily Galaxy (blog)

A state-of-the-art telescope for detecting optical signatures of gravitational waves - built and operated by an international research collaboration, led by the University of Warwick - has been officially launched.

GOTO is an autonomous, intelligent telescope, which will search for unusual activity in the sky, following alerts from gravitational wave detectors - such as the Advanced Laser Interferometer Gravitational-Wave Observatory (Adv-LIGO), which recently secured the first direct detections of gravitational waves. Gravitational waves are ripples in the fabric of space-time, created when massive bodies particularly black holes and neutron stars orbit each other and merge at very high speeds.

These waves radiate through the Universe at the speed of light, and analysing them heralds a new era in astrophysics, giving astronomers vital clues about the bodies from which they originated as well as long-awaited insight into the nature of gravity itself.

First predicted over a century ago by Albert Einstein, they have only been directly detected in the last two years, and astronomers' next challenge is to associate the signals from these waves with signatures in the electromagnetic spectrum, such as optical light.

This is GOTO's precise aim: to locate optical signatures associated with the gravitational waves as quickly as possible, so that astronomers can study these sources with a variety of telescopes and satellites before they fade away.

GOTO is a significant project for the Monash-Warwick Alliance, through which the construction of the telescope was partially funded. The Alliance combines the exceptional research and teaching capabilities of two world-class universities to meet the challenges of the 21st century.

Dr Danny Steeghs, from Warwick's Astronomy and Astrophysics Group, is leading the project. He comments: "After all the hard work put in by everyone, I am delighted to see the GOTO telescopes in operational mode at the Roque de los Muchachos observatory. We are all excited about the scientific opportunities it will provide." Dr. Duncan Galloway, from the School of Physics & Astronomy at Monash University, comments:

"GOTO is very significant for the Monash Centre for Astrophysics. We've invested strongly in gravitational wave astronomy over the last few years, leading up to the first detection announced last year, and the telescope project represents a fundamentally new observational opportunity.

"It's really satisfying seeing a research collaboration that we've build over many years coming to fruition in such an exciting way, and we couldn't have got here without the support of the Alliance and the participating universities."

GOTO is the latest addition to the University of Warwick's astronomical facility at La Palma, which includes the SuperWASP Exoplanet discovery camera - the most successful ground based exoplanet discovery project in existence.

The Daily Galaxy via University of Warwick

Read more from the original source:

New "Dark Universe" Telescope Detects Optical Signals of ... - The Daily Galaxy (blog)

Is Artificial Intelligence Over-Hyped? – MediaPost Communications

Worldwide spending on cognitive and artificial intelligence (AI) systems is predicted to increase 59.3% year-over-year to reach $12.5 billion by the end of 2017, according to an International Data Corporation (IDC) spending guide. That number is forecast to almost quadruple by 2020, when spending on AI is predicted to reach more than $46 billion.

If personalization was the marketing buzzword of 2016, then 2017 is the year of artificial intelligence. As more cloud vendors tout their own AI systems, however, could AI be over-hyped?

Joe Stanhope, vice president and principal analyst at Forrester, says a cultural dissonance exists with AI, thanks to science fiction, and that many people have a preconceived notion about what artificial intelligence really is.

A lot of people are talking a big game about AI and how it will change the world, but today its only applied in extremely discrete ways, says Stanhope. Theres a lot of hype around it.

advertisement

advertisement

Stanhope says this overexposure creates a dissonance, compounded by marketers trust issues with AI.

Marketers have a right to be skeptical about artificial intelligence, says Stanhope, adding that it is imperative that they begin to educate themselves about AI, since it is highly complex and difficult to understand without a doctoral degree in statistics, math or engineering.

Stanhope recommends that marketers become "educated about AI techniques and algorithms to develop a functional understanding of how it works. By educating themselves, marketers can be more critical of vendors AI-driven applications.

AI gets thrown out quite a bit, but marketers need to get to the point where they can ask, 'what can your AI do for me now'?" says Stanhope. You need to be able to ask, and they [vendors] need to be able to define and validate that question.

Although it may not be as exciting as changing the world, Stanhope says there are very realistic applications for AI today. Humans have become the bottleneck in marketing, he says, and AI has the potential to make marketers and marketing better.

AI is an efficiency play, says Stanhope, describing how it helps marketers manage data, experiment with segmentation, and takes out the human drudgery of menial tasks. He recommends that marketers dip their toes in AI by applying it to one existing use case first, and then broadening the scope when new use cases become avA.I.lable and trust is built.

Youre not turning over the whole marketing team to a computer, says Stanhope.

Stanhope also recommends that AI marketers investigate whether their ESP offers some sort of AI function, as it is easier to evaluate an add-on solution than find a completely new product.

Continued here:

Is Artificial Intelligence Over-Hyped? - MediaPost Communications

A ‘Neurographer’ Puts the Art in Artificial Intelligence – WIRED

rF(zbZ]5(^-TU.fK:=} $Qh#f?6/abI2H(JI += o=<[_=+jx_>e'2}Bxt%9v~=/2ez^ !v$y6G{_x.uV8&W|vxLZYz!;2mcn^m%mLAmcWl?UJM0R0A_=Dq4_%b$iKwmtd sYy7hhjd,yR/n;!Led4d&L-?IopSh&h^xr^(hkS ;en.&uvIPWlOt(2nzb0?y5"B;qwtg*{XT We,maY5IsyCs!uRtFAq-fs29Cywm{hM5rrLX#On&G?]dC,`{`<{b3M7c/_=~G`X$[{/a@*{tVS=]8;=bOgAc^e4D2?%L~c'wbKe<_Qb { Qz(Je{_Lz/zR,N9_9/0F L#cfHL0}4?ARX7wt[o'x)+&/d"~4yf>k4[QM=GrECnL4K/Mzh=a?*xu 86a!~D"D'@_{F7lA+o{SQ=#`? HH}ONX40]6Q#V~,VKX~;Emkr8CM:wBm`<+V.dAm?;C]tPq'>)lWkw'K;~^u%Oel 8q?saT<) /]<=o J`+|P1Hek1s?q=%kzT{ (W2{ `$>yWi8?:m=KrsVcPZU*#A=qzyot]3ae~ 4]oyq(mYG6*qqDKD:Q.e{@0 L;v]e##)|_Uggsh&N3 !+c Hx_^ :mTAtLY[pgF`l-4-4 /0gQaE!GJqD yn6z1<,I*BCl p|.^N>;gimi~?w!T j9HVZ#Z>Kx^48l"e((TJi+Hua"t*$a<|W"}{/Q $ub0kmv7j:1A mv/:;Hz=Q x,FvD}"eD;>T+5:ZTtv ~;rPfgkH}Tp47MOCCy_zzPH[MX98YQ =j{f9u9Bg`v^o1 OYV{ "CZ;?C%VSow(:@z(HOA||yQJvEIY<@r7 r@!(UA'q+y703E;'=HjmisuaJRoQqc6Xs.m M'lG0n.lD uu0'GQJnVn2l#duGynUQ;A59FFlo1jav.L vN ,d4,iUFQuP94q'iOPCmr %lC?bWu_ seF>5ko5qMUt$^)p*9o-i6y@b^UG%z0; {|q (i/G8ncCKHc H3wApNWJg^4FmPfYRR>5&fNFv`{-e?GxF=c(=N2Dj FoC/J6KTt()`r$uA7$|$6ik=&|fy)e)JF#,Q/{#y D@$]W6 h{7xDHoJyEqsD<]>edS/TwA]=tc^ ]!3 /]mXoH't*y!'i}WG(+:7#tBgII>xK|2 VlyZ_*Wsp<;m,O6)y?y@m#N14yF-M/Fg1q4W |'7X:1x87 Q*$4h6*j&&(1@n6!]Af(aVb?]/ne:,80 dT7Box}Pe!`A-*SLzmKtdq/Z1y{gJ 0P@O 'h.fod 5p+@$^mCl[(Yv+|a/TW8/>!!$rS_$^)!O#q}"e>= L ,c3=s,XwFgh#EQsS6 u6B27;c&[';4*G73#^P4aamt//ngdaM(831pdypP.),c~^t4oW;mgC'v':w3.xNM]t26towi&v'8M8]tb&;MtcoM7Cs]tcu$I7`;f# @#h~>xz#?jf=~|M64"cS DQM1^a<=wW5m6Pz8Fu~ ?v?c7~m/!G/6:tsuei(Kv3|xQ` %MQ5:1<@5jK]51 T)b2O`]0[b9cqchj2, 1]eede{wCAq-Ra*e}-lz#TR}=fVmX7iftY]i@~Xf#;It00+-EWx ?E`.bD[W,E@jVRK*N^-,llqrV v)P4[1FQ |j$^cq~s| w(!% a&$R${ |<]4DEN2vvJgy# '|jn(58-a^aH+-!?B4:0T i+FI2tp.,JKy8V3"<;xF/=%&y0%. Lfh4?qW0NcLh5-op&F*Qy OMleGaOM~F});I 8Cs V TwoFBDQ.#R"@A'[pGPzkTXMyAg"^Dqk.(s'8x+t:t}no).nOdSt$R"NT1w ,+7[D^ /bA`4 >Kw)Pd@)&Gs<'h6SlT+ -*h-c qKE5!T>I7`)I g(B J)TlEJ,Lw2jh>:ehy=x` Y%5z9h6bDiE'GgG=(bp(}B LDlrNY;!?;TZBIt1`)PgaZ3sQYe,K]rA>8lV:w,$>sOQdeX~(&-Up.tqE+3.]}@r":!v >>cd[g67T `QWx=mMuy33t]2xy 6kl.nn[/3lCoP^2tW]5 Qi@R 6fJ v|5M_rK&[W])r|/GGJG_7gz,X/MkkvK}q/]Pml>FA 0c)0y3%?HAu b}Blq_|'$p[3]?I`coK`~%5cgE`6,Nn0|}G {R"D0xfl+q[9g|w2Ca)yas[3HtZ,& 0'q>k@87w{ _`d8xY96gbJu/NV)(@sE.hq$`|GX.wh aKH[mmq;R05v^jpdCmk'k({qG[M+d}pRWhBk~o|}&^&et~`vEy@HN1' '>avN^397 GP=x2qD.NB.)s`MicZI89O;$/DR?P#8^aT^jK}R' emq*A-"m}zb#_,yxDr?$$]G;If{&f{}<8Cv<^/EyfO(@{F hgSKF;]&FD9.OCRAWn-,/}fw:wb5Q@9/^9Afh/Bvi$kC&P?B w<[,Pq0c1X~$'Lg~&o,}6t$n2Kr?bF%/Y0GG{gRc3E}Bj]90; fqTLp,R!HaE"|UO1{3]R![$h3Eu>COl;CNDn @1*DQxr&m<_,xX &>O[ T ncm fS Nyz>S=I):FX F4S~PH8.9Z.Q?I,v[NCK6M>17s8Wz(f)]"Y*sNbqus|u.BjZMY_%0p|=sIE}{MxrUzG=V1:y}&zAe/ox)>?kvC :7O;0V-`=]RIpG_v+0<@D-!F|pmRRD$Id]&!"SoE,-J1Xn-Eq,PK14=Xq$V`|y%Ei$ EKjcP_pY vy`f|[Y3n5;1g5[isV?qsl +: +!!gVtp{tI479&+v!LUHh%1SV`8JN5lcNlOAO[9Rn|*N&ec!;/X=sc}}zGdwfSnVRu{0|^q/iXzznE2mizt4SSbPHW,X;K1vN=>$v^<{2wun=bY|-RH@CM@G"4:T>#rth x$>_akZtv]X:X:]Kf[fn%E53v/T"_i&>C '(8)'D[63hSSz7LiBs,_H-Lf;8;INIn][-I=}< N~Zy8DSh*EK2/Rhwp-h,}X?)&>l;d#N,~)~EbW08y3EZ'STTYmXnIqL6%Y;p1Lj5U%Low~!?K4VgCtk}gf|#7qXwyr8/68hw1oNU[Q~uy?#^r^!opFwLt<5}6g'^I`E8;FC4vilfTbAITL{5^gwx>v'at#hb7B>y|o'7&i'BpqB=a3v!w}=|GBW!x4%{{GOy0DtvyE='x= JoaHKPoI^ M_ wPA;NtO:i.1VD00:;#_.'VQ+Zc$3 w5/.R=8)M|tF(=uGl2.Aiv`YVcE1y=K=QCn$8$ ey6(;gcny!n]i*6{0V[4bfd22P[4gQVjGl`4EzVj_i=VEHMck7 m~7$i ;h6E2)Fys=^y6ym:ZXJ dQMl[2s hcO K$0yY$<3!!?p>ms2uKO9;z-X6a3&TKX5>I6POxz mrSYj[ >,tslD;oCz{xqxR$zM9g]-&NI'cx&wa LR7*3Pve&G=(ZM?_] V*A!vb>@sZzoWK(.lyyv0`B@uq"@c 0$fz/5{LX|&B uRur1Ypp+]ntr}Fj+t!JYas#@uW6 ]rY!q > NwFjk{ Vb$oG G<:vaY#pQ @c]M5``tT3`YZzSL ,>be0-3;p>^etPHAs:@P`zJuw(o+{ +4At my{3l#T4twR{.51x`!{*2uh o564Ic3!]>r 4=4D'xli*/h~"Lcm0-k84q0,^_c<`N4)_r3UaGV4@1qo>mA2{rHN6FMcO ST>[ Vkx3g 12_5mC+z8KNkk'[}2k2Gl:a_3>Cz,m?e_wkJC< s[=C>({nL$ ?-hrvI0^N-`$ OpZgJ]&{?|t(]{~x7PD9KVXix}DP{a:!uF=/sBXWT{Z%`|I/)pYD4?cb8_j*on<~V;<"%RZ?a?; W?r V]V^`tAgJ$n1wq0i"-=ZEkxP B} "?^./]<8x~y MPSO-&>f |e%-Kp/m =_:L0eE8/Cs0/M]d`Y#xdTD42#?dl6oc3KB=)3D Z*JPm) :|I E&9vw@1|Lu ?@|+ d>~|mt$Ui4KF5 lcpT #!jS>>%=Q"w(+#X SZ 7^-z( Zo() /@wA]16< }h|"BSa!Ym$4(8a(A)[^ P3Oj/7T/lUQm'AL+_ w4 r.%TaY=tzAI1f2gdj3w28`,S!zt(01&mjZ?Y[%UUnUT.q|&EF@9zYs7?Hx43K+zwW_S 0mqzrV PMxGG P`Y0SL7.JBE*3Qa -TZ)vdF|U(w#&;nruT~"~(gz#pxLGnR (gVr:TQD(g}(gnru+I[Z.xp9??kTAw;tOsdxbk z.=w>a7M$^.natkxF-aW4~NON:VLy"*i^ t5 :YD 9hr`]au+W>q7>!8WvX-<;0D# SE0,pycu^JZB-vl/=ZTU8Uc@2jABBsD[3N;/8Oe^%}B.zrbWCwR'8}'4SPhF_uTAqSko-2tK0dk[ M5.o,^:C3&L t-Ek25 -0CB"`N o 't 20a:`e{5L1GczQ17C=^| R"~A3'},)J=uIuh6+.V[w<1NZbl XHB7L|hs0xeNx!l1J9 1("FPl'h5Hq ]"-0@Su)|bF8GA*]q[a&aw U: n03]ItE31fciE' W soNO.RE0N$``*2t:).uD.MpD"d&Q[F?$0DSLD^)`9NN&G} 8g.(*XARe7%IDn#rRfIcW9)fT~n3;p<;6U6^x3:(."@S>^gKNt$w2lQ;kb)mYSY%wfI,(,F$AUw%KQ-xOGr|3>'WH?qk;'T">f [A.]:6aTPb=<9,!w'n@NE"KaC$&=Q)&*VF.@?%9SW5, nAxA%!dqy*G$L1i.Q*t<)yPE[-OUrW|BT2?AVC R^U0 a(TA s/,K*!^Y[R$58MIm@je,i|rrJP lVEB 9)zd+4YvupU /qv`uAZ"z&gnrW N|?Rl>Q~I7|E9$jrlDb/([xv$-"GQ#M9AH[T/D/NK$@iarU.AAW{^.Fm;J^0oMt[p+QP(!6sU8x8pht|Ne+)P[r%SWg6G?.]UX,![D~ck+1H(+Jl:#Zrx#1iq/'v4Bc.&IF f4w1xo]U|^<)lC*q~|wT$V9EQ_2 CTB9B.cpiHjDTjOiSq"+w_A"Wz]H;qr(>c$(Cqo"N)0thmK%d1"r-@*v]`52.$w[ <@]4]!%nbptioF_k ?]uZr2#_B?U$!RQcP.| C_# 5=_?/1*(mVI^Sf'@_L4jO%ydWaWB9bEpUs@} gPaKT}eY~Ds?p |( >)"fg>r*ctPh2`w,FXDTs1B)b-{>|6/t&jXJ./Ou5.QDTd=wx3.',q"[}*C%&/0*Q<"f&Z_d"s+UHWLn85^.B`TXRIS7,PCM2!e!Z]KuR7i=uHK* o!GEzMLeZ;+V$QO,1xz# 'W:Un>+~s0HiSni2F_%y!q.q%gRIs#vk69E!)w"ODoSX1%k`1]CIRhT%fQAd&xMB8 ?A6rW""I0t+BK>L].&~Tg*"Sus1xTJWr>"~8N"|fx"E+q9T_O/12fDxDjyle ,LR.*"`l| HEoo1Hv|:c8W<037C/ JWb#%tgVkg`ty2lxerm)"}<+W"x0fQMT]XI_ ! Suc<5`U AGxi,Ty`f}XX. m|2#*;.)@nGOnX&hU(Ub%2otN7(T%YUKU#iA2]^9]^]Q@,xxy@ O@D^2+:',$aR|*osHW)}%TS^

#T0`2vH/F8+((&5=Q71mN@>%"[%]IeEas BqgP' 0b"fyHRk^:.TtLsQ-I K)3/l7nu[qn[r_QRz56,;1~bex@2YN>l6 >YL',[qOb /1mk1XD=T$;@$&'b7DyM|OngtQ{AUG$IAl-aHMMd?,u=|}7.bFkk;j.?Zi%c5eIVK@z[G/wX^2*!:$=hWFTX z9nzWkl H,:!0N2-_%*a hX#c{hF| W]/h>u]<`Gi$[#Bhj^<5I[@D<=7^j5Fsc*Sm:v8:;MZx7kLM~i+KS|n% +[*jX%5ZP!=X i>x)Kqn7M8Mi^hk UzN[ 'z2)o5{rRteD_om'$zMOAgRLqJ7/OoG7F_];a7hH)A^T5_v/*h7 Eg|r[uZYv b].C~5$m`9Tpr)iq@DX$@(" ^s[Po|iPQt?QOww/"]+#](x8-vjS_TdoDbq +t673xtoepx:j|Z_L 4n$^B$=i#X=24^Qwp~C?'DM7zCm @Uf)zf_a1:?Qts;Pt?EcFv2A7v4f|&S/*jxaks3NQ()X'H@~9^y3.)#Vo/sM2-nF;?f]xS.A#;D!=CpUDS fR>yWy65I9W$t7+ g?5`j[$];mH XN_E'1po87{CN.Yi`> /^7 r:e>c?vIjCnoH>tMvke5,sK&71Hy=t7zpv#6WSC&8-OVH*`|8&2=W"9pcC,l]MN_s_]:CbU/_^m@89V0h4LJn1?y/d@^Hk|ti3|<~Q"4If|3m'^.]8pw=?w+Zorr?kSW32T|/vb?Xow^+ fZ*}hp8>w{F@mm~3Cg!owG[Zwzrw$KQr1HBq_3IHR{C6zY-_Kw!iUyWUid@w3N|X!/-:X6|8um{K Fsu>ds8hG5>rkYA$Unn3 Y5l9txdf1*_CU}=a}r*!oS(X+aMjW2d0h 2ccZ@{,5KMs {V uo5IYM-w-&wur}Gy}=h;lGwo0KoB'GUAXqw@o{#z{<4:0G`y`Iv~3ccy~0ccEvKc/XXXXUjQRO}7 Ka"(FX>_Fe$DJIVY jX1bg$.-W{"cF+aM{jd*Se~{ea^![.hSoy?72z4F;&.,>2xmTqe1^)dt 6*EfPL)M.)&X2HHz+{'h lYTt=Q/)>Tjc!KoElZ&n'dUN-AHcyBgra,; Xeozm$xf!KXd^v| utTbS7i{@*'A6r. TR{C=Zxj>SVaY+-MBAd1Q Ia:"T1:_H%z,8 C`6dg$ztv(!A#z1]vM'q=A{0?.dyj *!D0M!,;G)A!/Wj6q=24aPfB$XqoDD}r 6n nP+2fmS]EL$!yfld-jeSj:n!Z@IXK?Lyl R49xxjOaQ=Vjc~Pa-"3-cSOVO/h4Gi8pX*n(Pf B+EOqN$R3NTR s AbUit3 ,HJA/V~PD1=UZuI@fb QL&9(%imU}(&S#LYil03!80L #qg9BeP,heyQwF_=}RJu%Mf:/L>Ni&SUxF(RcA1n%e10-utSo) kFdJNF'k"=@{R"&`-|?6>kF+bGT1g@x g6c]G~4n_IYP;h)|cEnflJF+/-_KeL"I-,95?$I@vUTF$IFPF#7ybWc}9:c)Qp xN{0CW^%~b"&5H{>EZ-b[_sY<91Qv/U,n0!>z$Zv.(N` +69>bH:YA&$NU$Xdb0aYj<5iOv[l {arR~ 5K>d.jtX1H~y6K}8t` cf@k(M|C[#bl v@&#EDlLn(W :yAw4"._ky d$ %95/=mA93hQ3m>}2Ygd-OD.|J1ilKqzSG 9P-b-^n)e) xn0{{++#{3)] 9)+J&YKi EEg{CqT^ xe0~Kx $XIQ"&:qP|Rw/O.'Qo8hga@]r$T0;5S2FdEswA>o(GD[5:p*Yav24?%z03YQzi(sY?6U77kb=0PGb'cAE%K@" 97L.4h+ I:(rHoVdm^H2Ak-m|e .PWE:&@E#qyyO3yOt%.BViR82N3Ao?4^{YDkfOEp' 9 w,AOw-5{rxN~|+w'=z?Lz?LOnxyla<<#?i{?xbT:174CyG4SSz[fk?wn~/k8mVkCnuV~hwuH)Hphz56/FCi2_d FPZ'R_Kk":X=ZfT7S. * /q.y.JN* =AB+I2sP}K9H1dsR-3vK}E!}sxy=U"3 )4K#Yj?%X){rg4ZG(4ZNRY4Q[}go1?.y`_<-!!NG 'm-^9)Rjo_j*7KCSz`s,V{9a[lSx?]W)emn-0V+qV1%l^U}//DclI$0}FX> v^[7&/@"fp3>%kRne0YQ[K1AwFk/AYt(AmLtXh|.9kLFh's?|StDHg* h:3w 2yQn@zV@}HWlcyeNQgLh,B3jL;wY!B.|orG@26S^n_v56_06y)R)rk#Q_p@)|2BE(H}6^(!yxl=xuoT5Tk5jbzS1HA2rS- $Ghzo` s$s| XS%^M Al)?> "MY.J) *g?[%G`nDx e2ij?S GsVb7pxK3FMV"ySY53ML4<>{]$[rr91y<-wQ1g&Pi4~"Bs}' gLSs6V%"MT.81IS^[u:Jh&pEw+n>Vnr!kwnSX f"lYP}lqX3i/l =bN.(J4Edg_6a3C]x^$v;f~y]w+s=r9j H*cXMCGb$!S8$2Q D S, KfUUnY^V)3~wZ@JBi7s%r03+)RoAP}iBfb +qNo-3N@c,$*0nD#D>8?brtTtu>Ok5HA_WUhVQja$ 3c?LwPBGw?XY//NI8R|+deYSkKK(Yb%~apR3N`JTZ`T/DZz $.;g.T][9g$|lnuh9]v9.GB1R+nb;Yu8+>W1e3~6[5T$ |zOl;fSWLa< ]3P+,|B@RLA k; $ S'7#IbyA?q@L( gsk%3U1_AYdk Nl,S^D>MA16 YO_:< z12)AQY z/^sZ2vZPw~YduZrk^Q!-?AQ25d An?#C*- 'DfRTVK+f%49]XUDqVI:KP 4i2cy]tjun$X*,83{L'(;:aGjJzI>XMQ]**9PHCoL[gda&(iRXML.K!l!0FXOX}(-5X%ZhB21e#Pui!"".bEuacUW98*M2ecI{zsqZo+XCK^{yxd(`yKh(5LS 'j.ExFx]R6E+:Zbp2U_sepq@ykkU!L]pK'iM4Hik;iMKv2>(CLjf,&[VE%[`o!uI5p [hD3V@u udl#zVBEh`Rg^a7[J7`Weh%NqT8x*(aC. ttKaC,C/U,b; 5h Um(E6Yg:|XMm%Bh&"&VLG1jm'*@5xOpoJSZ;?H~a:/jbqnVu6q nuzCDwnc=Kntz5xY.LP*+kr#a+{J+vPetQE1Y#y!6r)Cxzn!$c7 E;gVlZvXEV}CNA$rx{1`ao!0ISTp"$NE(QV9BFh !)M &^K(,C3u^k[W=VQ T#Ez,*}:xZ #MZZvZ$KBE+w&rm(v6{ph$/`cKE$rL f( 6! &Bi%qW1WPO2VzPEuX.?NBA_n^?Jn9InZ5Q/MWQ2[^k_plu-Ze)_2PDQfh$h+nYim"}6GRHa f@zE[oag.*nO(e,dV rxcxC r/v(Gx&A6$oRtQ:F9567=v8D a!}Z[866JE:%+$#3S]5&gTm-Vta&/sTXIg|TAzb_RCLpZ"+L<5

Go here to see the original:

A 'Neurographer' Puts the Art in Artificial Intelligence - WIRED

I, Alexa: Should we give artificial intelligence human rights? – Digital Trends

A few years ago, the subject of AI personhood and legal rights for artificial intelligence would have been something straight out of science fiction. In fact, it was.

Douglas Adams second Hitchhikers Guide to the Galaxy book, The Restaurant at the End of the Universe, tells the story of a futuristic smart elevator called the Sirius Cybernetics Corporation Happy Vertical People Transporter. This artificially intelligent elevator works by predicting the future, so it can appear on the right floor to pick you up even before you know you want to get on thereby eliminating all the tedious chatting, relaxing, and making friends that people were previously forced to do whilst waiting for elevators.

The ethics question, Adams explains, comes when the intelligent elevator becomes bored of going up and down all day, and instead decides to experiment with moving from side to side as a sort of existential protest.

We dont yet have smart elevators, although judging by the kind of lavish headquarters tech giants like Google and Apple build for themselves, that may just be because theyve not bothered sharing them with us yet. In fact, as weve documented time and again at Digital Trends, the field of AI is currently making a bunch of things possible we never thought realistic in the past such as self-driving cars or Star Trek-style universal translators.

Have we also reached the point where we need to think about rights for AIs?

Its pretty clear to everyone that artificial intelligence is getting closer to replicating the human brain inside a machine. On a low resolution level, we currently have artificial neural networks with more neurons than creatures like honey bees and cockroaches and theyre getting bigger all the time.

Have we also reached the point where we need to think about rights for AIs?

Higher up the food chain are large-scale projects aimed at creating more biofidelic algorithms, designed to replicate the workings of the human brain, rather than simply being inspired by the way we lay down memories. Then there are projects designed to upload consciousness into machine form, or something like the so-called OpenWorm project, which sets out to recreate the connectome the wiring diagram of the central nervous system for the tiny hermaphroditic roundworm Caenorhabditis elegans, which remains the only fully-mapped connectome of a living creature humanity has been able to achieve.

In a 2016 survey of 175 industry experts, the median expert expected human-level artificial intelligence by 2040, and 90 percent expected it by 2075.

Before we reach that goal, as AI surpasses animal intelligence, well have to begin to consider how AIs compare to the kind of rights that we might afford animals through ethical treatment. Thinking that its cruel to force a smart elevator to move up and down may not turn out to be too far-fetched; a few years back English technology writer Bill Thompson wrote that any attempt to develop AI coded to not hurt us, reflects our belief that an artificial intelligence is and always must be at the service of humanity rather than being an autonomous mind.

The most immediate question we face, however, concerns the legal rights of an AI agent. Simply put, should we consider granting them some form of personhood?

This is not as ridiculous as it sounds, nor does it suggest that AIs have graduated to a particular status in our society. Instead, it reflects the complex reality of the role that they play and will continue to play in our lives.

At present, our legal system largely assumes that we are dealing with a world full of non-smart tools. We may talk about the importance of gun control, but we still hold a person who shoots someone with a gun responsible for the crime, rather than the gun itself. If the gun explodes on its own as the result of a faulty part, we blame the company which made the gun for the damage caused.

So far, this thinking has largely been extrapolated to cover the world of artificial intelligence and robotics. In 1984, the owners of a U.S. company called Athlone Industries wound up in court after their robotic pitching machines for batting practice turned out to be a little too vicious. The case is memorable chiefly because of the judges proclamation that the suit be brought against Athlone rather than the batting bot, because robots cannot be sued.

This argument held up in 2009, when a U.K. driver was directed by his GPS system to drive along a narrow cliffside path, resulting in him being trapped and having to be towed back to the main road by police. While he blamed the technology, a court found him guilty of careless driving.

Sean Ryan / Rapid City Journal

There are multiple differences between AI technologies of today (and certainly the future) and yesterdays tech, however. Smart devices like self-driving cars or robots wont just be used by humans, but deployed by them after which they act independently of our instructions. Smart devices, equipped with machine learning algorithms, gather and analyze information by themselves and then make their decisions. It may be difficult to blame the creators of the technology, too.

Courts may hesitate to say that the designer of such a component could have foreseen the harm that occurred.

As David Vladeck, a law professor at Georgetown University in Washington D.C., has pointed out in one of the few in-depth case studies looking at this subject, the sheer number of individuals and firms that participate in the design, modification, and incorporation of an AIs components can make it tough to identify who the party responsible is. That counts for double when youre talking about black boxed AI systems that are inscrutable to outsiders.

Vladeck has written: Some components may have been designed years before the AI project had even been conceived, and the components designers may never have envisioned, much less intended, that their designs would be incorporated into any AI system, much less the specific AI system that caused harm. In such circumstances, it may seem unfair to assign blame to the designer of a component whose work was far removed in both time and geographic location from the completion and operation of the AI system. Courts may hesitate to say that the designer of such a component could have foreseen the harm that occurred.

Awarding an AI the status of a legal entity wouldnt be unprecedented. Corporations have long held this status, which is why a corporation can own property or be sued, rather than this having to be done in the name of its CEO or executive board.

Although it hasnt been tested, Shawn Bayern, a law professor from Florida State University, has pointed out that technically AI may have already have this status due to the loophole that it can be put in charge of a limited liability company, thereby making it a legal person. This might also occur for tax reasons, should a proposal like Bill Gates robot tax ever be taken seriously on a legal level.

Its not without controversy, however. Granting AIs this status would stop creators being held responsible if an AI somehow carries out an action its creator was not explicitly responsible for. But this could also encourage companies to be less diligent with their AI tools since they could technically fall back on the excuse that those tools acted outside their wishes.

There is also no way to punish an AI, since punishments like imprisonment or death mean nothing

Im not convinced that this is a good thing, certainly not right now, Dr. John Danaher, a law professor at NUI Galway in Ireland, told Digital Trends about legal personhood for AI. My guess is that for the foreseeable future this will largely be done to provide a liability shield for humans and to mask anti-social activities.

It is a compelling area of examination, however, because it doesnt rely on any benchmarks being achieved in terms of ever-subjective consciousness.

Today, corporations have legal rights and are considered legal persons, whereas most animals are not, Yuval Noah Harari, author of Sapiens: A Brief History of Humankind and Homo Deus: A Brief History of Tomorrow, told us. Even though corporations clearly have no consciousness, no personality and no capacity to experience happiness and suffering; whereas animals are conscious entities.

Irrespective of whether AI develops consciousness, there might be economic, political and legal reasons to grant it personhood and rights in the same way that corporations are granted personhood and rights. Indeed, AI might come to dominate certain corporations, organizations and even countries. This is a path only seldom discussed in science fiction, but I think it is far more likely to happen than the kind of Westworld and Ex Machina scenarios that dominate the silver screen.

At present, these topics still smack of science fiction but, as Harari points out, they may not stay that way for long. Based on their usage in the real world, and the very real attachments that form with them, questions such as who is responsible if an AI causes a persons death, or whether a human can marry his or her AI assistant, are surely ones that will be grappled with during our lifetimes.

Universal Pictures

The decision to grant personhood to any entity largely breaks down into two sub-questions, Danaher said. Should that entity be treated as a moral agent, and therefore be held responsible for what it does? And should that entity be treated as a moral patient, and therefore be protected against certain interferences and violations of its integrity? My view is that AIs shouldnt be treated as moral agents, at least not for the time being. But I think there may be cases where they should be treated as moral patients. I think people can form significant attachments to artificial companions and that consequently, in many instances, it would be wrong to reprogram or destroy those entities. This means we may owe duties to AIs not to damage or violate their integrity.

In other words, we shouldnt necessarily allow companies to sidestep the question of responsibility when it comes to the AI tools they create. As AI systems are rolled out into the real world in everything from self-driving cars to financial traders to autonomous drones and robots in combat situations, its vital that someone is held accountable for what they do.

At the same, its a mistake to think of AI as having the same relationship with us that we enjoyed with previous non-smart technologies. Theres a learning curve here and, if were not yet technologically at the point where we need to worry about cruelty to AIs, that doesnt mean its the wrong question to ask.

So stop yelling at Siri when it mishears you and asks whether you want it to search the web, alright?

See original here:

I, Alexa: Should we give artificial intelligence human rights? - Digital Trends

Is artificial intelligence a (job) killer? – HuffPost

Theres no shortage of dire warnings about the dangers of artificial intelligence these days.

Modern prophets, such as physicist Stephen Hawking and investor Elon Musk, foretell the imminent decline of humanity. With the advent of artificial general intelligence and self-designed intelligent programs, new and more intelligent AI will appear, rapidly creating ever smarter machines that will, eventually, surpass us.

When we reach this so-called AI singularity, our minds and bodies will be obsolete. Humans may merge with machines and continue to evolve as cyborgs.

Is this really what we have to look forward to?

AI, a scientific discipline rooted in computer science, mathematics, psychology, and neuroscience, aims to create machines that mimic human cognitive functions such as learning and problem-solving.

Since the 1950s, it has captured the publics imagination. But, historically speaking, AIs successes have often been followed by disappointments caused, in large part, by the inflated predictions of technological visionaries.

In the 1960s, one of the founders of the AI field, Herbert Simon, predicted that machines will be capable, within twenty years, of doing any work a man can do. (He said nothing about women.)

Marvin Minsky, a neural network pioneer, was more direct, within a generation, he said, the problem of creating artificial intelligence will substantially be solved.

But it turns out that Niels Bohr, the early 20th century Danish physicist, was right when he (reportedly) quipped that, Prediction is very difficult, especially about the future.

Today, AIs capabilities include speech recognition, superior performance at strategic games such as chess and Go, self-driving cars, and revealing patterns embedded in complex data.

These talents have hardly rendered humans irrelevant.

Reuters

But AI is advancing. The most recent AI euphoria was sparked in 2009 by much faster learning of deep neural networks.

Artificial intelligence consists of large collections of connected computational units called artificial neurons, loosely analogous to the neurons in our brains. To train this network to think, scientists provide it with many solved examples of a given problem.

Suppose we have a collection of medical-tissue images, each coupled with a diagnosis of cancer or no-cancer. We would pass each image through the network, asking the connected neurons to compute the probability of cancer.

We then compare the networks responses with the correct answers, adjusting connections between neurons with each failed match. We repeat the process, fine-tuning all along, until most responses match the correct answers.

Eventually, this neural network will be ready to do what a pathologist normally does: examine images of tissue to predict cancer.

This is not unlike how a child learns to play a musical instrument: she practices and repeats a tune until perfection. The knowledge is stored in the neural network, but it is not easy to explain the mechanics.

Networks with many layers of neurons (therefore the name deep neural networks) only became practical when researchers started using many parallel processors on graphical chips for their training.

Another condition for the success of deep learning is the large sets of solved examples. Mining the internet, social networks and Wikipedia, researchers have created large collections of images and text, enabling machines to classify images, recognise speech, and translate language.

Already, deep neural networks are performing these tasks nearly as well as humans.

But their good performance is limited to certain tasks.

Scientists have seen no improvement in AIs understanding of what images and text actually mean. If we showed a Snoopy cartoon to a trained deep network, it could recognise the shapes and objects a dog here, a boy there but would not decipher its significance (or see the humour).

We also use neural networks to suggest better writing styles to children. Our tools suggest improvement in form, spelling, and grammar reasonably well, but are helpless when it comes to logical structure, reasoning, and the flow of ideas.

Current models do not even understand the simple compositions of 11-year-old schoolchildren.

AIs performance is also restricted by the amount of available data. In my own AI research, for example, I apply deep neural networks to medical diagnostics, which has sometimes resulted in slightly better diagnoses than in the past, but nothing dramatic.

In part, this is because we do not have large collections of patients data to feed the machine. But the data hospitals currently collect cannot capture the complex psychophysical interactions causing illnesses like coronary heart disease, migraines or cancer.

So, fear not, humans. Febrile predictions of AI singularity aside, were in no immediate danger of becoming irrelevant.

AIs capabilities drive science fiction novels and movies and fuel interesting philosophical debates, but we have yet to build a single self-improving program capable of general artificial intelligence, and theres no indication that intelligence could be infinite.

Deep neural networks will, however, indubitably automate many jobs. AI will take our jobs, jeopardising the existence of manual labourers, medical diagnosticians, and perhaps, someday, to my regret, computer science professors.

Robots are already conquering Wall Street. Research shows that artificial intelligence agents could lead some 230,000 finance jobs to disappear by 2025.

In the wrong hands, artificial intelligence can also cause serious danger. New computer viruses can detect undecided voters and bombard them with tailored news to swing elections.

Already, the United States, China, and Russia are investing in autonomous weapons using AI in drones, battle vehicles, and fighting robots, leading to a dangerous arms race.

Now thats something we should probably be nervous about.

Marko Robnik-ikonja, Associate Professor of Computer Science and Informatics, University of Ljubljana

This article was originally published on The Conversation. Read the original article.

The Morning Email

Wake up to the day's most important news.

Read more here:

Is artificial intelligence a (job) killer? - HuffPost

What Is Artificial Intelligence, Really? – HuffPost UK

In popular media, "Artificial Intelligence" is by turns godlike, monstrous, uncannily human and a hoax; it inspires both awe and deep suspicion - it's unnatural.

Researchers who actually develop AI technologies - like those at PROWLER.io - prefer narrower, more useful terms like Machine Learning (ML) and decision theory. They're wary of the catch-all phrase "Artificial Intelligence", in part because human intelligence is itself largely artificial, an encoded system of man-made concepts, rules of thumb, recipes, customs, laws, even whole cultures. Humans have always used thinking tools, rules and systems to keep chaos at bay. Turn off the traffic lights in central London and you'll soon see how far "natural" intelligence gets us in a complex system.

Following suit, AI has traditionally made decisions using painstakingly coded "if x then y" rules that sometimes appear intelligent. This works well in narrow, predictable, static environments like relatively simple games and machines but not in big, surprising, dynamic ones like cities, where trying to dictate every decision is madness.

The smartest complex systems are in fact made of well-coordinated autonomous individuals making their own decisions. That's how bee colonies and free societies work. In Machine Learning, those individuals are "agents": statistical entities that operate intelligently within computer models of environments like games, self-driving cars, and smart cities.

PROWLER.io's agents get their smarts from three core technologies:

Probabilistic models: Powerful statistical tools can generate flexible models of virtual or physical environments. Agents operate both in and on those models, effectively programming themselves and updating the models as they go along. No model is perfect; uncertainties and hidden relationships abound. One powerful statistical model, Gaussian Processes, can help estimate, account for and even reduce uncertainty, allowing the system to uncover hidden relations between events.

Reinforcement Learning (RL): Agents can learn by acting in useful ways that are then reinforced numerically, much as a dog learns to sit when rewarded (reinforced) with a treat. Over time, the agent teaches itself to imitate, plan and perform sequences of actions, all without being given explicit instructions.

Multi-agent Systems (MAS): Agents can also cooperate and compete using strategies adapted from game theory that benefit both themselves and the system as a whole. This helps them infer what other agents are doing and adjust for the often surprising, irrational behaviour of humans. The result is a safe, efficient, multi-agent system that is smarter than the sum of its parts. The possible applications of machine decision making are virtually limitless, but let's focus on three examples:

Gaming: ML will soon tackle the thorniest problem in gaming: maintaining player interest. The key here is offering an optimal level of challenge between getting bored when the game is too easy and frustrated when it's too hard. The next generation of ML will open up whole new classes of games with dynamic, evolving characters and storylines that that adjust to each player's style of play and provide personalized interactions. Really smart zombies, anyone? Development costs and time to market will plummet when testing is handled by teams of humans working with agents, who'll do the boring, repetitive jobs a thousand times faster than manual testers.

Autonomous Vehicles: Get used to it, self-driving cars will increasingly take over our roads. Jaguar Land Rover is already testing a vehicle that is "nearly self driving" in city conditions. "If x then y" rules are a non-starter here: you can't program or script a vehicle to avoid ice patches, stray dogs or pedestrians. Put simply, probabilistic models will help a car "understand" itself and its environment, reinforcement learning will teach it to drive, and multi-agent systems will ensure it safely shares the road with other drivers, human and AI. Just as in gaming, ML can provide simulated environments where new technologies can be safely trained, tested and examined by regulators.

Smart cities: Our increasingly complex cities need to get a lot smarter. ML systems will help regulators identify weak points like terror targets or fire hazards and ensure first responders intervene promptly. Well before construction begins on projects like the new runway at Heathrow, ML driven simulations will help planners design and test changes to infrastructure while taking into account the impacts of weather, pollution, people and vehicle traffic.

All this is but a small glimpse of the foreseeable future of Machine learning. It's the next few steps in a history of human intelligence that's always been driven by artificial information, technology and culture, by what we create as much as by what we are.

See more here:

What Is Artificial Intelligence, Really? - HuffPost UK

The Robots are Coming: Is AI the Future of Biotech? – Labiotech.eu (blog)

AI, or artificial intelligence, has taken root in biotech. In this article, a contributor exploresits newfound niches in the industry.

Artificial intelligence (AI) and machine learning (ML) have become ubiquitous in tech startups, fueled largely by the increasing availability and amount of data and cheaper, more powerful computers. Now, if you are a new tech startup, ML or AI capabilities representyour minimum ticket to enter the industry. Over the past few years, AI and ML have started to peek their heads into the realm of biotech, due to an analogous transformation of biotech data.

We are beginning to see partnerships form between Big Pharma and biotech startups that employ AI and ML for drug discovery and other purposes. Positive results have already come out of joint projects, notably the delay in the onset of motor neuron disease in anefficacy study conducted by SITraN on a drug candidate proposed by BenevolentBIO.

With these results in mind, we must ask ourselves the question, what is the role of AI and ML now and also in the future of biotech?

Diagnostic assays today are usually developed once and only updated when there is a significant paradigm shift. Because of this, there are missed opportunities to improve the assay when the true results of previous diagnoses become known. However, ML techniques can immediately use the true result to improve the diagnostic test. This means that the more diagnostic tests that are run, the more accurate the test can become.

Currently, the most obvious implementation of ML techniques for diagnostics lies in genetic analysis. Sophia Genetics, the Swiss startup founded in 2011, exemplifies the state of the art. They intake a biopsy or blood sample from the patient, process the sample, and then analyze the data with their powerful analytical AI algorithms.

In Sophia Genetics case, the data analysis takes a few days withits platform, rather than several months like the current standard. While speed is clearly a benefit, the long-term advantage is that the machine learning algorithm thats behind the AI analysis enables the diagnostic process to become smarter with each iteration.

Besides genetic analysis, ML techniques can be used in any diagnostic that can be digitized, allowing the algorithm to determine the correct features to embed into its final decision-making process. DNAlytics demonstrates another use of ML in diagnostics, using the advanced computations to help diagnose rheumatoid arthritis.

Tedious tasks done in the lab such as designing constructs for gene editing or data analysis are slowly being handed over to AI programs as well, as a sort ofsecretarial work. Desktop Genetics has created a novel platform to design gene editing constructs using CRISPR that works through AI. Their gene editing platform follows the entire process, from selecting proper sgRNA molecules to analyzing the data of the experiment.

The power of AI allows them to more quickly and effectively constructCRISPR libraries that may be needed for a single experiment or an entire lab. Especially for people who do not have much experience working with CRISPR, this platform is valuable to not only expedite the processfrom designing to conducting an experiment but also to ensure that the guides are as effective as they can be, improving the efficacy ofgene editing.

For scientists who want quicker and/or easier data analysis, there are startups focused on using AI to look at many types of data. H2O.ai is an open-source platform on which people can analyze data using thousands of different statistical analysis models. While H2O.ai is industry-agnostic, there are a few startups focused specifically on healthcare and biotech data, alleviating the burden of data analysis from healthcare providers.

Increasingly more data is being generated, but not all of this data can be used, much less appropriately, at the moment. These startups are aiming to reduce the bottleneck at data analysis to take advantage of the rich datasets that exist.

Arguably, the most exciting advances in biotech using AI and ML have been in drug discovery. Current drug discovery economics are unsustainable, with costs now averaging over $2.5Band 12 years of trials for a single drug. The low-hanging fruit have already been picked, yet new approaches have not risen to reach the higher hanging ones.

However, AI and ML hope to be the solution that Big Pharma has been looking for. The computing technologies promise to make drug discovery cheaper and quicker, effectively making the time needed for lead discovery a small fraction of what it is today. Partnerships are already forming between young startups and pharma giants, and we should expect more to come at an increasing rate.

Several approaches exist for startups to make these advances happen. Some startups are focused on leveraging the increasing amount of genetic data and cheap sequencing to approach drug discovery from a genetics standpoint. Others are employing computer vision to analyze images of cells that have been treated with drug compounds, which eliminates the need for scores of PhDs to painstakingly peer into a microscope and screen for compounds of interest.

A few companies are taking a structure-based approach to drug discovery, using ML to find small molecules that could provide therapeutic benefits based on known target structures. Lastly, startups like BenevolentBIO use AI to pore over the vast, existing scientific data. With those results, they can make use of previously conducted studies to better inform future experiments and clue researchers into possible missteps in previous trials or even better designations for drugs.

With AI and ML seeping into more and more parts of biotech, what will the future bring? Lab assistant startups and diagnostics are trying to make healthcare providers and scientists more effective at their job, and I foresee the incorporation of tech making the pie bigger for almost everyone in these spaces.

For drug discovery, there seems to be a less symbiotic relationship at play with their customers. The startups act as Drug-Candidates-as-a-Service (DCaaS) companies, selling their findings to those who have the capital, both financial and human, to push the candidates further down the research pipeline.

Yet, aside from IBMs Watson initiative, large companies seem content to outsource this lead discovery step in R&D. Are they short-sighted? What happens to current behemoths when these small startups creep further into the R&D pipeline, conducting their own clinical trials and eventually selling the drugs they find themselves?

If we assume that the infusion of AI and ML into biotech will only increase, there seems to be only two outcomes: large companies with cash to spare will start acquiring these startups early and embedding the computational techniques into their current R&D structure, or current market leaders will slowly lose their grip on drug development to tech-enabled biotechs and become content producing generic drugs, at best.

With all of the good aspects of AI in biotech, there are a few challenges that could put a damper on progress. Most notably, the large volume of data is often stored in disparate or incompatible mediums, making it difficult to consolidate results and draw upon the entire wealth of data. Furthermore, data privacy is also a concern, particularly for companies using cloud computing to analyze patient-derived data, but at least in the US trailblazers have already jumped this hurdle.

Overall, AI and ML are coming into biotech and are here to stay. What will exactly happen is still up for debate, and AI biotech companies are still being formed, with good reason. The future of biotech is being written at this moment. The question is: who is writing it and what are they writing?

Michael Snyder. MBA Candidate at the Graduate School of Business at Stanford. Formerly a bioengineering researcher at EPFL and Boston University.

Images from Dmitry Rybin, Phonlamai Photo, Elnur, agsandrew, Bas Nastassia / shutterstock.com

See the article here:

The Robots are Coming: Is AI the Future of Biotech? - Labiotech.eu (blog)

XCOR Aerospace lays off remaining employees – SpaceNews

XCOR Aerospace suspended work on the Lynx, a two-seat reusable suborbital spaceplane, in 2016. Credit: XCOR Aerospace

WASHINGTON XCOR Aerospace, a company developing rocket engines and a suborbital spaceplane, has laid off its remaining employees but is continuing efforts to raise funding to maintain at least some of its projects.

In a statement provided to SpaceNews July 5, Michael Blum, a member of the companys board of directors who is also serving as acting chief executive, said some critical employees would be retained as contractors as the company attempts to stay alive.

Due to adverse financial conditions XCOR had to terminate all employees as of 30 June 2017, Blum said in the statement. XCOR management will retain critical employees on a contract basis to maintain the companys intellectual property and is actively seeking other options that would allow it to resume full employment and activity.

Blum did not disclose how many employees were laid off or how many would be kept on as contractors. In May 2016, XCOR laid off nearly half of its 50 to 60 employees as it devoted its resources to a liquid hydrogen/liquid oxygen engine the company was developing under contract with United Launch Alliance.

At that time, XCOR said it was suspending work on Lynx, a two-seat suborbital spaceplane the company had been working on for several years to serve the space tourism and research markets. Company officials said earlier this year that XCOR had not entirely abandoned the Mark 1 prototype vehicle that had been under construction at its Mojave, California, facility.

Although we have advanced the program with much of our recent efforts, completion of the prototype is funding dependent, Marco Martinez-Venturi, head of astronaut relations at the company, told SpaceNews in March.

With its employees laid off, company sources say management and investors are working to save at least some of XCORs products, keeping the company from folding entirely.

The company is also without a permanent chief executive. Jay Gibson, hired as chief executive in March 2015, left the company at the end of June. The Trump administration nominated Gibson June 16 to be the Deputy Chief Management Officer at the Department of Defense.

Blum, the acting chief executive, formerly was chief financial officer and a co-founder of Firefly Space Systems, a company that was seeking to develop a small launch vehicle. That company furloughed its staff in September 2016 after a planned funding round fell through. In March, it announced the sale of virtually all of its assets.

XCORs decision to lay off its remaining employees could also jeopardize a $10 million financial incentive package it received in 2012 to move the company to Midland, Texas. Brent Hilliard, chairman of the board of the Midland Development Corporation, which provided the incentive package, told the Midland Reporter-Telegraph that the board will meet with XCOR July 6 to discuss the companys status.

See original here:

XCOR Aerospace lays off remaining employees - SpaceNews

Aerospace and Defense: Who are the Digital Frontrunners? – IndustryWeek

What aerospace and defense firms are investing in the digitization of their businesses? According to a new study by the Boston Consulting Group (BCG), the answer is just about everyone. But simply spending money on digital capabilities doesnt make you a leader in this rapidly changing sector.

How much a company invests in digital generally does not influence whether it is a digital frontrunner or a digital follower, says Greg Mallory, a BCG senior partner and a coauthor of the report. Rather, the frontrunners in the race to extract value from digital are those A&D companies that define an enterprise-wide digital vision to guide their investment decisions across functions and that establish the right supporting structures, roles, and culture.

The study of 110 senior executives and managers at A&D companies found that nearly 100% of the respondents reported their digital investments were yielding positive results. These leaders appeared to regard digital investments as a low-risk move, as nearly half (45%) were willing to invest without a short-term business case.

Respondents fell into two groups frontrunners or followerson the basis of their self-reported success, BCG stated, in using digital in three categories: to improve operations, increase revenues and drive innovation. Among all respondents, 81% invested in digital to improve operations, while 49% did so to increase revenue and 52% to innovate. Of the companies that invested in all three categories, 58% are considered frontrunners.

While the level of spending did not generally correlate to being a digital leader, BCG found that 41% of frontrunners had higher spending levels on operations, while 25% had lower spends.

BCG said there are eight technology drivers reshaping aerospace and defense:

Companies are generally investing in similar technologies, BCG found. 3D printing for prototyping, simulation-based design, predictive analytics, and real-time monitoring are the most commonly implemented technologies, the firm stated.

Frontrunners apply digital technologies more broadly across their functions. They implement a digital technology in an average of 13 functions, compared with only nine functions for followers. BCG reported that a company is four times more likely to be a frontrunner in operations if it applies digital across the life cycle of the product, from initial program management to aftermarket and sustainment activities.

To support this broader implementation, BCG found digital frontrunners are developing an organization to support this transition. This includes a digital organization placed in the corporate offices or in business units, or as a shared service. The biggest differentiator between frontrunners and followers, BCG reported, is having a chief digital officer (CDO), although few companies have created this leadership position.

Both A&D frontrunners and followers ranked culture as the top challenge in adopting digital. After that, frontrunners cited finding the right digital solution, demonstrating the benefits of digital and identifying the right technology provider or partner. Followers said demonstrating the benefits of digital was their second most important challenge.

BCG warned that there would not necessarily be a lasting advantage for todays A&D frontrunners.

As more companies adopt similar approaches and digital tools become cheaper, the first-mover advantage erodes, the study stated. To stay ahead as competitors catch up, companies must ensure that their digital strategies continuously evolve.

Originally posted here:

Aerospace and Defense: Who are the Digital Frontrunners? - IndustryWeek

Minister questioned a second time in Israel Aerospace Industries corruption probe – The Jerusalem Post

MK Haim Katz (Likud) 311. (photo credit:Marc Israel Sellem)

Labor and Social Services Minister Haim Katz was questioned for a second time at Lahav 433 headquarters in Lod on Thursday as a suspect in an ongoing corruption probe into Israels largest state-owned aviation manufacturer, Israel Aerospace Industries, police said.

Katz is suspected of ethical violations and threats, according to police. The suspicions reportedly involve Katz a member of the Likud Party and former chairman of the National Workers Union of IAI allegedly threatening people to join the Likud and vote for him in party primaries.

Katz was questioned starting Thursday morning until shortly before 3:00 p.m. Last week he was questioned for around five hours. He denies any wrongdoing.

The IAI investigation became public in March, after 13 people were arrested on suspicion of corruption, offenses which included aggravated fraud, money laundering, theft by a public servant, fraud and breach of trust. Later in March, Katzs son Yair, who is chairman of the engineering sector of the IAI workers committee, was arrested on suspicion of granting benefits to IAI employees in exchange for them joining the Likud Party.

Retired IDF Brig.-Gen. Amal Asad was also among those arrested on suspicions of receiving bribes from businessmen at technology company DruzeNet to further the companys interests with IAI.

Asad denies any wrongdoing.

Attorney Illan Bombach, who represents Yair Katz, told The Jerusalem Post in March that his client denied the allegations against him and was cooperating fully with the police. Bombach argued that the investigation was motivated by interests seeking to overthrow Katz and that his client has no control over IAI employees futures. Furthermore, Bombach argued, police provided no evidence of any correlation between Likud membership and career advancement at the IAI.

The arrests in what is termed Case 630, came after an extensive, nearly yearlong undercover investigation by the Lahav 433 anti-fraud unit in cooperation with the Tax Authority, Ministry of Defense Security Authority, or Malmab an internal investigation branch of the Defense Ministry accompanied by the State Attorneys Office Economic Department.

The investigation comprised a large number of sub-allegations raising suspicion of corruption offenses including aggravated fraud, money laundering, theft by a public servant, fraud and breach of trust, the polices Intelligence and Investigations Division said in a statement last week.

According to police, the investigation raised the suspicion of systematic criminal behavior and deep corruption seemingly commonplace in Israel Aerospace Industries.

Share on facebook

Read more:

Minister questioned a second time in Israel Aerospace Industries corruption probe - The Jerusalem Post

Britain’s aerospace sector fears crash landing – The New European

PUBLISHED: 11:14 06 July 2017

Angela Jameson

The Rolls Royce XWB engine assembly line at the Rolls-Royce's aero engine factory in Derby.

PA Archive/PA Images

Email this article to a friend

To send a link to this page you must be logged in.

Some people go to Paris to buy beautiful clothes, others go to stock up on jets.

There was some good news out of this months Paris Air Show for the British aerospace sector, with UK manufacturers picking up about 13 billion worth of orders as part of the biannual event on an airfield in the Parisian suburbs. It was a welcome boost for the sector at a time when the Brexit storm clouds are looking ominous.

The UKs aerospace industry is bigger than you might imagine, but also very complex and important in that every job supports at least four more in a long supply chain.

Now the sector, which has grown hugely over the past 20 years, must face a reckoning. The UKs biggest manufacturers are Airbus, employing 12,000, Rolls-Royce and BAE Systems.

Other companies, who play a significant role in both commercial aerospace and the defence sectors, are Boeing, Cobham, Meggitt, GKN, Lockheed Martin, L3, Raytheon, Leidos, Babcock International and Northrop Grumman. In Belfast, there is Bombardier which employs 6,000. Besides these firms, there are more than 2,400 small and medium-sized businesses.

Total employment in the sector is about 250,000 jobs and sales are estimated to be worth in excess of 31 billion to the UK economy, with productivity growing at 19% since 2010.

Even more importantly, 90% of aerospace products manufactured in the UK are exported, providing a boost to our woeful trade deficit with the rest of the world. In fact, the UK is the second largest player in the aerospace world after the USA.

France, Germany and Spain would love to have a bigger slice of the aerospace cake and are just waiting for Brexit to provide that opportunity. So why is Brexit such a potential devastating blow to the health of British aerospace? There is no escaping the fact that UK manufacturers are fully intertwined in a global market. Some Rolls-Royce parts cross a border several times before they even reach an aircraft.

The aerospace industrys big worries are: fears over customs controls, concerns over skills shortages due to immigration policies and the need to ensure that the UK remains a member of the European Aviation Safety Agency which certifies the safety of aircraft products for sale and shapes standards for new markets, like drones.

Remaining a member of EASA is hugely important. If the UK were forced to go it alone and create a separate regulatory regime, the additional costs would be crippling for UK companies but also would deter overseas companies from investing here.

Not reaching a deal with the EU would have significant commercial consequences for UK industry, raising the costs of doing business, reducing our influence and damaging the UKs reputation as one of the best places in the world to develop new technology and create high value jobs.

Under WTO rules, the aerospace industry is exempt from tariffs but there are fears that EU-based companies would do all in their power to encourage governments to find loopholes that would raise the cost of production for UK manufacturers.

Ahead of the Paris Air Show last week, the aerospace industry started talking tough. Airbus laid down its minimum criteria from the UK government and said 110,000 jobs hinge on a successful Brexit deal.

The chief executive of Rolls-Royce pointed out that he was speaking for an extensive supply chain when calling for something as close as possible to the status quo on the cross-border movement of parts.

However, there was anger from the industrys leading executives that only junior ministers were drafted in to speak to the British delegation in Paris, at the last minute. A dereliction of duty and symptomatic of the moral and leadership vacuum at the heart of government one executive told the Sunday Times.

If the Government is beginning to understand that it needs a new peace accord with business, then stepping up to protect industries like aerospace is key.

Original post:

Britain's aerospace sector fears crash landing - The New European