International Yoga Day: Treat yourself these spiritual retreats – Outlook India

New Delhi [India], June 21 : When was the last time you came back from a vacation with that feeling of giddiness? No, we are not talking about the times when you were 'high' in the hills!

On This International Yoga Day ixigo brings you the top destinations for a spiritual retreat, that rare feeling of enlightenment, a spiritual journey, which soothes your soul and body.

1- Sri Aurobindo Ashram, Pondicherry -It's, supposedly, one of the best places in India if you wish to practice internal yoga. One of the most ancient forms of yoga, internal yoga might lead you to an intrinsically spiritual experience. It aims to bring self consciousness to the practitioner both on mental as well as a spiritual plain.

2- Sivananda Yoga Vedanta Dhanwantari Ashram, Thiruvananthapuram - This one is more for beginners, for it starts with the basic understanding of meditation, focusing on both practice and theory. A typical course is for six days with a class of 90 minutes for each day. Apart from the basic meditation practices, the chanting of mantras to ease focus while meditating is one of the main features of this place.

3. Phool Chatti Ashram, Rishikesh - Nestled deep in the Shivaliks at Rishikesh, this one offer beginners and intermediate courses in yoga. Usually running for seven days, there are two sessions of the same; one for spring and one for autumn. The courses here include yoga asanas (postures), meditation, breathing (pranayama), cleansing, chanting, yoga principles, mauna (silence), pooja (worship), kirtan (sacred singing) and more, while letting you experience the life of an ashram.

4- Ananda in the Himalayas, Narendranagar - One of the most sought after spas in the country, Ananda is also known for its yoga retreats. A group of yoga experts will guide you through the revolutionary practices of hatha yoga and Satyananda Yoga. The focus of the programmes here is to aid you in attaining a work life balance. The various forms of meditation that you can practice at Ananda are Psychic Sleep, Chakra Cleansing, Inner Silence, Chidaksha Meditation and more.

5- Yoga Vidya Spiritual Retreats, Kochi - This one started as an initiative by a group of yoga teachers to bring the benefits of yoga to the masses, while also allowing them the rich experience of Ashram life. While trained yoga performers help you bring in your sync your body and soul, you can pick from a number of programmes designed for beginners as well as experts, depending on your proficiency of the practices.

6- Ayurveda Yoga Meditation Resort, Coonoor - Ayurveda, yoga, reflexology, acupressure, or just strolling around amidst natural bounty, this one offers it all. Add to it the panchakarma detoxification programme, weight loss programme, advanced therapies to deal with depression, and you've a perfect getaway from the din of cities. There are some five packages to choose from, while you get to stay amidst tea plantations.

7 - Tushita Meditation Centre, McLeodganj - Apart from guided meditation and spiritual discourses, you can also opt for sessions on Buddhism. Located in McLeodganj, Tushita Meditation Centre exile of the Dalai Lama in India, there's the added treat that comes as "movie days" and more, when you can join the other followers for interactive sessions. You can choose from personal and group retreats. Giving you company here are the usually snow laden Dhauladhars.

8- Osho Meditation Resort, Pune - Situated at Koregaon in Pune, Osho Meditation Resort is a hub of meditation. There are a number of workshops, courses, massage options and the much followed Daily Meditation Program. Spread over an area of 28 acres, it's a perfect retreat for the people of nearby Mumbai. If not meditation, you can simply stay here or go for a swim in the campus pool. There are people from more than a 1000 countries coming in for meditation here.

9 - Dhamma Giri, Igatpuri - If Vipassana is what you are aiming to indulge in, Dhamma Giri is just the place for you. The course is for a minimum of ten days, presided by SN Goenka, one of the pioneers in the field of Vipassana. Considered to be one of the best meditation forms for self awareness, Vipassana is one of the toughest forms too. The meals served here are strictly vegetarian, while you can choose dormitories, single occupancy rooms and double occupancy rooms to stay here.

Excerpt from:

International Yoga Day: Treat yourself these spiritual retreats - Outlook India

There are horrors on the space station in ‘Life,’ now on DVD – LA Daily News

NEW FILMS

Life

Everybody Loves Somebody

Wilson

Railroad Tigers

Altitude

TELEVISION

Colony: Season Two

Incorporated: Season One

This Beautiful Fantastic

Workaholics: The Final Season

The sci-fi action film Life, directed by Daniel Espinosa (Safe House), is set on the International Space Station. After a probe digs up a sample of Mars for analysis, theres excitement about the possibility that this may be the first evidence of life beyond Earth.

So its no surprise that they do find life there wouldnt be a movie otherwise but this isnt E.T. What ensues takes us into Alien horror territory.

Life has a solid cast, led by Ryan Reynolds, Jake Gyllenhaal and Rebecca Ferguson, and some nifty special effects. It even has a few interesting ideas involving the freaky possibilities of extraterrestrial life, but ultimately the film looks more familiar than wondrous.

Advertisement

The rest is here:

There are horrors on the space station in 'Life,' now on DVD - LA Daily News

Alaskan builds virtual reality tour of International Space Station – KTUU.com

ANCHORAGE AK Alaskan Academy Award winner Ben Grossman, has created a virtual reality tour of the International Space Station.

Grossman, who grew up in Delta Junction and attended UAF before moving to Los Angeles to pursue a career in film-making, created the virtual reality tour in partnership with NASA.

He explained that his company Magnopus went to the Johnson Space Center and proposed to NASA to put a 360 degree camera in the International Space Station.

The idea was to give people the feeling down on earth of floating around in space, said Grossman.

To prepare for the pitch, the team at Magnopus created a virtual reality tour of the ISS instead of making pictures, 3D models or artworks, said Grossman.

NASA approved the pitch and suggested that Magnopus should release the virtual reality tour as a stand-alone thing.

Grossman agreed and with the support of Occulus Rift his company started developing the experience.

Over 6 months, Grossman, his team of artists and a lot of astronauts developed the virtual reality tour that includes participation in tutorials, spacewalks and even docking a Space X ship.

Grossman explained that for many of the astronauts who had been to the ISS, there was an element of homesickness to not being able to return to space.

Grossman then went on to tell KTUU that he had to decide what image would be seen through the famed Cupola window of the ISS.

Being Alaskan, Grossman decided on Anchorage but he was careful to put elements of Denmark, Africa and Italy to hide his hand.

Grossman also confirmed that he is currently working on a major international film that is created in virtual reality and exported to be viewed in cinemas.

He said the production, slated for release in 2019, is the first of its type in the world and that he would be able to give more information later this year.

Grossman could not confirm a set timeline for when the 360 degree camera would be in place in the International Space Station

To view the project visit: Click here to go to the VR Tour

Read more here:

Alaskan builds virtual reality tour of International Space Station - KTUU.com

Kepler discovers 10 Earth-like exoplanets, 219 planet candidates – SpaceFlight Insider

Laurel Kornfeld

June 21st, 2017

NASAs Kepler space telescope team has identified 219 new planet candidates, 10 of which are near-Earth-size and in the habitable zone of their star. Image & Caption Credit: NASA / JPL-Caltech

NASAs Keplermission has released the most extensive list of exoplanet findings: a total of 219 planet candidates, of which ten are probablyEarth-like and occupy their stars habitable zones where temperatures allow liquid water to exist on the surface.

The eighth and final data release of the original, four-year Kepler mission brings the total number of exoplanet candidates found by the telescope to 4,034. Of these, 2,335 have been confirmed to be planets, and 50 are located in habitable zones and are roughly Earth-sized.

Launched in March 2009, Kepler spent four years observing stars in the constellation Cygnus by using the transit method, which involves searching for regular dimming of stars light as orbiting planets transit or pass in front of the stars. However, the failure of the spacecrafts second of four reaction wheels in May 2013 put an end to that part of its mission, which had been expected to be extended until 2016.

Keplers latest findings announced at a news conference on Monday, June 19, held at the NASA Ames Research Center in California also revealed the existence of two divergent categories of small planets. The first are gaseous worlds with no known solid surfaces, dubbed mini-Neptunes, while the second are rocky planets sometimes described as super-Earths.

Researchers using data from the W. M. Keck Observatory and NASAs Kepler mission have discovered a gap in the distribution of planet sizes, indicating that most planets discovered by Kepler so far fall into two distinct size classes: the rocky Earths and super-Earths (similar to Kepler-452b), and the mini-Neptunes (similar to Kepler-22b). This histogram shows the number of planets per 100 stars as a function of planet size relative to Earth. Image & Caption Credit: NASA / Ames / Caltech / University of Hawaii (B. J. Fulton)

Using the Kepler data, one group of scientists observed 1,300 stars and 2,000 planets that the telescope found with the W.M. Keck Observatory in Hawaii, with the goal of obtaining precise measurements of the planets sizes. Their studies confirmed the presence of the two distinct small planet types.

Rocky planets appear to have a size limit at around 75 percent larger than the Earth. For reasons not well understood, some small planets accumulate hydrogen and helium, swelling out to become gaseous worlds with heavy atmospheres and no known solid surfaces. The latter are not good locations to search for life.

We like to think of this study as classifying planets in the same way that biologists identify new species of animals. Finding two distinct groups of exoplanets is like discovering mammals and lizards make up distinct branches of a family tree, explained Benjamin Fulton, lead author of the second study and a doctoral student at the University of Hawaii.

This diagram illustrates how planets are assembled and sorted into two distinct size classes. First, the rocky cores of planets are formed from smaller pieces. Then, the gravity of the planets attracts hydrogen and helium gas. Finally, the planets are baked by the starlight and lose some gas. At a certain mass threshold, planets retain the gas and become gaseous mini-Neptunes; below this threshold, the planets lose all their gas, becoming rocky super-Earths. Image & Caption Credit: NASA / Kepler / Caltech (R. Hurt)

Particular attention was given to one approximately Earth-sized world discovered orbiting in the habitable zone of its Sun-like star. Designated KOI-7711, this planet candidate resembles Earth in both its orbit and size, but the composition of its atmosphere and its ability to host liquid water on its surface is unknown.

In the Solar System, Venus, Earth, and Mars are all located in the habitable zone, yet only Earth is capable of supporting life, the scientists noted.

Theres a lot we dont know about this planet, Kepler scientist Susan Mullally said, noting it is premature to refer to it as an Earth twin.

In the past, planets discovered by Kepler were initially thought to be habitable only to later be found inhospitable due to phenomena such as bombardment by stellar flares.

Highlighted are new planet candidates from the eighth Kepler planet candidate catalog that are less than twice the size of Earth and orbit in the stars habitable zone the range of distances from a star where liquid water could pool on the surface of an orbiting planet. The dark green area represents an optimistic estimate for the habitable zone, while the brighter green area represents a more conservative estimate for the habitable zone. The candidates are plotted as a function of their stars surface temperature on the vertical axis and by the amount of energy the planet candidate receives from its host star on the horizontal axis. Brighter yellow circles show new planet candidates in the eighth catalog, while pale yellow circles show planet candidates from previous catalogs. Blue circles represent candidates that have been confirmed as planets due to follow-up observations. The sizes of the colored disks indicate the sizes of these exoplanets relative to one another and to the image of Earth, Venus, and Mars, placed on this diagram for reference. Note that the new candidates tend to be around stars more similar to the sun around 5,800 Kelvin representing progress in finding planets that are similar to the Earth in size and temperature that orbit Sun-like stars. Image & Caption Credits: NASA / Ames Research Center / Wendy Stenzel

To address the possibilities of both false positives and failure to identify actual planets, the Kepler team analyzed the data by combining it with software simulations that added false signals and deliberate misses of known planets. Mixing real data with simulated information accurately predicted both overcounts and undercounts.

This carefully measured catalog is the foundation for directly answering one of astronomys most compelling questions how many planets like Earth are in the galaxy? said SETI research scientist and catalog study lead author Susan Thompson.

The Kepler data set is unique, as it is the only one containing a population of these near-Earth analogs planets with roughly the same size and orbit as Earth, said Mario Perez, a Kepler scientist in NASAs Science Mission Directorate Astrophysics Division. Understanding their frequency in the galaxy will help inform the design of future NASA missions to directly image another Earth.

There are 4,034 planet candidates now known with the release of the eighth Kepler planet candidate catalog. Of these, 2,335 have been confirmed as planets. The blue dots show planet candidates from previous catalogs, while the yellow dots show new candidates from the eighth catalog. New planet candidates continue to be found at all periods and sizes due to continued improvement in detection techniques. Notably, 10 of these new candidates are near-Earth-size and at long orbital periods, where they have a chance of being rocky with liquid water on their surface. Image & Caption Credit: NASA / Ames Research Center / Wendy Stenzel

None of the new data comes from the revamped K2 mission, for which the Kepler telescope is searching other parts of the sky beyond Cygnus. K2s most recent discovery over 100 exoplanets was announced in July 2016.

The search for other exoplanets, in general, and another Earth, in particular, will continue with NASAs Transiting Exoplanet Survey Satellite (TESS) and with the James Webb Space Telescope, both scheduled for launch in 2018.

Thompson, Fulton, Perez, and Courtney Dressing, a NASA Sagan fellow at Caltech, participated in the news conference.

All exoplanet candidates and confirmed planets are listed in NASAs Exoplanet Archive online.

The population of exoplanets detected by the Kepler mission (yellow dots) compared to those detected by other surveys using various methods: radial velocity (light blue dots), transit (pink dots), imaging (green dots), microlensing (dark blue dots), and pulsar timing (red dots). For reference, the horizontal lines mark the sizes of Jupiter, Neptune, and Earth, all of which are displayed on the right side of the diagram. The colored ovals denote different types of planets: hot Jupiters (pink), cold gas giants (purple), ocean worlds and ice giants (blue), rocky planets (yellow), and lava worlds (green). The shaded gray triangle at the lower right marks the exoplanet frontier that will be explored by future exoplanet surveys. Kepler has discovered a remarkable quantity of exoplanets and significantly advanced the edge of the frontier. Image & Caption Credit: NASA / Ames Research Center / Natalie Batalha / Wendy Stenzel

Tagged: Ames Research Center Exoplanet Kepler Space Telescope Lead Stories NASA

Laurel Kornfeld is an amateur astronomer and freelance writer from Highland Park, NJ, who enjoys writing about astronomy and planetary science. She studied journalism at Douglass College, Rutgers University, and earned a Graduate Certificate of Science from Swinburne Universitys Astronomy Online program. Her writings have been published online in The Atlantic, Astronomy magazines guest blog section, the UK Space Conference, the 2009 IAU General Assembly newspaper, The Space Reporter, and newsletters of various astronomy clubs. She is a member of the Cranford, NJ-based Amateur Astronomers, Inc. Especially interested in the outer solar system, Laurel gave a brief presentation at the 2008 Great Planet Debate held at the Johns Hopkins University Applied Physics Lab in Laurel, MD.

See the original post here:

Kepler discovers 10 Earth-like exoplanets, 219 planet candidates - SpaceFlight Insider

Space flight bill could see Scotland become ‘thriving hub’ for the industry – Herald Scotland

A space flight bill to be included in the Queen's Speech could see Scotland become "a thriving hub" for the industry, according to the Secretary of State for Scotland.

New powers would see the launch of satellites from the UK for the first time, horizontal flights to the edge of space for scientific experiments and the creation of spaceports across the UK.

A number of Scottish sites have expressed an interest in the project, including Prestwick, Machrihanish and Stornoway.

Secretary of State for Scotland David Mundell said: "This new legislation on space ports will be a giant leap forward for Scotland's ambitious space and satellite sector.

"It will give each of our potential spaceports a fantastic opportunity to establish Scotland as a thriving hub for commercial spaceflight.

"By capitalising on our existing scientific expertise, a Scottish spaceport would create new, skilled jobs and drive economic growth."

More than 38,000 jobs rely on the UK's space industry, which is worth 13.7 billion to the economy.

The global market for launching satellites is estimated to be 25 billion over the next 20 years.

New powers would be given for a wide range of new spaceflight, including vertically-launched rockets, spaceplanes, satellite operation, spaceports and other technologies.

It would take the UK into the commercial space age by enabling small satellite launch and spaceflight from UK spaceports.

And it will create new opportunities for the UK's scientific community to carry out research in a microgravity environment by giving British scientists easier access.

The legislation comes as part of a series of bills aimed at infrastructure.

Theresa May is preparing for the Queen's Speech on Wednesday but has not yet struck a deal with the Democratic Unionist Party to prop up her minority government.

The event sets out the Prime Minister's legislative plans for the coming year.

More:

Space flight bill could see Scotland become 'thriving hub' for the industry - Herald Scotland

Martian crater provides reminder of Apollo 16 mission – SpaceFlight Insider

Jim Sharkey

June 20th, 2017

NASAs Opportunity Mars rover passed near this small, relatively fresh Martian crater in April 2017, during the 45th anniversary of the Apollo 16 mission to the Moon. (Click to enlarge) Image Credit: NASA/JPL-Caltech/Cornell Univ./Arizona State Univ.

During the 45th anniversary of Apollo 16s voyage to the Moon, NASAs Opportunity Mars roverdrove by a relatively young Martian crater, providing a connection between the two missions. The feature was informally named Orion Crater by the Opportunity mission team, in honor of the Apollo 16 lunar module Orion.

Orion was the name of the Lunar Module that carried astronauts John Young and Charles Duke to and from their lunar site, on the Descartes Highlandsin 1972.Orion also happens to bethe name of NASAs newest crew-rated spaceship which the space agency is developing to send crews on missions beyond-Earth orbit.

Opportunitys Panoramic Camera (PanCam) took images of Orion Crater on April 26, 2017. The crater is about 90 feet (27 meters) wide and estimated to be no older than 10 million years.

It turns out that Orion Crater is almost exactly the same size as Plum Crater on the Moon, which John Young and Charles Duke explored on their first of three moonwalks taken while investigating the lunar surface using their lunar rover, said Opportunity science-team member and SpaceFlight Insider contributor Jim Rice, of the Planetary Science Institute, Tucson, Arizona.

Rice sent the Pancam mosaic image of Orion Crater to Apollo 16 Lunar Module Pilot Charles Duke, to which the Apollo astronaut responded with much enthusiasm.

This is fantastic. What a great job! I wish I could be standing on the rim of Orion like I was standing on the rim of Plum Crater 45 years ago, Duke said.

Apollo 16 astronaut Charles Duke standing on the rim of Plum crater, which is 131 feet (40 meters) in diameter and about 33 feet (10 meters) deep. Photo Credit: NASA

Tagged: Apollo 16 Charlie Duke Mars Exploration Rover Opportunity Plum Crater The Range

Jim Sharkey is a lab assistant, writer and general science enthusiast who grew up in Enid, Oklahoma, the hometown of Skylab and Shuttle astronaut Owen K. Garriott. As a young Star Trek fan he participated in the letter-writing campaign which resulted in the space shuttle prototype being named Enterprise. While his academic studies have ranged from psychology and archaeology to biology, he has never lost his passion for space exploration. Jim began blogging about science, science fiction and futurism in 2004. Jim resides in the San Francisco Bay area and has attended NASA Socials for the Mars Science Laboratory Curiosity rover landing and the NASA LADEE lunar orbiter launch.

Visit link:

Martian crater provides reminder of Apollo 16 mission - SpaceFlight Insider

Are You Ready? NASA Webcast Marks 2 Months to Total Solar Eclipse – Space.com

In just two months, a total solar eclipse will sweep across the United States. On Aug. 21, 2017, observers along the eclipse's 70-mile (113 kilometers) path will see the sun slowly vanish behind the moon, turning midday into twilight and revealing the hidden layers of the sun's atmosphere.

"Most people never get to see a total solar eclipse, so this is an amazing opportunity for Americans," Angela Speck told Space.com by email. Speck, a researcher at the University of Missouri, is a member of the American Astronomical Society (AAS) eclipse team.

Today, which happens to be the summer solstice - or first day of summer - in the Northern Hemisphere, NASA is webcasting two news conferences featuring experts on eclipse science, as well as eclipse safety, travel and traffic information. The first briefing, which will focus on anticipated crowd sizes and traffic levels on Aug. 21, will run from 1 p.m. to 2 p.m. EDT (1700 to 1800 GMT). The second briefing will focus on eclipse science, and will go from 2:30 p.m. to 3:30 p.m. EDT (1830 to 1930 GMT). You can watch the webcasts on NASA TV or here on Space.com.

This so-called Great American Eclipse will be the first total solar eclipse to touch the continental United States since 1979, and the first to cross from coast to coast since 1918. Not until 2024 will another total solar eclipse cross the continental U.S., though that eclipse will be visible only east of Texas. [Amazing Solar Eclipse Pictures from Around the World]

For the August total solar eclipse, however, NASA estimates that most Americans live within a two-day driveof the path.

The total solar eclipse of Aug. 21, 2017 will cross the U.S. from coast to coast.

A solar eclipse occurs when the sun appears to pass behind the moon as seen from Earth. In a total eclipse, the sun vanishes completely behind the Earth's lunar companion, while less than 100 percent of the sun's disk is hidden during a partial eclipse. Totality (the period when the sun is completely hidden) is short, lasting only a few brief minutes. But it can be quite stunning for those fortunate enough (or with foresight enough) to be within the path of totality.

Once the sun disappears, its atmosphere becomes visible. Although the hot gases that make up the solar corona are always present, the hotter central disk of the star usually outshines this outer region. During totality, however, with the disk blocked, the atmosphere's tangle of streamers and loops becomes visible.

The disappearance of the body of the sun can cause the temperature to drop. Animals may behave as though it is nighttime, and signals from radio stations can bounce through the atmosphere differently. Observers may be able to view the approaching shadow of the eclipse in the minutes before totality.

Onlookers observe a partial solar eclipse in Glasgow in 2015, while wearing solar viewing glasses. When the moon completely covers the sun's disk, it is safe to remove solar viewing glasses.

During the partial phase, when the moon appears to "take a bite" out of the sun, viewers can use pinhole cameras to watch the moon progress. But a tree will work as well, said veteran eclipse chaser and press officer for the AAS Rick Fienberg.

"During the partial phases, especially when the sun has been reduced to a crescent, look under a tree," Fienberg said. "You'll see the ground dappled with little crescent suns projected by the spaces between the leaves!"

Both Speck and Fienberg stressed that first-time viewers should not focus on shooting photographsbut should instead spend the brief time of the eclipse enjoying the rare event. In addition to the corona, observers should look at the sunset colors and unusual light on the horizon, as well as the stars and planets that will be visible.

"Just take it in as much as you can," Speck said.

You're not completely out of luck if you can't make it to the eclipse path. "You will still have a partial solar eclipse if you're anywhere within North America," Fienberg said.

As long as the sun is visible, observers should view the event only through designated eclipse glasses. Observers viewing a partial eclipse will need to leave their viewing glasseson during the entire event, as the sun will always be visible.

For those in the path of totality, during the 2 to 3 minutesof totality, when the body of the sun is hidden from view, it is safe to remove your eclipse glasses.

"It is perfectly safe to view the totally eclipsed sun without any filters," Fienberg said. "In fact, if you leave your filters on, you won't see anything at all during totality."

REMEMBER:Looking directly at the sun, even when it is partially covered by the moon, can cause serious eye damage or blindness. NEVER look at a partial solar eclipse without proper eye protection. See our complete guide to find out how to view the eclipse safely.

Many eclipse enthusiasts, along with plenty of first-time viewers, have known about the Great American Eclipse for years and have made plans accordingly, as evidenced by the fact that most hotels along the path are booked. That poses a challenge for anyone hoping to make a last-minute reservation.

Campgrounds can provide another option. A day-trip to the path of totality may be an option, but because the shadow of the total eclipse is within driving distance for millions of people, NASA anticipates that traffic will be heavy. Travelers should budget plenty of extra time to reach their destinations the day of the eclipse. If you haven't made your travel plans yet, you should probably make them now.

While the eclipse itself is certainly worth the trip, Fienberg recommended locating an eclipse partyfor an enhanced experience.

"Organized eventswill have knowledgeable experts to guide your experience and, usually, [will] provide solar-eclipse viewers that meet the ISO international safety standard," he said.

Many cities in the path are also hosting multiday festivals to accompany the eclipse. If you're planning a multiple-day trip to see the eclipse, take a look at our state-by-state guide to find other attractions near your observing location.

While some events will distribute eclipse glasses so that observers can study the sun during its gradual disappearance, if you have your own glasses, you won't have to fear missing the event. Both NASA and the AASlist several certified vendors who sell inexpensive, quality eclipse glasses, and the AAS offers a guide to help you determine if your glasses are safe for eclipse viewing. The primary qualification is that the glasses meet the SO 12312-2(sometimes written as ISO 12312-2:2015) international standard; this information should be printed on the filters.

Weather can also throw a wrench into your plans, so be sure to check the forecast in the days leading up to the eclipse.

"Fortunately, mid-August weather is usually pretty good across most of the country," Fienberg said. "But, on average, it's better in the Northwest than in the Southeast, so many of the most die-hard eclipse chasers are setting a course for Oregon, Idaho, Wyoming or Nebraska.

Although you may make preparations well in advance of the eclipse, sometimes things don't work out. For example, the weather may be cloudy. Bad traffic or other events may keep you from reaching your intended destination on the day of the eclipse. It is important to have a backup plan for viewing the event, such as a second site if the weather is bad.

Don't worry about picking the perfect site to watch the event. Locations that fall inside the shadow of the moon should be more or less equal.

"As long as one is on the path of totality and has a clear line of sight to the sun anywhere will do," Speck said. Line of sight depends on location; a spot surrounded by mountains may give you a smaller glimpse of the sky than a site on the plains.

Still, the closer you are to the center of the path, the longer totality will last, Fienberg said, with a maximum of about 2 minutes and 40 seconds in southern Illinois and western Kentucky.

Parents can help engage their children with books about eclipses. Check your local library for eclipse events; a map of libraries known to have events planned can be found online here.

NASA is affiliated with multiple eclipse events along the path of totality, where the sun will vanish completely behind the moon. The agency has created a map showing the locations of these events, including spots where the agency will film its eclipse program that will broadcast online and on NASA TV. Some of these events span multiple days or even the week before the eclipse.

If you aren't fortunate enough to view the complete eclipse, you might be able to catch other events hosted by the space agency at sites around the United States where only a partial eclipse occurs. Most of these occur in cities where the space agency already has a presence, including Houston and Washington, D.C.

The AAS has compiled a list of events pertaining to the eclipse on the society's website. Some events, like lectures on eclipse science and history, are already taking place around the country.

Viewers from around the world will also be able to tune in to NASA's live webcast of the total solar eclipse, with footage from multiple locations inside the path of totality.

"This will probably be the most-watched total solar eclipse in history," Fienberg said.

Follow Nola Taylor Redd on Twitter @NolaTReddor Google+. Follow us at @Spacedotcom, Facebookor Google+. Originally published on Space.com.

Continued here:

Are You Ready? NASA Webcast Marks 2 Months to Total Solar Eclipse - Space.com

NASA developing supersonic aircraft to create soft thump rather than disruptive sonic boom – The Bakersfield Californian

If youre old enough to remember the european Concorde, the last commercial supersonic transport, which ended service in 2003, you may know that while the plane could fly at twice the speed of sound, it was loud and expensive to operate.

Now NASA Armstrong in eastern Kern County is working with private contractors on a new concept that would make those pesky, disruptive sonic booms sound more like your neighbor closing his car door.

Recent research has shown it is possible for a supersonic airplane to be shaped in such a way that the shock waves it forms when flying faster than the speed of sound can generate a sound at ground level so quiet it will hardly be noticed by the public, if at all.

NASA is working hard to make flight greener, safer and quieter all while developing aircraft that travel faster, and building an aviation system that operates more efficiently,NASA Administrator Charles Bolden said last year when the space agency awarded a contract to Lockheed Martin in Palmdale to complete a preliminary design for Quiet Supersonic Technology, or QueSST for short.

Now the first stage is nearly complete, saidMatt Kamlet, a public affairs specialist atNASA Armstrong.

"We are meeting with Lockheed this week to go over the final design and concept," Kamlet said.

Developing, building and flight testing a quiet supersonic aircraft could ultimately enable the industry and theFederal Aviation Administration to open supersonic travel for the flying public.

But before that happens, a full-size flight demonstration aircraft will have to be built, and will have to prove itself in flight tests that will likely begin in eastern Kern before being tried out over communities not accustomed to hearing sonic booms.

Our unique aircraft design is shaped to separate the shocks and expansions associated with supersonic flight, dramatically reducing the aircrafts loudness, Peter Iosifidis, QueSST program manager at Lockheed Martin Skunk Works in Palmdale, said in a statement. Our design reduces the airplanes noise signature to more of a heartbeat instead of the traditional sonic boom thats associated with current supersonic aircraft in flight today.

Why should the average person care?

"A five- or six-hour flight from Los Angeles to New York would be cut in half,"Kamlet said.

Meanwhile, seeing the sleek, single-pilot demonstration jet streaking through the skies overhead may be its own reward.

If the research and development stays on schedule, Kamlet said, that could happen as early as 2021.

Read more from the original source:

NASA developing supersonic aircraft to create soft thump rather than disruptive sonic boom - The Bakersfield Californian

Image: NASA Mars Reconnaissance Orbiter views rover climbing Mount Sharp – Phys.Org

June 21, 2017 The feature that appears bright blue at the center of this scene is NASA's Curiosity Mars rover amid tan rocks and dark sand on Mount Sharp, as viewed by the HiRISE camera on NASA's Mars Reconnaissance Orbiter on June 5, 2017. Credit: NASA/JPL-Caltech/Univ. of Arizona

Using the most powerful telescope ever sent to Mars, NASA's Mars Reconnaissance Orbiter caught a view of the Curiosity rover this month amid rocky mountainside terrain.

The car-size rover, climbing up lower Mount Sharp toward its next destination, appears as a blue dab against a background of tan rocks and dark sand in the enhanced-color image from the orbiter's High Resolution Imaging Science Experiment (HiRISE) camera. The exaggerated color, showing differences in Mars surface materials, makes Curiosity appear bluer than it really looks.

The image was taken on June 5, 2017, two months before the fifth anniversary of Curiosity's landing near Mount Sharp on Aug. 5 PDT (Aug. 6, 2017, EDT and Universal Time).

When the image was taken, Curiosity was partway between its investigation of active sand dunes lower on Mount Sharp, and "Vera Rubin Ridge," a destination uphill where the rover team intends to examine outcrops where hematite has been identified from Mars orbit.

The rover's location that day is shown at https://mars.nasa.gov/multimedia/images/2017/curiositys-traverse-map-through-sol-1717 as the point labeled 1717. Images taken that day by Curiosity's Mast Camera (Mastcam) are at https://mars.nasa.gov/msl/multimedia/raw/?s=1717&camera=MAST%5F .

Explore further: Image: Curiosity trek through 'Pahrump Hills' spotted by Mars Reconnaissance Orbiter

More information: For more information about NASA's Mars Reconnaissance Orbiter, visit mars.nasa.gov/mro/ For more information about NASA's Mars Science Laboratory Project and Curiosity, visit mars.nasa.gov/msl/

Using the most powerful telescope ever sent to Mars, NASA's Mars Reconnaissance Orbiter caught a view of the Curiosity rover this month amid rocky mountainside terrain.

(Phys.org)Astronomers have identified another rare example of an extreme helium star. The star, designated GALEX J184559.8413827 (or J18454138 for short), was initially classified as a faint helium-rich "hot subdwarf," ...

Astronomers using the National Science Foundation's Karl G. Jansky Very Large Array (VLA) have found new evidence suggesting that a jet of fast-moving material ejected from one young star may have triggered the formation ...

Traditional solar panels used to power satellites can be bulky with heavy panels folded together using mechanical hinges. An experiment that recently arrived at the International Space Station will test a new solar array ...

Derelict satellites could in future be grappled and removed from key orbits around Earth with a space tug using magnetic forces.

Along with its aesthetic function of helping create the glorious Aurora Borealis, or Northern Lights, the powerful magnetic field surrounding our planet has a fairly important practical value as well: It makes life possible.

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

More:

Image: NASA Mars Reconnaissance Orbiter views rover climbing Mount Sharp - Phys.Org

Nanotechnology Medical Devices Industry Report 2017 to 2022 – Equity Insider (press release)

Global Nanotechnology Medical Devices Market tracks the major market events including product launches, technological developments, mergers & acquisitions, and the innovative business strategies opted by key market players. Along with strategically analyzing the key micro markets, the report also focuses on industry-specific drivers, restraints, opportunities and challenges in the Nanotechnology Medical Devices market. This research report offers in-depth analysis of the market size (revenue), market share, major market segments, and different geographic regions, forecast for the next five years, key market players, and premium industry trends. It also focuses on the key drivers, restraints, opportunities and challenges.

Top companies operating in the global Nanotechnology Medical Devices market profiled in the report are Stryker, 3M, Smith & Nephew, Mitsui Chemicals, Dentsply International, ST. Jude Medical, AAP Implantate, Perkinelmer, Affymetrix, Starkey Hearing Technologies.

This report segments the global Nanotechnology Medical Devices market on the basis of types, Active Implantable Devices, Biochips, Implantable Materials, Medical Textiles and Wound Dressings, Others. On the basis of application, the global Nanotechnology Medical Devices market is segmented into Therapeutic Applications, Diagnostic Applications, Research Applications. For comprehensive understanding of market dynamics, the global Nanotechnology Medical Devices market is analyzed across key geographies namely North America, Europe, China, Japan, Southeast Asia, India. Each of these regions is analyzed on basis of market findings across major countries in these regions for a macro-level understanding of the market.

Inquire for sample at :

https://www.marketinsightsreports.com/reports/062012235/2017-global-nanotechnology-medical-devices-market-professional-survey-report/inquiry

There are 15 Chapters to deeply display the global Nanotechnology Medical Devices market.

Chapter 1, to describe Nanotechnology Medical Devices Introduction, product scope, market overview, market opportunities, market risk, market driving force;

Chapter 2, to analyze the top manufacturers of Nanotechnology Medical Devices, with sales, revenue, and price of Nanotechnology Medical Devices, in 2016and 2017;

Chapter 3, to display the competitive situation among the top manufacturers, with sales, revenue and market share in 2016and 2017;

Chapter 4, to show the global market by regions, with sales, revenue and market share of Nanotechnology Medical Devices, for each region, from 2012to 2017;

Chapter 5, 6, 7,8and 9, to analyze the key regions, with sales, revenue and market share by key countries in these regions;

Chapter 10and 11, to show the market by type and application, with sales market share and growth rate by type, application, from 2012 to 2017;

Chapter 12, Nanotechnology Medical Devices market forecast, by regions, type and application, with sales and revenue, from 2017to 2022;

Chapter 13, 14 and 15, to describe Nanotechnology Medical Devices sales channel, distributors, traders, dealers, Research Findings and Conclusion, appendix and data source.

Avail complete report of this Research with TOC and List of Figures at :

https://www.marketinsightsreports.com/reports/062012235/2017-global-nanotechnology-medical-devices-market-professional-survey-report

This report provides in-depth analysis of the Nanotechnology Medical Devices and provides market size (US$ Million) and Cumulative Annual Growth Rate (CAGR (%)) for the forecast period: 2017 2022, considering 2016 as the base year. It elucidates potential revenue opportunity across different segments and explains attractive investment proposition matrix for this market. This study also provides key insights about market drivers, restraints, opportunities, new product launches, approvals, regional outlook, and competitive strategies adopted by the leading players. It profiles leading players in the global Nanotechnology Medical Devices market based on the following parameters company overview, financial performance, product portfolio, geographical presence, distribution strategies, key developments and strategies and future plans Key companies covered as a part of this study include. Insights from this report would allow marketers and management authorities of companies to make informed decision with respect to their future product launches, market expansion, and marketing tactics. The global Nanotechnology Medical Devices market report caters to various stakeholders in this industry, including investors, device manufacturers, distributors and suppliers for Nanotechnology Medical Devices equipment, government organizations, research and consulting firms, new entrants, and financial analysts. Various strategy matrices used in analyzing the Nanotechnology Medical Devices market would provide stakeholders vital inputs to make strategic decisions accordingly.

The vast market research data included in the study is the result of extensive primary and secondary research activities. Surveys, personal interviews, and inputs from industry experts form the crux of primary research activities and data collected from trade journals, industry databases, and reputable paid sources form the basis of secondary research. The report also includes a detailed qualitative and quantitative analysis of the market, with the help of information collected from market participants operating across key sectors of the market value chain. A separate analysis of macro- and micro-economic aspects, regulations, and trends influencing the overall development of the market is also included in the report.

View post:

Nanotechnology Medical Devices Industry Report 2017 to 2022 - Equity Insider (press release)

Nanotechnology Engineers Struggle to Match Cell Performance – Discovery Institute

Cells make mechanical work look easy, but imitating what they do is very, very hard for intelligent designers of the human kind.

Moving Parts

Nanotechnologists are taking baby steps toward imitating what cells do all the time. In our bodies, each muscle cell uses myosin motors on actin filaments to generate tightly regulated pushes and pulls. While individually small, these physical forces added up to allow a competition weightlifter to hoist several times his body weight over his head. In other parts of our cells, kinesin motors walk along microtubules, delivering cargo where it is needed (see our animation). These moving parts are tightly regulated, directional and efficient. Now try building a molecular motor that can do these things.

In Gearing up molecular rotary motors in Science Magazine, Massimo Baroncini and Alberto Credi take a look at progress being made by nano-engineers trying to imitate motorized motors in cells. They have a long, long way to go.

Machines and motors based on synthetic small molecules are realized by a bottom-up approach to nanotechnology and could exploit molecular motion in one of two ways. The first generates macroscopic work by collecting the actions of many nanodevices organized in an array that provides spatial and temporal control of the motion activated by an energy supply. This approach mimics myosin motor proteins in skeletal muscles. The second route uses the energy-consuming directed movement of individual molecular machines to perform a task at the nanoscale, mimicking kinesin-based transport. Both cases mechanically couple an active component (the molecular machine) with nearby passive components and, ultimately, with the surrounding environment. On page 964 of this issue, tacko et al. report the synchronous transmission of a photoactivated directional motion from a synthetic molecular motor to a coupled rotor. This demonstration takes an important step forward toward more complex mechanical functions with artificial nanoscale devices. [Emphasis added.]

Baroncini and Credi try to put a positive spin on the artificial motors, but a look at the figures and actual accomplishments in the article show the products to be pitiful contenders. One is two molecules joined by a bond that spins uncontrollably and randomly. A team made progress by adding a kind of brake to it. Another model shows a design with a molecular paddle that can be rotated in one direction with light, but what does it actually do? Very little. It was considered a major accomplishment to keep the rotating part oriented to the stator.

While we applaud every bit of progress in this very challenging arena, the real lesson is what they are learning about design requirements.

Leaving aside the stereochemical considerations required for a detailed understanding of the coupling mechanism, the key message conveyed by the study of tacko et al. is that the transmission of motion relies on an appropriate tuning of the energy barriers associated with the different rotary motions. Another important requirement is the presence of diagnostic elements that enable the unambiguous experimental identification of the structures involved in the operation cycle. Both goals have been achieved by means of an ingenuous [sic? nave? artless?] molecular design.

When we ask just how ingenious the design is, we finally hear an admission that cells do things far, far better than we can. The ending paragraph says:

[An important feature of the present system compared with previous examples of controlled movements transferred within synthetic molecular devices is that the rotation generated by the motor is unidirectional, continuous, and autonomous (that is, it takes place under steady experimental conditions as long as light energy is available). Such extremely valuable properties are preserved upon transmission of motion. In living organisms, tasks ranging from signal transduction to motility are carried out by propagating molecular movements via mechanical connections. Although we are still far from reaching similar goals with artificial systems, the field of molecular machines is rapidly progressing, and elements now exist for taking up the challenge of making sophisticated nanoscale devices by coupling mechanical parts.]

In other words, just getting a molecule to spin in one direction has been a major challenge. Its going to take a long time before they add the signaling system and the mechanical connections to make their motors actually do some useful work.

Controlled Access

Another paper in Science explores how cells control what goes into and out of the cell membrane. Active transport is key to maintaining the cell far from equilibrium, which is part of what it means to be alive. Natural diffusion would bring everything to equilibrium, and life would stop. Nanotechnologists would like to be able to control passage of molecules, so they look to cells to learn their tricks.

What they find is an optimization problem. Its about Maximizing the right stuff: The trade-off between membrane permeability and selectivity, say five researchers from Korea and America. Engineers are always looking for ways to improve filters for things like desalination, dialysis, sterile filtration, food processing, dehydration of air and other industrial, medical, and environmental applications. A good filter cant be too picky or it will slow down, but it cant be too permissive or bad things will leak through. These constraints often work against each other. For instance, how would you design a filter that could let tennis balls pass, but keep ping pong balls out? Try that for an engineering challenge! Yet some membrane channels in living cells succeed at something similar on the atomic scale:

Unlike synthetic membranes, biological membranes exhibit both high permeability and high selectivity. For example, the potassium ion channel in cell membranes is thousands of times more permeable to potassium than sodium ions, despite the smaller ionic (i.e., crystallographic) size of sodium, and exhibits permeation rates (~10^8 ions/s) approaching the diffusion limit.

One cant just adjust the pore sizes in a filter, obviously, to achieve that kind of performance. A good filter needs other ways to discriminate between objects trying to get through. So far, though, human-designed filters that are sufficiently discriminating operate orders of magnitude more slowly than biological filters.

All membranes exhibit a trade-off between permeability i.e., how fast molecules pass through a membrane material and selectivity i.e., to what extent the desired molecules are separated from the rest. However, biological membranes such as aquaporins and ion channels are both highly permeable and highly selective. Separation based on size difference is common, but there are other ways to either block one component or enhance transport of another through a membrane.

By imitating the selectivity filters in cells, design features from biological membranes have been applied to break the permeability-selectivity trade-off. Cells show that you can get your tennis balls and keep the ping pong balls out. Once again, though, humans remain far behind in this kind of engineering challenge. You almost hear a sense of awe in their jargon:

Biological membranes, such as potassium ion channels and aquaporins (Fig. 2, E and F), have extremely high selectivity-permeability combinations, which has stimulated recent efforts aimed at (i) direct incorporation of such structures into membranes, (ii) theoretical studies aimed at understanding optimal structures (Fig. 2G) that might yield high permeability and selectivity, or (iii) synthetic membrane structures that mimic or are inspired by one or more elements of biological membranes. So far, incorporation of, for example, aquaporins into membranes has been done via assimilation of aquaporins into vesicles and integration of the resulting vesicles into membranes, but there are no successful, reproducible studies demonstrating that this strategy can produce highly selective membranes. Thus, much remains uncertain about their ability to be processed into the large-scale, defect-free structures required for practical applications or whether they can maintain adequate transport and selectivity properties upon exposure to complex, real-world feed mixtures for extended periods of time.

Translation: We cant even borrow cell technology so far, let alone imitate it. And even if we did, could it work for extended periods of time in complex, real-world situations?

The lengthy review paper describes case after case of needs banging against requirements. How can engineers use carbon nanotubes, graphene and other trendy materials to achieve cellular performance? If they could, all kinds of wonderful applications could be in our futures: better desalination plants, improved batteries, water purification, food processing, distillation, and a host of medical devices for separating blood, performing dialysis, delivering drugs and much more. Cells make it look so easy. The authors dont have much to say about how Darwinian evolution achieved such high performance. Like, nothing.

The authors spend the last three paragraphs discussing the outlook for future progress. Molecular-level design and insight, including advanced simulation and modeling, will be critical for breakthroughs going forward, they say. Yet our fundamental understanding of filtration at the molecular level is at an extremely rudimentary level.

In short, the best answers will come through biomimetics: imitating how cells do it. Cells have set a very high bar. The future of science, both for theoretical understanding and application, is focused on intelligent design. Without coming out and saying the banned phrase, these papers show it. Now if they can get their molecular machines to assemble from other molecular machines following coded instructions, and to reproduce themselves, theyll really be onto something.

Image: From The Workhorse of the Cell: Kinesin, via Discovery Institute.

Original post:

Nanotechnology Engineers Struggle to Match Cell Performance - Discovery Institute

Freshwater from Salt Water Using Only Solar Energy – Controlled Environments Magazine

A federally funded research effort to revolutionize water treatment has yielded an off-grid technology that uses energy from sunlight alone to turn salt water into fresh drinking water. The desalination system, which uses a combination of membrane distillation technology and light-harvesting nanophotonics, is the first major innovation from the Center for Nanotechnology Enabled Water Treatment (NEWT), a multi-institutional engineering research center based at Rice University.

NEWTs nanophotonics-enabled solar membrane distillation technology, or NESMD, combines tried-and-true water treatment methods with cutting-edge nanotechnology that converts sunlight to heat. The technology is described online this week in the Proceedings of the National Academy of Sciences.

More than 18,000 desalination plants operate in 150 countries, but NEWTs desalination technology is unlike any other used today.

Direct solar desalination could be a game changer for some of the estimated 1 billion people who lack access to clean drinking water, said Rice scientist and water treatment expert Qilin Li, a corresponding author on the study. This off-grid technology is capable of providing sufficient clean water for family use in a compact footprint, and it can be scaled up to provide water for larger communities.

The oldest method for making freshwater from salt water is distillation. Salt water is boiled, and the steam is captured and run through a condensing coil. Distillation has been used for centuries, but it requires complex infrastructure and is energy inefficient due to the amount of heat required to boil water and produce steam. More than half the cost of operating a water distillation plant is for energy.

An emerging technology for desalination is membrane distillation, where hot salt water is flowed across one side of a porous membrane and cold freshwater is flowed across the other. Water vapor is naturally drawn through the membrane from the hot to the cold side, and because the seawater need not be boiled, the energy requirements are less than they would be for traditional distillation. However, the energy costs are still significant because heat is continuously lost from the hot side of the membrane to the cold.

Unlike traditional membrane distillation, NESMD benefits from increasing efficiency with scale, said Rices Naomi Halas, a corresponding author on the paper and the leader of NEWTs nanophotonics research efforts. It requires minimal pumping energy for optimal distillate conversion, and there are a number of ways we can further optimize the technology to make it more productive and efficient.

NEWTs new technology builds upon research in Halas lab to create engineered nanoparticles that harvest as much as 80 percent of sunlight to generate steam. By adding low-cost, commercially available nanoparticles to a porous membrane, NEWT has essentially turned the membrane itself into a one-sided heating element that alone heats the water to drive membrane distillation.

The integration of photothermal heating capabilities within a water purification membrane for direct, solar-driven desalination opens new opportunities in water purification, said Yale University s Menachem Meny Elimelech, a co-author of the new study and NEWTs lead researcher for membrane processes.

In the PNAS study, researchers offered proof-of-concept results based on tests with an NESMD chamber about the size of three postage stamps and just a few millimeters thick. The distillation membrane in the chamber contained a specially designed top layer of carbon black nanoparticles infused into a porous polymer. The light-capturing nanoparticles heated the entire surface of the membrane when exposed to sunlight. A thin half-millimeter-thick layer of salt water flowed atop the carbon-black layer, and a cool freshwater stream flowed below.

Li, the leader of NEWTs advanced treatment test beds at Rice, said the water production rate increased greatly by concentrating the sunlight. The intensity got up 17.5 kilowatts per meter squared when a lens was used to concentrate sunlight by 25 times, and the water production increased to about 6 liters per meter squared per hour.

Li said NEWTs research team has already made a much larger system that contains a panel that is about 70 centimeters by 25 centimeters. Ultimately, she said, NEWT hopes to produce a modular system where users could order as many panels as they needed based on their daily water demands.

You could assemble these together, just as you would the panels in a solar farm, she said. Depending on the water production rate you need, you could calculate how much membrane area you would need. For example, if you need 20 liters per hour, and the panels produce 6 liters per hour per square meter, you would order a little over 3 square meters of panels.

Established by the National Science Foundation in 2015, NEWT aims to develop compact, mobile, off-grid water-treatment systems that can provide clean water to millions of people who lack it and make U.S. energy production more sustainable and cost-effective. NEWT, which is expected to leverage more than $40 million in federal and industrial support over the next decade, is the first NSF Engineering Research Center (ERC) in Houston and only the third in Texas since NSF began the ERC program in 1985. NEWT focuses on applications for humanitarian emergency response, rural water systems and wastewater treatment and reuse at remote sites, including both onshore and offshore drilling platforms for oil and gas exploration.

Li is Rices professor of civil and environmental engineering, chemical and biomolecular engineering, and materials science and nanoengineering. Halas is Rices Stanley C. Moore Professor of Electrical and Computer Engineering and professor of chemistry, bioengineering, physics and astronomy, and materials science and nanoengineering. Elimelech is Yales Roberto C. Goizueta Professor of Environmental and Chemical Engineering.

Additional study co-authors include Pratiksha Dongare, Alessandro Alabastri, Seth Pedersen, Katherine Zodrow, Nathaniel Hogan, Oara Neumann, Jinjian Wu, Tianxiao Wang and Peter Nordlander, all of Rice, and Akshay Deshmukh of Yale University.

SOURCE: Rice University

See the article here:

Freshwater from Salt Water Using Only Solar Energy - Controlled Environments Magazine

A Meeting of Minds and Computers: What Are the Costs of Using Technology to Merge Humans with Machines? – Religion Dispatches

There has been a lot of talk recently about the Singularity: the idea that were rapidly approaching a threshold event in history when artificial intelligence will transcend human intelligence, and the resulting transformation will lead to a new form of existence utterly different from anything that has come before. Discussions of the Singularity, however, sometimes miss the fact that there are very different ways it could happen, with different levels of likelihood.

One version that has received significant press lately is the emergence of a superhuman artificial intelligence (AI). Last year DeepMind, a Google-backed AI system, used deep learning techniques to teach itself Go, a game far more complex than chess, and then trounced world champion Lee Sedol. Prominent scientists, Stephen Hawking included, warn that the rise of self-organized machine intelligence could be the greatest existential threat facing humanity.

At the other end of the optimism spectrum, futurist Raymond Kurzweil dreams of immortality by downloading his mind and re-uploading it to new hardware after his deatha prospect he believes is closer than most people imagine, setting its date at 2045 in his bestseller The Singularity Is Near. Kurzweils ideas are gaining tractionhe is a director of engineering at Google, and his Singularity University boasts a faculty of some of Silicon Valleys leading entrepreneurs. But his vision may contain a fatal flaw: the human brain cannot be split, like a computer, between hardware and software. Rather, neuroscientists point out that a neurons biophysical makeup is intrinsically linked to its computations; the information doesnt exist separately from its material construction.

Will humans get there first?

There is, however, another kind of Singularity that doesnt rely on a leap of faith. Instead, its a predictable outcome of technological enhancements already being designed and implemented. It relies on the very fact that makes Kurzweils version unlikely: that human consciousness is embedded in a physical network of neurons. In this form of Singularity, human nervous systems across the world could connect with each other through the internet, permitting a new type of human superorganism to emerge.

Last month, Facebook and Elon Musk separately announced investments in technologies that could lead humanity to this outcome. Facebook announced plans for a silent speech interface using neural signal receptors that could allow users to type words into their smartphone using only their thoughts.

Billionaire Elon Musk raised the ante even further, announcing a new company, Neuralink Corporation, that aims to merge human brains with computers with the ultimate goal of enabling what Musk calls consensual telepathy. In this scenario, you would be able to share your thoughts and feelings with another person through a neural-computer interface. As in his other ventures, Musk is acting according to what he perceives as a grand vision for humanity. In his view, there is a race to the Singularity between humans and AI, and he wants humans to get there first, thus becoming active participants in the post-Singularity world rather than useless bystanders as AI takes over.

Speculative as these ideas may appear, the first steps down this path have already been taken. Hundreds of thousands of profoundly deaf people now hear through neural implants that use electrodes to transmit sound waves directly to the cochlear nerve. Patients suffering from Parkinsons disease can control their tremors through deep brain stimulation, which sends electrical pulses that modulate the brains neural activity. Brain-controlled prosthetics are being developed to allow paralyzed patients to move artificial fingers, legs, and even cursors on a computer screen.

Currently this requires complicated surgery. However, Miguel Nicolelis, a pioneer in brain-machine interfaces, believes that by around 2030 noninvasive methods could enable people to communicate regularly with their computers using thought. Scientists have already programmed brain scanners to literally see an image someone has in their mind.

Human Superorganism?

The implications of an advanced neural-computer interface are so enormous that they challenge the imagination. People could use the interface to share ideas with each other merely by thinking them and transmitting those thoughts through a network. But the potential extends far beyond mere conceptual sharing. As technology improved, youd be able to share your emotions, feeling tones, and physical sensations with others. Emotional responses to public events could be uploaded and spread over the internet. Intimate relationships would be utterly transformed.

Its easy to see how the boundary lines between an individual, the computer interface, and the rest of humanity might become blurred. Once a critical mass of people is connected, would they eventuallybegin to identify more as a group of interconnected thoughts and sensations than as individuals? Unlike Kurzweils Singularity, this version would not provide immortality to any of us. It might, however, cause the emergence of a new, collective entity, a self-organized human intelligence that incorporates and reflects each of the billions of individuals comprising it, in much the same way that an ant colonysometimes referred to as a superorganismdemonstrates a collective intelligence far beyond the limitations of each individual ant.

Many people might recoil from this vision, fearing the loss of individuality it might entail. However, this wouldnt necessarily be the case. The urge to connect with each other is one of humanitys defining characteristics. Our most crucial inventionlanguageis essentially a vehicle to transcend each persons cognitive isolation. Another uniquely human capabilitymusicpermits us to share emotional experiences in a meaningful and exquisite way. From this perspective, its reasonable to see the emergence of a human superorganism as another evolutionary stage that could profoundly enrich, rather than detract from, the intrinsic experience of being human.

Or a swarm of programmed drones?

However, this vision could be hijacked by the same forces that are already steering the internet into disturbing territory. Data privacy concerns, already paramount, have been exacerbated by the recent decision of the U.S. Congress to permit internet service providers to sell detailed usage data without customers permission. How much greater would this concern be if our very thoughts and feelings could be used as marketing fodder?

Even with the limitations of current technology, social media innovators have found ways to manipulate our hormonal responses. Pioneers in the field of captologyfrom the acronym CAPT, or Computers As Persuasive Technologyhave learned to use what they call hot triggers, such as the thumbs-up icon or Like statistics, to spark micro-doses of endorphinsin our brains, causing subconscious addictive behavioral loops. It is hard to imagine the power these manipulations would hold over us if they had direct access to our brains.

Then there are concerns about what kind of forces an internet of emotions might unleash. Many observers have voiced apprehension about our post-factual world roiled by fake news stories that spread like wildfire across the internet. If raw emotions could be transmitted across humanity like a tidal wave without requiring even false facts to back them up, what would this do to the political makeup of a future neurally connected society?

The issues that are hotly debated today have implications not just for how the internet will develop in the near future, but quite possibly for how humanity will evolve. Might we one day share the unimaginable experience of being part of a human superorganism while retaining individual autonomy? Or will we simply become programmed drones thinking were making our own choices that ultimately are driven by the objectives of corporate shareholders and unscrupulous politicians?

One thing is clear: discussions about the Singularity cannot be left to a few pioneering think tanks and billionaire entrepreneurs. The implications of these new developments are enormous. The future direction of humanity may well be at stake.

More here:

A Meeting of Minds and Computers: What Are the Costs of Using Technology to Merge Humans with Machines? - Religion Dispatches

Wearables a glimpse into the future – Gadgets & Wearables

While some have viewed the merging of technology and physiology as fantasy, reality is showing that this is no longer just science fiction. On the contrary, the beginnings of this process are wellunderway. We are however, still in the toddler stage of this evolution.

Many people found the first wave of wearables came up short. Entry-level price points are high, and accuracy is not as great as it could be. It is no wonder that surveys show there is a 30% return rate and high product abandonment after six months.

But what does the future hold?

Search the internet, and you will find a plethora of articles attempting to predict where technology will lead us in the next few decades. One name that stands apart, however, is Ray Kurzweil.

Bill Gates calls Ray Kurzweil, the best person I know at predicting the future of artificial intelligence.

So who is this man?

He has received 20 honorary doctorates, been awarded honors from three U.S. presidents, and has authored 7 books (5 of which have been bestsellers).

Essential reading: Top fitness trackers and health gadgets for 2017

Ray Kurzweil is the principal inventor of many technologies ranging from the first CCD flatbed scanner to the first print-to-speech reading machine for the blind. He is also the chancellor and co-founder of Singularity University, and the guy employed by Google to direct its artificial intelligence development.

While he hasnt been precisely right in every single prediction, his track record in making forecastsis stunningly good.

So what does the future hold according to Ray? It makes for exciting and scary reading.

All of this sounds incredible,but think back 10 or 20 years ago. Just like Rays predictions sound like science fiction today, the internet, the iPhone, Google would have seemed like science fiction not too long ago.

Rays predictions are a byproduct of the power of Moores Law.

Moores Law contends that as components get smaller, products gain efficiency and become more powerful. Moores law is part of a continuum of exponential expansion of computational power that extends back hundreds and hundreds of years. This means that in addition to accurately charting the progress of semiconductor technology from 1960 until now, it goes further. As before that there were other computing technologies, back to the abacus and beyond.

What this means is, that you can think of current wearables as the Osborne Executive portable computer strapped to your wrist.

The above image shows an Osborne Executive portable computer, from 1982, with a Zilog Z80 4 MHz CPU, and a 2007 Apple iPhone with a 412 MHz ARM11 CPU; the Executive weighs 100 times as much, has nearly 500 times the volume, costs approximately 10 times as much (adjusted for inflation), and has about 1/100th the clock frequency of the smartphone.

As devices become smaller, faster and more feature packed, jewelry like gadgets will increase. This will be followed by conductive fabrics or sensor-clad smart garments which we are already seeing.

Essential reading: Moving away from the wrist the best smart clothing

As hard as it is to believe, ultimately wearables will go much further, even going into ingestible technology. By the early 2020s, it is likely that we will start to rely on embedded devices technology that is physically implantedinto our bodies.

There is little doubt that human beings are increasingly merging with technology. Computers are no longer just for our desks and pockets. They are now proudly displayed on our bodies and will one day be merged with them. The innovations that will enable this are inevitable and already well underway!

Like this article?Subscribe to our monthly newsletterandnever miss out!

See the original post here:

Wearables a glimpse into the future - Gadgets & Wearables

Cloud-based Research Informatics: Improving Collaboration, Increasing Agility and Reducing Operating Costs – Technology Networks

To address rising cost and risk pressures, improve innovation and focus on core competencies, many science-based organizations are moving collaborative relationships beyond traditional boundaries and creating flexible networks of researchers. Some are in-house; others are with industry and academic partners, research institutes, consortia and contract research organizations (CROs). Over time, these externalized networks are increasing in size and complexity. Many combine numerous partners with diverse objectives involving single or multiple research projects that, in some cases, can tie up more than 50% of a commissioning organizations IT budget. Internet-based collaboration solutions such as email, SharePoint, VPN, Citrix and other data exchange mechanisms often introduce security challenges, incompatible data formats and the need to prepare and curate files manually. These difficulties can reduce productivity, decrease data quality, lengthen project timelines and increase failures.

With these challenges in mind, combined with an enormous pressure to reduce their informatics footprint, organizations are turning to cloud-based solutions as a scalable, secure, state-of-the-art environment for research collaboration. With cloud adoption significantly enhancing collaborative projects, increasing operational agility and lowering total cost of ownership, cloud computing has become a valuable and viable solution today; however, organizations are often uncertain about the best way to evaluate, select and implement a cloud collaboration platform.

To learn more about cloud-based research informatics and the benefits and challenges of adopting this technology, we spoke to Ton van Daelen, senior product director for collaborative sciences at Dassault Systmes BIOVIA.

What data challenges do scientific organizations face in the modern, collaboration-driven world?

Science-based organizations across diverse industry sectors (e.g., life sciences, consumer packaged goods, energy/process/utilities and industrial equipment) are radically reinventing themselves by embracing globalization, innovating with outside partners and focusing on operational excellence. Externalized projects introduce substantial challenges that are typically not encountered in internal projects. How do you set up an IT infrastructure that supports external parties? How do you communicate effectively with partners in different geographies and time zones? How do you securely share data and reduce the amount of time required to clean up and standardize collaborator data? How do you secure the IP of different parties and share project data in real time?

Faced with these challenges, external projects often do not meet their original expectations, or worse, they fail completely. This is a huge risk factor as organizations rely more than ever on external partners to advance discovery initiatives. External collaborations require internal staff to radically change the way they access, manage and interpret research data, which can disrupt established workflows and require significant retraining. As this is not often feasible, access to collaboration data tends to be managed by a few gatekeepers, further reducing collaboration effectiveness.

Data exchange with external partners is often manual and therefore prone to error, and all collaborations bring their own sets of data representations. Highly-paid scientists can spend up to 50 percent of their time manually processing and checking collaborator data. Errors can go unnoticed for weeks or months, resulting in significant project delays and IP risk. Reliable, automated procedures for tech transfer, data standardization and data transfer to legacy databases are expensive to implement and maintain and often require highly skilled software developers.

What are the key benefits of cloud-based data management? How much of an impact can implementing this kind of system have?

Cloud-based data management provides web and mobile-accessible applications for uploading, processing, storing, searching and analyzing structured and unstructured scientific records. Most importantly, the cloud delivers improved agility and lowers total cost of ownership (TCO) for organizations tasked with responding quickly to changing business needs while also lowering costs. Being in the cloud means you can set up a robust collaboration systemaccessible anywhere, anytimewith minimal IT support quickly and easily, and you only pay for what you need and use.

In a cloud environment:

Scientists can rapidly access and share research data including experiments, chemical structures, assays and other test results which significantly accelerates informed decisions based on the most complete and current information available. No time is lost on data transformation and interpretation. Data originators ensure that their data resides in the system as intended. The ability to annotate data provides context for other scientists using the information.

Research managers continuously gain visibility into projects along with the ability to understand the latest results across all partnering organizations. As a result, they can better schedule the project tasks to maximize the process efficiency. Data access is secure and controlled and not reliant on the partner, helping to protect IP and the organizations investment in collaborations.

IT organizations benefit from low IT cost of ownership resulting from the pay-as-you-go model of a hosted system. As collaboration networks evolve, the cloud system makes it easy for sponsoring organizations to quickly spin up and spin down partner engagements with the data they provide securely partitioned in the system. A highly configurable cloud system can support the easy definition of sites, projects and user roles, as well as the definition of data access and upload permissions for each of these. A cloud system certified to the ISO-27001 industry standard helps to ensure that confidential and proprietary information remains secure in the cloud.

Cloud-based technologies are completely new to many people in science. How can scientific organizations be confident that they are selecting a solution that will meet their requirements?

The cloud is now a proven solution used by many companies of all sizes and in many industries. Solutions like Salesforce.com, Workday, SAP SuccessFactors and Concur Technologies have been handling confidential information in the cloud for millions of users over many years. The best way to deploy a cloud solution (and deal with cloud skeptics) is to create a strong cloud vision and build a compelling cloud strategy that aligns with organizational needs. Simplify governance issues by assessing impactful use cases, current workflows/processes, security risks and the highest priority functions to be moved to the cloud. Additionally, carefully consider the cultural changes and new policies that will be needed to streamline governance and align employees with the new cloud environment. Your cloud provider should be able to guide you through the process of provisioning and managing users and groups across the cloud apps your organization needs. This includes the process of creating and managing groups, controlling who has access to apps, enabling self-signup, managing password requirements and the many other business process changes that come with a move to the cloud.

A few figures:

Worldwide spending on public cloud services will grow at a 19.4 percent compound annual growth rate (CAGR) from nearly $70B in 2015 to more than $141B in 2019.1

+171,000 paid attendees from 83 countries attended Dreamforce in 2016.2

Morgan Stanley predicts Microsoft cloud products will be 30 percent of revenue by 2018.1

By 2020, penetration of software as a service (SaaS) versus traditional software deployment will be over 25 percent. Packaged software will shrink to 10percent of new enterprise installations.1

Implementing a cloud-based data management system is a big step for any organization. How can they effectively measure the success of adopting this technology once it is all up and running?

Organizations can expect a number of benefits to materialize over time: time and cost savings through data exchange and communication automation, a lowered TCO resulting from infrastructure in the cloud, shorter project timelines and improved efficiency with cloud agilitymaking it possible for organizations to adapt to changing business environments by spinning collaborations up and down quickly in the cloud. Key Performance Indicators (KPIs) our customers typically use are 1) scientist productivity, 2) IT spending per user/application, 3) time spent implementing new software applications and 4) average time to start up, run and close down a project.

References

1. Louis Columbus, Roundup of Cloud Computing Forecasts and Market Estimates, 2016

https://www.forbes.com/sites/louiscolumbus/2016/03/13/roundup-of-cloud-computing-forecasts-and-market-estimates-2016/#192bad382187

2. https://www.salesforce.com/blog/2016/10/dreamforce-16-by-the-numbers.html

Continued here:

Cloud-based Research Informatics: Improving Collaboration, Increasing Agility and Reducing Operating Costs - Technology Networks

6 great Android features missing from iOS 11 – PCWorld

Call me a flip-flopper, but thenew features in iOS 11 have me thinking of jumping back to iOS after switching to Android barely a year ago.

Indeed, the new version of iOS brings such enticing features as a revamped App Store, a customizable Control Center, and drag-and-drop for iPad users, plus such catch-up features as one-handed typing and easy person-to-person payments.

But returning to iOS would mean leaving behind many Android features I've grown to love, from the ability to set up multiple user profiles to one-touch Google searches on whatever's onscreen at a given moment.

Read on for six awesome Android features that iOS 11 has yet to match, starting with...

Given all the innovations coming to the iPad courtesy of iOS 11, from the ability to drag-and-drop elements from one side of the split screen to the other and the new, persistent app dock, you'd think Apple would toss in a feature that's been standard on Android for years: user profiles, perfect for letting family members in a one-iPad household create their own personal iPad spaces.

If you've been waiting for Android-like user profiles to arrive on iOS, bad news: they're still missing in iOS 11.

For whatever reason, though (privacy concerns, perhaps?), Apple has yet again passed on adding user profiles to the iPhone or iPad. That means if you share your iPad with your toddler or teenager, you're sharing all your iPad data, too, including your e-mail, your open browser tabs, your Facebook app, everything.

Android has really spoiled me with its "automatic rules" for Do Not Disturb mode. With automatic rules, you can set up multiple Do Not Disturb schedules for weeknights, weekends, meetings, and any other scenarios you dream up. For example, I have Do Not Disturb set to turn itself off early (as in 6 a.m.) on weekday mornings, while on weekends, Do Not Disturb keeps things quiet until about 8.

Android's "automatic rules" let you create multiple Do Not Disturb schedules, as opposed to the single Do Not Disturb schedule in iOS 11.

In iOS 11, though, Do Not Disturb mode still lets you set onlya single schedule, meaning you can't set Do Not Disturb to give you more quiet time on weekends or during meetings.

Yes, the new "Do Not Disturb While Driving" feature (which automatically silences notifications whenever your iPhone senses you're driving) is a nice innovation, but it's too bad iOS 11 didn't catch up to Android's Do Not Disturb features.

As with previous versions of Apple's mobile software, iOS 11 lets you perform quick web searches on selected text via Spotlight, iOS's universal search feature. That's helpful if you want a deep search on a narrow selection of text, but sometimes I'm looking for a broader search of everything on my screen.

Android's "screen search" feature lets you do a one-tap Google search on everything that's on your screen, a feat that iOS 11 has yet to master.

Here's where Android's Screen Search feature comes in handy. With a single tap of the What's on my screen button in Google Assistant, Android will scan the entire screen and return any relevant search results, handy if you want a quick, 360-degree cheat sheet on a news article or web page. Pretty neat, and there's no real equivalent on iOS, not even once iOS 11 arrives.

Here's an Android feature I'd sorely miss even though I know it's more cosmetic than anything else. The "Clear All" button on Android's Overview screen instantly closes all your open app windows, leaving you with a soothing "No recent items" message when you tap the Overview button again. For a neat-freak like me, tapping the Clear Allbutton never gets old.

The "Clear All" button on Android's Overview screen might be the feature that Android-to-iOS 11 switchers miss the most.

On iOS, thoughand yes, this includes iOS 11there's no easy way to clear out the massive stack of app windows on the multitasking screen, forcing you to flick up on dozens of individual windows until the coast is clear.

Now, I'm sure iOS comes with marvelous under-the-hood tools that manage the resources used by your apps and automatically suspends those that have been sitting untouched in the background for too long.

Still, though, I know it'll kill me the first time my thumb reaches for the non-existent Clear All button on my new iPhone 8 (assuming I actually make the big leap).

You've probably heard about the new storage-saving features in iOS 11, particularly when it comes to the storage-hogging Photos app.

iCloud Photo Library will help shave some of the storage space consumed by your iPhone snapshots, but Google Photos for Android can wipe all local pictures and videos, perfect for keeping photo storage to an absolute minimum.

For example, Apple announced support for a new image format (HEIF, for "High Efficiency Image Format") that can halve the amount of storage gobbled up by your snapshots.

Also coming in iOS 11: shortcuts that do a better job of recommending storage-saving features like iCloud Photo Library, which uploads all your pictures and videos to the cloud and then automatically pares down the number of images sitting on your iPhone or iPad.

Those are worthwhile improvements, but here's something I'd sorely miss if I went back to iOS: the "free up space" feature in Android's Photos app, which instantly zaps each and every local snapshot and video stored on your handset.

Thanks to the "free up space" feature, photos take up less than 100MB of space on my 16GB Nexus 5X. On the other hand, the Photos app on my old iPhone 6 consumes a ridiculous 17GB of storage, even with iCloud Photo Library turned on (and yes, with the Optimize iPhone Storage option enabled).

Bonus tip: The iOS version of Google Photos has a "free up space" feature just like its Android counterpart, meaning you could clear up tons of storage space on your iPhone or iPad by uploading your photos to Google and then using the "free up space" option to delete your local copies. Keep in mind, though, that if you're using Google Photos and iCloud Photo Library at the same time, wiping your local images and videos with Google Photo's "free up space" feature will also delete those photos from iCloud, so make sure all your local image files are safely backed up first.

As with the latest version of Google Keyboard for Android, iOS 11 will bring symbol shortcuts to letter keys on the iPad keyboard, handy for saving a few keystrokes when you need to type a number key, an ampersand, or another common symbol.

Thanks to iOS 11, symbol shortcuts on letter keys (shown here on Google Keyboard for Android phones) are finally coming to the iPad; not so for iPhone, unfortunately.

That's a welcome change, but unfortunately, iOS 11's so-called "QuickType" keyboard is only coming to iPad, not iPhone. Now, you could argue that the iPhone keypad is too small for symbol shortcuts, but the shortcuts on Google Keyboard work just fine on my five-inch Nexus 5X.

Read the original:

6 great Android features missing from iOS 11 - PCWorld

Fish as Medicine for Rheumatoid Arthritis – New York Times

Photo

Eating fish may help reduce the joint pain and swelling of rheumatoid arthritis, a new study has found.

Researchers studied 176 people in a larger health study who had had physical exams and blood tests and filled out food frequency questionnaires that indicated their consumption of various types of non-fried fish.

The study, in Arthritis Care & Research, categorized the participants into groups by fish consumption: less than one serving a month, one a month, one to two a week, and more than two a week. To rate the severity of symptoms they used a disease activity score that assigns a number based on the degree of swelling and pain.

After controlling for race, sex, body mass index, smoking, education, fish oil supplement use, duration of rheumatoid arthritis symptoms and other health and behavioral characteristics, they found the average disease activity score in each group declined as fish intake increased.

The lead author, Dr. Sara K. Tedeschi, an associate physician at Brigham and Womens Hospital in Boston, said that this is an observational study and does not prove cause and effect.

Still, the observed reductions in pain and swelling from the lowest to the highest group in fish intake is clinically significant. The magnitude of the effect, she said, is large about one-third of the expected magnitude of the standard drug treatment of rheumatoid arthritis with methotrexate.

See the rest here:

Fish as Medicine for Rheumatoid Arthritis - New York Times

George T. Hare, 86, chose medicine over baseball – Philly.com

George T. Hare

Advertisment

of

George T. Hare threw a mean pitch and was a cleanup hitter back in the day. When he had to choose between professional baseball and going to medical school, however, he picked the latter, as he felt a strong calling to become a doctor.

Dr. Hare, 86, of Cherry Hill, formerly of Haddonfield, died of heart failure on Saturday, June 17, at his home.

Building a specialty in geriatric medicine, Dr. Hare was diligent in making sure his patients were comfortable, said daughter Patricia.

He loved medicine more than anything else, she said. He not only treated the patient, but he treated the family as well.

Joseph Costabile, a surgeon at Cooper University Health Care who worked with Dr. Hare, called him 100 percent passionate about patient care. He was a tough doctor who checked on his patients to make sure his orders had been carried out properly.

If you didnt take good care of his patients, there would be hell to pay, Costabile said.

George T. Hare with his wife, JoAnn.

In 1948, Dr. Hare graduated from Haddon Heights High School, where he played baseball, basketball, and football. In 1991, he was inducted into the schools Sports Hall of Fame.

After high school, he attended Gettysburg College, where he played basketball and baseball. In basketball, he was an All-State honorable mention and All-Little Three player, according to Gettysburgs website.

In college baseball, Dr. Hare was both a top starting pitcher and the cleanup hitter in the batting order. He hit safely in 14 games as a sophomore and later hit numerous home runs, according to the website. Dr. Hare is listed as a .295 hitter as a sophomore to over .400 as a senior, switching between the mound and left field. He pitched in defeats against Navy and the Universities of Delaware and Pittsburgh. He was inducted into the Gettysburg Sports Hall of Fame in 1995.

Dr. Hare turned down an offer by the Cleveland Indians in order to continue his education.

He majored in biology at Gettysburg, receiving his bachelors degree in 1952.

After Gettysburg, he attended New York Medical College, receiving his medical degree in 1956. He did his residency at Thomas Jefferson University Hospital in Philadelphia, and interned at what is now Cooper University Hospital in Camden.

During his medical career, which spanned more than 50 years in New Jersey and Pennsylvania, Dr. Hare held numerous leadership positions, including medical director of the Camden County Health Services Center and head of the division of geriatric medicine that he created at Cooper. He also was part of a homebound program in geriatric medicine in Camden, and helped establish New Jerseys first palliative care unit in a long-term care facility, his family said.

Dr. Hare was a member of the Camden County Medical Society, serving as secretary and participating with the group at the national level. He served as a member of the Gettysburg Board of Fellows, and was in the Army Reserve Medical Corps, reaching the rank of major.

His daughter said her father had an innate understanding of the elderly, a trait she believes was passed down by his parents, who were kind and generous. He was deeply devoted to his family and was a dedicated physician who touched countless lives, she said.

In his spare time, Dr. Hare golfed, played the slot machines in Atlantic City, and loved his pets.

In addition to his daughter, Dr. Hare is survived by his wife, JoAnn; son Tom; five grandchildren; and three great-grandchildren. He was preceded in death by sons John and David.

A viewing for Dr. Hare will be from 10 to 10:45 a.m. Friday, June 23, at St. Marys Episcopal Church, 501 Green St,, Haddon Heights. A memorial service will follow at 11. Interment will be at Locustwood Cemetery in Cherry Hill.

Donations may be made to the Voorhees Animal Orphanage, 419 Cooper Rd,, Voorhees, N.J. 08043, or vaonj.org.

Published: June 20, 2017 3:01 AM EDT | Updated: June 20, 2017 6:22 PM EDT

We recently asked you to support our journalism. The response, in a word, is heartening. You have encouraged us in our mission to provide quality news and watchdog journalism. Some of you have even followed through with subscriptions, which is especially gratifying. Our role as an independent, fact-based news organization has never been clearer. And our promise to you is that we will always strive to provide indispensable journalism to our community. Subscriptions are available for home delivery of the print edition and for a digital replica viewable on your mobile device or computer. Subscriptions start as low as 25 per day. We're thankful for your support in every way.

Read the original here:

George T. Hare, 86, chose medicine over baseball - Philly.com

Liberty Twp. trustees want Millikin Road interchange to be priority – Hamilton Journal News

LIBERTY TWP.

Liberty Twp. Trustee Tom Farrell says he is prepared to do battle with the state over a new Millikin Road interchange he and his fellow trustees consider a top priority, even over fixing Liberty Way.

The trustees met Tuesday afternoon with their consultant and Butler County Engineer Greg Wilkens to go over their purpose and need statement for the Ohio Department of Transportation. They were told they really dont have the traffic and safety data to support the interchange at Interstate 75.

The purpose and need has to have as its foundation congestion and safety before economic development. It has to, consultant Andy Shahan told the trustees. The data has to show that.

The township needs the interchange to open up its northeastern edge that is ripe for commercial development. But Farrell said it is more than just about economic development, they need to build an entire transportation network in that area, which includes extending Cox Road north to Millikin Road.

He said the county and the township capitulated when the Liberty Way interchange at Ohio 129 was built and now they have a problem interchange that needs fixing. He said they either had to agree to the states plan or lose funding.

It was basically build it their way or not at all. We did the right thing and built it, Farrell said. I want to do the right thing this time and I want to fight about it and I want to make the right business decision. This is silly, were spending this kind of money on Liberty Way as opposed to putting in Millikin to support this area for generations.

Wilkens recently got the county commissioners concurrence to pay a consulting firm almost $1 million to fine tune plans for fixing Liberty Way. Rough estimates on that project are $30 to $40 million.

Trustee Steve Schramm said there is a logical business reason for putting Millikin ahead of the Liberty Way project. If Millikin were done first development will come and bring a revenues to help pay for the Liberty Way project.

How much life span can you buy by getting Millikin Road in, bleeding traffic off (Liberty Way) and letting the income stream from Millikin pick up some of that load from Liberty going forward, Schramm said. The other way around we dont have the income stream to offset Liberty.

Wilkens said the state is a lot more receptive to what locals want than they used to be but it is still going to be a hard sell.

I understand what youre driving at but I dont think youre ever going to get recognition from ODOT and the feds in that, that recognize the problem at Liberty, they are on the property at Liberty, he said. To say that by putting Millikin up Im going to resolve part of this problem, I dont think youll ever convince them.

Trustee President Christine Matacic said she also believes if they can show the state all the traffic that is currently using Liberty Way and Ohio 63 and growth projections in both Butler and Warren counties on that stretch of Interstate 75, they could make a better case for the Millikin Road interchange.

We need those numbers, not just for Liberty Twp. but Mason, Deerfield, Turtlecreek, Monroe, Fairfield Twp. and West Chester Twp., she said. Because theyre all surrounding us and theyre all utilizing these pathways that are being created as well as the business centers that are being created.

Shahan told the trustees his team could use some of their institutional knowledge in strengthening the data and they will be back probably within a month with a new statement to help convince the state Millikin is a must-do project.

Original post:

Liberty Twp. trustees want Millikin Road interchange to be priority - Hamilton Journal News

Athletics lead division after beating Liberty twice – Lee’s Summit Journal

Shane Boyer of the Midwest As, who slid safely into second base during a recent game against Sabetha, had four hits and drove in three runs in the first of two victories last weekend over the Liberty Monarchs.

JIM ARNOLD Special to the Journal

The Midwest Athletics took over first place in the Mid-Plains Leagues East Division with two close victories over the Liberty Monarchs last weekend.

The As started the series with an 8-6 victory over the former division leaders June 15 at Liberty High School. Midwest broke a 6-6 tie in the top of the ninth inning with an RBI-single from Luke Knight and an error to win the game.

Michael Briggs two-run double in the first inning helped the As take a 3-0 lead. The As had a 4-0 lead before the Monarchs tied it with four runs in the bottom of the fourth and tied it again with two in the seventh after the As scored twice in the sixth.

Shane Boyer went four for five with three RBIs and Elijah Dilday went four for five and scored two runs for the As, who cranked out 17 hits off two Liberty pitchers.

In the second game June 16, Briggs hit a two-run homer during a three-run seventh inning that broke a 5-5 tie and lifted the As to an 8-5 victory. Midwest trailed 4-0 after the first and trailed 5-3 before tying the game with two runs in the sixth.

Tyler Perkins went three for five and drove in three runs, while Connor Sorge went two for four with two RBIs for the Athletics. Dalen Blair picked up the win with three scoreless innings in relief of starter Dalton Spurgeon.

The As, 9-3, maintained their new lead in the standings after blowing a seven-run lead and losing to the Rossville Rattlers 12-10 Monday night in Rossville, Kan.

Midwest scored five runs on four hits in the first inning and held a 10-3 lead until Rossville struck for nine runs on seven hits in the seventh. Two of the Rattlers hits were home runs.

Nate Vossman hit a three-run homer in the first for the As, and Dusty Stroup added a two-run shot in the second.

The As will host the Baldwin City Blues for a three-game series Thursday through Saturday at Belton High Schools Al Hamra Stadium. All three games will start at 5 p.m.

Visit link:

Athletics lead division after beating Liberty twice - Lee's Summit Journal