Life and Love in the Uncanny Valley | Visual Science

NEXT>

1-map

David Hanson’s robots are by now somewhat familiar faces, including his Einstein robot currently being used as a research tool at Javier Movellan’s Machine Perception Lab at UCSD, and the punk rock conversationalist Joey Chaos. A less familiar face is that of Bina Rothblatt, the blonde at the end of the table in the above photograph. Bina is a robot commissioned by Sirius Satellite Radio inventor Martine Rothblatt to look like her beloved wife. Take that, uncanny valley!

Photographer Timothy Archibald and I worked closely on this project with the idea of creating portraits, and maybe a kind of family portrait, of the Hanson robots. After flying to Texas to shoot Hanson and robots at his home and workshop in Dallas, Texas, Archibald wrote to me.

“Here is a big house in a Texas suburb that looks normal on the outside. On the inside it is robot making company made up of a floating array of 9-12 employees sculpting things, working on the electrical stuff and writing code for software…taking over the living room, den, kitchen, etc. On the upstairs level is where Hanson, his wife and 3 year old live. They they are in month three of this arrangement. There is no down time. People trickle in at 11:00 AM and stay until 1-3 AM everyday including weekends. They are cranking right now, trying to hit deadlines with The Android Portrait of Bina Rothblatt as well as a potential consumer robot called ZENO. Curiously, Hanson’s son is also named Zeno. There is a story on how that came to be, of course…”

To see more photography from this story, check out DISCOVER magazine’s May 2010 issue on newsstands now.

Katherine Batiste of Hanson Robotics working on a computer with “An Android Portrait Of Bina Rothblatt” sits on the table.


NEXT>

Happy 20th anniversary, Hubble! | Bad Astronomy

Tomorrow marks the 20th anniversary of the launch of the Hubble Space Telescope. I spent ten years of my life working on that magnificent machine, from using observations of a supernova for my PhD, all the way to helping test, calibrate, and eventually use STIS, a camera put on Hubble in 1997.

Last year, I published Ten Things You Don’t Know About Hubble, and I don’t think I can really add much to it here. I also have a lot of new readers since then, so I’ll simply repost it now as my tip o’ the dew shield to the world’s most famous observatory.

Introduction

On April 24, 1990, the Space Shuttle Discovery roared into space, carrying on board a revolution: The Hubble Space Telescope. It was the largest and most sensitive optical-light telescope ever launched into space, and while it suffered initially from a focusing problem, it would soon return some of the most amazing and beautiful astronomical images anyone had ever seen.

Hubble was designed to be periodically upgraded, and even as I write this, astronauts are in the Space Shuttle Atlantis installing two new cameras, fixing two others, and replacing a whole slew of Hubble’s parts. This is the last planned mission, ever, to service the venerable ’scope, so what better time to talk about it?

Plus, it’s arguably the world’s most famous telescope (it’s probably the only one people know by name), and yet I suspect that there are lots of things about it that might surprise you. So I present to you Ten Things You Don’t Know About the Hubble Space Telescope, part of my Ten Things series. I know, my readers are smart, savvy, exceptionally good-looking, and well-versed in things astronomical. Whenever I do a Ten Things post some goofball always claims they knew all ten. But I am extremely close to being 100% positive that no one who reads this blog will know all ten things here (unless they’ve used Hubble themselves). I have one or two big surprises in this one, including some of my own personal interactions with the great observatory!

Ten Things You Don’t Know About Hubble

 


Good teachers help students to realise their genetic potential at reading | Not Exactly Rocket Science

Teacher_writing_on_a_BlackboardGenetic studies suggest that genes have a big influence on a child’s reading ability. Twins, for example, tend to share similar reading skills regardless of whether they share the same teacher. On the other hand, other studies have found that the quality of teaching that a child receives also has a big impact on their fluency with the written word. How can we make sense of these apparently conflicting results? Which is more important for a child’s ability to read: the genes they inherit from their parents, or the quality of the teaching they receive?

According to a new study, the answer, perhaps unsurprisingly, is both. Genes do have a strong effect on a child’s reading ability, but good teaching is vital for helping them to realise that potential. In classes with poor teachers, all the kids suffer regardless of the innate abilities bestowed by their genes. In classes with excellent teachers, the true variation between the children becomes clearer and their genetic differences come to the fore. Only with good teaching do children with the greatest natural abilities reach their true potential.

This study demonstrates yet again how tired the “nature versus nurture” debate is. As I wrote about recently in New Scientist, nature and nurture are not conflicting forces, but partners that work together to influence our behaviour.

This latest choreography of genes and environment was decoded by Jeanette Taylor from Florida State University. She studied over 800 pairs of Florida twins in the first and second grades. Of the pairs, 280 are identical twins who share 100% of their DNA, and 526 are non-identical twins who share just 50% of their DNA. These twin studies are commonly used to understand the genetic influences of behaviour. If a trait is strongly affected by genes, then the variation in that trait should be less pronounced in the identical twins than the non-identical ones.

Florida just happens to collects data on the reading skills of its young children, using a test called the Oral Reading Fluency (ORF) test. The twins’ scores told Taylor how good they were at reading, and the improvement in the scores of their classmates told her how good their teachers were. Crunching the numbers, Taylor found that genes influenced around half of the variation in reading scores (47%), while shared environments (like a common household) accounts for 37% and non-shared environments accounted for 16%.

Teaching_genetic_readingGenes are clearly important, but teaching mattered too. At the highest echelons of teaching quality, genes explained around 70% of the variance in reading scores. At the lowest troughs, they only accounted for around 30%.

Taylor confirmed the effect of teaching quality in a couple of different ways. She took a sample of 42 pairs of identical twins and found that those whose reading skills were below average did indeed have poorer teachers than those with above-average skills. She also looked at 216 pairs of identical twins, where each twin had a different teacher. Among these children, the difference in quality between their teachers strongly predicted the difference in their reading abilities.

These results are somewhat different to previous genetic studies, which found that around 65% of the variation in children’s reading skills can be explained by genetic factors. These same studies have suggested that outside influences, like family and school, are far less important – the genes are at the wheel, and the environment is in the backseat shouting instructions.

But Taylor says that the twins in these earlier studies often came from similar and wealthy backgrounds. If they all get similar educations, that would mask the effect of teaching. So she deliberately set out to recruit twins from a wide variety of ethnic groups and social backgrounds. A third were Hispanic, a third were white, and around a quarter were black. Half of the children came from families that qualified for free lunches on the grounds of low income.

There are many caveats to the study, which Taylor herself lists. The reading improvements of a classroom may reflect the school, students or resources, as well as the quality of teaching. You might see different results if you used different measures of teaching quality (like class observations), or of reading skill. The effect of teaching quality might also be different in higher education, or in richer schools.

Nonetheless, Taylor’s work does demonstrate that poor teaching constricts genetic variation in reading ability so that it never germinates. Only in the light of quality teaching does that variation bloom. Teachers should be pleased with the result, for, as Taylor says, “Reading will not develop optimally in the absence of effective instruction.” Likewise, putting really good teachers into a classroom won’t magically make all the students into literary Jedis, and (contrary to what some parents expect) it won’t benefit all students equally.

I wrote about something similar in my New Scientist piece – a variant of the MAOA gene can lead to aggressive behaviour, but only in people who were raised in abusive environments. Again, the environment sets the stage in which genetic actors can express themselves.

Reference: Science 10.1126/science.1186149

Image: by Tostie14

More on education:

Nearly 100% Out-of-Africa in the past 100,000 years | Gene Expression

Since I’ve been talking about the possibility of admixture with “archaics” (I’m starting to think the term is a bit too H. sapiens sapiens-centric, is the Neandertal genome turning out to have more ancestral alleles?) I thought I’d point to a paper out in PLoS ONE which reiterates the basic fact that the overwhelming genetic evidence today suggests a massive demographic expansion from an African population within the last 100,000 years. Study after study has supported this contention since the mid-1980s. The question is whether this is the exclusive component of modern human genetic ancestry, which is a somewhat more extreme scenario. In any case, the paper is Formulating a Historical and Demographic Model of Recent Human Evolution Based on Resequencing Data from Noncoding Regions:

Our results support a model in which modern humans left Africa through a single major dispersal event occurring ~60,000 years ago, corresponding to a drastic reduction of ~5 times the effective population size of the ancestral African population of ~13,800 individuals. Subsequently, the ancestors of modern Europeans and East Asians diverged much later, ~22,500 years ago, from the population of ancestral migrants. This late diversification of Eurasians after the African exodus points to the occurrence of a long maturation phase in which the ancestral Eurasian population was not yet diversified.

They took 213 individuals, a little over half from diverse African groups, and the other half split evenly between Europeans and East Asians, and sequenced 20 distinct noncoding autosomal regions of the genome. ~27 kilobases per person. The noncoding part is important because they are trying to look at neutral regions of the genome, not subject to natural selection (this is obviously an approximation, as there is some evidence that even noncoding regions may have some selective value). The variation is what you’d expect, Africans more varied than non-Africans, and the two Eurasian populations are distinct from each other, but less so than either is from the Africans. Lots of statistics ensue, and an “Approximate Bayesian Computation (ABC) analysis.” I’ll cut to the chase, the highest probability model is illustrated in panel A of figure 4. Expansion out of Africa ~60,000 years ago, major bottleneck, a ~40,000 year interregnum where there was a relatively unified Eurasian population genetically, and then a separation between East and West Eurasians ~20,000 years ago.

journal.pone.0010284.g004

I’ll stipulate that I haven’t dug deep into the statistics, nor would I really comprehend all the details if I did spend a weekend on it. But I’m rather skeptical of the 40,000 year period of a common Eurasian population. Reading the text where they discuss this finding it seems clear that it was surprising to the authors, and I’m not sure how convinced they are about it either. It is interesting that the second most probable scenario, B, is a simultaneous expansion out of Africa by two different groups which lead to East and West Eurasians. That makes me a little less confident about the details on Eurasian demographic history overall than I’d already been. They do note that some Y chromosomal data imply that all Eurasian populations may have derived from a Central Asian group, and the settlement of Europe ~35,000 years ago may actually indicate population replacement (though it’s pretty clear that many Central Asian groups have been recently heavily admixed with the incursion of Turks from Mongolia in historical time overlain upon a Iranian substrate, and some sequencing of ancient Cro-Magnon mtDNA shows that their haplgroup is still found in many Eurasian, and even New World, populations). Perhaps there is a more complicated story to be told about the replacement of early modern H. sapiens sapiens by later H. sapiens sapiens. I wouldn’t discount it, but one analysis does not push me to consider this at all likely. Additionally, they obviously couldn’t test the “two wave” model Out-of-Africa whereby there was a southern migration which skirted the Indian ocean along with a north wave which pushed into Central Asia; they didn’t have any “southern” Eurasian samples. Also, I do want to make a note of the fact that they had a lot more parameters in their model than I’m mentioning, including migration between the two Eurasian groups.

But let’s jump to the conclusion and highlight a portion which is relevant to what I’ve been discussing on this weblog over the past few days. As I observe above they constructed scenarios with different parameters to see which fit the data best, and one of those parameters was interbreeding with older hominin groups in Eurasia. Here’s what they say in the discussion:

For those historical and demographic parameters that have been previously studied, our co-estimations are in agreement with previous reports, highlighting the general accuracy of our estimates. For example, our estimation of the replacement rate of archaic hominids by modern humans, although indicating that the introgression of archaic material into the gene pool of modern humans has been minimal, did not rule out the presence of minor archaic admixture of other hominids in modern humans in agreement with previous observations…However, it is important to emphasize that our inferences are based on non-coding neutral regions of the genome and that adaptive introgression from archaic to modern humans may have occurred to a greater extent…Indeed, in contrast to neutral alleles, adaptive variants may attain high frequencies by natural selection after minimal genetic introgression. Future studies comparing coding-sequence variation in modern humans and extinct hominids (e.g. Neanderthals) should help to answer this question.

Their models don’t offer any plausible scenarios where more than 1% of the sequence which they analyzed was derived from populations which were not from the recent Out-of-Africa movement. But, they do specifically say that they lose power to ascertain whether there was admixture at levels below 1%. At some point in the medium term future when we have a fair amount of ancient DNA from Neandertals sequenced, as well as a lot of genomes of modern human beings, if we still don’t find any evidence for alleles which have introgressed from other lineages which had long been separated, the time for hedging may be over. But at this point there’s still some wiggle room. What I’m wondering though is how the University of New Mexico group found lots of evidence of introgressed lineages when other groups have not. Granted, they had 10 times as many individuals and more diverse populations, but presumably far less of the genome. If there was admixture which we could detect, in light of the nearly two decades of this sort of stuff, I assumed it would be cases of adaptive introgression. Here a very low level of admixture could still lead to the increase in frequency of a haplotype which bears the hallmarks of having been in a distinct population from H. sapiens sapiens for long periods of time (like haplogroup D for the microcephalin gene). In other words, I assumed that evidence of introgression would be a story of genetics & natural selection and not genomics & admixture. For instance, particular metabolism genes and the like which new Africa populations might have picked up just like they’d eventually develop their own adaptations from mutation or extant variation if they didn’t admix. I guess 614 microsatellites may not count as genomics, but if adaptive introgression on a few select genes was how we’d detect interbreeding between native Eurasian groups and the Africans this not a way I’d assume you could find any evidence of that.

Citation:Laval G, Patin E, Barreiro LB, Quintana-Murci (201). Formulating a Historical and Demographic Model of Recent Human Evolution Based on Resequencing Data from Noncoding Regions PLoS One : 10.1371/journal.pone.0010284

Jeff Hanley Openly Defies White House Policy

NASA program: Ares I backers work to save rocket despite White House wishes, Orlando Sentinel

"The day after President Obama visited Kennedy Space Center last week to unveil his new vision for NASA, the manager of the moon program that Obama wants to kill told his team to draw up plans in case Obama fails to win congressional support. In an e-mail sent April 16, NASA's Constellation program manager, Jeff Hanley, instructed his managers to "prioritize" all the resources they have at their disposal under this year's budget to plan for test flights of prototypes of the troubled Ares I rocket that Obama aims to cancel. Hanley also orders them to look at ways to shrink the Constellation program in such a way that it can fit in a tighter rocket-development budget backed by the White House. The move comes as some members of Congress have pledged to stop Obama and save Ares. "This direction," Hanley wrote in the e-mail, "remains consistent with ... policy to continue program execution and planning in the event that the program or parts thereof will continue beyond [this financial] year."

20 Years of Hubble Science

Hubble Space Telescope Celebrates 20 Years of Discovery

"As the Hubble Space Telescope achieves the major milestone of two decades on orbit, NASA and the Space Telescope Science Institute, or STScI, in Baltimore are celebrating Hubble's journey of exploration with a stunning new picture and several online educational activities. There are also opportunities for people to explore galaxies as armchair scientists and send personal greetings to Hubble for posterity."

Lockheed Martin-Built Hubble Space Telescope Marks 20 Years of Astronomical Discovery

"NASA's Hubble Space Telescope (HST), built and integrated at the Lockheed Martin Space Systems facility in Sunnyvale, was launched 20 years ago aboard Space Shuttle Discovery, on April 24, 1990, ushering in a new golden age of astronomy. HST was released by the crew into Earth orbit the next day and the universe hasn't looked the same since."

Mineta Praises Obama NASA Plan

Former Secretary of Transportation Mineta Praises Obama's NASA Plan For Jump-Starting Commercial Spaceflight

"Norman Mineta, who served as Secretary of Transportation under President George W. Bush and as Secretary of Commerce under President Bill Clinton, and who represented Silicon Valley in Congress for more than 20 years, has published an op-ed stating, "With Russia, China and India close on our heels, the only way we can maintain our hard-won leadership in space transportation is by employing America's unique entrepreneurial strength. Obama's new plan for NASA does exactly that."

Critics With Inconsistent Arguments

Obama should rethink NASA's space program, editorial, Washington Post

"... with the cancellation of Ares I, the administration wants to rely on private companies to develop vehicles to get passengers to low-Earth orbit. These "space taxis" would stretch current capabilities, but the private sector could play an important, and potentially cost-effective, role. It is odd for those who accuse this administration of wanting to take over the private sector to blast this initiative."

Shuttle backers say space station needs safety net

"[Senator] Hutchison's scenario "says you have to protect against something that's extremely unlikely," said John Logsdon, a space historian and former director of the Space Policy Institute at George Washington University. "I think it is trying to make an argument in support of a relatively parochial position of keeping the shuttle flying." Retirement of the shuttle fleet would have no impact on crew safety, said former shuttle astronaut John Grunsfeld. "We don't rely on the shuttle as a rescue vehicle in the event of a problem on space station -- that's exactly why we have Soyuz that are docked up there all the time," he said."

Pneumatic Belt Fed Nerf Grenade Launcher

From OhGizmo!:

Sometimes, 500 Nerf darts per minute is just not enough. Sometimes, you just need heavier artillery. Enter the Mk 19 Nerf grenade launcher, a faithful reproduction of the Mk 19 grenade launcher used by the military. Like the real thing, the Nerf Mk 19 is belt fed, it

Nanodevice Powered by Motion

From Discovery News - Top Stories:

Every move you make, every step you take, you can generate electricity. By cramming 20,000 nanowires into three square centimeters, scientists from Georgia Tech have created the world's first device powered solely by piezoelectric materials. A

I Hate Computers: Confessions Of A Sysadmin

From CrunchGear:

I often wonder if plumbers reach a point in their career, after cleaning clogged drain after clogged drain, that they begin to hate plumbing. They hate pipes. They hate plumber's putty. They hate all the tricks they've learned over the years, and they hate the nee

One Step Closer to Understanding Dark Energy

From Discovery News - Top Stories:

According to the most precise cosmological models to date, dark energy is a mysterious repulsive "force" that makes up the majority of the total energy contained within the universe. And yet, we have no idea what it actually IS. The Lawrence Berke

Non-Stop Bullet Train Concept

From Neatorama:

A virtual high-five goes out to the engineer(s) that came up with this idea. As wonderfully fast as the bullet train is, all that stopping and starting takes time, which adds up. "A mere 5 min stop per station (elderly passengers cannot be hurried) will result in a

Former Secretary of Transportation Mineta Praises Obama’s NASA Plan For Jump-Starting Commercial Spaceflight

Norman Mineta, who served as Secretary of Transportation under President George W. Bush and as Secretary of Commerce under President Bill Clinton, and who represented Silicon Valley in Congress for more than 20 years, has published an op-ed stating, “With Russia, China and India close on our heels, the only way we can maintain our hard-won leadership in space transportation is by employing America’s unique entrepreneurial strength. Obama’s new plan for NASA does exactly that.”

Mineta’s op-ed in Silicon Valley’s San Jose Mercury-News, titled “Time to Bring Silicon Valley Spirit to Space Industry,” can be read at http://www.mercurynews.com/opinion/ci_14929987 .

Mineta, a longtime Silicon Valley leader both in Congress and as mayor of San Jose, stated, “As President Barack Obama outlined in a historic speech last week, NASA will now partner with commercial space companies to bring that Silicon Valley spirit to all of NASA and breathe new life into the space industry.” Mineta added, “When I was secretary of transportation, I had final authority for more than 40 FAA-licensed commercial rocket launches. Safety is something I take very seriously, and I would not be advocating for expanded commercial space flight if I didn’t believe it would be safe.”

A bipartisan figure, Mineta became only the fourth person to be a member of Cabinet under two Presidents from different political parties when he became Secretary of Transportation for President Bush after being Secretary of Commerce for President Clinton.

In the op-ed, Mineta stated, “While the Atlas and Delta rockets have extensive track records, it is not just the established companies that will compete in this new industry. Having spent two decades representing Silicon Valley in Congress, I say it’s long overdue to bring in entrepreneurs to this sector, with all their fresh ideas, private investment and new business approaches.”

The President’s new plan has also been endorsed by other public figures such as New Mexico Governor Bill Richardson, former Congressman Newt Gingrich, and James Cameron, who served on the NASA Advisory Council from 2003 to 2005. Newspaper editorial board endorsements of the new NASA plan include The New York Times, Boston Globe, The Economist, Nature, Philadelphia Inquirer, Pittsburgh Post-Gazette, Tampa Tribune, and the Chicago Tribune.

NASA Satellite Data Helps Everyone Breath a Little Easier

Haze blanketed Beijing, China, on January 18, 2010, when the Moderate Resolution Imaging Spectroradiometer (MODIS) on NASA's Terra satellite captured this imageFeeling a little ill? Step outside for some fresh air.

But before you do, you may want to check the latest NASA data about what, exactly, is in the air we breathe.

NASA-funded scientists and medical researchers are working together to tackle the problems of public health associated with bad air quality. Bad air quality can contribute to and aggravate asthma, bronchitis, high blood pressure, and stroke -- to name a few. Air quality-related health problems result in hospital visits that cost taxpayers millions of dollars annually.

› RAND study: Air pollution costs $193 million in hospital visits

NASA is using data intended for weather and climate research to help pinpoint how environmental factors such as aerosol levels in the atmosphere impact cardiovascular health. Aerosols are solid and liquid particles suspended in the atmosphere, and can occur naturally or get emitted by human activities such as burning fossil fuels.

Scientists measure aerosols, also called particulate matter (PM), by their size. The smallest particles -- less than 2.5 microns in diameter (PM2.5) -- are the worst for human health because they can make their way into the lungs or bloodstream and exacerbate cardiovascular problems, especially in very young and elderly populations.

The ability to detect these microscopic particles (often found in smoke and haze) is helping public health researchers better document the health risks for the general population and specifically at-risk populations.

Dr. Yang Liu, a researcher at Emory University, first realized that NASA satellite data could enhance public health tracking while attending a 2007 NASA workshop where scientists from the Center for Disease Control (CDC) presented an overview of a newly formed tracking network.

The National Environmental Public Health Tracking Network was created in 2002 as a cooperative program to find and document links between environmental hazards, such as aerosols, and diseases. The network uses ground-based air pollution data provided by the Environmental Protection Agency (EPA), and disease information from the CDC to monitor and distribute information about environmental hazards and disease trends, as well as develop a strategy to combat these trends.

Since the workshop, Dr. Liu has been working with NASA to integrate data from two instruments, the Multi-angle Imaging SpectroRadiometer (MISR) (onboard the NASA Terra satellite) and the Moderate Resolution Imaging Spectroradiometer (MODIS) (onboard NASA's Terra and Aqua satellites) into the tracking network. Both MISR and MODIS are used to monitor tropospheric aerosols.

"NASA satellites allow faster observations with a wider view to increase our understanding of the connections between PM 2.5 and illnesses, " said Liu "We can essentially provide more timely estimates of harmful aerosol concentrations."

Smog in downtown Atlanta, taken in June 2009Until recently, ground-based air quality monitoring has been the only data source for estimating exposure to aerosols. However, even in the U.S., the networks are spread out and the coverage is limited by high operating costs. Using NASA satellite information, federal, state, and local agencies will be better prepared to develop and evaluate effective public health actions.

Liu explains that "Satellites have both wide spatial coverage and long mission lives, so a satellite measuring the quantity of small aerosol particles over a larger area can supplement ground-based measurements and do so over a longer period of time."

NASA's contribution to public health does not stop there, however. NASA also has been working with researchers at the University of Alabama at Birmingham (UAB) to determine how atmospheric conditions contribute to cardiovascular disease in African Americans. Past research has shown that group to have a higher risk of contracting cardiovascular disease, hypertension, and other environmentally related diseases.

UAB has been working for six years on a public health study called Reasons for Geographic and Racial Differences in Stroke (REGARDS). Funded by the National Institutes of Health, REGARDS researchers recorded blood pressure, took blood samples, and asked detailed health questions of more than 30,000 people, particularly African Americans, between January 2003 and October 2007. The study focused on the so-called 'Stroke Belt', the area in the southeastern U.S. where incidents of stroke are 1.5 times the national average.

The REGARDS program is now working with colleagues at NASA to integrate satellite data on temperature, humidity, particulate matter in the air, and other environmental elements, to understand the connections between the atmosphere and human health.

"We can merge the REGARDS data with our data from MODIS," said Mohammed Al-Hamden, a co-lead on the project and a scientist at NASA's Marshall Space Flight Center. "We examine the statistical relationships between these diseases and the air quality and climate where these people live. With the wide spatial coverage of satellite measurements, we can better help health officials with environmental alerts and health recommendations."

Bill Crosson, the other NASA lead on the REGARDS project says the value of integrating NASA data is "that the data comes quickly and more frequently -- daily instead of weekly so we can provide it to the people who really need it."

The regional study has been so successful that it has recently expanded to the entire nation, with the information that NASA provides being integrated into a CDC database of public health records, called the Wide-ranging Online Data for Epidemiological Research (WONDER). NASA and UAB researchers are expanding the subject of the study along with its geographic range. Researchers are now exploring the connection between harmful particulate matter and cognitive decline, including memory, attention span, as well as reading listening comprehension.

With these two NASA-sponsored projects, public health officials are improving air quality forecasts, preparing hospitals for air quality-related health problems, and perhaps preventing health problems in the future by warning the public about the potentially harmful effects of aerosols.

View my blog's last three great articles...


View this site auto transport car shipping car transport charter buses business class flights


Deadly Fungus Aided by Climate Change

Yuk! A deadly fungus is thought to be spreading up north because of climate change.

Potentially Deadly Fungus spreading in US, Canada

* Fungus is unique genetic strain * — Climate change may be helping its spread

According to NPR:

“A rare and dangerous fungal infection named Cryptococcus gattii has been quietly spreading from British Columbia southward to the U.S. Pacific Northwest. And it’s changing as it goes.

Researchers have discovered that a unique strain of the bug has emerged recently in Oregon, and already has spread widely there, sickening humans and animals.

So far, over the past 11 years there have been about 220 cases reported in British Columbia. Since 2004, doctors in Washington and Oregon have reported about 50 cases. Among the total 270 cases, 40 people have died from overwhelming infections of the lungs and brain.”

You might want to put that vacation to British Columbia or Seattle on hold. According to the fungus’s FAQ pdf linked to above:

Until recently, C. gattii was only found in certain subtropical and tropical environments. In 1999 it emerged on Vancouver Island, British Columbia (BC), Canada. Between 1999 and 2006, 176 cases were reported in BC. C. gattii has been isolated from native tree species on Vancouver Island and from the surrounding soil and air, primarily from the east coast of Vancouver Island. Cases have also occurred on the lower BC mainland. The exact geographic distribution of the fungus is not known, and may be expanding.

And from Reuters – A potentially deadly strain of fungus is spreading among animals and people in the northwestern United States and the Canadian province of British Columbia, researchers reported on Thursday.

The airborne fungus, called Cryptococcus gattii, usually only infects transplant and AIDS patients and people with otherwise compromised immune systems, but the new strain is genetically different, the researchers said.

“This novel fungus is worrisome because it appears to be a threat to otherwise healthy people,” said Edmond Byrnes of Duke University in North Carolina, who led the study.

“The findings presented here document that the outbreak of C. gattii in Western North America is continuing to expand throughout this temperate region,” the researchers said in their report, published in the Public Library of Science journal PLoS Pathogens at http://dx.plos.org/10.1371/journal.ppat.1000850.

“Our findings suggest further expansion into neighboring regions is likely to occur and aim to increase disease awareness in the region.” The new strain appears to be unusually deadly, with a mortality rate of about 25 percent among the 21 U.S. cases analyzed, they said. “From 1999 through 2003, the cases were largely restricted to Vancouver Island,” the report reads.

“Between 2003 and 2006, the outbreak expanded into neighboring mainland British Columbia and then into Washington and Oregon from 2005 to 2009. Based on this historical trajectory of expansion, the outbreak [...]