A Nano-Wiretap: Scientists Use Nanowires to Spy on a Cell’s Inner Life | 80beats


Meet the cyborg cell. By attaching probes with nano-hairpin connectors to living cells, researchers have measured electrical currents from inside. They hope the probes will provide a useful way to monitor cells’ health.

A team at Harvard University conducted the study, which appears in Science. Though other probes can measure the currents in electrical impulse-producing cells–such as beating heart cells–none have given researchers the precision of measuring from inside. The probes designed in this study allowed researchers to successfully measure the electric pulses from cultured chicken heart cells’ beating.

One of the team’s challenges was getting the wires to kink into the hairpin shape–a difficult maneuver using traditional nanowire-making techniques. They noted if they stopped the wire as it formed, they could force it to bend.

The business end of the transistor sits on the pin’s bent tip and penetrates the cell. The two arms of the hairpin, which serve as electrical contacts, do not penetrate the cell deeply so minimise damage. In general, it is difficult to control the shape of nanowires, which are grown gradually on a substrate. But last year, [Charles M.] Lieber’s team reported that if you stop and restart this growth process, you can introduce a 120º kink. By kinking their wire twice in quick succession, the team created the sharp hairpin bend that they required. [Nature News]

The team also camouflaged the probe’s tip (which can be smaller than the diameter of a virus particle) with a lipid coating–fooling the cell into letting it inside.

The Harvard cell probes, described today in the journal Science, are three-dimensional, V-shaped silicon nanowires with transistors at their tips. They’re flexible and coated with two layers of lipid molecules, just as a cell is. When the transistor tip, which is about the size of a virus, encounters a cell, the cell pulls it inside. Lieber’s group found that the tips can also be removed gently, with no ill effects to the cell. They’ve used the transistor probes to take electrical measurements in single cells and are now using them to measure electrical activity in the groups of adjacent cells that form tissues. [Technology Review]

The team hopes that eventually such devices will prove useful in medical monitoring, and is planning tests to see if similar devices will work with neurons.

Related content:
80beats: 2 New Nanotech Super Powers: Desalinating Sea Water and Treating Cancer
80beats: Are Carbon Nanotubes a Super Fertilizer?
80beats: Golden Nanocages Could Deliver Cancer Drugs to Tumors
80beats: Nanoparticle “Smart Bomb” Could Stop Cancer’s Spread

Images: Science / AAAS


Retracted Study: Biblical Woman Had Flu, Not Demonic Possession | Discoblog

jesusbandaidThough it might work for The DaVinci Code, apparently citing the bible doesn’t fly in a scientific journal. Virology Journal apologized yesterday for publishing a paper titled “Influenza or not influenza: Analysis of a case of high fever that happened 2000 years ago in Biblical time,” which attempts to diagnosis “a woman with high fever cured by our Lord Jesus Christ.”

Yesterday, journal editor Robert F. Garry apologized for the paper’s publication and announced that Virology will retract the piece. The blog Retraction Watch, where we found this story, posted a response from the paper’s lead author, Ellis Hon:

“As an article for debate, there was no absolute right or wrong answer, and the article was only meant for thought provocation. Neither was it meant to be a debate on the concept of miracles. My only focus at the time of writing was ‘what had caused the fever and debilitation’ that was cured by Jesus.”

The piece, which appeared in the journal’s “Case Report” section, had a reference list including The Holy Bible (New King James Version) and the Fahrenheit temperature scale. The authors cite the cure’s speed and the woman’s quick recovery in making their diagnosis.

The Bible describes that when Jesus touched the woman, the fever retreated instantaneously. This implies that the disease was probably not a severe acute bacterial infection (such as septicemia) or subacute endocarditis that would not resolved instantaneously.

Playing it safe, the authors also note that other possibilities could include drug fever and poisoning, but demonic possession is definitely out:

One final consideration that one might have is whether the illness was inflicted by a demon or devil. The Bible always tells if an illness is caused by a demon or devil (Matthew 9:18-25, 12:22, 9:32-33; Mark 1:23-26, 5:1-15, 9:17-29; Luke 4:33-35, 8:27-35, 9:38-43, 11:14). The victims often had what sounded like a convulsion when the demon was cast out. In our index case, demonic influence is not stated, and the woman had no apparent convulsion or residual symptomatology.

Related content:
Discoblog: Super-Size Me, Jesus: Last Suppers in Paintings Have Gotten Bigger
Discoblog: What Kind of Peer-Review Would Jesus Want?
Discoblog: The Science of Virgin Birth
Discoblog: Man, Pronounced Dead, Spontaneously Comes Back to Life

Image: flickr / Dan Germain


Crowdsourced Science Success: Einstein@Home Participants Find a Pulsar | 80beats

crabpulsarAugust has been quite the success story for the use of crowdsourcing—farming out work to willing humans or bored computers—to make scientific discoveries. Last week a study showed how citizen scientists helped unravel the structure of proteins by playing a video game. Today, a study in Science documents a newly discovered pulsar—newly discovered by the computers of amateurs, that is.

This find was the first of its kind for Einstein@Home, a project that uses the downtime of a network of volunteer computers to hunt for gravity waves and radio signals. (The more famous SETI@Home uses computer free time to seek out alien signals.) The idling PC of Chris and Helen Colvin of Ames, Iowa, detected the signature of the pulsar now called J2007 for short, which was confirmed by a computer in Germany owned by Daniel Gebhardt.

Pulsars are interesting objects: These rotating stars emit a beam of radiation that is only visible to us when the beam flashes past Earth, a phenomenon called the lighthouse effect. And the newfound pulsar seems particularly interesting.

It turns out that J2007, located in the Milky Way in the constellation Vulpecula, is not just any radio pulsar. Most pulsars are neutron stars that spin on their axis about once a second and have strong magnetic fields. In contrast, the J2007 neutron star spins 41 times a second, and has a weak magnetic field. This type of fast-spinning pulsar is usually associated with a binary-star system — but J2007 seems to be sitting out in space by itself [MSNBC].

Initially, program director Bruce Allen says, Einstein@Home sought gravity waves: ripples in space caused by massive objects, theorized in the early 20th century by Einstein. But after finding no significant signals, Allen turned part of the program over to seeking radio signals from pulsars, hoping to give the volunteers something to get excited about.

When the pulsar was spotted, Allen says, it took a while to get the excitement across to the Colvins. He tells MSNBC that they thought his emails were spam, so he had to send a certified letter to get their attention. Once he did, though, the Iowa couple was walking on air.

Helen Colvin says that she and Chris loaded Einstein@Home after a period of running SETI@Home because it seemed more likely to get a result some day. Still, she is surprised to own the computer that actually found something. “It was a bit like winning the lottery,” she says. “The odds aren’t in your favour” [Nature].

Follow DISCOVER on Twitter and Facebook.

Related Content:
DISCOVER: Gravity Wave Sky
DISCOVER: The 14 Best Ways To Use Your Computer’s Spare Time
80beats: NASA Invites You To “Be a Martian” And Explore the Red Planet’s Terrain
80beats: Computers Exploit Human Brainpower to Decipher Faded Texts
80beats: Crowdsourced Astronomy Project Discovers “Green Pea” Galaxies
80beats: Crowdsourced Science: 5 Ways You Can Help the Hive-Mind

Image: Wikimedia Commons (Crab Nebula pulsar)


The Next 10 Years of Astronomy | Cosmic Variance

The US astronomical community is anxiously awaiting tomorrow’s press conference on the release of the “Astro2010 Decadal Survey”. Now, the astronomical community has press releases all the time, but almost all are about communicating scientific results or images to the general public. Tomorrow’s is different. What we learn will shape the next ten years of investment in astronomical infrastructure, and set the course of much of scientific innovation in the ten years after that.

For close to half a century, the astronomical community has gone through an extremely productive exercise in navel gazing, producing exhaustive reports once a decade to lay out our priorities as a field. These reports are the result of a year long process of consultation, analysis, and lobbying. Through the National Academy of Sciences, the community organizes a series of committees to evaluate every aspect of US astronomical research. They try to identify scientific areas that are ripe for breakthroughs, and then to match these areas with specific technological investments in astronomical tools (primarily telescopes, but also increasingly computational and theoretical resources). The committees then do their best to rank these investments into a prioritized list.

The process of making a prioritized list is relatively horrific, since it involves choices between extremely different, non-overlapping projects. For example, if you’ve spent your life understanding optical and near-infrared spectra of galaxies, you’ll be rooting for a gigantic ground based telescope — most competing projects will be of little utility for your research. However, as a field, we are forced to face up to the fact that sometimes the best way to move forward on an astrophysical topic is not necessarily where we, as individuals, have chosen to do so. We also have to recognize that what may interest us personally may not be the most important question in the field. For example, I’m a nearby galaxy kind of girl, but I’d be a fool not to recognize that extrasolar planets are far more “ripe” for dramatic results. Finally, accepting these facts is not equally easy for all individuals, and many people are willing to go the mattresses for their preferred outcome. One hopes for good behavior, but people will be people.

The reason the process is so high-stakes is that the ranking that comes out of the Decadal Survey is taken very, very seriously. The upper administration of NASA and the National Science Foundation take these recommendations as commandments (i.e. don’t bother seeking funding for the satellite telescope that was ranked 15th). Ever more seriously, congressional staffers read these reports, making Congress extremely unlikely to finance anything but a top ranked project. (The few times that earmarks have been laid out for specific projects, it’s been Seriously Frowned Upon by the community, and by any administrator who has based their planning on the ranked list). Frankly, this is great, even if it’s hard. We wouldn’t want anyone else to make these decisions but us, as hard as it is to sometimes see your favorite project nudged out by something you are far less interested in.

So, the big things to look for in the news tomorrow are the first ranked ground-based project (i.e. NSF funded) and the first ranked space-based project (NASA funded). In the current funding climate, and with the growing costs of building competitive facilities, the community is unlikely to get more than one major initiative rolling — if that. This decadal report is unlikely to make the mistakes of the last one, which can best be described as being equivalent to asking a 3 year old whether they’d prefer a bathtub full of ice cream or a pony. This round, there was much more attention paid to cost, so that the committee could make realistic decisions.

Frankly, it’s a bit of a scary time. The situation reminds me a bit too much of the Superconducting Supercollider. The funding levels needed to make big advances are at a point where we really can’t afford more than one major initiative a decade. That puts us in the unfortunate position of having a single point failure. Say we back one big project. Suppose that the one big project goes over budget (as cutting edge facilities frequently do) to the point where it gets cancelled, 10-15 years from now. Then, we’re left with nothing, and young astronomers start looking for jobs in Europe.


Omniscient Being Could Solve Any Rubik’s Cube in 20 Moves | 80beats

rubiksScientists have cranked through the numbers and determined that no matter how you mangle a Rubik’s Cube, if you’re doing it right you can theoretically solve the puzzle in 20 moves or fewer. By doing it right, we mean doing it like a supercomputer: Researchers tapped Google’s spare computing power to burn through the Cube’s 43,252,003,274,489,856,000 starting positions.

Even given Google’s processing power, the team–which included a mathematician, a Google engineer, a math teacher, and a programmer–could not solve the problem using brute force alone. They had to take all the starting positions and divide them into more manageable chunks, 2.2 billion smaller groups called “corsets,” which Google’s computers could solve simultaneously.

“The primary breakthrough was figuring out a way to solve so many positions, all at once, at such a fast rate,” says Tomas Rokicki, a programmer from Palo Alto, California, who has spent 15 years searching for the minimum number of moves guaranteed to solve any configuration of the Rubik’s cube. [New Scientist]

Though smaller, these 2.2 billion groups also each had about 20 billion starting positions.

The subproblems were small enough to fit in the memory of a modern PC. But it would take an Intel four-core, 2.8-GHz Nehalem chip-based desktop computer 1.1 billion seconds, or about 35 years, to perform the calculation. So the team turned to the impressive computing power that Google has to solve the problem. (Google won’t disclose exactly what kind of computing resources it offered to the group.) [Wired]

Mathematicians have slowly whittled down the believed minimum number of moves to solve the Cube from any starting position, the so called “God’s number,” since the Cube’s creation in 1974 by Hungarian Erno Rubik. Many believed that 20 was the answer for some time, but no one had run through the starting positions to prove it.

[Team member Morley] Davidson said this was “pure religion” as no-one had managed to crunch their way through all configurations. “We were secretly hoping in our tests that there would be one that required 21,” he said. [BBC]

Still, the researchers suggest, don’t expect the confidence boost of a known 20-move maximum to help you solve the Cube any faster.

There are many different algorithms, varying in complexity and number of moves required, but those that can be memorized by a mortal typically require more than forty moves. [Team website]

Related content:
80beats: Has the Devilish Math Problem “P vs NP” Finally Been Solved?
80beats: Brilliant & Reclusive Russian Mathematician Doesn’t Need Your Prize Money
80beats: Can a Google Algorithm Predict Nobel Prize Winners?
Discoblog: Book-Balancing, Rubik’s Cube-Solving, Pi-Reciting Geek Girl Goes Viral
Discoblog: A Rubik’s Cube Could Tell Us Which Arm Is an Octopus’ Favorite

Image: flickr / kirtaph


“Living Library” of Fruit Plants May Fall to Russian Bulldozers | 80beats

VavilovThe Pavlovsk Experimental Station, near St. Petersburg, Russia, was founded in the 1920s. About 90 percent of the plants grown there occur nowhere else, making the collection an island of agricultural biodiversity. And the station soon may be knocked over to make way for a housing development.

The station’s operators at the Vavilov Institute of Plant Industry lost a court ruling this week, so the land upon which all those plants sit will be given to the Russian Housing Development Foundation. The plant scientists bought themselves an extra month with an instant appeal, but the situation looks grim.

“We expected to lose,” agrees Cary Fowler, executive director of the Global Crop Diversity Trust in Rome, who has spent months campaigning against the station’s destruction. “Our real hope lies with President Medvedev and Prime Minister Putin, who could both override the decision of the courts. At least the higher appeal will give us time to mobilize more people and hopefully get through the gates of the Kremlin,” he adds [Nature].

Pavlovsk station is home to more than 5,000 different varieties of plants, including kinds of berry, cherry, and pear that exist no place else. But with Folwer’s hopes for the station’s salvation dimming, the pertinent question seems to be: Why not just move?

The problem, he says, is that we’re talking about whole plants and not just seeds. Fowler says the plants in the collection are difficult to grow and can’t be stored in a seed bank, like the famous one now operating in the Svalbard Islands of Norway that stores the seeds of important crops in case of Armageddon. Workers would have to uproot the plants, and Fowler says there’s no other suitable location in Russia, so the station’s keepers would have to try to move their collection abroad—if that’s even possible.

Most of the unique fruit plant strains do not reproduce asexually, and are pollinated by other strains, so their seeds do not necessarily yield adult plants that mirror the characteristics of the parent plant…. ”It’s a valuable and unique collection of strains, and its loss would be a serious blow to agriculture,” agreed Peter Raven, director of the Missouri Botanical Garden [The Scientist].

Expect the Pavlovsk scientists to keep up the fight as long as they can: They have a history of taking their plants personally.

The Pavlovsk facility earned a special place in Russian history during the World War II siege of the city, then called Leningrad, when 12 scientists chose to starve to death rather than eat the precious seeds [Los Angeles Times].

Related Content:
DISCOVER: The Numbers on Seeds, From the Largest to the Oldest to the Safest
DISCOVER: The “Doomsday Vault” Stores Seeds for a Global Agricultural Reboot
DISCOVER: The Banks That Prevent–Rather Than Cause–Global Crises
DISCOVER: Beautiful Images of Strange Fruits (photo gallery)
80beats: “Methuselah Seed” Sprouts After 2,000 Years

Image: Wikimedia Commons (N.I. Vavilov, institute founder)


Pursue the Perseids tonight! | Bad Astronomy

netherlands_meteorThe next couple of nights bring us one of the best meteor showers of the year: the Perseids. It peaks around mid-August — this year the peak is tonight, Thursday August 12 — when the Earth plows through the debris from the comet Swift-Tuttle. This year should be pretty good, as the Moon sets early, and won’t interfere with seeing fainter meteors.

If you don’t think the Perseids will be cool, then watch this:

Love it! Want more info?

Back in 2007 I wrote up a brief guide on how to observe the Perseids, and it’s still pretty much apropos of the shower this year (just replace "Sunday" with whatever day you’re observing). The most important things: the later you go out, the better since the shower really peaks after midnight; you need a clear view of as much of the sky as possible; and you don’t need any equipment, but I recommend a lounge chair to lie back on.

Other sites are covering this as well, of course:

- Wanna chat about the meteors? NASA is hosting a live chat Thursday night/Friday morning (Aug 12/13)!
- Universe Today
- Tom’s Astronomy Blog
- Astropixie

So get out there and enjoy the shower!


Show Off Your Research & Win Our Book! | The Intersection

A neat contest from the good folks at New Voices for Research:

Stop, wherever you are. Quick, grab your phone or closest camera and take a picture of what’s in front of you. Send it to hbenson at researchamerica.org to enter the Mystery Lab contest.

What is the Mystery Lab contest? A chance to show off your research. Monday through Thursday next week the most creative four images submitted will be posted in the order they were received. New Voices readers will be asked to guess what field of research is being represented in the photo (biology, chemistry, marine science, physics, mathematics, etc.)

How to participate: Send an image of your work with the general field you work in as the subject to hbenson at researchamerica.org or via Twitter @NV4Research. Then encourage your friends to guess each day next week.

How to win: The winning entry will be determined using the following formula:
Number of guesses x total daily visitors

The prize: The submitter of the winning entry will receive a copy of Unscientific America: How scientific illiteracy threatens our future by Chris Mooney and Sheril Kirshenbaum. If the winner already has a copy of this fabulous book, we’ll work something out.

Send your entries as soon as possible to be considered! Help us see what real science looks like.


The Next Decade of US Ground Based Astronomy | Cosmic Variance

On to the ground-based (i.e. NSF funded) recommendations (for large, new projects — i.e., not including on-going investments in ALMA; there are a number of interesting medium scale projects recommended, but I probably won’t have time to get to them).

First priority was the Large Synoptic Survey Telescope (LSST) — a survey for a multi-color, multi-cadence survey of the sky with an 8m class telescope. As my colleague and LSST Project Scientist Zeljko Ivezic puts it, “LSST will make a movie of the sky,” which, you have to admit, is pretty cool. When you think about discovery space in astronomy, the largest gains come when you move into new regimes. We’ve largely run out of new wavelength regimes, but the time-variable regime has not yet been explored in a large scale systematic way (although PanSTARRS and the Los Cumbres Observatory will certainly be making headway). In addition, the co-adds of all the epochs will produce an 8m telescope version of the 2.5m Sloan Digital Sky Survey (SDSS) imaging, which is a good thing. All data is non-proprietary, and can be used by anyone.

Second priority is a “Mid-Scale Innovations Program” — basically, a ground-based equivalent of the NASA Explorer program. The decadal survey committee reviewed a wealth of scientifically compelling medium size projects. These don’t rise to the level of building giant new facilities, and are typically seeking funding for an instrument and a decidated multi-year survey on existing facilities. The report recommends that there be a review and funding mechanism for such projects, which have the capability of responding nimbly to scientific and technological changes.

Third priority is contributing to the development of a 30m class ground-based optical/nearIR telescope (a “Giant Segmented Mirror Telescope”; GSMT). Such a telescope would be essential for carrying out spectroscopy of the sources found at the limits of 8m-class telescope imaging; basically, if you detect a source in an image, when you want spectra, you’re spreading the light over much larger areas, requiring bigger apertures to reach the same signal-to-noise as when all the wavelengths are being imaged together. There are currently 2 large US programs that are well underway (TMT and GMT), using private funding. For these programs to have enough money to be built and operated, an investment of Federal money is required. This money would also guarantee some degree of access for the larger US community, but probably significantly less than 50%. The report recommends that involvement should be at least a 25% share. However, they argue that there is only money enough to invest in one, and the community had better pick one as soon as possible, rather than letting both go forward.

The fourth priority is participation in the “Atmospheric Cerenkov Telescope Array” (ACTA), to detect and characterize the highest energy cosmic rays. Recent years have seen the detection of TeV cosmic rays, which places strong constraints on particle acceleration at the highest energy scales; a new array would greatly expand the chances of fully understanding the origin of these high energy events. Rather than funding a separate US initiative, however, the report recommends joining into an existing European project (CTA), in spite of the fact that the US would be a minor partner.

Reactions to the Ground-Based Recommendations:

Perhaps the biggest surprise was the drop in the GSMT from (1) its prioritization in the previous report, and (2) its prioritization in the actual optical/IR subcommittee (See Table B.1). The justification was that LSST was a much lower risk in terms of cost and technology, and, as in the space recommendations, pragmatism ruled the day. The committee was quite strong in their support for GSMT as a project, and pointed out that the combination with LSST is highly synergistic — LSST provides the targets, and GSMT tells you what they are. However, the pie was simply not big enough to give everyone a slice. In addition, if you can only dish out one slice of pie, you want it to feed the most number of people — LSST made a strong case that a much larger fraction of the US community could make use of the data.

Personally, I’m very sympathetic to this view. There are scientific advances that come because you have new facilities pushing into new territory, and GSMT has this in spades. However, there are also scientific advances that come about because you have the largest number of very clever brains thinking about how to exploit a given data set. Taking SDSS as a model, a ridiculously large fraction of the ridiculously large number of SDSS-related papers had absolutely nothing to do with anything in the “black book” of science justifications used to obtain funding for SDSS. You take good data, you let smart people work with it, and you’ll get science you never anticipated. I’m optimistic that LSST could work the same way, with the caveat that the scientific impact may well be blunted without a wide scale investment in spectroscopy (which SDSS had, and which LSST lacks). I very much hope that a 30m gets built, but not to the point where I’d be comfortable leveraging all public large ground-based investment over the next 10 years for a 25% share of a telescope. (Full disclosure: I am not at an institution that would have private 30m access, and am at one that has made early and ongoing investments in LSST. So, my perspective is undoubtedly shaped somewhat by viewing GSMT projects as a potential “outside” user. I do my best to be fair, but I’ve pretty much shaped my scientific research around the premise that I won’t have exclusive access to large aperture telescopes.)

I am also really pleased to see the “Mid-Scale Innovations” recommendation. I think this is a smart way to make sure we can take advantage of rapidly changing fields. When something like dark energy or extrasolar planets shows up on the scene, it’s great to have a mechanism in place to take advantage of new opportunities. In addition, it’s a smart way to skim the low hanging fruit, so that larger missions have a better understanding of what the scientific requirements really are — for example, you’d design a very different dark energy mission if you know that w is nearly equal to -1, than if you had no idea of its value.

The other noticeable lack here is a call for US participation in the Square Kilometer Array. (The panel did recommend some radio projects in the medium scale category.) However, if you look at Figure 4-8 (which I found fascinating and surprising) fewer than 10% of the members in the American Astronomical Society (ASS) categorize themselves as “Observational Radio” astronomers. I’d presume this would grow in response to investment in ALMA, but the community is clearly not enormous.

So, my take on the ground-based recommendations, is that they did pretty well at making hard choices. And the choices were indeed hard, and are going to be rightfully hard to swallow in many cases.


The New Happy Meal: A Burger With a Side of Cholesterol Meds | Discoblog

burgerJust in time for the release of Denny’s new Fried Cheese Melt, a study in The American Journal of Cardiology questions if cholesterol-reducing drugs called statins might pair well with fast food. A side of McStatins with that Big Mac, the study suggests, could decrease the risk of cardiovascular disease.

Using data from previous studies, researchers in the UK compared the decreased risk of cardiovascular disease from taking statins to the increased risk caused by sucking down a cheeseburger and milkshake. From their abstract:

“The risk reduction associated with the daily consumption of most statins, with the exception of pravastatin, is more powerful than the risk increase caused by the daily extra fat intake associated with a 7-oz hamburger (Quarter Pounder®) with cheese and a small milkshake.”

The study’s authors note that the pills certainly won’t counteract all of the ill effects of unhealthy eating choices–diners would still have to contend with the calories and the sodium. Instead they compare taking the medications with other protections from risky behaviors, like wearing a helmet while motorcycling.

Reuters reports that some physicians are worried that the combination will give fast food lovers the wrong message, and may make diners think that they can avoid the negative consequences of their date with the deep fryer. But study coauthor Darrel P. Francis’ statement to CBS News makes you wonder if the currently prescription-only drugs such as Lipitor and Crestor might soon be available next to the ketchup pump:

“It’s ironic that people are free to take as many unhealthy condiments in fast food outlets as they like, but statins, which are beneficial to heart health, have to be prescribed.”

Related content:
Discoblog: Fast Food News: It Boosts Impatience, and What Trumps KFC’s Double Down?
Discoblog: Fast Food Joints Lie About Calories (Denny’s, We’re Looking at You)
Discoblog: Grand Engineering Challenge of Our Era: A Non-Lethal Hot Dog
Discoblog: How Deep Fryer Grease Can Become an Energy-Saving Coating for Your Roof

Image: flickr / Marshall Astor — Food Pornographer


New Antibiotic-Resistant Superbug Found: Should Everybody Panic? | 80beats

E_coliThe antibiotics-resistant superbug that emerged in South Asia appears to have claimed its first life. According to doctors who treated a man in Belgium, he went to a hospital in Pakistan after a car accident, and there he picked up the bacterial infection. While the man died back in June, his doctors announced today that he carried the superbug.

This new health scare intensified this week after researchers published a study in The Lancet Infectious Diseases characterizing “a new antibiotic resistance mechanism” in the U.K., India, and Pakistan. How bad is this “mechanism?”

It’s bad:

The problem isn’t a particular kind of bacteria. It’s a gene that encodes an enyzme called New Delhi metallo-lactamase-1 (NDM-1). Bacteria that carry it aren’t bothered by traditional antibiotics, or even the drugs known as carbapenems deployed against antibiotic-resistant microbes.

The NDM-1 gene is a special worry because it is found in plasmids — DNA structures that can easily be copied and then transferred promiscuously among different types of bacteria. These include Escherichia coli, the commonest cause of urinary tract infections, and Klebsiella pneumoniae, which causes lung and wound infections and is generated mainly in hospitals [AFP].

It’s no worse than what we had before:

Yes, NDM-1 is scary. But it’s not unprecedented.

There are numerous strains of antibiotic-resistant germs, and although they have killed many patients in hospitals and nursing homes, none have yet lived up to the “superbug” and “flesh-eating bacteria” hyperbole that greets the discovery of each new one. “They’re all bad,” said Dr. Martin J. Blaser, chairman of medicine at New York University Langone Medical Center. “Is NDM-1 more worrisome than MRSA? It’s too early to judge” [The New York Times].

The Los Angeles Times contends that the problem is not a serious problem for the West—at least not yet. The people in the most danger are patients in hospitals, The New York Times says—especially ones with weaker immune systems.

Could it happen anywhere?

India and Pakistan have developed some excellent hospitals and surgeons that provide medical care and surgical procedures, especially elective procedures, more cheaply than they are available in the West. But the overuse of antibiotics among the larger population leads to the development of resistance, and those organisms can make their way into even the best hospitals [Los Angeles Times].

Indians, however—cognizant of the bad press that comes with having a medical scare named after one’s capital city—have fought back against the idea that their practices brought on the emergence of a superbug.

“Several superbugs are surviving in nature and they have been reported from countries like Greece, Israel, the U.S., Britain, Brazil,” and elsewhere, V. M. Katoch, director general of the Indian Council of Medical Research in New Delhi told India Real Time. “It’s unfortunate that this new bug, which is an environmental thing, has been attached to a particular country” [Wall Street Journal].

Related Content:
DISCOVER: Promising Antibiotic Could Spawn the Next Superbug
80beats: Tiny Nanotech “Diving Boards” Test the Killing Power of Antibiotics
80beats: Non-Lethal Antibiotics Could Fight “Superbugs”
80beats: Red Meat Acts as Trojan Horse for Toxic Attacks By E. Coli

Image: USDA


Life in the Shadow of Coal | Visual Science

This photograph was made in a small town in Southeast Ohio, along the Ohio river, called Cheshire. To the left you can see the Gavin Power Plant, a coal-fired plant that provides electricity to Ohio, Appalachia, and the greater Northeast. The coal that is burned at Gavin Power comes in part from mountaintop removal mines in Appalachia. In 2002, shortly after federal health experts confirmed that the blue sulphuric clouds from the plant endangered the residents of the town, the American Electric Power company bought the entire village of Cheshire for $20 million, and the residents of Cheshire agreed to relocate. Photographer Daniel Shea has been working on a long-term project to explore the social and environmental impacts of coal in Southeast Ohio and Appalachia.

About taking this photo, Shea writes: “I had driven by this house repeatedly over the course of a week. Sometimes I would stop and photograph it, but it never felt right. This is a residential street in a very small town, and people take notice of strangers. I received really dirty looks the first time I photographed it, it was noon and the sun and heat were unbearable. Eventually I made it back to the house at 5 AM. At that time you couldn’t see the stacks, so I parked outside, waiting until they were barely visible an hour later. I knew while taking it that it would be “the” picture from Plume. This is a fairly atypical landscape — I assume that the house belongs to a manager at the power plant as there isn’t a lot of money in the region; it’s mostly rural poverty.”

Happy Friday the 13th

Oh, yay!  It’s Friday the 13th; that traditional ‘bad luck’ day in Western culture.  I’m sure we all know people who genuinely dread Friday the 13th, having been raised to believe something awful was going to happen to them.  Perhaps the day makes YOU uncomfortable, for whatever reason.

To celebrate, I thought it would be interesting to take a quick look at a few of the astronomy-related superstitions floating around out there.  I imply no negative connotation to the word “superstition”.  I’m not intending on pointing any fingers and laughing at anybody; after all, one person’s “superstition” is another person’s “religion”.  For blog purposes, a “superstition” is a belief or collection of beliefs that is without support of direct, scientific evidence. Notice — that doesn’t mean “wrong”; it means it’s a philosophy rather than a science.

Are you ready?  Jumping right on it, with:

FRIDAY THE 13TH: The belief that a specific day (calendar position) can be unlucky.  The roots of this one are right on top of Christianity.  According to Christian canon, Christ was crucified on Friday, the satanic “mass” is supposedly enacted with 13 participants (one lead, twelve actors), Noah’s flood began on Friday, there were 13 present at the Last Supper, and Adam and Eve fell from Grace on a Friday.  It’s the combination of Friday and 13 that supposedly makes the day unlucky.  Fear of the number 13 (triskaidekaphobia) is a widespread phobia, by the way.

Interesting Nugget: The Apollo 13 mission was launched on April 11, 1970 (4 + 11 + 70 = 85, 8+5=13).  It was launched from pad #39 (3 x 13 = 39).  The explosion took place at 1:13 pm (1313 hours in military time) on April 13th.

Gorgeous Full Moon by Luc Viatour, October 7, 2006, Hamois, Belgium http://www.lucnix.be

THE FULL MOON: There is supposed to be an increase in violence and irrational behavior during the full moon.  Also, to a lesser degree, during the “new” moon (the ‘dark’ of the moon).  Modern studies are contradictory, at best.  Usually they do not show a link between behavior and phases of the moon, except for the link associated with increase/decrease in the landscape brightness (i.e., hunting).

Interesting Nugget: Each full moon is given a name, usually associated with some activity occurring at or around the time of the phase.  Some of the names are:  Pink Moon, Harvest Moon, Cold Moon, Worm Moon, and Blood Moon.

WISH ON A FALLING STAR: The belief that if you can make your wish before the “falling star” fades, your wish will come true.  Of course, there are no stars really falling; I think everyone by now accepts that falling (or “shooting”) stars are meteoroids.  A whimsical, pleasant superstition.  (remember, by the way, a meteoroid or ‘meteor’ is falling; a ‘meteorite’ is what’s left over after it hits)

Interesting Nugget: Meteors are theorized to make noise.  Aside from the occasional sonic boom heard in association with meteors, a “hissing”, “crackling” noise has often been reported.  Sometimes the sounds are intercepted by unusual transducers, like your hair.

COMETS ARE HERALDS OF MISFORTUNE: Misunderstood for almost all of their known history, comets were believed to be the heralds of misfortune.  Not just personal misfortune, either.  Perceived as balls of fire thrown at us from the Heavens, comets were believed to foreshadow epic misfortunes such as drought, flood, and widespread disease (as in the plagues).

Interesting Nugget: Comets were viewed as important heralds of great misfortune, so the job of predicting their advent was given to the Chinese astrologers.  China took their astrology seriously; incorrectly predicting a comet or eclipse (or failing to predict one at all) was likely to cost the astrologer his life.

Superstition Mountains, AZ, image by Doug Dolde

Now it’s your turn!  What’s your favorite astronomy-related superstition?  Why?  I find it interesting to read about old superstitions and rituals.  Most of the time the origins are in man trying to make sense of the universe around him.  We were looking for the “cause” and the “effect”.  We were trying to answer the eternal question… why?

UC Berkeley Halts Its Freshman DNA Testing Project | 80beats

UCBerkeleyIs it medicine, or is it not?

In May, the University of California, Berkeley unveiled its “Bring Your Genes To Cal” program. The idea was, Berkeley’s 5,500 or so incoming freshman would have the option to have their DNA tested for three particular characteristics: Their metabolism of folate, tolerance of lactose, and metabolism of alcohol. Though the program was limited, it raised privacy hackles. And now the State of California has ruled: This is a medical test, and Cal can’t do it unless it’s in a clinical setting.

Mark Schlissel, UC Berkeley’s dean of biological sciences and an architect of the DNA program, said he disagreed with the state Department of Public Health’s ruling that the genetic testing required advance approval from physicians and should be done only by specially-licensed clinical labs, not by university technicians. The campus could not find labs willing to do the work and probably could not afford it anyway, Schlissel said. He also contended that the project deserved an exemption from those rules because it was an educational exercise [Los Angeles Times].

If you’ve been following this summer’s stories about personal DNA tests, this probably sounds familiar. The simmering question of whether or not they are “medical,” and therefore how they ought to be regulated, began to reach a boil after Walgreens announced its intention to sell DNA tests in its brick-and-mortar stores. In June, the FDA stepped in and said the tests are medical, and therefore it has the right to make the rules for them.

The Cal program’s leaders say they got about 700 samples returned as part of the voluntary program before the state clamped down on it. Because of the state’s ruling, the university can’t return individual results to students, but the researchers can analyze the samples to present the entire group as a data set—and then they must incinerate them.

Revealing himself to be a true teacher at heart, Berkeley geneticist Jasper Rine says the short-lived program was a success anyway because it provided a learning experience.

“Most of the benefit of this program has already been had,” Rine told reporters – “Every single student who opened the envelope had to make a judgment for themselves” regarding whether or not to get tested. The two biologists also said that the Berkeley program would undoubtedly raise wider issues of how universities around the country use genetic information, for both educational and research purposes [Nature].

A teachable moment may be all the Berkeley scientists get for a while. The experiment has shown that even a testing program with a limited scope can’t escape vexing questions about handing over the keys to individuals’ DNA.

Critics had raised questions about how the genetic information, even seemingly innocuous, could be misinterpreted or misused. For example, students who learn they metabolize alcohol well may mistakenly think they can overindulge without consequence [San Francisco Chronicle].

And given the authority relationship between a university and its students, allowing the school access to its students’ DNA invites the possibility of abuse, no matter how benign the intention or how detailed the legal papers.

Related Content:
Not Exactly Rocket Science: How I Got My Genes Tested
Discoblog: Welcome, UC Berkeley Freshmen! Now Hand Over Your DNA Samples
80beats: Government Sting Operation Finds Problems With Personal Genetics Tests
80beats: FDA: We’re Going To Regulate Those Personal Genetics Tests, After All
80beats: 5 Reasons Walgreen’s Selling Personal DNA Tests Might Be a Bad Idea

Image: Wikimedia Commons


The Next Decade of US Space Astronomy | Cosmic Variance

So, the Decadal Survey (”Astro2010″) results are out. I missed the webcast (which I heard was of pretty sketchy quality), but read Roger Blandford’s slides, and have skimmed or read a reasonable fraction of the preliminary report. Here’s my summary and first reactions, broken down by regime. Steinn has also been blogging a running commentary of his reactions here.

Space Missions:

The top recommendation for a space mission is “WFIRST” — basically a 1.5m wide-field IR imager in space, with low-resolution spectroscopy capabilities. This concept is the latest realization of what was previously known as “JDEM” (for “Joint Dark Energy Mission”, which itself was an expanded and reconstructed version of “SNAP”, the Supernova (SN) Acceleration Probe). The goal would be to use some combination of high redshift SNe, baryon acoustic oscillations (BAO), and weak lensing to constrain the parameters of dark energy. The committee recommended that the mission allow for a general observer (GO) program (thank goodness) and have a component dedicated to exoplanet discovery through microlensing (really? not really something I follow, but this isn’t something I’ve heard much about. UPDATE: from the comments, Andy Gould has a white paper pointing out that the weak lensing requirements are essentially identical to what’s needed for a microlensing-based planet search. Basically, you get it for free if you decide to pursue weak lensing. However, they did not take Andy’s recommendation that the dark energy mission not pursue 3 independent techniques in one satellite.).

The next recommendation is for a mixed portfolio of smaller satellite missions. These “Explorer”-class missions have historically been hugely successful — WMAP, GALEX, etc — but have been squeezed out recently by funding limitations and pressure from flagship mission development (JWST) and operations.

The third recommendation is for continued development of LISA, an orbiting interferometric gravitational wave detector. LISA is a really nifty project — one that I was not inately that interested in, but that became more and more compelling the more I learned about it. Co-blogger Daniel has thought a lot about LISA, and maybe we can get him to talk some more about it.

Reactions to the Space Recommendations:

Overall: These were hard choices, and reading the report, it’s clear that a huge amount of weight was given to cost, feasibility, and competitiveness. IXO, the next generation flagship X-ray mission, dropped compared to its previous ranking, largely because the committee found it to be technologically and financially risky (”The Survey Committee also found IXO technologies to be too immature at present for accurate cost and risk assessment”). They instead flagged IXO as ripe for money for “technological development”, so that it’s ready to go for the next report. The Space Interferometry Mission (SIM, or SIMlite) dropped completely out, in large part due to cost vs scientific return.

The real bummer about these recommendations is that entire subfields of US astronomy are pretty much shut out of the only environment where they can operate. X-ray, UV, and high-resolution astronomy (outside of IR and radio) are fundamentally space-based enterprises, and when Chandra and HST shut down, there will be nothing left, and nothing in the pipeline for a decade or more. The good times are continuing to role if you’re an infrared astronomer — (considering the series of Spitzer, WISE, JWST, and now WFIRST), but entire communities are going to be gutted. I do think that IXO will eventually get a start, because it’s a strong mission, but are there going to be any X-ray astronomers left when it starts getting data?

WFIRST: It will be interesting to see how this plays out, because two of the three dark energy techniques are going to making a fair bit of progress over the next decade, even without this mission — two of the three new gigundo Hubble Multicycle Treasury programs will have a significant high-redshift SN component, and ground-based BAO surveys like BigBOSS are viable candidates for completion within a 10yr timescale. I’m sure discovery space will be left, but it will be interesting to see where we are in 10 years. There is also a highly ranked ESA mission with very similar capabilities. The only way it makes sense to go forward with WFIRST is if the projects somehow merge.

Explorer Missions: There will definitely be broad community support for this recommendation. For certain wavelength regimes, this will be the only game in town. UV astronomers can probably make some real progress here, because there are huge gains that can be made by increases in detector efficiency, rather than by larger apertures, which are expensive to build and launch. High-resolution questions can’t be addressed through the Explorer program, since you really need large baselines that are inaccessible at this cost limit (large baseline = big mirrors or interferometry = expensive). Not sure what can be done in the X-ray, but hard to go from Chandra or XMM down to what’s available through this approach.

LISA: I think LISA is pretty cool. I would have thought that the technological challenges for LISA are comparable to those that IXO faces, but I’ll sensibly assume that the committee spent infinitely more time evaluating this issue than I have. Of the two, LISA probably has more pure discovery space potential. We at least know something about x-rays from space, but we know close to nothing about gravitational radiation from space.

Ok, I gotta try to do some actually science today before I tackle the rest of the recommendations…more later


Two weeks, two geeks: Mythbusters edition | Bad Astronomy

geekaweek_adamsavageI’ve told you about Geek A Week before: artist Len Peralta is interviewing one Alpha Nerd ever week, and drawing a cool trading card for each. I was honored to be included, and I also mentioned my friend Brea Grant was one as well.

Len recently interviewed two other friends of mine: Adam Savage, and Grant Imahara. Yeah, two of the Mythbusters! You can grab Adam’s interview here, and Grant’s here.

Grant’s interview was interesting to me because I didn’t know his history. Well, I do now after hearing him talk about it, and he’s done a lot of very cool stuff. I really enjoyed his interview; he has an attitude about a lot of things he’s done (and still doing) that I find simpatico. It was fun to listen to.

geekaweek_grantimaharaAdam’s interview may surprise you. Most people know him through the TV show, of course, but you only get a glimpse of who he is there. If you’ve seen him talk at Dragon*Con or TAM, you know he is a man of deep intelligence, and dare I say wisdom. He is passionate, and — largely self-taught — has depth to his musings.

While I found myself laughing as I listened to Adam, I also found myself thinking about what he was saying. For example, when asked what quality he values most in other people, he said, "the ability to adapt to change". That’s an excellent answer, I must say. If everyone could adapt to change willingly and rapidly, how much better would the world be? Creationism would disappear over night, as would most forms of denialism.

His other answers are similarly thoughtful and interesting. When it was over, I was really goofily proud of him. If you’re a Mythbusters fan, I strongly urge you to listen to both those interviews. You’ll like ‘em.


Found: One of Neptune’s Asteroid Stalkers | 80beats

neptuneAstronomers have confirmed it: Neptune has a stalker. They have spotted, for the first time, an asteroid follower that keeps a fairly constant distance behind the planet in its orbit around the sun. And there may be many more.

Asteroid 2008 LC18 can’t help itself. It’s caught in a balancing game between the gravitational tug of the sun and Neptune, and effects from its whirling course. The conflicting tugs cause the asteroid not to orbit Neptune or crash into it, but instead to follow the planet from a little distance behind (about 60 degrees on its path).

Neptune has five of the these pits–called Lagrangian points (see diagram below the fold)–but the spots ahead and behind the planet, researchers say, are best for asteroid-trapping, since the hold is particularly stable in these places. Researchers have previously spotted several asteroids in front of the planet (again by about 60 degrees), but this is the first time they’ve found one following it. The findings appeared online yesterday in Science.

24617_webSpotting followers in the place behind the planet was particularly difficult because astronomers’ line of sight to where these special Lagrangian asteroids–called Trojans–roam overlapped with the center of the Milky Way. Says lead author Scott Sheppard in a Carnegie Institute press release:

“The L4 and L5 Neptune Trojan stability regions lie about 60 degrees ahead of and behind the planet, respectively. Unlike the other three Lagrangian points, these two areas are particularly stable, so dust and other objects tend to collect there. We found 3 of the 6 known Neptune Trojans in the L4 region in the last several years, but L5 is very difficult to observe because the line-of-sight of the region is near the bright center of our galaxy.” [Carnegie Institution]

Using the 8.2-meter Japanese Subaru telescope in Hawaii, astronomers spotted the L5 Trojan by waiting for dust clouds in our galaxy to block out the light from the galaxy’s center. They determined its orbit using Carnegie’s 6.5-meter Magellan telescopes in Chile, and they believe finding 2008 LC18 means the planet may have many Trojan asteroids. Sheppard believes they’d be easier to spot if Neptune wasn’t so far away.

“We believe Neptune Trojans outnumber the Jupiter Trojans and the main-belt asteroids between Mars and Jupiter,” Scott Sheppard of the Carnegie Institution in Washington, D.C., told Space.com. “If Neptune was where the main-belt was, we’d know thousands of these objects.” [Space.com]

Sheppard also notes that Neptune’s newfound Trojan has a slightly tilted orbit, which may mean that the planet captured the asteroid when it was in a different location and trajectory–lending support to the theory that the giant planets don’t have the same orbits that they started with in the early solar system.

“The Neptune Trojans can tell us a lot about how the giant planets formed,” Sheppard said…. “People have been saying that the giant planets were very chaotic,” that they formed closer together, in unusual orbits, and migrated outward, Sheppard said. The Neptune Trojans’ odd orbits “supports that theory.” [National Geographic]

Related content:
80beats: Study: Uranus & Neptune Have Seas of Diamond—With Diamond Icebergs
80beats: Did Galileo Spot Neptune Two Centuries Before Its “Discovery?”
80beats: The Earth’s Oldest Diamonds May Show Evidence of Earliest Life
Bad Astronomy: A New Ring Around Uranus
Bad Astronomy: Did Herschel See the Rings of Uranus?

Images: NASA & Scott Sheppard


Do You Speak Brain? Try Studying These Neurons-on-a-Chip | Science Not Fiction

spacing is importantThe neurons of a patient suffering from Alzheimer’s.

You may not be consciously aware of it, but at any given time your brain is playing host to billions of simultaneous conversations (and no, I’m not talking about those voices). I speak, of course, of the conversations between your neurons—the incessant neural jabbering that makes it possible for you to move your limbs, learn, remember, and feel pain. Every time we experience a new sensation or form a memory, millions of electrical and chemical signals are propagated across dense networks of axons and jump from one synapse to the next, building new neuronal connections or strengthening existing ones. And they are constantly changing—forming and reforming associations with other neurons in response to how the brain perceives and processes new bits of information.

Despite being central to our understanding of how the brain functions, these neural chats remain largely a mystery to scientists. What exactly are the individual neurons “saying” to each other? And how do these electrical and chemical “messages” become translated into actions, memories, or a range of other complex behaviors? To help decipher these discussions, a team of researchers from the University of Calgary led by bioengineer Naweed Syed have built a silicon microchip embedded with large networks of brain cells. The idea is to get the brain cells to “talk” to the millimeter-square chip—and then have the chip talk to the scientists through a computer interface.

Syed’s team demonstrated that it was possible to fuse neuronal networks to a microchip in 2004 when they created the original “brain on a chip,” the first bionic hybrid technology of its kind. The neurochip stimulates the cells and the resulting chatter—the activity of the neurons at the level of the ion channels and synaptic ends—can be recorded with a computer. At the time, Syed and his colleagues used the chips to eavesdrop on snail neurons, which are large (4 to 10 times larger than human neurons) and thus easier to cultivate than other animal brain cells.

The new version also relies on snail cells but is automated—a major improvement which means that just about anybody can now learn how to properly grow the cells on them. Furthermore, they offer a much higher degree of resolution and are more accurate. Whereas the first neurochips only enabled scientists to monitor the chatter between two brain cells, the new and improved models now allow them to listen in on entire networks and pick up on all the minute neural exchanges.

Aside from giving researchers unprecedented access to the brain’s innermost workings, the hope is that this technology will pave the way for new drugs to treat neurodegenerative disorders like Parkinson’s and advanced prostheses that better mimic normal human motion by communicating directly with the brain. Over the coming months, Syed and his team plan on cultivating the neurons of a group of epileptic patients on their chips in order to study the cells’ dysfunctional activity.

People who suffer from epilepsy are wracked by frequent seizures which are brought on by unusual and excessive neuronal chatter. By honing in on the defective ion channels that trigger these abnormal signals, Syed believes that his chips will yield crucial insights into the disease and lead to a more effective treatment. If proven successful, the same model could be applied to other brain disorders, eventually eliminating the need to test drugs directly on patients—or at least providing a good pilot study before moving on to patients—and thus greatly accelerating the pace of research and development. It’s the same principle as the lung-on-a-chip, which scientists hope will lead to new drug-testing protocols that obviate the need for animal subjects.

It’s certainly not hard to see the appeal of these technologies. Everyone can get behind the idea of faster drug-development cycles and more finely tuned treatments, especially if it means that no humans or animals will be harmed in the process. In several years, after they become more sophisticated and ubiquitous, these neurochips could give a big boost to the fight against brain disorders, which are some of the trickiest puzzles in medicine.

Image: Yale