Daily Archives: May 28, 2017

Astronomy Guide to the rest of the Memorial Day Weekend – AccuWeather.com (blog)

Posted: May 28, 2017 at 8:18 am


AccuWeather.com (blog)
Astronomy Guide to the rest of the Memorial Day Weekend
AccuWeather.com (blog)
Memorial Day weekend is here! If you have plans to relax outside this weekend, there are some cool things to check out in the night sky. We've been talking ...

and more »

Continue reading here:

Astronomy Guide to the rest of the Memorial Day Weekend - AccuWeather.com (blog)

Posted in Astronomy | Comments Off on Astronomy Guide to the rest of the Memorial Day Weekend – AccuWeather.com (blog)

Study: Female Astronomers are Cited Less Frequently – The Atlantic – The Atlantic

Posted: at 8:18 am

The citations found at the end of research papers serve several purposes, like providing background on the current work and giving proper credit where its due. They can also, according to a new study, reveal decades worth of trends in whole fields of science.

A trio of researchers have waded though more than half a century of research published in astronomy journals and found that studies authored by women receive 10 percent fewer citations than similar studies written by men.

Neven Caplar of the Swiss university ETH Zurich and his colleagues analyzed more than 149,700 papers published between 1950 and 2015 in five journals: Astronomy & Astrophysics, The Astrophysical Journal, Monthly Notices of the Royal Astronomical Society, Nature and Science. They made sure that the papers being cited matched up in variables unrelated to gender, like the lead authors seniority in the field, the institutions they wrote from, the total number of authors on the paper, the number of references, the year and journal in which it was published, and the specific field of study. They say their findings, published Friday in Nature Astronomy, quantify the effect of gender bias in citations within astronomy research.

If there were no gender bias in astronomy research and only these factors mattered, the researchers analysis predicts that men would actually receive 4 percent fewer citations than women would. So their actual results were surprisingto the algorithms, at least. In the context of history, their findings are not surprising at all.

Since the late 1990s, women in the United States have earned nearly 60 percent of all bachelors degrees, but about half of all degrees in science and engineering fields, according to the National Science Foundation. The number of women receiving degrees in science is on the rise, but women remain outnumbered in many of these fields, particularly in physics, engineering, and computer science. In 2013, an analysis of more than 8 million papers in the fields of natural sciences, social sciences, and humanities showed that men are more likely to be listed as lead authors. So it follows that with fewer women getting degrees, becoming researchers and professors, contributing to papers, and then leading papers, there are fewer women to cite.

Some of the gender disparity can be attributed to the nature of the workforce. Most science professionals got their degrees in the last 40 years, and those people tend to be disproportionately male and white, National Science Foundation statistics show. A 2014 report on an annual meeting of the American Astronomical Society found that although the gender ratio of speakers matched that of the audience, more men than women asked questions of the participants. The researchers in this study interpreted this observation to be a product of the workforce. More senior scientists may be more likely to ask questions, they wrote, and senior scientists are usually men. Another survey of participants at a National Astronomy Meeting, organized by Britains Royal Astronomical Society, made similar observations about question-askers. A 2016 survey of more than 13,000 requests for use of the European Southern Observatory over eight years found that female applicants had significantly lower chances of getting telescope time. The study attributed this result to the effects of seniority; only 34 percent of the women applying were professionally employed astronomers, compared to 53 percent of the men.

Critics of the effect described in the Nature Astronomy study could argue that researchers seek to use the best sources in their work, regardless of gender. Any perceived preference for male-led work surely must be unintentional. But research has shown that when gender is taken out of consideration, potential implicit biases fade away and the scales balance. In 2001, the journal Behavioral Ecology started using a double-blind review that masked the genders of the applicants being evaluated. This led to a significant increase in female first-authored papers, a pattern not observed in a very similar journal that provides reviewers with author information, according to a paper that examined the policy. No negative effects could be identified. A similar effect has been found in hiring. In a 2012 study, researchers simulated an application process for a laboratory manager job, randomly assigning applicants either a male or female name. The applicants, members of faculty at a research university, were given identical credentials for the applicants. Yet the participantsboth male and female facultyrated the male applicant as significantly more competent than his female counterpart. Even scientists, some of the loudest advocates for objectivity, are not immune to deeply rooted differences in the perception of men and women.

The Nature Astronomy study does have some encouraging findings. The number of astronomy papers authored by women has increased over the last 50 years, and the difference between the number of female-led and male-led papers in citations has shrunk, the researchers write. They found that back in the 1950s and 1960s, men received between 50 percent and 100 percent more citations than women did.

The average number of citations in a paper has also increased, from about 10 in the 1960s to about 60 today, providing room for more authors to be recognized and credited, male or female. But the disparity persists, in astronomy and likely elsewhere, and even in the very study that examined it. Of the 19 authors cited in the paper, just six are women.

See the rest here:

Study: Female Astronomers are Cited Less Frequently - The Atlantic - The Atlantic

Posted in Astronomy | Comments Off on Study: Female Astronomers are Cited Less Frequently – The Atlantic – The Atlantic

Juno results offer tantalizing hints of Jupiter’s secrets – Astronomy Magazine

Posted: at 8:18 am

Researchers also studied the planets immense magnetic field and found that close to the planet it was much stronger than expected, clocking in at 7.766 Gauss about ten times stronger than Earths. Their measurements also found lots of magnetic complexity near Jupiters outermost layers, which supports the hypothesis that the worlds magnetic field is being driven by the swirling liquid hydrogen layer beneath the clouds. A full mapping of the magnetic field awaits data from further Juno orbits.

Juno is giving us a view of the magnetic field close to Jupiter that weve never had before, said Jack Connerney, Junos deputy principal investigator. Already, we see that the magnetic field looks lumpy: It is stronger in some places and weaker in others. This uneven distribution suggests that the field might be generated by dynamo action closer to the surface, above the layer of metallic hydrogen. Every flyby we execute gets us closer to determining where and how Jupiters dynamo works.

Understanding its magnetic field will add another piece to the puzzle of Jupiters interior. While planetary scientists assume it to be mostly hydrogen, the true composition, density and structure remain unknown. Scientists assume that the crushing pressures create a large layer of metallic hydrogen in the planets interior with a rocky core beneath, but definitive evidence is still lacking. Juno is also taking gravitational measurements as it orbits, which should give us more information about the interior as additional data becomes available.

In addition to looking below Jupiters clouds, the researchers wanted to see what happens above them, where charged particles from both the sun and within Jupiter interact with its magnetic field, creating huge auroras. Juno first encountered the shroud of particles last summer when it passed through the bow shock, a sort of shock wave created when Jupiters magnetic field shunts particles from the solar wind aside.The bow shock seems to have been moving outward as Juno passed through it, the researchers say.

Continued here:

Juno results offer tantalizing hints of Jupiter's secrets - Astronomy Magazine

Posted in Astronomy | Comments Off on Juno results offer tantalizing hints of Jupiter’s secrets – Astronomy Magazine

[ 27 May 2017 ] Jupiter surprises in first trove of data from NASA’s Juno mission News – Astronomy Now Online

Posted: at 8:18 am

This image shows Jupiters south pole, as seen by NASAs Juno spacecraft from an altitude of 32,000 miles (52,000 kilometres). Image credit: NASA/JPL-Caltech/SwRI/MSSS/Betsy Asher Hall/Gervasio Robles

The first months of observations of the solar systems biggest planet from NASAs Juno spacecraft have revealed huge swirling polar cyclones, previously-undetected structures and motions beneath Jupiters distinctive clouds, and the first evidence for what lies at the core of the gas giant, scientists said Thursday.

There was plenty scientists did not know about the planet when the Juno spacecraft left Earth in 2011, and the probe has sought answers to questions about Jupiters interior, magnetic field, auroras and radiation belts, and used a visible light camera to capture the first direct views of the poles.

The general theme of our discoveries is really how different Jupiter looks from what we expected, said Scott Bolton, Junos principal investigator at the Southwest Research Institute in San Antonio. Juno, in many ways, is looking inside Jupiter for the first time, close-up and personal.

Since Juno arrived at its destination July 4, 2016, to wrap up a five-year interplanetary trip, the spacecraft, built and operated by Lockheed Martin, has circled Jupiter six times in an oval-shaped loop that extends a few million miles at its farthest point. Each lap takes more than 53 days, and Juno speedily skirts within 3,000 miles (5,000 kilometres) Jupiters cloud tops at closest approach.

Junos science instruments collect most of their data when the orbiter is near Jupiter, taking pictures, measuring plasma and electrons, and probing deep inside the planet to find out what is hidden under its cloudy veneer.

Many scientists thought Jupiter was relatively boring and uniform inside before Juno arrived, Bolton said.

For decades, scientists have assumed this, that if we drop below the cloud tops, below where the sunlight reaches, that pretty much Jupiter was all uniform inside, and it really didnt matter where you looked, it would all look the same, Bolton said Thursday. And what were finding is anything but that is the truth. Its very different and very complex.

Junos microwave radiometer, an instrumentsimilar to those aboard climate satellites looking down on Earth, gathers sounding measurements to peer below the red-orange tapestry of Jupiters cloud tops.

The radiometer is tuned to six wavelengths, detecting thermal radiation emitted from different layers of the atmosphere from the storm clouds and jet streams to as deep as 300 miles, or about 500 kilometers.

Going into Junos mission, scientists anticipated Jupiters atmosphere to be relatively consistent deeper than 60 miles, or 100 kilometers. Instead, Junos microwave radiometer discovered a belt of ammonia around Jupiters equator, and variations in ammonia abundances at other latitudes extending deep into the planets atmosphere.

This was completely unexpected, Bolton said. You have a deep band of ammonia that goes from the top of Jupiter as deep as we can see. It goes down to 350 kilometres (217 miles) because thats the limit of where were looking.

The ammonia band may penetrate even deeper inside Jupiter, Bolton said.

What this is telling us is that Jupiter is not very well-mixed, Bolton said. Its not all uniform inside. The idea of, once you drop below the sunlight, that everything would all be uniform, boring and mixed up was completely wrong. Its actually very different depending on where you look.

The findings suggest more ammonia farther down in Jupiters atmosphere, and the ammonia detections appear to have no relationship with the zones and belts of clouds visible in pictures from space.

Thats really going to force us to rethink not only how Jupiter works, but how do we explore Saturn, Uranus and Neptune if they are highly variable like this? Bolton said.

Other parts of Junos scientific sensor suite are mapping Jupiters gravity field to learn about the heart of the planet.

When we went to go measure the gravity field, what we were really looking for was the core whether there was a compact core or no core, Bolton said. Instead, what we found was that it really looks fuzzy. There may be a core there, but its very big, and it may be partially dissolved. Were studying that, but that came as a big surprise to us that there was no core.

Theories about Jupiters core before Juno arrived predominately predicted the planet either had a small, dense rocky core between one and 10 times as massive as Earth, or no core at all, scientists said.

Most scientists were in one camp or the other, and what we found was really neither was true, Bolton said. There may be a little bit of a compact core, but there may be layers there, and there seems to be a fuzzy core that may be much larger than anybody had anticipated.

The gravity data that weve gotten thus far is not really consistent with just a small compact core or zero core, but it is somewhat consistent with a large fuzzy core that may be partially dissolved, Bolton said. Its also consistent, maybe, with some deep motions, or zonal winds and things like that dictating the interior of Jupiters dynamics, which are very different than historically models have assumed.

Jupiters intense magnetic field, the strongest of any planet in the solar system, has also been interrogated by Juno, which has a magnetometer mounted at the end of one of the crafts three solar array wings.

Jack Connerney, Junos deputy principal investigator at NASAs Goddard Space Flight Center in Maryland, described the magnetometer as like a fancy compass that can measure the direction and strength of Jupiters magnetic field.

Juno has come closer to Jupiter than any mission before, and proximity yields better magnetic field measurements, Connerney said.

What we found in our first few passes is that the magnetic field was both stronger than we expected where we expected it to be strong, and it was weaker than we expected where we expected it to be weak, Connerney said. In other words, it evidenced a dramatic spatial variation that we were not quite aware of previously.

The fluctuations detected by Juno suggest the spacecraft is unexpectedly close to the magnetic fields source, or dynamo.

Scientists thought the magnetic field might be generated in a global pool of liquid metallic hydrogen in Jupiters middle layer somewhere between the center of the planet and the atmosphere. Squeezed at extreme pressure, the deep layer of hydrogen is liquified and conducts electricity.

The magnetic field expands outward from Jupiter and is blown back by the solar wind like a comets tail. The magnetic field bubble, called a magnetosphere, is similar to one around Earth, but Jupiters is so immense it would be the size of the full moon in the sky, if it was visible with the naked eye.

Junos observations might mean that the dynamo is above that metallic hydrogen region, Connerney said, perhaps in an envelope of molecular hydrogen.

An infrared camera and ultraviolet spectrometer aboard the Juno spacecraft have been looking at Jupiters powerful polar auroras, producing another set of observations that surprised scientists.

It turns out some of the auroral light emissions seem to be produced by electrons streaming out of Jupiters atmosphere, not by charged particles riding field lines into the planet, as is the case with Earths auroras. One of Junos instruments, an electron detector, found particles moving upward as the orbiter soared over Jupiters south pole.

According to Connerney, the electrons are probably drawn out of the planet along the same field lines scientists thought would see the particles into Jupiter.

As theyre leaving, they collide with hydrogen molecules and excite ultraviolet emissions, Connerney said. Itsa 180-degree turnabout from the way we were thinking about those emissions prior to the Juno observations.

NASAs Cassini spacecraft, in the final months of its mission, is now orbiting Saturn on a trajectory similar to Junos. Bolton said scientists are eager to compare observations from the two craft to compare the solar systems two largest planets.

Cassini doesnt have the exact same kind of instruments we have, and of course, were tuned to do this interior research, but it has a lot of great instruments that can learn a lot about the interior and other things that it can do close-up, said Bolton, who is also a member of the Cassini science team.

Were both trying to figure out our data from our own planets at the moment, but eventually we will compare, and of course, thats the key to scientific advancement comparative study, Bolton said. So being able to compare Cassinis measurements at Saturn and Jupiters measurements by Juno, we will reallybe able to advance our understanding how these giant planets work.

Junos camera has scanned Jupiter during each pass over the planets poles, catching dozens of swirling storms in the act, some the size of Earth.

The Juno team relies on amateur observers and image processors logged in to the missions website to crunch raw views from JunoCam and create colorful mosaics.

What you see are incredible, complex features, Bolton said. These cyclones and anticyclones all over the poles. That wasnt really expected.

The bluish hue is probably real, he said of one south pole mosaic. And the biggest feature is that Jupiter, from the poles, doesnt look anything like it does from the equator.

Our usual picture of Jupiter has zones and belts, the Great Red Spot, and you see these stripes, and thats the Jupiter weve all known and grown to love, Bolton said. When you look from the pole, it looks totally different. If you looked at this picture, and somebody had shown it to you a few years ago, I dont think anybody would have guessed this is Jupiter.

Mission managers tacked on the JunoCam imager to the spacecrafts instrument package after NASA selected Juno for development in 2005. JunoCam was not originally part of the Juno mission, but officials added the camera as a public outreach tool.

Scientists said JunoCams imagery adds context to their data analysis work, but it also engages a broader community of professional and amateur scientists, space enthusiasts and artists.

The contributions of the amateurs are essential, said Candy Hansen, Juno co-investigator at the Planetary Science Institute in Tucson, Arizona. I cannot understate how important the contributions are. We dont have a way to plan our data without the contributions of the amateur astronomers.

We dont have a big image processing team, so we are completely relying on the help of our citizen scientists, Hansen said.

JunoCam collects images in strips as the spacecraft spins on its main axis, and contributors stitch the strips together to make pictures.

What I find the most phenomenal of all is that this takes real work, Hansen said. When you download a JunoCam image and process it, its not something you do in five minutes. The pictures that we get that people upload back onto our site, theyve invested hours and hours of their own time, and then generously returned that to us.

Hansen said JunoCam has spotted tiny features suspended above Jupiters main cloud deck that look like squall lines on Earth. The clouds are dwarfed by Jupiters enormous scale, but they actually stretch around 30 miles, or 50 kilometers, across, she said.

I keep saying (theyre tiny), but theyre really not tiny at all, Hansen said. Theyre up above the cloud deck at a pressure level where the temperature is going to be very cold, so what youre seeing is most likely ice crystals of water ice and ammonia ice.

Junos next close-up encounter with Jupiter is set for July 11, when the orbiter will pass above the Great Red Spot for the first time.

The discoveries made by Juno so far are making us rethink how giant planets work, not just in our own solar system, but giant planets are really important throughout the galaxy and the Universe, Bolton said.

Were getting the first really close-up and personal look at Jupiter, and were seeing that a lot of our ideas were incorrect, and maybe naive, that its very complex, and there are a lot of deep motions going on, he added.

NASA decided in February to forego an engine burn to move Juno into a 14-day orbit with a tighter path around Jupiter after engineers detected a problem with check valves inside the crafts propulsion system last year.

Junos mission will last until at least February 2018, enough time to make 11 science orbits around Jupiter, instead of the 32 laps originally planned. But NASA could extend the mission another three years to give Juno more flybys near Jupiter.

Theres a theme here. There are motions going on just beneath the clouds that we see with the microwaves, and there may be very deep winds and deep motions going on that we see with the gravity field (sensors), Bolton said. Its hard to say yet, but more data will tell us how deep those really go. Were just at the beginning of this mission, where eventually were going to map out that planet.

Email the author.

Follow Stephen Clark on Twitter: @StephenClark1.

Read this article:

[ 27 May 2017 ] Jupiter surprises in first trove of data from NASA's Juno mission News - Astronomy Now Online

Posted in Astronomy | Comments Off on [ 27 May 2017 ] Jupiter surprises in first trove of data from NASA’s Juno mission News – Astronomy Now Online

Microsoft’s weapon in high-stakes cloud-computing battle with Amazon? Freebies – The Seattle Times

Posted: at 8:18 am

Microsoft isnt banking on snazzy marketing or technical chops alone to make its Azure service a winner in the critical cloud-computing market. Its also offering freebies, betting that discounts and free technical support today will produce paying customers down the line.

DefinedCrowd, a Seattle software startup, had a choice to make when it was developing its first product last year build on the cloud-computing foundation offered by the dominant Amazon.com, or Microsofts upstart competitor?

For founder Daniela Braga, the competing services seemed about even in terms of features. On price, Amazons tools were a bit cheaper than Microsofts. And more developers were comfortable working with Amazon Web Services (AWS), the cloud-computing pioneer and now the markets largest player.

But Microsoft held the trump card: an offer of $500,000 in credits to spend on Microsofts Azure cloud services over three years, a benefit DefinedCrowd had earned by participating in a Microsoft startup program. That kind of sum can pay for the entire technology-infrastructure cost of getting a software companys first products off the ground.

That was kind of hard to refuse, said Aya Zook, business-development manager with DefinedCrowd, which makes tools to train software how to recognize speech or images.

The startup would build its software on Microsofts Azure.

Microsoft has staked its future on the cloud, the range of on-demand computing power and software services bundled into Azure and other products.

But Microsoft isnt banking on snazzy marketing or technical chops alone to make Azure a winner. The technology giant is also offering bargains and freebies, including discounts to large businesses, free trial offers to all comers, and grants of cash for startups and nonprofits that try the service.

The programs are part of a broader, companywide effort to gain market share. The bet is that discounts and free technical support today will make paying customers down the line, ideally bringing thousands of dollars a year to Azure and boosting awareness of Microsofts offering in a highly competitive market.

Its an old tactic for a company that has long had plenty of cash to work with. Exactly where Microsoft has deployed that money to lure software developers offers a window into the companys shifting priorities over the years.

In the midst of its unsuccessful smartphone push a few years ago, Microsoft was shelling out a reported $100,000 (and up) to application makers who built tools for Windows Phone. Before that, Microsoft made similar deals to get developers and corporate partners interested in Bing, the fledgling search engine. And to a generation of technologists years ago, Microsoft offered ample support to get businesses to plug into the new Windows Server.

Those programs have yielded mixed results, said Michael Cherry, who worked at Microsoft in the late 1990s, and today tracks the company with analysis firm Directions on Microsoft.

Grants to use products dont tend to make a big difference on their own, he said. But when you can add feet on the ground to help a developer that had a problem? Theyll be loyal to you forever.

For Microsoft, the cloud is the priority today.

It was the focus of the companys recent Build developer show in Seattle, where the company kicked off the proceedings by staking out a virtual claim to the city, and the market.

A promotional video showed the Space Needle topped by a flag with the Microsoft logo on one side, and Cloud City on the other. Never mind that Amazon, with a much bigger cloud-market share than Redmond-based Microsoft, has its headquarters just a few blocks away from the landmark.

When choosing between Amazon and Microsoft, Braga concedes she had a soft spot for Microsoft. A linguist and speech-software expert originally from Portugal, she had spent seven years at the company. Zook, her colleague, is a fellow Microsoft alum.

Were ex-Microsoft people, she said. Its an environment that were comfortable with.

Still, she said, There are a lot of incentives, and pressure, to go on AWS.

Amazon, which pioneered the business of selling software and developer tools delivered over the internet, built its lead in that market, in part, by touting an easy-to-use product that offered room to experiment without paying. Adding to the appeal, technologists say, was the absence of complex, negotiated software-licensing deals of the sort Microsoft relies on.

A free tier of AWS services, introduced in 2010, can add up to thousands of dollars a year, a benefit available to all customers regardless of size. The company has bolstered that in recent years with credits aimed at researchers and educators, as well as standard startup grants ranging from $15,000 to $100,000.

The combination, on top of a technologically impressive set of products, has given AWS an enviable list of customers at the cutting edge of technology, including Netflix, Airbnb and Slack.

To counter AWS lead, No. 2 Microsoft has brought to bear what some see as its greatest asset: a giant base of corporate customers, and a sales force of tens of thousands built to sell to them.

In contract talks with corporate customers of Windows, Office and other software, Microsoft recently has been offering discounts on those products in exchange for a commitment to buy thousands of dollars worth of Azure cloud-computing services, according to consultants who advise those companies.

The company has also lent customers its own engineers.

Mojio, a Vancouver, B.C.-based software maker, participated in a Microsoft program called BizSpark, essentially a boot camp for technology startups eager for Microsofts counsel and connections. The program comes with complimentary Microsoft software, and, in the last two years, up to $120,000 in cash to use on Azure over two years (though some companies, including Mojio, have received larger grants).

Mojio, which builds software for connected cars, had just about run out of free Azure credits when it caught its big break: a deal with wireless carrier T-Mobile.

Mojio signed on to supply some of the technology behind the Bellevue companys new car-mounted Wi-Fi hot spot and diagnostic data gathering tool. The product went live the Friday before Thanksgiving.

By Monday, Mojio was in crisis mode.

The stream of data being thrown off by the hot spots and into Mojios systems built on Microsofts Azure pushed them to the breaking point. So many customers were using the tools that the software built to digest it slowed to a crawl.

It wasnt clear whether it was an architecture issue, whether it was a bug, said Mojio chief executive Kenny Hawk. The volume came faster than any of us had predicted.

Hawk, worried that he was watching his startup implode, called in a big favor.

A friend, a former Microsoft board member whom he declined to name, agreed to put in a call to Microsoft Chief Executive Satya Nadella, asking for help on behalf of tiny Mojio, which then employed fewer than 15 people.

Literally within a couple hours there were (Microsoft) people working on it, Hawk said.

The next day, Microsoft engineers arrived in Vancouver. They would work side-by-side with Mojios staff for the next three days to retool the software to handle a larger workload.

Hawk is grateful for the help, but has no illusions: Microsoft isnt a charity.

The company, he says, is probably hopeful that Mojio, which outgrew its free allotment of Microsoft tools, would eventually become a major buyer of them.

It wasnt just that we were nice people, or that wed been a part of BizSpark, Hawk said. They see how big the connected car market will be. Having a core customer in that space is strategic.

Corey Sanders, who leads a Microsoft team building Azure infrastructure services that compete with Amazon, wasnt involved with the Mojio rescue and hadnt heard the story. Still, the scale of Microsofts response didnt surprise him.

In the competitive cloud market, every customer matters, he said. Every product is critical.

See the original post:

Microsoft's weapon in high-stakes cloud-computing battle with Amazon? Freebies - The Seattle Times

Posted in Cloud Computing | Comments Off on Microsoft’s weapon in high-stakes cloud-computing battle with Amazon? Freebies – The Seattle Times

Cloud computing will change the nature of hospital IT shops – Healthcare IT News

Posted: at 8:18 am

Start putting the puzzle pieces together and a clear picture emerges of hospitals implementing more and more cloud services in the immediate future.

The freshest of those pieces, IDCs Cloud in Healthcare 2.0, said that hospitals are acquiring a taste for buying IT via the pay-as-you-go model and its operational expenditure approach rather than purchasing technology the old-fashioned way, as a capital expenditure.

The use of cloud computing as an increasingly business-critical technology is quickly changing how healthcare organizations and payers evaluate, procure, and deploy IT assets, IDC analysts wrote.

[Also:Hospital datacenters: Extinct in 5 years?]

Earlier this month, HIMSS Analytics research director Brendan FitzGerald said that data-intensive trends such as precision medicine and population health will demand more robust infrastructure than what hospitals have in place to support EHRs today. Moving forward, then, more and more hospitals will turn to infrastructure-as-a-service offerings from Amazon, IBM, Google, Microsoft and others.

Smart CIOs should be thinking about the best ways to coordinate cloud vendors and infrastructure instead of applying an asset-centric view toward managing IT resources, IDC added, so they can ultimately deliver either cost-savings, innovation or both.

Hospitals should also be taking inventory of how many and exactly which cloud services various lines of business have tapped. While that may sound simple, the Internet Security Threat Report Symantec published late last month found that CIOs thought their users had about 30 or 40 cloud apps but, instead, enterprises have 928 already.

IDC said that cloud computing will become the main platform for analytics and big data, as well as mobile and internet of things tools. As those and other emerging technologies, such as cognitive computing, 3D printing and robotics spark digital transformation, CIOs and IT departments will have big opportunities to drive innovations in the cloud that they otherwise could not.

But the cloud model will also force them to evolve.

IT departments will operate in an environment that has a centralized operating model where they focus on service delivery and more predictable expenditures, the IDC analysts wrote. Cloud will enable an IT department to have a line of business point of focus because daily operations and services are acquired instead of managed internally.

Twitter:SullyHIT Email the writer: tom.sullivan@himssmedia.com

Like Healthcare IT News on Facebook and LinkedIn

Continued here:

Cloud computing will change the nature of hospital IT shops - Healthcare IT News

Posted in Cloud Computing | Comments Off on Cloud computing will change the nature of hospital IT shops – Healthcare IT News

Baidu to leverage cloud computing, artificial intelligence, in effort to ramp up behavioural analysis – South China Morning Post

Posted: at 8:18 am

Chinese internet giant Baidu says it plans to leverage advanced cloud-computing to analyse the online data of millions of its users to help companies improve their marketing campaigns.

The Chinese search engine giant, which has real-time search data on more than 700 million internet users, is able to analyse individual users through its cloud arms artificial intelligence (AI), big data and cloud computing technologies, Yin Shiming, vice president and general manager of Baidu Cloud Computing, said in Shenzhen.

AI is bringing in new ways of thinking for many traditional industries, said Yin, who cited the recent battle between AlphaGo and Chinese Go master Ke Jie as supporting his view that the development of AI technology has stepped up.

Our Marketing Cloud, backed by Baidu Clouds data and technology, is not just saving resources and costs, but making marketing easier,Yin said.

Despite challenges from other local search brands such as Sogou and Qihoo 360, Baidus dominance in online search has hardly swayed over the years, accounting for about 75 per cent of the search market.

Baidus mobile app is ranked as the seventh most popular in China, with 244.3 million active mobile users as of the end of March, according to Beijing-based research agency Analysys.

Currently over 70 per cent of newly emerged marketing strategies are AI-driven ones, according to Tang Jin, a deputy general manager of Baidu Cloud Computing. But tonness of data on the internet is ignored without being interpreted properly. To achieve precise marketing for commercial institutions, we need to understand user behaviour on the internet first, said Tang.

The scale of the cloud computing industry in the mainland is forecast to grow to 430 billion yuan (US$62.75 million) in 2019 from 150 billion yuan in 2015, according to the Ministry of Industry and Information Technology.

Baidu launched its AI platform for commercial users in Beijing in November. The system is powered by cloud computing technologies that include perception, machine learning and deep learning. More than 30,000 enterprises from various sectors are reportedly employing Baidus cloud services.

The company is actively pushing for a transition from the traditional search-engine business to an AI-led company to broaden its revenue channels after reported 10.6 per cent drop in net profit during the first quarter ended March 31.

Baidu aims to intensify efforts in applying AI technology to improve existing products and accelerate the development of AI-enabled new businesses for higher revenue growth in the coming quarters, vice president Lu Qi said after the quarterly results in late April.

Baidu in January appointed Lu a leading AI expert and former Microsoft Corp executive as its chief operating officer in a bid to bolster its efforts in AI.

Read the original:

Baidu to leverage cloud computing, artificial intelligence, in effort to ramp up behavioural analysis - South China Morning Post

Posted in Cloud Computing | Comments Off on Baidu to leverage cloud computing, artificial intelligence, in effort to ramp up behavioural analysis – South China Morning Post

Cray Takes the Plunge into Cloud Computing – TOP500 News

Posted: at 8:17 am

Cray is now offering its Urika-GX supercomputer for rent. One of the last HPC system vendors to give cloud computing a whirl, the companys initial foray into supercomputer-as-a-service will target life science customers looking for compute cycles on something more sophisticated than a traditional cluster.

To make its cloud business fly, Cray is partnering with Markley, a cloud infrastructure provider based in Boston, Massachusetts. Markley is a fairly typical cloud company, offering services like collocation, utility storage, disaster recovery, and so on. The company promises 100 percent uptime.

Crays entrance into the cloud came about as a result of a beta trial of the Urika-GX supercomputer by a research institute located outside of Boston. According to Ted Slater, who heads up the healthcare and life sciences unit at Cray, genomic researchers there were doing variant analysis, studying cell mutations associated with disease. Identifying those mutations can often lead to effective diagnosis and treatments.

Slater says the researchers were able to realize a five-fold speed-up on their variant analysis runs, compared to the HPC clusters they were using. That allowed them to analyze more data and ask more interesting questions. In fact, the faster turn-around time sped up the whole workflow, including software development of the genomic codes.

Its not too surprising that a Urika-GX could outrun a conventional HPC cluster, given its customized design, in particular, its use of the Aries interconnect to speed inter-node communications. Its also important to know that Urika-GX is an extremely flexible platform for analytics, says Slater.

The system comes with a complete software stack tuned for analytics applications, especially graph analytics. That includes the low-level Cray Graph Engine, a popular statistical programming languages in R, and distributed programming frameworks, like Hadoop and Spark. Application libraries can be added as needed.

Thanks to the five-fold performance improvement, the research institute was sold on the Urika-GX, but they preferred to rent rather than buy. After all, says Fred Kohout, Crays senior vice president of products and chief marketing officer, who wouldnt want to use a Cray?

Although the cloud offering will initially be confined to the Urika-GX system and life science types, Kohout says theyre already considering ways to expand the business. Were going to continue to look at other industries and other parts of the Cray portfolio as they make sense, says Kohout. And well roll those out in the months ahead.

If you happen to be attending the Bio-IT World Conference and Expo in Boston this week (May 23-25), Cray and Markley will be on hand to talk about their cloud computing venture. If you miss the event, the two companies will be conducting a live webinar on the new service on June 13th at 10:00 am PDT. You can register for it here.

Read more from the original source:

Cray Takes the Plunge into Cloud Computing - TOP500 News

Posted in Cloud Computing | Comments Off on Cray Takes the Plunge into Cloud Computing – TOP500 News

Doped Diamonds Push Practical Quantum Computing Closer to Reality – Motherboard

Posted: at 8:17 am

A large team of researchers from MIT, Harvard University, and Sandia National Laboratories has scored a major advance toward building practical quantum computers. The work, which is described in the current Nature Communications, offers a new pathway toward using diamonds as the foundation for optical circuitscomputer chips based on manipulating light rather than electric current, basically.

Pushing beyond the quantum computing hype and, perhaps, misinformation, we're still faced with a largely theoretical technology. Engineering a real quantum computer is hard because it should be hard. What we're attempting to do is harness a highly strange and even more so fragile property of the quantum world, which is the ability of particles to occupy seemingly contradictory physical states: up and down, left and right, is and isn't.

If we could just have that property in the same sense that we can have a basic electronic component like a transistor, we'd be set. But maintaining and manipulating qubits, the units of information consisting of simultaneous contradictory particle states, is really hard. Just looking at a quantum system means disrupting it, and, if that system happened to be encoding information, the information is lost.

The almost-perfect lattice structure of atoms in a diamond offers a promising foundation for a quantum circuit. Here, a qubit is stored within a "defect" within the diamond. Every so often within the neatly ordered confines of a diamond, an atom will be missing. In this vacancy, another atom might sneak in to replace the missing carbon atom. This diamond defect may in turn have some free electrons associated with it, and it's among these particles that information is stored (while information is transmitted around the diamond as photons, or light particles).

Crucially, this little swarm of electrons naturally emits light particles that are able to mirror the quantum superposition (the particle or particle system in multiple states). This is then a way of retrieving information from the qubit without disturbing it.

The challenge is in finding and implementing the ideal replacement for the carbon atom in the diamond lattice. This replacement is known as a dopant. This is where the new study comes in.

The most-studied dopant for diamond-defect optical circuits is nitrogen. It's stable enough to maintain the requisite quantum superposition, but is limited in the frequencies of light that it can emit. It's like having a perfect encryption system that can nonetheless only represent like a quarter of the alphabet.

The dopant explored in the new research is silicon. Silicon atoms embedded into a diamond lattice are able to emit much narrower wavelength bands. It's like they have a higher-resolution. But the cost of being able represent information with more precision are more precarious quantum states. Consequently, the diamonds have to be kept at very near absolute-zero temperature. Nitrogen states, meanwhile, can withstand heat up to about four degrees above absolute zero. In either case, we're not exactly talking about quantum laptops.

The researchers were able to implant silicon defects into diamonds via a two-step process involving first blasting the diamond with a laser to create vacancies and then heating the diamond way up to the point that the vacancies start to move around the lattice and bond with silicon atoms. The result is a lattice with an impressively large number of embedded silicon atoms that are exactly where they should be within the structure.

The result is a promising pathway toward reliable fabrication of "efficient lightmatter interfaces based on semiconductor defects coupled to nanophotonic devices." The stuff of a quantum computer, in other words.

See the original post here:

Doped Diamonds Push Practical Quantum Computing Closer to Reality - Motherboard

Posted in Quantum Computing | Comments Off on Doped Diamonds Push Practical Quantum Computing Closer to Reality – Motherboard

IBM to Sell Use of Its New 17-Qubit Quantum Computer over the Cloud – All About Circuits

Posted: at 8:17 am

IBM has created a 17-qubit quantum computer and is making plans to timeshare the machine with other companies via cloud computing. While this is an important step, it isn't quite enough to make quantum computers truly competitive compared to supercomputers. What will it take to bring quantum computing into the commercial realmand how long until we get there?

Classical computing has been around for many years and has completely transformed the human race. Near instant communication between any two individuals used to be a dream. The idea of large calculations being done faster than you can blink was unimaginable. The concept of free information and education was too much for any University to handle.

But it comes as no surprise that, now that these concepts are areality,we've become dependent on them. This dependence places pressure on the industry to produce more powerful devices with every passing year. This was not an issue in the past since silicon devices were easy to scale down. But, with transistor gates as small as one-atom thick, shrinking may no longer be possible. Silicon, the building block of modern semiconductors, is already being phased out by Intel and future devices using feature sizes of 7nm and smaller will instead be made from materials such as Indium-Gallium-Arsenide (InGaAs).

One solution for increasing computational power is the use of quantum computers (though theircreation isn'tlikely to allow faster consumer devices). A common applicationis reliant on control flow, discrete mathematics, and IO handling. A quantum computer, however, is designed to solve statistical problems and scenarios which involve large amounts of data. The best way to understand it is to compare a classical processor (such as an i7) to an imaginary quantum processor (iQ7 for example). The i7 could add 1000 numbers together much faster than the q7, but the q7 could solve a game (such as checkers) much faster than the i7 due to the possible number of moves that the game possesses.

So why are quantum computers so good at parallel data crunching?

A classical computer is made up of transistors which handle two possible states:on (1) and off (0). For each additional bit, the amount of information that can be represented is equal to 2n where n is the number of bits. For example, four bits can represent oneof 16 possible states and eight bits can represent oneof 256 possible states.

By comparison, a quantum bitorqubitcan hold three states: on (1), off (0), and a superposition state. While the on and off states behave in an identical manner to classical bits, the superposition is what drives quantum computation. This superposition is a linear probability that lies between 0 and 1, allowing four qubits to represent all 16 different states at the same time where each one of those 16 states has a complex amplitude reflecting its probability of being observed.

Read More

So it's pretty obvious that quantum computing provides many advantages over classical computers for complex, parallel data processing. While such tasks are not commonly found in the everyday device, they are almost too common in many different industries, including financial data processing, insurance, scientific models, oil reserves, and research. Currently, supercomputers are used for such parallel data processing but, if a quantum option were available, it's a safe bet that each of these sectors would do anything to get one.

This has been one of the major drives in quantum computer technology with many companies trying to produce such a machine. For example, D-Wave Systems have their series of specialized quantum annealing processors, while many other researchers and companies are trying to find methods of producing universal quantum gates.

However, IBM has just taken the lead with their 17-qubit quantum computer.

What makes the IBM quantum computer a game changer is that it is a universal quantum computer as opposed to being a highly specialized device. Many other quantum systems currently available are usually of the annealing persuasion, which is good for optimization problems but not for other quantum problems such as database searches. The IBM machine, however, can be configured to execute just about any quantum problem.

IBM has decided to sell time on the computers to business and researchers alike through their IBM Q program accessed via the internet (i.e., over the cloud). This will allow developers and researchers to create a quantum program anywhere around the world and then have it executed with the press of a button.

IBM's made strides with its previous 5-qubit quantum computer. This 17-qubit machine is obviously yet another milestone. However, many say that even a 17-qubit computer is not good enough because classical computers can still process the same information in a smaller time frame. In fact, it has been stated that classical computers can model quantum computers up to 50 qubits in size. This means that, for a quantum computer to become better at solving quantum related problems than a classical computer, it has to contain at least 50 qubits. Of course, this assumes that such quantum computer simulations on classical computers do not improve.

SoGoogle is ambitiously planning to release a 49-qubit quantum computer by the end of this year. Considering the size difference between the IBM machine and the proposed Google machine, however, it's likely safe to assume that Googles machine may not be entirely universal.

It's safe to say that quantum computers, despite becoming increasingly more powerful, are still very far away from being commercially available. IBM's cloud-based scheme, however, does technically place quantum computing into the commercial realm.

Supercomputers are still very powerful compared to quantum computers and their cost-to-performance ratio makes them highly economical. But, unlike fusion power (which is always 20 years away), quantum computers really could make their debut when either IBM or Google release the world's first 50-qubit computer.

Here is the original post:

IBM to Sell Use of Its New 17-Qubit Quantum Computer over the Cloud - All About Circuits

Posted in Quantum Computing | Comments Off on IBM to Sell Use of Its New 17-Qubit Quantum Computer over the Cloud – All About Circuits