The Prometheus League
Breaking News and Updates
- Abolition Of Work
- Ai
- Alt-right
- Alternative Medicine
- Antifa
- Artificial General Intelligence
- Artificial Intelligence
- Artificial Super Intelligence
- Ascension
- Astronomy
- Atheism
- Atheist
- Atlas Shrugged
- Automation
- Ayn Rand
- Bahamas
- Bankruptcy
- Basic Income Guarantee
- Big Tech
- Bitcoin
- Black Lives Matter
- Blackjack
- Boca Chica Texas
- Brexit
- Caribbean
- Casino
- Casino Affiliate
- Cbd Oil
- Censorship
- Cf
- Chess Engines
- Childfree
- Cloning
- Cloud Computing
- Conscious Evolution
- Corona Virus
- Cosmic Heaven
- Covid-19
- Cryonics
- Cryptocurrency
- Cyberpunk
- Darwinism
- Democrat
- Designer Babies
- DNA
- Donald Trump
- Eczema
- Elon Musk
- Entheogens
- Ethical Egoism
- Eugenic Concepts
- Eugenics
- Euthanasia
- Evolution
- Extropian
- Extropianism
- Extropy
- Fake News
- Federalism
- Federalist
- Fifth Amendment
- Fifth Amendment
- Financial Independence
- First Amendment
- Fiscal Freedom
- Food Supplements
- Fourth Amendment
- Fourth Amendment
- Free Speech
- Freedom
- Freedom of Speech
- Futurism
- Futurist
- Gambling
- Gene Medicine
- Genetic Engineering
- Genome
- Germ Warfare
- Golden Rule
- Government Oppression
- Hedonism
- High Seas
- History
- Hubble Telescope
- Human Genetic Engineering
- Human Genetics
- Human Immortality
- Human Longevity
- Illuminati
- Immortality
- Immortality Medicine
- Intentional Communities
- Jacinda Ardern
- Jitsi
- Jordan Peterson
- Las Vegas
- Liberal
- Libertarian
- Libertarianism
- Liberty
- Life Extension
- Macau
- Marie Byrd Land
- Mars
- Mars Colonization
- Mars Colony
- Memetics
- Micronations
- Mind Uploading
- Minerva Reefs
- Modern Satanism
- Moon Colonization
- Nanotech
- National Vanguard
- NATO
- Neo-eugenics
- Neurohacking
- Neurotechnology
- New Utopia
- New Zealand
- Nihilism
- Nootropics
- NSA
- Oceania
- Offshore
- Olympics
- Online Casino
- Online Gambling
- Pantheism
- Personal Empowerment
- Poker
- Political Correctness
- Politically Incorrect
- Polygamy
- Populism
- Post Human
- Post Humanism
- Posthuman
- Posthumanism
- Private Islands
- Progress
- Proud Boys
- Psoriasis
- Psychedelics
- Putin
- Quantum Computing
- Quantum Physics
- Rationalism
- Republican
- Resource Based Economy
- Robotics
- Rockall
- Ron Paul
- Roulette
- Russia
- Sealand
- Seasteading
- Second Amendment
- Second Amendment
- Seychelles
- Singularitarianism
- Singularity
- Socio-economic Collapse
- Space Exploration
- Space Station
- Space Travel
- Spacex
- Sports Betting
- Sportsbook
- Superintelligence
- Survivalism
- Talmud
- Technology
- Teilhard De Charden
- Terraforming Mars
- The Singularity
- Tms
- Tor Browser
- Trance
- Transhuman
- Transhuman News
- Transhumanism
- Transhumanist
- Transtopian
- Transtopianism
- Ukraine
- Uncategorized
- Vaping
- Victimless Crimes
- Virtual Reality
- Wage Slavery
- War On Drugs
- Waveland
- Ww3
- Yahoo
- Zeitgeist Movement
-
Prometheism
-
Forbidden Fruit
-
The Evolutionary Perspective
Monthly Archives: June 2017
ALMA Observes Massive Protostar in Kleinmann-Low Nebula – Sci-News.com
Posted: June 12, 2017 at 8:42 pm
A team of astronomers has determined how the gas flow from a massive infant star is launched. The researchers used the Atacama Large Millimeter/submillimeter Array (ALMA) to observe the 10-solar-mass protostar Orion KL Source I in the Kleinmann-Low Nebula and obtained clear evidence of rotation in the outflow.
Artists impression of Orion KL Source I. The massive protostar is surrounded by a disk of gas and dust. The outflow is launched from the surface of the outer disk. Image credit: ALMA / ESO / NAOJ / NRAO.
Stars form from gas and dust floating in interstellar space. But, astronomers do not yet fully understand how it is possible to form the massive stars seen in space.
One key issue is gas rotation. The parent cloud rotates slowly in the initial stage and the rotation becomes faster as the cloud shrinks due to self-gravity.
Stars formed in such a process should have very rapid rotation, but this is not the case. The stars observed in the Universe rotate more slowly.
How is the rotational momentum dissipated? One possible scenario involves that the gas emanating from protostars.
If the gas outflow rotates, it can carry rotational momentum away from the system.
Astronomers have tried to detect the rotation of the outflow to test this scenario and understand its launching mechanism.
In a few cases signatures of rotation have been found, but it has been difficult to resolve clearly, especially around massive protostars.
Orion KL Source I observed with ALMA. The massive protostar is located in the center and surrounded by a gas disk (red). A bipolar gas outflow is ejected from the protostar (blue). Image credit: ALMA / ESO / NAOJ / NRAO / Hirota et al.
Dr. Tomoya Hirota, an astronomer at the National Astronomical Observatory of Japan (NAOJ) and SOKENDAI, and colleagues observed a protostar called Orion KL Source I in the Kleinmann-Low Nebula, the most active part of the Orion Nebula complex.
Thanks to its close vicinity and ALMAs advanced capabilities, Dr. Hirota and co-authors were able to reveal the nature of the outflow from Orion KL Source I.
We have clearly imaged the rotation of the outflow. In addition, the result gives us important insight into the launching mechanism of the outflow, Dr. Hirota said.
The new ALMA observations beautifully illustrate the rotation of the outflow: it rotates in the same direction as the gas disk surrounding the star; this strongly supports the idea that the outflow plays an important role in dissipating the rotational energy.
Furthermore, ALMA clearly shows that the outflow is launched not from the vicinity of Orion KL Source I itself, but rather from the outer edge of the disk. This morphology agrees well with the magnetocentrifugal disk wind model.
The findings appear today in the journal Nature Astronomy.
_____
Hirota et al. Disk-Driven Rotating Bipolar Outflow in Orion Source I. Nature Astronomy, published online June 12, 2017
Go here to see the original:
ALMA Observes Massive Protostar in Kleinmann-Low Nebula - Sci-News.com
Posted in Astronomy
Comments Off on ALMA Observes Massive Protostar in Kleinmann-Low Nebula – Sci-News.com
The Controversy Over the Alien ‘Wow!’ Signal Is Astronomy’s Greatest Beef – Motherboard
Posted: at 8:42 pm
The origin of the notorious Wow! Signala 72 second-long astronomical anomaly some scientists first thought to have been a signal from extraterrestrial lifehas been a constant source of speculation for alien hunters ever since it was recorded in 1977 by Ohio State University's Big Ear radio telescope.
Was it a radio signal sent by E.T. or just something more mundanely human?
A new scientific paper, published in the Journal of the Washington Academy of Sciences, claims to have finally nailed it, sparking a flurry of press coverage proclaiming the mystery finally solved. According to the study author, Antonio Paris of St. Petersburg College in Florida, the answer lies in a passing comet called 266P/Christensen (only discovered in 2006) that caused the 1420MHz 'Wow!' radio signal detected some forty years ago.
Paris' research claims to have confirmed that comets emit a 1420MHz signal. Thus, he goes on to argue, this is likely what the telescope picked up on when the comet passed in front of the area of the sky the it was pointed at.
But Paris' conclusion doesn't have fellow astronomers convinced.
"There are some problems with the analysis, which doesn't use many of the standard things one would do in radio astronomy," Chris Lintott, professor of astrophysics at the University of Oxford, told Motherboard.
"The paper appears in a journal that I hadn't heard of before Paris published his Wow/comet ideas in thereit may be peer reviewed, but it's not part of the astronomical mainstream and so I'd be worried about the quality of that review."
"You may as well say it's due to ghosts or due to reality television"
So between interrogative tweets and doubting Reddit threads, Motherboard reached out to Paris to ask what's really going on.
"I have received over 500 emails this week about the Wow paper. About 99.99 percent appear [to be] positive reaction from the public and the scientific community," Paris told Motherboard in an email. "A handful, however, were from those who are still skeptical, mostly from the SETI (Search for Extraterrestrial Life) community. I suspect that SETI, who has used the Wow signal as a source of revenue, is nervous."
SETI, for its part, says such claims are preposterous: "We haven't made any money on the Wow signal whatsoever," Seth Shostak, senior astronomer for the SETI Institute, told Motherboard. "The charge we're making a lot of money off of it is bizarre. It's a bogus claim."
Shostak says he finds Paris's paper hard to believe based on what other astronomers have observed.
"The Ohio State radio telescope has two receivers on it. If it observes something in the sky, it's always reobserved 70 seconds later with a second feed," he said. "With Wow, it found it in one feed but it doesn't find it in the second. It's disappeared. A comet doesn't disappear in a minute. It doesn't move across the sky in a minute. It barely moves at all."
Paris said he had even received a phone call from a technician who helped build the Big Ear telescope who was excited that "the mystery has been solved." But Paris' research is still in the firing line.
"Saying 'The Wow signal might not be a comet if comets do something we haven't seen them doing' seems not very exciting."
"The claimed detectioneven if it's realis much, much weaker than the Wow signal, and lasts for longer. So at best the paper shows that comets are detectable in the radionot that they're capable of the kind of burst that produced the Wow signal," Lintott told Motherboard. "Saying 'The Wow signal might be a comet if comets do something we haven't seen them doing' seems not very exciting."
A Reddit user, also claiming to be a radio astronomer, posted a lengthy takedown of Paris' paper over the weekend, arguing, "This paper was also just really, really, really short on details that a radio astronomer would want, to the point where it likely wouldn't have passed a referee at a 'regular' journal."
The Wow! signal. Its name inspired by astronomer Jerry R. Ehman, who discovered the anomaly in August 1977. Image: Big Ear Radio Observatory and North American AstroPhysical Observatory (NAAPO).
But Paris has kicked back against what he calls an "emotional" response. "I am not in the business of responding to emotions," he told Motherboard. "There are too many people 'excited' or 'upset' about this project. Emotions should not have any part in science."
Shostak said that the comet explanation "would be an interesting result if true," but that the data just doesn't back it up: "You may as well say it's due to ghosts or due to reality television or something. If the explanation doesn't fit the data, you have to be a little suspect."
To that end, Lintott has spent the weekend putting together a public list of questions for Paris to answer about his paper, including contributions from other astronomers. The search for extraterrestrial life continues.
Get six of our favorite Motherboard stories every day by signing up for our newsletter .
Read more here:
The Controversy Over the Alien 'Wow!' Signal Is Astronomy's Greatest Beef - Motherboard
Posted in Astronomy
Comments Off on The Controversy Over the Alien ‘Wow!’ Signal Is Astronomy’s Greatest Beef – Motherboard
Mori Astronomy exhibition double-finalist at NZ Museum Awards – Mori Television
Posted: at 8:42 pm
Waikato Museum exhibition "Te Whnau Mrama: The Heavenly Bodies" was a double-finalist at the New Zealand Museum Awards.
Curated by Dr Rangi Matamua, Dr Hemi Whaanga, Dr Ann Hardy and Hohepa Tuahine from the University of Waikato, the exhibition shines the spotlight on Mori astronomy and how it is being revitalised.
Supported by taonga, photographs and krero, Te Whnau Mrama opens the door on tuning into the stars and a better understanding of the history and meaning of Matariki.
Dr Rangi Matamua (Thoe) is an associate professor at the University ofWaikato.
He states, The right time to look for Matariki is at the end of June or the beginning or middle of July. That's Pipiri according to the Mori calendar. This year, its from July 17 to July 20 when the moon is in the Tangaroa phase in the month ofPipiri.
Matamua maintains that Mori astronomy is not practised as widely as it once was.
I want this system of knowledge of astronomy to be revitalised in our modern world. We haveforgottenhow to read the stars. However, the knowledge is still there today.
The exhibition was a finalist in the Excellence: Taonga Mori and Most Innovative Use of Te Reo Mori categories.
It incorporates Mori legend, tradition, architecture, music and history to convey the spoken and written language within the exhibition.
Although the exhibition was not a winner, Matamua maintains that the overall aim is the dissemination and revival of traditional Mori knowledge.
The hope is for Mori to return to the environment, to the origin of Mori language and philosophy, governing principals and protocols everything that came from the environment.
The Te Whnau Mrama: The Heavenly Bodies exhibition is on at Waikato Museum and runs until 13 July 2018.
Excerpt from:
Mori Astronomy exhibition double-finalist at NZ Museum Awards - Mori Television
Posted in Astronomy
Comments Off on Mori Astronomy exhibition double-finalist at NZ Museum Awards – Mori Television
Why Cloud Computing Is The Most Disruptive IT Force – CXOToday.com
Posted: at 8:41 pm
Cloud computing is one of the most disruptive forces facing the Information Technology sector. This statement is not without justification. Lets cast a glance at the enormity of the phenomenon. According to the Bain & Company research report The Changing Faces of the Cloud, globally, the cloud IT market revenue is projected to increase to $390 billion in 2020 from $180 billion, translating into a compound annual growth rate (CAGR) of around 17%.
The report also points out that cloud demand accounted for 70% of related IT market growth in 2015. Moreover, 48 out of Fortune Global 50 companies have announced plans for cloud adoption for a range of IT applications. The cloud market will continue to gain momentum as businesses shift from legacy systems to cloud-based ones, with an increasingly higher number of organizations pursuing digital business strategies.
Cloud computing represents the biggest IT industry disruptions in several years. CIOs will need to put their heads together in order to aid their company successfully and safely navigate the cloud journey, says Shashank Dixit, CEO, Deskera, a global leader in cloud tech.
While premise-based IT software and tools have their own advantages, the global trend is for cloud based applications since they offer more connectivity and functionalities than legacy systems. Moreover, enterprises are naturally gravitating towards it as the technology is reasonably reliable, affordable, and provides them access to other new and emergent technologies as well as high end skills. The cloud boom is also propelled by the fact that enterprises are trying to improve performance and productivity over the long term. Looking at the tremendous response for cloud services, several IT companies are designing applications meant solely for pure cloud play.
The overall global public cloud market will mature, and its growth rate will slightly slow down from 17.2% in 2016 to a 15.2% increase in 2020, says Sid Nag, research director at Gartner. While Brexit and other growth challenges exist, some segments such as financial SaaS applications and the PaaS user markets will still see strong growth through 2020. As buyers intensify and increase IaaS activity, they will be getting more for their investment: ongoing enhancement of performance, more memory, more storage for the same money (which will drive increases in consumptions) and increased automation in traditional IT outsourcing (ITO) delivery, added Nag.
In this fast paced world of technology, enterprises must leverage technology to stay ahead of competition. And they must choose wisely. There is a huge market that needs to be explored particularly since the reach of the Internet (including both 3G and 4G services) and the levels of automation and digitization rise exponentially.
[Disclaimer:The views expressed in this article are solely those of theauthors and do not necessarily represent or reflect the views ofTrivone MediaNetwork's or that of CXOToday's.]
Excerpt from:
Why Cloud Computing Is The Most Disruptive IT Force - CXOToday.com
Posted in Cloud Computing
Comments Off on Why Cloud Computing Is The Most Disruptive IT Force – CXOToday.com
China’s cloud industry moving to new era with emergence of unicorns – TechNode (blog)
Posted: at 8:41 pm
Just a few years ago, billion-level funding would be beyond the imagination of Chinese cloud computing companies. But now it is becoming more and more tangible as the market matures.
QingCloud, a leading player in the field, is announcing the largest ever funding in the industry so far. The cloud computing platform made it public that they havesecured D round funding worth RMB 1.08 billion (around US$ 158 million). The current round adds to a US$ 2 million series A in 2012, aUSD 20 million Series Bin 2013 and USD 100 million in 2016.The company confirmed with TechNode that it has IPO plans, but declined to offer more details. The firm reportedly is removing their VIE structure to prepare for a local listing.
The massive round is from a consortium of investors, including China Merchants Securities International and China Merchants Zhiyuan Capital Investment (two wholly-owned subsidiaries of Chinas top security trading and brokerage firm, China Merchants Securities), Riverhead Capital Investment Management, CICC Jiatai Fund and China Oceanwide Holdings Group. Existing investors of Lightspeed China Partners and Bluerun Ventures also participated.
QingCloud founding team (L-R): Spencer Lin, Richard Huang, Reno Gan (Image credit: QingCloud)
QingClouds funding isnt a single case. It marks the latest in a series of venture investments in this sector, which has bumped several companies in the vertical to unicorn status recently.
Two companies in the arena received similar-sized backings in June alone. Cloud and big data solution provider Dt Dream received an RMB 750 million A round led by Alibaba and Everbright Industry Capital Management. Another Alibaba-backed cloud computing startup Cloudcare received nearly a 1 billion RMB C round led by FOSUN Group and Sequoia Capital China.Tencent-backed UCloud completed an RMB 960 million series D roundearlier this year.
Among the companies that have landed billion-level RMB funding, Dt Dream is the only one that announced unicorn status with over US$ 1 billion valuation. This may shed light on the valuations of the other companies, which have received similar size or higher funding.
Behind the investment frenzy is the huge potential of this market. Areport from research institute CCID shows that Chinas cloud computing market surged 41.7% YOY to RMB 279.7 billion in 2016, forecasting that this figure would reach RMB570.64 billion by 2019 with an annual growth rate of over 20%.
The emergence of several unicorns over a relatively short period of time is signifying a deeperchange in the market. In line with the second-half era proposition proposed by Meituan-Dianping CEO Wang Xing, the cloud computing startup pointed that Chinas cloud computing market is also entering a special transition point fora new period. While cloud computing platforms only used by non-core businesses for financial clients like banks, insurance, and security companies in the first-half era, it will find wider application in the new era.
Co-founded by IBM alumni Richard Huang, Reno Gan, and Spencer Lin, the company launched the QingCloud platform in July 2013. They now operate 24 data centers, of which 10 are run independently and 14 through partnerships, providing services to over 70,000 enterprise services.
Emma Lee is Shanghai-based tech writer, covering startups and tech happenings in China and Asia in general. Reach her at lixin@technode.com
See the rest here:
China's cloud industry moving to new era with emergence of unicorns - TechNode (blog)
Posted in Cloud Computing
Comments Off on China’s cloud industry moving to new era with emergence of unicorns – TechNode (blog)
The Risks and Perquisites of Cloud Computing – DATAQUEST
Posted: at 8:41 pm
Cloud technology is catching up in India and the considerations for adopting it have evolved too. Customers today are looking at deploying public and private cloud capabilities in one infrastructure. In fact, according to a recent Gartner report, the public cloud services market in India is estimated to grow at 38% in 2017 to $1.81 billion.
Infrastructure as a service (IaaS), projected to grow 49.2% in 2017, will be the highest growth driver, followed by software as a service (SaaS) at 33% and lastly platform as a service (PaaS) with 32.1%. This trend is a significant indicator that the migration of application and workloads from on premises data centres to the cloud, along with the development of cloud-ready and cloud-native applications, are triggering immense growth. While this may be on the rise, it also comes with some challenges and companies need to consider aspects like security, migration to new technologies, training for resources etc.
With this, the debate surrounding the security of cloud computing specifically whether data was more secure in the cloud or not has for the most part been settled. A growing number of organizations now view the cloud as secure and in many cases more so than an on-premises deployment. Beyond that, as each of the public cloud vendors point out, security in the cloud is ashared responsibility with the organization as the application owner being responsible for protecting applications, the OS, supporting infrastructure and other assets running in the cloud.
From a security standpoint, public cloud vendors management consoles are a key weak point and consequently an attractive target for an attacker, often via a phishing attempt. As such, its important to lock down and secure privileged credentials in a digital vault to secure the management console. As such, the enterprises responsibilities, specifically the functions above the hypervisor, include securing the privileged credentials used by applications and scripts accessing other applications and assets, such the enterprises customer database.
Unfortunately these credentials are all too often hardcoded. This is a particularly troubling vulnerability as there can be a large number of hardcoded credentials used throughout cloud and hybrid environments. Hard-coding and embedding credentials in the code or a script can initially make them easy to use but thisrepresents a significant vulnerabilitybecause attackers or malicious insiders can also easily access them, especially if the credentials are in clear text. But, even worse, when credentials are hard-coded or locally stored, they are nearly impossible to rotate, further making them a static and easy target for attackers.
The Risk Is Real
As part of the DevOps process developers often share source code theyve developed on code repositories such GitHub. While its part of the DevOps process, its an all too common example of how embedded passwords and credentials can become public if theyre hardcoded. Even if the code is only saved in the enterprises internal code repositories those passwords and credentials can easily be accessed by other developers and used either inadvertently or maliciously. It also becomes difficult, if not impossible, to fully identify which applications or scripts are interacting with other applications and other enterprise assets.
In the past, these mistakes might not have been so risky, exploitable and damaging within an on-premises environment. However, in a cloud environment, because of the rapid pace of change, the ability to quickly scale and the tools being used, these vulnerabilities are amplified and can pose unacceptable levels of risk.
To minimize risk and follow best practices, enterprises should avoid hardcoding passwords and credentials used by applications and scripts and instead secure credentials in a digital vault and rotate them according to policy. With this approach, just like with human users, enterprises can assign unique credentials to each application, code image or script, and then track, monitor and control access. IT administrators will know which applications access resources such as a customer database. Also, when an application or script is retired, the administrator or script can simply turn off the credentials.
A core business benefit of cloud is elasticity the ability to easily and instantaneously scale up and scale down the number of compute instances or virtual servers to meet the needs of the business at a specific point in time. With on-demand cloud computing, the business only pays for the compute, storage and other resources they use. No human intervention is required. The cloud automation tools are either built-in as a capability of the public cloud vendors offerings such as AWS Auto Scale, or as part of the orchestration and automation tools used with DevOps such as Puppet, Chef, Ansible, etc.
On-demand computing in the cloud, enabled by the automation tools, is a huge business benefit, but it also presents challenges and new potential risks when these new application instances are created and launched, they need privileges and credentials to access resources. The automation tools can provide the credentials, but these credentials also need to be secured.
Consequently, when a new application instance is created, as the compute environment dynamically scales, a best practice is to immediately secure the permissions and credentials assigned to the new instance in the secure digital vault. This ensures that the credentials can immediately be monitored, managed, secured and rotated according to policy. When the compute instances are retired, the associated credentials can also be removed. This is achieved with integrations between the various automation tools and the secure digital vault.
Whether the enterprise is fully in the cloud with IaaS or PaaS or is migrating to the cloud, it is critical to ensure applications, scripts and other assets use secure passwords and privileged credentials to access other applications and assets in the cloud.
See more here:
Posted in Cloud Computing
Comments Off on The Risks and Perquisites of Cloud Computing – DATAQUEST
Terry Crews Is On Crackdown 3 Trailer, No Cloud Computing For Single Player – EconoTimes
Posted: at 8:41 pm
Crackdown 3.BagoGames/Flickr
Crackdown 3 is one of the most highly anticipated games on the Xbox Ones lineup, not least of all because its one of the few exclusive titles coming to the marginalized console. Microsoft released the trailer for the game that comes with the obligatory explosions and considerable selections of firearms. It also featured gaming community favorite Terry Crews. Unfortunately, its not all good news, especially on the single player front.
The last time Crackdown 3 made an official appearance was back in 2015, where Microsoft provided a demo for the game. The recent trailer did a good job of making it up to fans, which consisted of many things that went bang and boom. Retaining its cell shaded, neon theme, its still the Crackdown of old, The Verge reports.
Opening the trailer is movie star Terry Crews, the Oldspice spokesperson himself. After a brief, yet intense monolog, viewers are shown some gameplay aspects, which includes a ton of jumping using the jetpacks and blowing people away.
The game is scheduled for launch on November 7th for the Xbox One and Windows. This makes for a relatively short waiting period before gamers can start knocking down buildings in multiplayer. Speaking of which, this is where the bad news comes in.
Back in 2014, Microsoft announced that the game would feature cloud computing aspects in order to make the environment destructible. All well and good, but the company recently clarified that this was only for the multiplayer mode.
For single-player, gamers will not be able to enjoy as much of the destruction. Then again, the game might more than make up for that by absolutely slamming players with a huge amount of content and enemies to destroy as a member of the elite forces that cracks down on crime.
Whats more, the game is coming out the same time as the newly unveiled Xbox One X, Kotaku reports. Crackdown 3 would be a great testbed for bringing out the full power of the console.
New Study Could End Insulin Dependence Of Type-1 Diabetics
Infertility in men could point to more serious health problems later in life
Electrically stimulating your brain can boost memory but here's one reason it doesn't always work
Fainting and the summer heat: Warmer days can make you swoon, so be prepared
Why bad moods are good for you: the surprising benefits of sadness
Here's why 'cool' offices don't always make for a happier workforce
Four myths about diabetes debunked
What are 'fasting' diets and do they help you lose weight?
Placebos work even when patients know what they are
More here:
Terry Crews Is On Crackdown 3 Trailer, No Cloud Computing For Single Player - EconoTimes
Posted in Cloud Computing
Comments Off on Terry Crews Is On Crackdown 3 Trailer, No Cloud Computing For Single Player – EconoTimes
Are Enterprises Ready to Take a Quantum Leap? – IT Business Edge
Posted: at 8:40 pm
The exciting landscape of modern life has been built with the aid of powerful computers. They have done dazzling things, from making the trains run on time to helping to build skyscrapers. Now, imagine a discontinuity in computing in which these capabilities are suddenly expanded and enhanced by orders of magnitude.
You wont have to imagine too much longer. It is in the process of happening. The fascinating thing is that this change is based on quantum science, which is completely counter-intuitive and not fully understood, even by those who are harnessing it.
Todays computers are binary, meaning that they are based on bits that represent either a 1 or a 0. As fast as they go, this is a basic, physical gating factor that limits how much work they can do in a given amount of time. The next wave of computers uses quantum bits called qubits that can simultaneously represent a 1 and a 0. This root of the mysteries that even scientists refer to as quantum weirdness allows the computers to do computations in parallel instead of sequentially. Not surprisingly, this greatly expands the ability of this class of computers.
The details of how quantum computers operate are more or less impossible to understand. A couple of related points are clear, however: Harnessing the power of quantum mechanics to create incredibly powerful machines is not a pipe dream: Companies such as IBM, Microsoft and Google, as well as startups and universities, dont sink billions of dollars in flights of fancy.
The second point is that the payoff is here, or at least quite near. The world of computing wont instantaneously change once quantum actions are proven. It is still a long road to being fully commercialized, bypassing classical approaches and, finally, living up to the most extravagant promise.
In late May, Microsoft and Purdue University announced research on quantum computing that focuses on one of the key challenges, which is the extraordinarily fragile nature of the qubits. Indeed, the subject of the research is a good example of the amazing complexity of the field and how far it has to go.
In quantum mechanics, the mere act of looking at the system makes it choose between the 1 and the 0 and exit the quantum state. The task of the Microsoft/Purdue research is to develop topological qubits that are stable enough to function in the real world.
In essence, according to Professor Michael Manfra, the university's Bill and Dee O'Brien Chair Professor of Physics and Astronomy, stability increases as the quantum properties are spread out.
The quantum variable that houses information is really a property of the quantum system as [a] whole, he wrote to IT Business Edge in response to emailed questions. More particles may be needed to define the qubit, but this complexity has an advantage while a local disturbance or perturbation can flip an individual spin, it is much less likely to change the state of the entire quantum system that comprises a topological qubit.Therefore these topological qubits are expected to be more robust.They do not couple well to the commonly occurring noise in the environment.
Preparing for the Quantum Future
There is an angle to all of this that is refreshingly straightforward and accessible, however: Great change is coming and companies need to prepare for quantum computing. Indeed, even assuming that the high-profile changes are down the road a bit, they will be massive when they do arrive.
The bottom line is that planners need to think about quantum computing. A logical first step in assessing the impact is identifying the tasks it will most likely perform. In responses to emailed questions, Jerry Chow, the manager of Experimental Quantum Computing for IBM, told IT Business Edge that four areas likely to be affected are business optimization (in areas such as the supply chain, logistics, modeling financial data and risk analysis); materials and chemistry; artificial intelligence and cloud security.
Things may not be quite as clear cut, however. David Schatsky, the managing director of Deloitte LLP, told IT Business Edge, in response to emailed questions, that risk management, investment portfolio design, trading strategies, and the design of transportation and communications networks will be affected. Quantum computer, he wrote, could be disruptive in cryptography, drug design, energy, nano-engineering and research.
Thats an almost intimidating list. However, Schatsky prefaced it with a disclaimer: Quantum computing will entirely transform some kinds of work and have negligible impact on others. The truth is, researchers dont yet know all the types of problems quantum computing may be good for.
There Is Still Time to Prepare
Luckily, planners have time. Quantum computing will be a massive change, but one that will be gradual. It makes sense to think of quantum computing as a new segment of the supercomputer market, which is a small fraction of overall IT spending, Schatsky wrote. Annual supercomputer server sales total about $11 billion globally by some estimates. I suspect quantum computing revenues will be a very small fraction of that for years to come. So Im not sure its going to become common anytime soon.
Though it clearly will be quite a while before people are buying quantum computers on Amazon, organizations need to be thinking about quantum computing today. The power of quantum computing is so extreme, especially when coupled with artificial intelligence and other emerging techniques, it is clear that all of that time must be put to good use.
IBMs Chow said that quantum-driven platforms such as Watson will be able to find patterns that are buried too deeply for classical computers. This will open new frontiers for discovery, he wrote.
It is a new age, not a new computer.
Corporations should ask: How do I learn about quantum computing to get a feel for where it might make a difference? Now is the time to realize its enormous potential, and that this is a field ripe for innovation and exploration that goes beyond simply just an end application. Becoming quantum-ready is about participating in a revolution within computing. People need to learn the details enough to open their minds up about what could be possible.
And, eventually, quantum mechanics may go beyond computing.
In general terms, I believe the development of quantum technologies is inevitable quantum computing is perhaps just the most visible example, Manfra wrote. It is not hard to imagine that certain businesses in which innovation may be enhanced by dramatic improvement in computational capabilities will need to have long-term plans which exploit quantum machines once they become available.
Carl Weinschenk covers telecom for IT Business Edge. He writes about wireless technology, disaster recovery/business continuity, cellular services, the Internet of Things, machine-to-machine communications and other emerging technologies and platforms. He also covers net neutrality and related regulatory issues. Weinschenk has written about the phone companies, cable operators and related companies for decades and is senior editor of Broadband Technology Report. He can be reached at cweinsch@optonline.net and via twitter at @DailyMusicBrk.
Excerpt from:
Are Enterprises Ready to Take a Quantum Leap? - IT Business Edge
Posted in Quantum Computing
Comments Off on Are Enterprises Ready to Take a Quantum Leap? – IT Business Edge
Microsoft and Purdue work on scalable topological quantum computer – Next Big Future
Posted: at 8:40 pm
In 2016, Purdue University and Microsoft have signed a five-year agreement to develop a useable quantum computer. Purdue is one of four international universities in the collaboration. Michael Manfra, Purdue Universitys Bill and Dee OBrien Chair Professor of Physics and Astronomy, professor of materials engineering and professor of electrical and computer engineering, will lead the effort at Purdue to build a robust and scalable quantum computer by producing what scientists call a topological qubit.
The team assembled by Microsoft will work on a type of quantum computer that is expected to be especially robust against interference from its surroundings, a situation known in quantum computing as decoherence. The scalable topological quantum computer is theoretically more stable and less error-prone.
One of the challenges in quantum computing is that the qubits interact with their environment and lose their quantum information before computations can be completed, Manfra says. Topological quantum computing utilizes qubits that store information non-locally and the outside noise sources have less effect on the qubit, so we expect it to be more robust.
Purdue University and Microsoft Corp. have signed a five-year agreement to develop a useable quantum computer. Purdue is one of four international universities in the collaboration. Michael Manfra, Purdue Universitys Bill and Dee OBrien Chair Professor of Physics and Astronomy, Professor of Materials Engineering and Professor of Electrical and Computer Engineering, will lead the effort at Purdue to build a robust and scalable quantum computer by producing what scientists call a topological qubit. (Purdue University photo/Rebecca Wilcox)
Arxiv Topological Quantum Computation
The theory of quantum computation can be constructed from the abstract study of anyonic systems. In mathematical terms, these are unitary topological modular functors. They underlie the Jones polynomial and arise in Witten-Chern-Simons theory. The braiding and fusion of anyonic excitations in quantum Hall electron liquids and 2D-magnets are modeled by modular functors, opening a new possibility for the realization of quantum computers. The chief advantage of anyonic computation would be physical error correction: An error rate scaling like e, where is a length scale, and is some positive constant. In contrast, the presumptive qubit-model of quantum computation, which repairs errors combinatorically, requires a fantastically low initial error rate (about 10^4) before computation can be stabilized.
Manfra says that the most exciting challenge associated with building a topological quantum computer is that the Microsoft team must simultaneously solve problems of material science, condensed matter physics, electrical engineering and computer architecture.
This is why Microsoft has assembled such a diverse set of talented people to tackle this large-scale problem, Manfra says. No one person or group can be expert in all aspects.
Purdue and Microsoft entered into an agreement in April 2016 that extends their collaboration on quantum computing research, effectively establishing Station Q Purdue, one of the Station Q experimental research sites that work closely with two Station Q theory sites.
Purdues role in the project will be to grow and study ultra-pure semiconductors and hybrid systems of semiconductors and superconductors that may form the physical platform upon which a quantum computer is built. Manfras group has expertise in a technique called molecular beam epitaxy, and this technique will be used to build low dimensional electron systems that form the basis for quantum bits, or qubits.
The work at Purdue will be done in the Birck Nanotechnology Center in the universitys Discovery Park, and well as in the Department of Physics and Astronomy. The Birck facility houses the multi-chamber molecular beam epitaxy system, in which three fabrication chambers are connected under ultra-high vacuum. It also contains clean-room fabrication, and necessary materials characterization tools.
See the original post here:
Microsoft and Purdue work on scalable topological quantum computer - Next Big Future
Posted in Quantum Computing
Comments Off on Microsoft and Purdue work on scalable topological quantum computer – Next Big Future
From the Abacus to Supercomputers to Quantum Computers – Duke Today
Posted: at 8:40 pm
If using quantum mechanics to compute problems that are unsolvable with todays fastest supercomputers sounds outrageously ambitious, thats because it is. There are many experts who say that it cant be done.
But thats not stopping Jungsang Kim, professor of electrical and computer engineering at Duke University, from pursuing the impossible. A pioneer in translating theoretical quantum physics into physical hardware, Kim has been engineering the components for a quantum computer at Duke for more than a decade.
And hes starting to sniff the finish line.
Weve put together and demonstrated all of the individual components needed to build a large, scalable quantum computer, said Kim. We are convinced that within the next few years we could turn this technology into much more sophisticated quantum computers with the potential to solve problems considered impossible today.
Imagine a computer trying to put together a jigsaw puzzle. Because computer code is binary, either a piece fits or it doesnt, the most efficient method would be to pick a piece at random and attempt to attach every other available piece until one fits. Todays computers would then take that two-piece unit, and repeat the entire process over and over until the puzzle is completed.
Even with todays supercomputers, this process would take a long time because it must be done sequentially. Quantum computers, however, have the advantage of occupying many different states at the same time.
Now imagine a quantum computer with enough qubitsindividual pieces of memory analogous to todays transistorsto assign one to each puzzle piece. Thanks to quantum mechanics, all possible configurations are stored into a quantum memory, which is manipulated in a very careful way so that all the non-answers fade away very quickly and all the real answers emerge in a systematic way. This allows the quantum computer to converge on a solution much more efficiently than a classical computer.
Nobel Laureate Bill Phillips said that using quantum principles to compute is as different from classical computing as a classical supercomputer is from an abacus, said Kim. There are, however, several different ways that one might achieve this. Our group has focused on approaches using individually trapped ions.
The qubits in Kims quantum computer are individually trapped ionsatoms with electrons stripped away to give it a positive electric charge. That charge allows researchers to suspend the atoms using an electromagnetic field in an ultra-high vacuum. Kim and his colleagues then use precise lasers to manipulate their quantum states.
The method is promising. Kim and colleague Christopher Monroe at the University of Maryland have secured more than $60 million in grants to transition these ideas into large, scalable quantum computers. And theyre not alonemany other big companies like Google, IBM, Microsoft and Intel are starting to make big investments as well.
With the potential to revolutionize industries such as materials design, pharmaceutical discovery and security encryption, the race is on. And Kim and his colleagues are the only ones betting on trapped ions, having started a company called IonQ to pursue commercialization of the technology.
Our collaboration actually has a small qubit quantum computer that's very generally programmable, said Kim. We think we know how to take this system and turn it into a much bigger system that is reliable, stable and much more scalable. We've come to a point where we believe that even commercially viable systems can be put together.
More:
From the Abacus to Supercomputers to Quantum Computers - Duke Today
Posted in Quantum Computing
Comments Off on From the Abacus to Supercomputers to Quantum Computers – Duke Today







