The Prometheus League
Breaking News and Updates
- Abolition Of Work
- Ai
- Alt-right
- Alternative Medicine
- Antifa
- Artificial General Intelligence
- Artificial Intelligence
- Artificial Super Intelligence
- Ascension
- Astronomy
- Atheism
- Atheist
- Atlas Shrugged
- Automation
- Ayn Rand
- Bahamas
- Bankruptcy
- Basic Income Guarantee
- Big Tech
- Bitcoin
- Black Lives Matter
- Blackjack
- Boca Chica Texas
- Brexit
- Caribbean
- Casino
- Casino Affiliate
- Cbd Oil
- Censorship
- Cf
- Chess Engines
- Childfree
- Cloning
- Cloud Computing
- Conscious Evolution
- Corona Virus
- Cosmic Heaven
- Covid-19
- Cryonics
- Cryptocurrency
- Cyberpunk
- Darwinism
- Democrat
- Designer Babies
- DNA
- Donald Trump
- Eczema
- Elon Musk
- Entheogens
- Ethical Egoism
- Eugenic Concepts
- Eugenics
- Euthanasia
- Evolution
- Extropian
- Extropianism
- Extropy
- Fake News
- Federalism
- Federalist
- Fifth Amendment
- Fifth Amendment
- Financial Independence
- First Amendment
- Fiscal Freedom
- Food Supplements
- Fourth Amendment
- Fourth Amendment
- Free Speech
- Freedom
- Freedom of Speech
- Futurism
- Futurist
- Gambling
- Gene Medicine
- Genetic Engineering
- Genome
- Germ Warfare
- Golden Rule
- Government Oppression
- Hedonism
- High Seas
- History
- Hubble Telescope
- Human Genetic Engineering
- Human Genetics
- Human Immortality
- Human Longevity
- Illuminati
- Immortality
- Immortality Medicine
- Intentional Communities
- Jacinda Ardern
- Jitsi
- Jordan Peterson
- Las Vegas
- Liberal
- Libertarian
- Libertarianism
- Liberty
- Life Extension
- Macau
- Marie Byrd Land
- Mars
- Mars Colonization
- Mars Colony
- Memetics
- Micronations
- Mind Uploading
- Minerva Reefs
- Modern Satanism
- Moon Colonization
- Nanotech
- National Vanguard
- NATO
- Neo-eugenics
- Neurohacking
- Neurotechnology
- New Utopia
- New Zealand
- Nihilism
- Nootropics
- NSA
- Oceania
- Offshore
- Olympics
- Online Casino
- Online Gambling
- Pantheism
- Personal Empowerment
- Poker
- Political Correctness
- Politically Incorrect
- Polygamy
- Populism
- Post Human
- Post Humanism
- Posthuman
- Posthumanism
- Private Islands
- Progress
- Proud Boys
- Psoriasis
- Psychedelics
- Putin
- Quantum Computing
- Quantum Physics
- Rationalism
- Republican
- Resource Based Economy
- Robotics
- Rockall
- Ron Paul
- Roulette
- Russia
- Sealand
- Seasteading
- Second Amendment
- Second Amendment
- Seychelles
- Singularitarianism
- Singularity
- Socio-economic Collapse
- Space Exploration
- Space Station
- Space Travel
- Spacex
- Sports Betting
- Sportsbook
- Superintelligence
- Survivalism
- Talmud
- Technology
- Teilhard De Charden
- Terraforming Mars
- The Singularity
- Tms
- Tor Browser
- Trance
- Transhuman
- Transhuman News
- Transhumanism
- Transhumanist
- Transtopian
- Transtopianism
- Ukraine
- Uncategorized
- Vaping
- Victimless Crimes
- Virtual Reality
- Wage Slavery
- War On Drugs
- Waveland
- Ww3
- Yahoo
- Zeitgeist Movement
-
Prometheism
-
Forbidden Fruit
-
The Evolutionary Perspective
Monthly Archives: June 2017
Astronomy club to host full moon viewing – The Oracle
Posted: June 1, 2017 at 11:10 pm
SPECIAL TO THE ORACLE
From the twinkling lights of the stars to the glow of a full moon, students have the opportunity to enjoy the heavens with the astronomy club at Riverfront Park.
All students are welcome as prior knowledge of astronomy is not required.
It doesnt matter if youre a physics major, it doesnt matter if you have an astronomy minor, it doesnt matter if youre not a (science, technology, engineering and mathematics) major, said Kyle Denny, a junior majoring in physics and president of the astronomy club. You could be anything and you could come join the astronomy club. It is open to anyone who just wants to connect and learn about the universe and appreciate it.
We do a lot of events, too, when planets are in opposition, said Kami Malestein, a junior majoring in physics and astronomy club vice president.
The astronomy club hosts many activities such as stargazing, full moon watching and eclipse viewing. Students with telescopes or binoculars are encouraged to bring them.
One of Dennys favorite events was Mercurys transit in May of last year.
We watched the planet Mercury go in front of the sun, Denny said. we had a great turnout for that one.
The number of students at an event varies from five to 10 people on stormy nights to a hundred for occurrences such as the Mercury transit.
An even larger turnout is expected for the upcoming solar eclipse on Aug. 21 the viewing location on campus is yet to be determined.
Although it wont be a total eclipse visible over USF, there will be a partial phase. Eighty percent of the sun will be blocked out, and itll be on the very first day of school, Denny said. People are going to stop by and wonder whats going on with the sun, so they get a chance to look at the sun in a really spectacular event.
Most of the clubs events take place at Withlacoochee River Park or Riverfront Park, with transportation through students driving themselves or joining a carpooling list.
The astronomy club is one of a few clubs that remain active during the summer. Their next event is scheduled for the next full moon June 9 at Riverfront Park.
The times that there are not thunderstorms, the Milky Way is nice and prominent in the night sky. You can see it from horizon to horizon, Denny said. Its a really inspiring experience. So, the summer is probably the best time to really look up at the night sky and really appreciate it.
See the original post here:
Posted in Astronomy
Comments Off on Astronomy club to host full moon viewing – The Oracle
UVI To Host Two Astronomy Conferences Showcasing Major Discoveries – VI Consortium (press release)
Posted: at 11:10 pm
The University of the Virgin Islands College of Science and Mathematics, together with the Etelman Observatory, are organizing two upcoming astronomy conferences this summer. The first one, Generation-GW: Diving into Gravitational Waves will take place from June 5-9. The second conference, Unveiling the Physics Behind Extreme AGN Variability will take place from July 11-14. Both conferences will facilitate discussions about crucial breakthroughs in the field of astronomy over the last few years.
We are establishing a legacy, and these events will improve the recruitment of Virgin Islands students to study physics and astronomy at UVI, said Dr. Antonino Cucchiara, assistant professor of physics. The conferences will also demonstrate how research and activities undertaken at UVI can benefit the community.
The scientific breakthrough to be discussed by groups of international astrophysicists from around the world at the June conference is Gravitational Waves. Widely considered to be the greatest discovery of 21st century astronomy, this phenomenon describes ripples in the curvature of space-time that propagate at the speed of light, outward from their source.
The other discovery to be discussed by more than 50 astronomers at the July conference is Fast Variable Active Galactic Nuclei (AGN). The center of every galaxy has a super massive black hole which is millions of times heavier than our sun. Everything that gets too close to it or falls in is destroyed, explained Cucchiara. That destruction produces energy that is observable in optical, X-ray, gamma-ray radiation producing an AGN. The July conference will focus on Fast Variable AGNs, which radiation changes quickly in time and are therefore difficult to observe in detail.
Both conferences will include an undergraduate mentoring component with question and answer sessions, as well as a talk that will be open to the public. The public talk for the June conference is set for 7 p.m. on Thursday, June 8, in the Administration and Conference Center (ACC). It will feature Professor Alberto Sesana from the University of Birmingham in the United Kingdom, and Professor Jillian Bellovary from Queensborough Community College in New York. The public talk in July will also be held on a Thursday; details to be announced.
UVI and the Etelman Observatory are establishing a path forward to become an astronomy research hub, said Cucchiara. It is important for us to involve not just UVI physics faculty, but also international partners, undergraduate researchers and federal agencies. Eight UVI students will be at the National Aeronautics and Space Administration (NASA) working on a variety of projects, from building the new generation of microsatellite, to studying planets around other stars, to studying the most powerful stellar explosions known in the Universe. Some of these projects relate to research that is currently being pursued at UVI, representing the strong connection between both institutions.
See more here:
UVI To Host Two Astronomy Conferences Showcasing Major Discoveries - VI Consortium (press release)
Posted in Astronomy
Comments Off on UVI To Host Two Astronomy Conferences Showcasing Major Discoveries – VI Consortium (press release)
Bad Astronomy | Astronomers may have seen a star collapse directly … – Blastr
Posted: at 11:10 pm
[Artist's conception of a black with material swirling around it in an accretion disk, and also a jet of matter blasting away from it. Until recently, it was thought that a star had to supernova to create a black hole, but evidence is mounting it may not. Credit:NASA/JPL-Caltech]
One of the basic truisms in astronomy is that, when a massive star ends its life, it goes out with a bang. A big one. A supernova.
This titanic explosion is triggered when the star runs out of nuclear fuel in its core. The core collapses in a heartbeat, and the energy generated in that collapse is so immense that it blows the outer layers off. This explosion is so colossal it can outshine an entire galaxy! In the meantime,the collapsed core can form an exotic neutron star, or may even squeeze itself down into a black hole.
Now, Ive skipped some steps there, but thats the general picture (if you want more, check out my Crash Course Astronomy episode on high mass stars and supernovae). If you want a black hole, you have to blow up a massive star.
Except, maybe not. It turns out theres a loophole that could allow a star to bypass the supernova part. It collapses directly down to a black hole without the explosion. Some energy is released, but not much compared to a supernova, and in the end what you get is a now-you-see-it-now-you-dont situation: The star is there, and then suddenly ... it isnt.
The idea of a failed supernova is an interesting theoretical astrophysical problem, and one scientists have been working on for a while now. But theres been a new an exciting development: Astronomers now think theyve seen one!
The star in question is called N6946-BH1, and it was found in a very cool survey specifically designed to look for failed supernovae. Using the Large Binocular Telescope in Arizona, 27 galaxies all within about 30 million light-years of Earth were observed over and over again. Each image was painstakingly compared to the others to look for transients: objects that have changed brightness. Even using rather stringent criteria, thousands were found stars change brightness for a lot of reasons,but most are not due to them going supernova ... or, in this case, failing to supernova.
Eventually,the number of interesting objects was whittled down to just 15. Six of them turned out to be run-of-the-mill exploding stars (if the titanic explosion of a few octillion tons of star screaming outward at a substantial fraction of the speed of light can be called ho-hum), but nine of them turned out to be more interesting.
Of these, all but one were likely unusual events, like two stars merging, which can cause a very big (and very pretty) eruption, but again falls short of the outcome of a massive star dying. When all was said and done, after searching 27 galaxies for seven years, only one object was left: N6946-BH1.
In earlier images, the star is there, clearly seen in the galaxy NGC 6946, a lovely face-on spiral galaxy roughly 20 million light-years away (and one that has had no fewer than 10 recorded supernovae in the past century; by coincidence one was seen just this year). Then, in later images, its gone. Like, gone: Disappeared. Poof.
If it had exploded as a supernova it wouldve been seen in the images. Instead, in 2009, it briefly got somewhat brighter, glowing at about a million times brighter than the Sun; then it faded so much it was only about 2% of its previous brightness (that is, pre-collapse) by 2015. And yes, in human terms, a million times the Suns luminosity is terrifyingly bright, but in terms of a supernova, its barely worth mentioning; a typical one will shine many billions of times brighter than the Sun! So this was, at best, a bit of a pop.
So, how do we know it wasnt some sort of weird supernova, maybe obscured by lots of dust in the host galaxy? This material is dark and opaque, and can completely block the light from even a normal supernova. Follow-up observations using Spitzer Space Telescope should reveal that, because infrared light can pierce through the dust. Spitzer did see some IR light from the event, roughly 20003000 times the Suns luminosity. Again, thats a lot, but nowhere near what youd expect from a supernova. Even a stellar merger would produce more than that.
It really looks like whats left is what the astronomers had been looking for all along: a failed supernova.
If true, this is very interesting, indeed. Why? Because of physics.
It takes a massive star to explode; it has to have enough pressure in the core (caused by the mass of the star above it squeezing down on it) to fuse successively heavier elements over time. First,hydrogen fuses into helium.Then, when that runs out, helium is fused into carbon, and so on, until the core builds up iron. When iron fuses, it doesnt release energy; it absorbs it. Thats a big problem, because its that release of fusion energy that holds the star up (in a similar fashion that hot air causes a balloon to expand). Once the star tries to fuse iron, the core collapses. If the core has a mass up to about 2.8 times that mass of the Sun, it forms a neutron star, but if it has more, it forms a black hole.
And in general, either way, the core collapse triggers the supernova in the outer layers, and kaboom.
But thats where this gets funny. It may not always happen that way. For a range of core masses, theoretical calculations show that the explosion may stall. The outer layers get a decent kick, but not a huge one. They blow off, but its a more gentle event than the unfettered violence of a supernova.
That depends on a lot of factors, actually, but it tends to happen when the total star mass is roughly 25 times that of the Sun. Looking at the observations of N6946-BH1, thats just about the mass it had.
And theres more. We see lots of high-mass stars in galaxies being born, but there arent enough supernovae seen to account for them all. That implies failed supernovae happen relatively often.
Also, when we look at the masses of neutron stars and black holes, we find theres a gap between them; the lowest-mass black holes are still considerably more massive than the highest-mass neutron stars. If all these compact objects formed from regular supernovae, youd expect there to be a smooth transition. Thats because, in a supernova, a lot of the material in the star still lingers near the core, and that can fall back on the newly formed neutron star. If theres enough, the neutron star will then collapse to form a low-mass black hole. So youd expect to see lots of black holes right at the lower mass limit. But we dont.
Ah, but in the failed supernova scenario, theres a lot more material left over there wasnt enough energy in the event to blow away all the outer layers. This comes crashing back down and adds its mass to the neutron stars, making a far more massive black hole. So, in reality, the existence of failed supernovae explains a lot of different phenomena.
And now, very likely, weve seen one! More observations would be nice, though. For example, a newly formed black hole should emit lots of X-rays, as material heats up before falling in. If we see those X-rays, that would go a long way in understanding what were seeing.
And again, this is the first one that weve seen. Given the number of supernovae that were detected in the survey, it implies that something like 14% of all high-mass star deaths result in failed supernovae. If thats the case, then we need more eyes on the sky looking for these events. Supernovae are what create and distribute elements literally vital to our existence: iron, calcium and more. Without them, you and I would literally not exist.
In my opinion, that makes these events very much worthy of our study. Even when they fail.
See the rest here:
Bad Astronomy | Astronomers may have seen a star collapse directly ... - Blastr
Posted in Astronomy
Comments Off on Bad Astronomy | Astronomers may have seen a star collapse directly … – Blastr
Cloud computing takes off as top new discipline on campus – Education Dive
Posted: at 11:09 pm
Indranil Gupta, an associate professor in the Department of Computer Science at the University of Illinois Urbana-Champaign, recalled the first time he offered a free Coursera online class on Cloud Computing Concepts in the spring of 2015. In the first class, Gupta said, Coursera registered a total of 179,000 enrollees from 198 countries.
That shows you how much interest there is, he said. It seems like every single country has some students who are interested.
Guptas assessment matches numerous reports that interest in cloud computing among students had skyrocketed, and courses in computer science departments throughout the nation were increasingly becoming commonplace. However, a recent report by Clutch, a Washington, D.C. based B2B and research firm, found that there were still concerns among universities and professors regarding the cost of teaching cloud computing. Riley Planko, a content developer at Clutch who authored the report, noted that while individual courses and certification programs were increasingly available, undergraduate and Masters programs were still developing.
For the cost, there was definitely optimism. Theres potential with regulation, and learning how to manage this, that its something that can be more more under control by the university, she said. It still a young field. Its only been around in its true power for a couple of years.
Higher education institutions have been interested in storing data on cloud servers for several years, and as the Clutch report indicates, cloud computing skills are in high demand by corporations, and increasingly, public institutions (LinkedIn found that knowledge in cloud computing was the most desirable skill in job applicants among employers, according to the report).
Kevin McDonald, the founder and managing director of GreyStaff Group, LLC, also teaches a cloud computing course in the Technology Management Masters program at Georgetown Universitys School of Continuing Studies. He said the sea change cloud computing brought to public and private industry was now benefitting individual startups. By eliminating the need for expensive server infrastructure and IT staff, new companies can significant cut their upfront costs, building their entire infrastructure in the cloud. It is an opportunity McDonald echoes in his course, with teams visualizing and building a phone app within a matter of weeks before presenting it to the class; some had even sought investors for their creations.
Its a total revolution under our feet, so as weve developed the program, weve tried to keep it in the real world, he said, marveling at the fact that students come up with an idea, and go through a startup and are able to present to a venture capitalist within six weeks.
Gupta agreed there was an ongoing transition amongst higher education institutions on how to offer cloud computing courses integrated in disciplines, instead of in isolation, and he detailed a Masters of Computer Science in Data Science currently offered by UIUC. The MCS-DS is an online program with a $19,200 tuition, offering students the ability to proceed at their own pace, and Guptas Coursera class in Cloud Computing Systems is integrated into the degree.
Gupta said that while there is always a period of transition where professors in a particular discipline may wonder whether a new facet of the discipline should be integrated or is merely temporal, he was optimistic about how computer science had quickly warmed to introducing cloud computing and big data into curricula.
Cloud computing as it is today is new, but many of the systems in cloud computing have been around for decades, he said. Many of the building blocks have been around for a long time, its just that its become more available and accessible to students.
Gupta also said the imposing costs of accessing cloud storage for student use could be alleviated by partnering with companies that offer free or reduced-price resources for students, citing that Amazon Web Services ran a program for several years that would offer $100 worth of credit for proposed research projects.
The company currently offers AWS Educate for institutions, educators and even individual students, touting access to company technology, training resources and open-source content for educational use. Much of UIUCs work, Gupta said, was done with Microsoft Azure due to a mutual partnership. He said students benefitted from the cloud space, while industries could see benefits once students enter the workforce.
Companies want students who are more familiar with the state of the technology, so they need as little training as possible when they join, he said. They know that all our students are smart; its whether they have the necessary skills or need extra training. If Microsoft has students use Microsoft Azure courses, theyre kind of already training them.
McDonald, who is also the author of Above The Clouds: Managing Risk In The World Of Cloud Computing, said government, after some lag time, was catching up to private industry in the adoption of cloud technology. The Federal Cloud First Initiative, instituted in 2010 by the Obama administration, had led to the closure of more than 3,000 data centers as of April 2016, with a goal of closing 5,203 federal data centers in total by 2019, almost half of the 2010 number.
He said cloud computing, like many burgeoning computer science fields, was increasingly viewed as interdisciplinary, asserting that while the School of Professional Studies valued the technical processes inherent in cloud computing, the increased accessibility of cloud storage for novice users lowered the complexity barrier for interested students.
Its gotten to that level of simplicity where we dont need to worry about that unless were turning out system engineers, he said. Thats always been the philosophy for this program since day one.
In addition to cost concerns, Pankos report found that some professors expressed concern with how to appropriately teach cloud computing in a rapidly-changing field, and also said the lack of necessary staff at universities that could be a hindrance.
Nevertheless, the report concluded that it would be worthwhile for colleges and universities to at least consider the topic for future implementation in their curricula.
Excerpt from:
Cloud computing takes off as top new discipline on campus - Education Dive
Posted in Cloud Computing
Comments Off on Cloud computing takes off as top new discipline on campus – Education Dive
Box CEO Aaron Levie: Artificial intelligence to revolutionize cloud computing – MarketWatch
Posted: at 11:09 pm
Box Inc. is accomplishing its current goal of generating cash from its cloud-software business, and Chief Executive Aaron Levie has plans for more changes down the road, including an artificial-intelligence effort.
After reporting fiscal first-quarter earnings Wednesday afternoon, Levie chatted with MarketWatch for about 10 minutes about the path Box BOX, +9.52% has traveled since its 2015 initial public offering, where the enterprise online-storage company goes from here, and how his sneaker game has changed. The interview has been edited for length and clarity.
MarketWatch: Since the IPO, Box has been able to maintain solid revenue growth, but the last two quarters you have generated positive free cash flow for the first time, which you had targeted. Is that the biggest change for the company financially since going public, and what else has been important so far?
Aaron Levie: I think thats a very key point. I would say that, overall, weve been building a cloud content-management platform for a little over a decade, and whats starting to happen is larger and larger enterprises are adopting Box as their core system of record for securing and managing and governing and organizing their corporate information. Were seeing customers basically do larger transactions with us, buying more seats of the service for their user base, and add on additional products like our Box governance capabilities and some of our advanced security technology.
So basically whats happening, were continuing to move more and more upmarket, were getting more efficient over time with our sales force, and were growing a larger base of customers, which obviously produces a larger recurring revenue base, which then drives more efficiency from an operational standpoint, and thus generating free cash flow. So I think whats happening is as you see deal sizes go up and transactions go up and our own internal productivity improve, youre seeing the economics of the business really kind of start to take hold. This is obviously what we had always been building into the business model, but it wasnt always as clear, like when we first went public, that this is what it was going to turn into. I think thats what is starting to happen within the numbers.
MW: A question provided by a person who tweets about Box even more than you, Alex Wilhelm from CrunchBase: How does positive free cash flow impact the business and how do you balance revenue growth with the focus on cash generation?
AL: It hasnt been any kind of significant change as much as just our own evolution as a company. Were now around 1,600 employees, we operate around the world, we have 74,000 customers, so theres a whole bunch of things that as we scale up as an organization not the least of which is going public where we have just become more operationally rigorous. So as were scaling up, it makes sense to ensure we have a sustainable business model that doesnt require outside capital, which is why the cash flow elements to the business are so incredibly important. But it hasnt restricted our growth, were just making sure that we execute as effectively as possible and that were driving that growth in as efficient of a way as possible. I think thats what youre starting to see show up in the business. I dont think were trading off that much from a top-line standpoint, but ultimately were building a much healthier organization and a much healthier business.
MW: Whats the next milestone beyond cash generation? Is actual GAAP profitability ahead? Youve discussed $1 billion in annual revenues, is there a target year for that? Are there other serious financial goals?
AL: Yeah, we are on a path to $1 billion in revenue over the next few years, thats probably the most significant next major medium-term milestone, so obviously this years financial metrics are going to be incredibly important to ensuring were on that path. We guided to more than $500 million in revenue this year, so the $1 billion mark is the next significant material milestone that we kind of have a flag in the ground on.
Dont miss: How artificial intelligence will affect your job
MW: When you went public, you talked a lot about how Box was capitalizing on the transition to cloud and mobile, and said that kind of major transformative change in tech happens every 10-15 years. Do you see another of those changes on the way?
AL: Yeah, I think the most significant technology were seeing is artificial intelligence. We think that the impact of AI within the enterprise is going to be enormous and were quite excited about some upcoming announcements we have that will at least point to where Box will be going in the space. I obviously cant reveal too much, but needless to say, we think that AI is going to be substantially powerful for the future of work, and we want to make sure were embedding intelligent experiences into everything we do and everything we build at Box.
MW: Any big changes in your sneaker game since the IPO? You using your cash to move up to some limited edition Yeezys or anything?
AL: No, getting pretty boring on the sneaker front, unfortunately. Im becoming a little more post-IPO in my sneaker choices. Still sneakers, but less, lets say, colorful.
Box shares have gained 35% in 2017, while the S&P 500 SPX, +0.76% has gained 8%.
Read the original:
Box CEO Aaron Levie: Artificial intelligence to revolutionize cloud computing - MarketWatch
Posted in Cloud Computing
Comments Off on Box CEO Aaron Levie: Artificial intelligence to revolutionize cloud computing – MarketWatch
Will Amazon’s Web Services Business Get Hurt by Cloud Computing Commodification? – HuffPost
Posted: at 11:09 pm
Will the profitability of AWS (Amazon Web Services) decrease over time (to near zero) because the service is basically a commodity? originally appeared on Quora: the place to gain and share knowledge, empowering people to learn from others and better understand the world.
Answer by Mathew Lodge, San Francisco tech executive, on Quora:
The premise of the question is flawed: Amazon Web Services is nothing like a commodity. I do expect that the profitability of AWS will decline at some point due to competitive intensity specifically from Microsoft Azure and Google Cloud Platform but that really isnt the same thing, and it isnt happening yet.
For over nineyears now theres been a narrative about AWS that says an IaaS cloud is just a convenient place where you can run some virtual machines on demand. The saying The cloud is just computers that belong to someone else embodies this idea. And because one rented virtual machine is much like another, the theory goes, a VM service like AWS is just a commodity like other fungible on-demand services such as electricity.
Peddlers of this narrative felt emboldened when AWS kept cutting VM prices in the days before we could see any financials about AWS. Surely this constant price erosion was evidence of the commodity nature of AWS?
There are two problems with this narrative:
From the outset of AWS, Amazon was building itself a new platform for building and deploying distributed applications. While it intended to eventually use this platform for Amazon.com, it fully intended to sell it to other people too. AWS was never spare capacity not being used by the retail site an enduring myth that just wont die[1].
The death blow to the commodity narrative should have happened when Amazon started breaking out AWS balance sheet in April 2015. Amazon revealed a breathtakingly profitable business with a balance sheet that looks totally unlike a commodity service, while also demonstrating a 49% growth rate that most multi-billion dollar businesses only ever get to dream about[2].
AWS EBITDA (Earnings Before Interest, Taxes, Depreciation and Amortization) is about 50%. For comparison, the best EBITDA Rackspace ever achieved as a hosting/cloud provider was 28%. So AWS is nearly twice as profitable as one of its most efficiently run public company predecessors in the hosting/cloud business. [I am using EBITDA because its the best way to compare profitability of capital-intensive businesses[3].]
Azure and Google Cloud Platform have incredibly competitive basic compute services. Googles compute service is vastly more flexible than AWS. Yet neither is badly denting AWS growth rate. Why? Because the code running inside of that VM needs to actually get stuff done, and AWS has a very broad and increasingly deep set of complementary services that software developers can tap into.
Angela Zhang () does a great job of explaining how well AWS does this, and how unlike a commodity AWS is, in her answer. Stan Hanks articulates The promise that means switching costs are high for the millions already using AWS, and for millions of new users who want not to screw things up by choosing the wrong cloud platform.
AWS and Microsoft are battling for control of the next great app platform.
Many people have been surprised that after years of brutally battling all-comers for server operating system revenue share with Windows Server, Microsoft has embraced Linux and done everything it can possibly do to encourage development of cloud apps on Linux on Azure.
Why the sudden charge of heart? Satya Nadella realized before many others that the battle for app developer mindshare was slipping away from the OS to the cloud and specifically the API of the cloud that it ran on. When your app dependencies are all on cloud services provided by an IaaS like AWS, then winning the OS battle doesnt win you much if they just go run the app on AWS.
This question originally appeared on Quora - the place to gain and share knowledge, empowering people to learn from others and better understand the world. You can follow Quora on Twitter, Facebook, and Google+. More questions:
Start your workday the right way with the news that matters most.
Read the original:
Will Amazon's Web Services Business Get Hurt by Cloud Computing Commodification? - HuffPost
Posted in Cloud Computing
Comments Off on Will Amazon’s Web Services Business Get Hurt by Cloud Computing Commodification? – HuffPost
Mary Meeker: Healthcare technology is booming thanks to cloud computing and wearables – SiliconANGLE (blog)
Posted: at 11:09 pm
Kleiner Perkins Caufield & Byers partner and longtime tech analyst Mary Meeker released her annual Internet Trends Report Wednesday, and more than anything else, she pointed to a transformation of health thanks to big data and cloud computing.
The report, which is highly regarded in the tech community for its insights into trends and predictions, dedicated 31 pages to Healthcare @ The Digital Inflection Point and came up with some amazing stats about how technology and the Internet are transforming the sector.
At the top of the list, and perhaps the most remarkable number, is the way data is helping develop new medicines. Meeker said the digitization of medical data means that medical knowledge now doubles every 3.5 years versus doubling only every 50 years in 1950. Meeker added that the increased availability of health datais helping to accelerate clinical trials and encouraging collaboration with the scientific community as well.
That data accumulation, which Meeker describes as Digitally Native Health-Related Data Sets, comes from many sources, not only from medical establishments themselves but directly from consumers with the proliferation of wearable devices. According to her numbers, global wearable shipments hit 102 million in 2016, a figure five times higher than 2014, and a remarkable 25 percent of Americans now own a wearable device with more likely to buy in the future.
That data requires sharing, and some companies have earned more trust more than others in handling it. Google Inc. was trusted by 60 percent of those polled to handle health data, while Microsoft Corp. and Samsung Electronic Co. Ltd. were notfar behind at 56 percent and 54 percent, respectively. At the other end, consumers didnt trust Amazon.com Inc. and IBM Corp. nearly as much, with the companies only being trusted by 39 and 37 percent of people.
The surge in wearables has also been matched by a surge in health-related apps, with downloads hitting more than 1.2 billion in 2016. The types of apps were split across the health spectrum, with the most popular, fitness, sitting at 36 percent followed by disease and treatment at 24 percent and lifestyle and stress at 17 percent.
All the advances in healthcare technology wouldnt have been possible without growing cloud computing support. The cloud got its own section, with the report noting that Cloud Adoption = Reaching New Heights + Creating New Opportunities.
Although traditional data center spending still accounted for the majority of global information technology infrastructure spend in 2016, the type of spending is changing. Private and public cloud infrastructure accounted for 37 percent of total spending last year, versus 23 percent in 2013. Going forward, Meeker notes a survey that indicates that many enterprises are considering cloud adoption, with 57 percent of respondents saying they planned to run appson Amazon Web Services alone, with growing support for Microsofts Azure at 37 percent.
Heres a full copy of the report:
Read the original here:
Posted in Cloud Computing
Comments Off on Mary Meeker: Healthcare technology is booming thanks to cloud computing and wearables – SiliconANGLE (blog)
Movers: Amazon’s Stock Price Hits $1000 – New York Times
Posted: at 11:09 pm
New York Times | Movers: Amazon's Stock Price Hits $1000 New York Times Amazon is making itself indispensable on a number of fronts, most notably e-commerce and cloud computing. It is also expanding into areas like artificial intelligence and entertainment services. Our tech columnist recently wrote that of the big five ... Amazon@$1k on cloud unit business and global growth Amazon stock tops $1000 Amazon.com, Inc. - AMZN - Stock Price Today - Zacks |
Originally posted here:
Posted in Cloud Computing
Comments Off on Movers: Amazon’s Stock Price Hits $1000 – New York Times
Research collaborative pursues advanced quantum computing – Phys.Org
Posted: at 11:09 pm
May 31, 2017 by Steve Tally Purdue University and Microsoft Corp. have signed a five-year agreement to develop a useable quantum computer. Purdue is one of four international universities in the collaboration.Michael Manfra, Purdue University's Bill and Dee O'Brien Chair Professor of Physics and Astronomy, professor of materials engineering and professor of electrical and computer engineering, will lead the effort at Purdue to build a robust and scalable quantum computer by producing what scientists call a "topological qubit." Credit: Purdue University photo/Rebecca Wilcox
"If this project is successful it will cause a revolution in computing."
That's the forecast of Michael Manfra, Purdue University's Bill and Dee O'Brien Chair Professor of Physics and Astronomy, Professor of Materials Engineering and Professor of Electrical and Computer Engineering, on a new long-term enhanced collaboration between Purdue and Microsoft Corp. to build a robust and scalable quantum computer by producing what scientists call a "topological qubit."
Purdue President Mitch Daniels noted that Purdue was home to the first computer science department in the United States, and says this partnership and Manfra's work places the university at the forefront of quantum computing.
"Someday quantum computing will move from the laboratory to actual daily use, and when it does, it will signal another explosion of computing power like that brought about by the silicon chip," Daniels says. "It's thrilling to imagine Purdue at the center of this next leap forward."
In the computers that we currently use every day, information is encoded in an either/or binary system of bits, what are commonly thought of as 1s and 0s. These computers are based on silicon transistors, which, like a light switch, can only be in either an on or off position.
With quantum computers, information is encoded in qubits, which are quantum units of information. With a qubit, however, this physical state isn't just 0 or 1, but can also be a linear combination of 0 and 1. Because of a strange phenomenon of quantum mechanics called "superposition," a qubit can be in both states at the same time.
This characteristic is essential to quantum computation's potential power, allowing for solutions to problems that are intractable using classical architectures.
Advocates of quantum computing believe this never-before-seen technology will create a new global "quantum economy."
The team assembled by Microsoft will work on a type of quantum computer that is expected to be especially robust against interference from its surroundings, a situation known in quantum computing as "decoherence." The "scalable topological quantum computer" is theoretically more stable and less error-prone.
"One of the challenges in quantum computing is that the qubits interact with their environment and lose their quantum information before computations can be completed," Manfra says. "Topological quantum computing utilizes qubits that store information "non-locally" and the outside noise sources have less effect on the qubit, so we expect it to be more robust."
Manfra says that the most exciting challenge associated with building a topological quantum computer is that the Microsoft team must simultaneously solve problems of materials science, condensed matter physics, electrical engineering and computer architecture.
"This is why Microsoft has assembled such a diverse set of talented people to tackle this large-scale problem," Manfra says. "No one person or group can be expert in all aspects."
Purdue and Microsoft entered into an agreement in April 2016 that extends their collaboration on quantum computing research, effectively establishing "Station Q Purdue," one of the "Station Q" experimental research sites that work closely with two "Station Q" theory sites.
The new, multi-year agreement extends that collaboration, and includes Microsoft employees being embedded in Manfra's research team at Purdue.
Manfra's group at Station Q Purdue will collaborate with Redmond, Washington-based Microsoft team members, as well as a global experimental group established by Microsoft including experimental groups at the Niels Bohr Institute at the University of Copenhagen in Denmark, TU Delft in the Netherlands, and the University of Sydney, Australia. They are also coupled to the theorists at Microsoft Station Q in Santa Barbara. All groups are working together to solve quantum computing's biggest challenges.
"What's exciting is that we're doing the science and engineering hand-in-hand, at the same time," Manfra says. "We are lucky to be part of this truly amazing global team."
Mathematician and Fields Medal recipient Michael Freedman leads Microsoft's Station Q in Santa Barbara working on quantum computing.
"There is another computing planet out there, and we, collectively, are going to land on it. It really is like the old days of physical exploration and much more interesting than locking oneself in a bottle and traveling through space. We will find an amazing unseen world once we have general purpose programmable quantum computers," Freedman says. "Michael Manfra and Purdue University will be a key collaborator on this journey. I'm not interested in factoring numbers, but solving chemistry and materials science problems, and most ambitiously machine intelligence. Curiously, we need great materials science and transport physics Mike Manfra's work to build the systems we will use to do quantum computing and, thus, to usher in the next era of materials science."
Purdue's role in the project will be to grow and study ultra-pure semiconductors and hybrid systems of semiconductors and superconductors that may form the physical platform upon which a quantum computer is built. Manfra's group has expertise in a technique called molecular beam epitaxy, and this technique will be used to build low-dimensional electron systems that form the basis for quantum bits, or qubits.
The work at Purdue will be done in the Birck Nanotechnology Center in the university's Discovery Park, as well as in the Department of Physics and Astronomy. The Birck facility houses the multi-chamber molecular beam epitaxy system, in which three fabrication chambers are connected under ultra-high vacuum. It also contains clean-room fabrication and necessary materials characterization tools. Laboratories for low-temperature measurement of the materials electronic properties will be conducted in the Department of Physics and Astronomy.
Suresh Garimella, executive vice president for research and partnerships, and Purdue's Goodson Distinguished Professor of Mechanical Engineering, says the tools and laboratories found in Discovery Park have enabled Purdue to become a world leader in several areas.
"Combining these world-leading facilities with our incredibly talented and knowledgeable faculty, such as Professor Manfra, has placed Purdue at the forefront of research and development of nanotechnology, nanoelectronics, next-generation silicon transistor-based electronics, and quantum computing. To have Purdue contribute to the construction of the world's first quantum computer is be a dream come true for us," he says.
Explore further: The mystery of quantum computers
Our computers, even the fastest ones, seem unable to withstand the needs of the enormous quantity of data produced in our technological society. That's why scientists are working on computers using quantum physics, orquantum ...
While technologies that currently run on classical computers, such as Watson, can help find patterns and insights buried in vast amounts of existing data, quantum computers will deliver solutions to important problems where ...
With a combined $1.8 million from the W.M. Keck Foundation and the University of Arizona, materials science and engineering professor Pierre Deymier explores building a quantum computer that uses sound instead of quantum ...
The global race towards a functioning quantum computer is on. With future quantum computers, we will be able to solve previously impossible problems and develop, for example, complex medicines, fertilizers, or artificial ...
What does the future hold for computing? Experts at the Networked Quantum Information Technologies Hub (NQIT), based at Oxford University, believe our next great technological leap lies in the development of quantum computing.
IBM has announced its plans to begin offering the world's first commercial universal quantum-computing servicecalled IBM Q, the system will be made available to those who wish to use it for a fee sometime later this year. ...
(Phys.org)For the first time, researchers have demonstrated that shining a nanosecond pulsed laser at the base of a 100-m-long diamond needle can significantly enhance electron emission from the tip of the needle. The ...
A ripe apple falling from a tree has inspired Sir Isaac Newton to formulate a theory that describes the motion of objects subject to a force. Newton's equations of motion tell us that a moving body keeps on moving on a straight ...
(Phys.org)A large team of researchers with members from China, the U.K., the U.S. and Japan has developed a material that can switch between multiple phases with distinct electronic, optical and magnetic properties. In ...
A new experiment appearing in Science shows that features that are even 100 times smaller than the wavelength can still be sensed by light.
What do you get when you revive a beautiful 20-year-old physics machine, carefully transport it 3,200 miles over land and sea to its new home, and then use it to probe strange happenings in a magnetic field? Hopefully you ...
Quantum encryption using single photons is a promising technique for boosting the security of communication systems and data networks, but there are challenges in applying the method over large distances due to transmission ...
Please sign in to add a comment. Registration is free, and takes less than a minute. Read more
See original here:
Research collaborative pursues advanced quantum computing - Phys.Org
Posted in Quantum Computing
Comments Off on Research collaborative pursues advanced quantum computing – Phys.Org
Here’s how we can achieve mass-produced quantum computers … – ScienceAlert
Posted: at 11:09 pm
Still waiting patiently for quantum computing to bring about the next revolution in digital processing power? We might now be a little closer, with a discovery that could help us build quantum computers at mass scale.
Scientists have refined a technique using diamond defects to store information, adding silicon to make the readouts more accurate and suitable for use in the quantum computers of the future.
To understand how the new process works, you need to go back to the basics of the quantum computing vision: small particles kept in a state of superposition, where they can represent both 1, 0, and a combination of the two at the same time.
These quantum bits, or qubits, can process calculations on a much grander scale than the bits in today's computer chips, which are stuck representing either 1 or 0 at any one time.
Getting particles in a state of superposition long enough for us to actually make use of them has proved to be a real challenge for scientists, but one potential solution is through the use of diamond as a base material.
The idea is to use tiny atomic defects inside diamonds to store qubits, and then pass around data at high speeds using light optical circuits rather than electrical circuits.
Diamond-defect qubits rely on a missing carbon atom inside the diamond lattice which is then replaced by an atom of some other element, like nitrogen. The free electrons created by this defect have a magnetic orientation that can be used as a qubit.
So far so good, but our best efforts so far haven't been accurate enough to be useful, because of the broad spectrum of frequencies in the light emitted and that's where the new research comes in.
Scientists added silicon to the qubit creation process, which emits a much narrower band of light, and supplies the precision that quantum computing requires.
At the moment, these silicon qubits don't keep their superposition as well, but the researchers are hopeful this can be overcome by reducing their temperature to a fraction of a degree above absolute zero.
"The dream scenario in quantum information processing is to make an optical circuit to shuttle photonic qubits and then position a quantum memory wherever you need it," says one of the team, Dirk Englund from MIT. "We're almost there with this. These emitters are almost perfect."
In fact, the researchers produced defects within 50 nanometres of their ideal locations on average, which is about one thousandth the size of a human hair.
Being able to etch defects with this kind of precision means the process of building optical circuits for quantum computers then becomes more straightforward and feasible.
If the team can improve on the promising results so far, diamonds could be the answer to our quantum computing needs: they also naturally emit light in a way that means qubits can be read without having to alter their states.
You still won't be powering up a quantum laptop anytime soon, but we're seeing real progress in the study of the materials and techniques that might one day bring this next-generation processing power to the masses.
The research has been published in Nature Communications.
Excerpt from:
Here's how we can achieve mass-produced quantum computers ... - ScienceAlert
Posted in Quantum Computing
Comments Off on Here’s how we can achieve mass-produced quantum computers … – ScienceAlert







