The Prometheus League
Breaking News and Updates
- Abolition Of Work
- Ai
- Alt-right
- Alternative Medicine
- Antifa
- Artificial General Intelligence
- Artificial Intelligence
- Artificial Super Intelligence
- Ascension
- Astronomy
- Atheism
- Atheist
- Atlas Shrugged
- Automation
- Ayn Rand
- Bahamas
- Bankruptcy
- Basic Income Guarantee
- Big Tech
- Bitcoin
- Black Lives Matter
- Blackjack
- Boca Chica Texas
- Brexit
- Caribbean
- Casino
- Casino Affiliate
- Cbd Oil
- Censorship
- Cf
- Chess Engines
- Childfree
- Cloning
- Cloud Computing
- Conscious Evolution
- Corona Virus
- Cosmic Heaven
- Covid-19
- Cryonics
- Cryptocurrency
- Cyberpunk
- Darwinism
- Democrat
- Designer Babies
- DNA
- Donald Trump
- Eczema
- Elon Musk
- Entheogens
- Ethical Egoism
- Eugenic Concepts
- Eugenics
- Euthanasia
- Evolution
- Extropian
- Extropianism
- Extropy
- Fake News
- Federalism
- Federalist
- Fifth Amendment
- Fifth Amendment
- Financial Independence
- First Amendment
- Fiscal Freedom
- Food Supplements
- Fourth Amendment
- Fourth Amendment
- Free Speech
- Freedom
- Freedom of Speech
- Futurism
- Futurist
- Gambling
- Gene Medicine
- Genetic Engineering
- Genome
- Germ Warfare
- Golden Rule
- Government Oppression
- Hedonism
- High Seas
- History
- Hubble Telescope
- Human Genetic Engineering
- Human Genetics
- Human Immortality
- Human Longevity
- Illuminati
- Immortality
- Immortality Medicine
- Intentional Communities
- Jacinda Ardern
- Jitsi
- Jordan Peterson
- Las Vegas
- Liberal
- Libertarian
- Libertarianism
- Liberty
- Life Extension
- Macau
- Marie Byrd Land
- Mars
- Mars Colonization
- Mars Colony
- Memetics
- Micronations
- Mind Uploading
- Minerva Reefs
- Modern Satanism
- Moon Colonization
- Nanotech
- National Vanguard
- NATO
- Neo-eugenics
- Neurohacking
- Neurotechnology
- New Utopia
- New Zealand
- Nihilism
- Nootropics
- NSA
- Oceania
- Offshore
- Olympics
- Online Casino
- Online Gambling
- Pantheism
- Personal Empowerment
- Poker
- Political Correctness
- Politically Incorrect
- Polygamy
- Populism
- Post Human
- Post Humanism
- Posthuman
- Posthumanism
- Private Islands
- Progress
- Proud Boys
- Psoriasis
- Psychedelics
- Putin
- Quantum Computing
- Quantum Physics
- Rationalism
- Republican
- Resource Based Economy
- Robotics
- Rockall
- Ron Paul
- Roulette
- Russia
- Sealand
- Seasteading
- Second Amendment
- Second Amendment
- Seychelles
- Singularitarianism
- Singularity
- Socio-economic Collapse
- Space Exploration
- Space Station
- Space Travel
- Spacex
- Sports Betting
- Sportsbook
- Superintelligence
- Survivalism
- Talmud
- Technology
- Teilhard De Charden
- Terraforming Mars
- The Singularity
- Tms
- Tor Browser
- Trance
- Transhuman
- Transhuman News
- Transhumanism
- Transhumanist
- Transtopian
- Transtopianism
- Ukraine
- Uncategorized
- Vaping
- Victimless Crimes
- Virtual Reality
- Wage Slavery
- War On Drugs
- Waveland
- Ww3
- Yahoo
- Zeitgeist Movement
-
Prometheism
-
Forbidden Fruit
-
The Evolutionary Perspective
Monthly Archives: May 2017
Citizen scientists are invited to help find supernovae – Astronomy Magazine
Posted: May 17, 2017 at 2:28 am
If youve ever wanted to find supernovae, nows your chance. The Australian National University (ANU) is inviting citizen scientists to join the search for the bright, exploding stars.
Supernovae are the bright explosions that mark the end of a stars life and can shine brighter than entire galaxies. They are incredibly useful for researchers who use the bright light from the explosion as a form of measurement.
Using exploding stars as markers all across the universe, we can measure how the universe is growing and what its doing, ANU astrophysicist and co-lead researcher Dr. Brad Tucker said in a press release. We can then use that information to better understand dark energy, the cause of the universes acceleration.
To get involved with the study, all any interested citizen scientist has to do is search images from the SkyMapper telescope, a 1.3-meter telescope at the ANU Siding Spring Observatory, on a website called Zooniverse.org and mark any differences they see in the images. From there, the researchers will check over the marked images and see what they found.
Dr. Tucker said the team is studying an overwhelmingly large amount of sky, so the help would achieve things that would take scientists working alone years to do.
The volunteer help isnt without glory, though. Co-lead researcher Dr. Anais Mller from the ANU Research School of Astronomy and Astrophysics said, The first people who identify an object that turns out to be a supernova will be publicly recognised as co-discoverers.
Dr. Tucker said the team plans to use this information to gather measurements of the universe as well as have a better understanding of supernovae.
See more here:
Citizen scientists are invited to help find supernovae - Astronomy Magazine
Posted in Astronomy
Comments Off on Citizen scientists are invited to help find supernovae – Astronomy Magazine
Astronomy on Tap just one of the fun Tuesday things to do – Austin American-Statesman
Posted: at 2:28 am
7:30 to 9:30 p.m. May 16. Free. The North Door, 502 Brushy St. ndvenue.com.
With a pint of beer in hand, travel out of this world to outer space with another in the casual cosmic talks from local scientists. This months Astronomy on Tap brings you three insightful discussions about ice on Mars, merging galaxies and the Hubble deep field from Cassie Stuurman, Chao-Ling Hung and Mark Dickinson. Plus, the 31st edition of the series will feature trivia, giveaways and even telescopes that will be on hand for anyone to look for exciting orbs in the night sky, weather permitting.
2. Joyce Howell at Wally Workman Gallery
10 a.m. to 5 p.m. Tuesdays-Saturdays through May 27. 1202 W. Sixth St. 512-472-7428, wallyworkmangallery.com.
Wally Workman is opening their fifth show with Texas abstract painter Howell, who lives and works in Kingsland on the Colorado River. The setting provides ample opportunity to observe color changes relating to atmosphere, temperature, wind, time of day and season. Howell believes that even the most pastoral scene, when observed carefully, is riotous in color, texture and pattern. Those elements come to life in her current body of work displayed at the gallery.
3. Julia Mickenberg at BookWoman
6 p.m. May 16. Free. 5501 N. Lamar Blvd. ebookwoman.com.
The University of Texas professor will give a reading of her forthcoming book, American Girls in Red Russia: Chasing the Soviet Dream, while you enjoy appetizers provided by Russian House of Austin. The book chronicles a forgotten counterpoint to the story of the Lost Generation (those who came of age during and just after World War I): that Russian revolutionary ideology attracted many women, including suffragists, reformers, journalists and artists, as well as curious travelers.
4. Georgetown Art Centers Made for You and Me
10 a.m. to 6 p.m. Tuesday-Saturday, 1 to 5 p.m. Sunday through June 4. Free. 816 S. Main St., Georgetown. 512-930-2583, georgetownartcentertx.org.
Austin-based artist James Tisdales newest body of work, a series of Southern Gothic sculptures on display at the center, takes a look at the social and political issues scattered across the American landscape. These issues, created from our past, follow us to this day and stretch from coast to coast. While Tisdale is influenced by all that he sees and hears, his historical art influences range widely, from the figurative works of the Renaissance to the personally powerful folk art of the south.
Read the original:
Astronomy on Tap just one of the fun Tuesday things to do - Austin American-Statesman
Posted in Astronomy
Comments Off on Astronomy on Tap just one of the fun Tuesday things to do – Austin American-Statesman
Cloud computing – Simple English Wikipedia, the free encyclopedia
Posted: at 2:27 am
In Computer science, cloud computing describes a type of outsourcing of computer services, similar to the way in which electricity supply is outsourced. Users can simply use it. They do not need to worry where the electricity is from, how it is made, or transported. Every month, they pay for what they consumed.
The idea behind cloud computing is similar: The user can simply use storage, computing power, or specially crafted development environments, without having to worry how these work internally. Cloud computing is usually Internet-based computing. The cloud is a metaphor for the Internet based on how the internet is described in computer network diagrams; which means it is an abstraction hiding the complex infrastructure of the internet.[1] It is a style of computing in which IT-related capabilities are provided as a service,[2] allowing users to access technology-enabled services from the Internet ("in the cloud")[3] without knowledge of, or control over the technologies behind these servers.[4]
According to a paper published by IEEE Internet Computing in 2008 "Cloud Computing is a paradigm in which information is permanently stored in servers on the Internet and cached temporarily on clients that include computers, laptops, handhelds, sensors, etc."[5]
Cloud computing is a general concept that utilizes software as a service (SaaS), such as Web 2.0 and other technology trends, all of which depend on the Internet for satisfying users' needs. For example, Google Apps provides common business applications online that are accessed from a web browser, while the software and data are stored on the Internet servers.
Cloud computing is often confused with other ideas:
Many cloud computing deployments are powered by grids, have autonomic characteristics and are billed like utilities, but cloud computing can be seen as a natural next step from the grid-utility model.[8] Some successful cloud architectures have little or no centralised infrastructure or billing systems including peer-to-peer networks like BitTorrent and Skype.[9]
The majority of cloud computing infrastructure currently consists of reliable services delivered through data centers that are built on computer and storage virtualization technologies. The services are accessible anywhere in the world, with The Cloud appearing as a single point of access for all the computing needs of consumers. Commercial offerings need to meet the quality of service requirements of customers and typically offer service level agreements.[10]Open standards and open source software are also critical to the growth of cloud computing.[11]
As customers generally do not own the infrastructure or know all details about it, mainly they are accessing or renting, so they can consume resources as a service, and may be paying for what they do not need, instead of what they actually do need to use. Many cloud computing providers use the utility computing model which is analogous to how traditional public utilities like electricity are consumed, while others are billed on a subscription basis. By sharing consumable and "intangible" computing power between multiple "tenants", utilization rates can be improved (as servers are not left idle) which can reduce costs significantly while increasing the speed of application development.
A side effect of this approach is that "computer capacity rises dramatically" as customers do not have to engineer for peak loads.[12] Adoption has been enabled by "increased high-speed bandwidth" which makes it possible to receive the same response times from centralized infrastructure at other sites.
Cloud computing is being driven by providers including Google, Amazon.com, and Yahoo! as well as traditional vendors including IBM, Intel,[13]Microsoft[14] and SAP.[15] It can adopted by all kinds of users, be they individuals or large enterprises. Most internet users are currently using cloud services, even if they do not realize it. Webmail for example is a cloud service, as are Facebook and Wikipedia and contact list synchronization and online data backups.
The Cloud[16] is a metaphor for the Internet,[17] or more generally components and services which are managed by others.[1]
The underlying concept dates back to 1960 when John McCarthy expressed his opinion that "computation may someday be organized as a public utility" and the term Cloud was already in commercial use in the early 1990s to refer to large ATM networks.[18] By the turn of the 21st century, cloud computing solutions had started to appear on the market,[19] though most of the focus at this time was on Software as a service.
Amazon.com played a key role in the development of cloud computing when upgrading their data centers after the dot-com bubble and providing access to their systems by way of Amazon Web Services in 2002 on a utility computing basis. They found the new cloud architecture resulted in significant internal efficiency improvements.[20]
2007 observed increased activity, including Google, IBM and a number of universities starting large scale cloud computing research project,[21] around the time the term started gaining popularity in the mainstream press. It was a hot topic by mid-2008 and numerous cloud computing events had been scheduled.[22]
In August 2008 Gartner observed that "organizations are switching from company-owned hardware and software assets to per-use service-based models" and that the "projected shift to cloud computing will result in dramatic growth in IT products in some areas and in significant reductions in other areas".[23]
Clouds cross many country borders and "may be the ultimate form of globalisation".[24] As such it is the subject of complex geopolitical issues, whereby providers must satisfy many legal restrictions in order to deliver service to a global market. This dates back to the early days of the Internet, where libertarian thinkers felt that "cyberspace was a distinct place calling for laws and legal institutions of its own"; author Neal Stephenson envisaged this as a tiny island data haven in his science-fiction classic novel Cryptonomicon.[24]
Although there have been efforts to match the legal environment (such as US-EU Safe Harbor), providers like Amazon Web Services usually deal with international markets (typically the United States and European Union) by deploying local infrastructure and allowing customers to select their countries.[25] However, there are still concerns about security and privacy for individual through various governmental levels, (for example the USA PATRIOT Act and use of national security letters and title II of the Electronic Communications Privacy Act, the Stored Communications Act).
In March 2007, Dell applied to trademark the term '"cloud computing" in the United States. It received a "Notice of Allowance" in July 2008 which was subsequently canceled on August 6, resulting in a formal rejection of the trademark application in less than a week later.
In November 2007, the Free Software Foundation released the Affero General Public License (abbreviated as Affero GPL and AGPL), a version of GPLv3 designed to close a perceived legal loophole associated with Free software designed to be run over a network, particularly software as a service. According to the AGPL license application service providers are required to release any changes they make to an AGPL open source code.
Cloud architecture[26] is the systems architecture of the software systems involved in the delivery of cloud computing (e.g. hardware, software) as designed by a cloud architect who typically works for a cloud integrator. It typically involves multiple cloud components communicating with each other over application programming interfaces (usually web services).[27]
This is very similar to the Unix philosophy of having multiple programs doing one thing well and working together over universal interfaces. Complexity is controlled and the resulting systems are more manageable than their monolithic counterparts.
Cloud architecture extends to the client where web browsers and/or software applications are used to access cloud applications.
Cloud storage architecture is loosely coupled where metadata operations are centralized enabling the data nodes to scale into the hundreds, each independently delivering data to applications or users.
A cloud application influences The Cloud model of software architecture, often eliminating the need to install and run the application on the customer's own computer, thus reducing software maintenance, ongoing operations, and support. For example:
A cloud client is computer hardware and/or computer software which relies on The Cloud for application delivery, or which is specifically designed for delivery of cloud services, and which is in either case essentially useless without a Cloud.[33] For example:
Cloud infrastructure (e.g. Infrastructure as a service) is the delivery of computer infrastructure (typically a platform virtualization environment) as a service.[41] For example:
A cloud platform (e.g. Platform as a service) (the delivery of a computing platform and/or solution stack as a service) [42] facilitates deployment of applications without the cost and complexity of buying and managing the underlying hardware and software layers.[43] For example:
A cloud service (e.g. Web Service) is "software system[s] designed to support interoperable machine-to-machine interaction over a network"[44] which may be accessed by other cloud computing components, software (e.g. Software plus services) or end users directly.[45] For example:
Cloud storage is the delivery of data storage as a service (including database-like services), often billed on a utility computing basis (e.g. per gigabyte per month).[46] For example:
Traditional storage vendors have recently begun to offer their own flavor of cloud storage, sometimes in conjunction with their existing software products (e.g. Symantec's Online Storage for Backup Exec). Others focus on providing a new kind of back-end storage optimally designed for delivering cloud storage (EMC's Atmos), categorically known as Cloud Optimized Storage.
A cloud computing provider or cloud computing service provider owns and operates cloud computing systems serve someone else. Usually this needs building and managing new data centers. Some organisations get some of the benefits of cloud computing by becoming "internal" cloud providers and servicing themselves, though they do not benefit from the same economies of scale and still have to engineer for peak loads. The barrier to entry is also significantly higher with capital expenditure required and billing and management creates some overhead. However, significant operational efficiency and quickness advantages can be achieved even by small organizations, and server consolidation and virtualization rollouts are already in progress.[47]Amazon.com was the first such provider, modernising its data centers which, like most computer networks were using as little as 10% of its capacity at any one time just to leave room for occasional spikes. This allowed small, fast-moving groups to add new features faster and easier, and they went on to open it up to outsiders as Amazon Web Services in 2002 on a utility computing basis.[20]
The companies listed in the Components section are providers.
A user is a consumer of cloud computing.[33] The privacy of users in cloud computing has become of increasing concern.[48][49] The rights of users is also an issue, which is being addressed via a community effort to create a bill of rights (currently in draft).[50][51]
A vendor sells products and services that facilitate the delivery, adoption and use of cloud computing.[52] For example:
A cloud standard is one of a number of existing (typically lightweight) open standards that have facilitated the growth of cloud computing, including:[57]
Read the rest here:
Cloud computing - Simple English Wikipedia, the free encyclopedia
Posted in Cloud Computing
Comments Off on Cloud computing – Simple English Wikipedia, the free encyclopedia
How telecom is shifting its strategy to support cloud computing – SiliconANGLE (blog)
Posted: at 2:27 am
Cloud computing has fundamentally expanded the realm ofpossibilities organizationscan accomplish with technology.While a lot of focus has been placed on the cloud technology and dataarchitecture advancements, the underlying telecommunications infrastructure is also seeing a shift in strategies to support the latest trends in cloud computing.
Cisco Systems, Inc., known for its hardware infrastructure deployments, is helping drive this shift. Ian Wells(pictured, left), distinguished engineer, cloud and platform services at Cisco Systems Inc., and Jerome Tollet(pictured, right), distinguished engineer, Chief Technology and Architecture Office, at Cisco Systems, are twoof the companys team membersspearheading this initiative.
Wells and Tollet spoke with host Stu Miniman (@stu) and guest host John Troyer (@jtroyer), of theCUBE, SiliconANGLE Medias mobile live streaming studio, during OpenStack Summit in Boston, Massachusetts. They discussed theirtechnicalperspectives on virtualization and cloud computing.(*Disclosure below.)
Of all the advances in telecommunications infrastructure, the most important technology for advancing cloud computing is Network Function Virtualization, according to Tollet. NFV is becoming a first-class citizen for this community. At the beginning, people were kind of ignoring NFV, it was all about cloud. Now its becoming quite the opposite, he said.
NFV is the term used to describe the virtualization of functions that historically have been physical hardware used for things like intrusion detection and routing.As the adoption rate for NVF rises, so does the demand for more features, which can create bottle necks in development.
On the networking side, its always, Id like more functionality. Youll hear people talk about service chaining. MPLS [Multiprotocol Label Switching] comes up quite regularly, which is really integration with the rest of the service provider network, Wells said. We have a ways to go to really address the kind of general purpose model that would suit everyone.
Tollet also brought up a very interesting point about the redundancy and overheadassociatedwith a completely virtualized system.
Think in terms of two containers sitting on the same virtual compute node. Why do you need to create a packet? Why do you need to do crypto? Why do you need to do virtual LAN when the two applications are sitting on the same compute node? Tollet said.We have imported into the virtual world all of the concepts we have used in the physical world now I think we can do something a bit more efficient .
Watch the complete video interview below, and be sure to check out more of SiliconANGLEs and theCUBEs independent editorial coverage ofOpenStack Summit 2017 Boston.(* Disclosure: Cisco Systems Inc. sponsored this OpenStack Summit segment on SiliconANGLE Medias theCUBE. Neither Cisco Systemsnor other sponsors have editorial control over content on theCUBE or SiliconANGLE.)
Read the rest here:
How telecom is shifting its strategy to support cloud computing - SiliconANGLE (blog)
Posted in Cloud Computing
Comments Off on How telecom is shifting its strategy to support cloud computing – SiliconANGLE (blog)
Cloud Computing puts in work for Preakness before deluge – Daily Racing Form
Posted: at 2:27 am
Michael Amoruso
Cloud Computing ran second to J Boys Echo in the Grade 3 Gotham Stakes.
ELMONT, N.Y. Before Mother Nature poured buckets of water on Long Island on Saturday morning, trainer Chad Brown was able to get in Cloud Computings last major workout before next Saturdays 142nd Preakness Stakes at Pimlico.
With steady rain falling at Belmont Park shortly after 5:30 a.m. Saturday, Cloud Computing worked a half-mile in 48.56 seconds over the training track that would be considered fast. Working by himself, Cloud Computing went in splits of 12.43 seconds for the opening eighth, 24.47 for the quarter, and got his final quarter in 24.09 without too much encouragement from exercise rider Peter Roman. He galloped out five furlongs in 1:01.69.
The horse breezed well, Brown said. He went a good half, out five. I thought he did it real, real well. Hes moving sound and strong. Im real happy with this horse. If he comes out of it in good shape, well be on to Maryland.
Cloud Computing, a son of Macleans Music owned by Seth Klarman and William Lawrence, has a win from three starts. That victory came going six furlongs over Aqueducts inner track on Feb. 11. He wheeled back three weeks later in the Grade 3 Gotham and ran a respectable second behind J Boys Echo after chasing a fairly hot pace. Cloud Computing then had a wide trip when the inside part of Aqueducts main track was the preferred spot when he finished third behind Irish War Cry and Battalion Runner in the Grade 2 Wood Memorial.
Cloud Computing did earn enough points to run in the Kentucky Derby, but Brown and his owners felt it was too much too soon for the horse. Cloud Computing needed a chip removed from a front ankle last summer, which is why he didnt run at age 2.
I feel very comfortable that we gave him the six weeks from the Wood, Brown said. I see a horse thats really doing well.
Javier Castellano will ride Cloud Computing in the Preakness. Brown said he anticipates shipping Cloud Computing to Baltimore on Tuesday.
Term of Art works
At Santa Anita on Saturday morning, Term of Art, one of the outsiders in the Preakness, worked six furlongs in 1:13.80 while outfitted in blinkers, which trainer Doug O'Neill said he will wear in the second leg of the Triple Crown.
Term of Art has scored both of his wins in blinkers, but he has not worn them in his last three starts, including a seventh-place finish in his last start, the Santa Anita Derby.
On Saturday, with exercise rider Amir Cedeno up, Term of Art worked by himself.
"He worked great," O'Neill said. "The track was demanding safe but slow. I'm very happy. We know he's a longshot, but he's doing well."
O'Neill said Term of Art would fly to Baltimore from California on Tuesday. He will be the only horse O'Neill has at Pimlico next weekend.
O'Neill won the Preakness in 2012 with I'll Have Another, his first Kentucky Derby winner. Last year, he finished third in the Preakness with Derby winner Nyquist.
Always Dreaming, the Derby winner, has been at Pimlico since last Tuesday and on Saturday galloped 1 1/4 miles on a sloppy, sealed track. The wet track forced the postponement of a scheduled work for Royal Mo, who traveled with Always Dreaming to Pimlico on Tuesday. He could work Sunday or wait until Monday.
Gunnevera, seventh in the Derby, was scheduled to arrive at Pimlico on Saturday after a van ride from Churchill Downs.
additional reporting by Jay Privman
More here:
Cloud Computing puts in work for Preakness before deluge - Daily Racing Form
Posted in Cloud Computing
Comments Off on Cloud Computing puts in work for Preakness before deluge – Daily Racing Form
Benefit-risk ‘tipping point’ for cloud computing now passed, says … – Out-Law.com
Posted: at 2:27 am
The Depository Trust & Clearing Corporation (DTCC), which already hosts some applications in the cloud, said cloud computing has now "moved past a tipping point" whereby it offers greater benefits and fewer risks to traditional outsourcing arrangements.
Financial services and technology law expert Luke Scanlon of Pinsent Masons described DTCC's move as a sign that the barriers that dissuade many financial firms from utilising cloud-based solutions are diminishing.
"The DTCC, after a period of testing and detailed analysis, have here highlighted that some of the traditional reasoning as to why cloud services present significant risk such as concerns around security are no longer valid," Scanlon said.
"In 2017 we are certainly seeing a maturing of the discussion and more and more of a focus on the few remaining regulatory sticking points to cloud adoption, together with the practical concerns around achieving the levels of availability necessary to operate the core systems of financial institutions and utilities, liability and exit arrangements," he said.
In a new white paper it has published, which contained its strategy to leverage the cloud, the DTCC explained why it will move more of its applications and services into the cloud.
"DTCC has been leveraging cloud services for almost five years and believes the cloud represents a viable alternative to corporate data centres," it said. "The maturation, expanded offerings and enormous scale of the technology, resolve the privacy and security challenges of cyber-threats, potential flash crash type market disruptions and the cost challenges facing many financial firms today."
"DTCC believes cloud computing has moved past a tipping point, prompting the firm to pursue a strategy of building a cloud ecosystem with partner vendors that support best practices and standards. DTCC is taking this step because it is confident that the security, scalability, resiliency, recoverability and cost of applications in the cloud are better than almost any private enterprise could achieve on its own," it said.
"DTCC also believes that business services, delivered by applications written to take advantage of the infinite resources, resiliency, and global reach of the cloud, have a significant advantage over legacy applications using traditional models in private data centres. We believe that gap will continue to widen over time," the firm said.
DTCC said it plans to work with regulators to ensure that its cloud-based operations are compliant with "the highest and strictest levels of recommended controls and best practices" it is subject to.
Earlier this year,seven main hurdles to banks' adoption of cloud-based serviceswere highlighted in a joint report by Pinsent Masons and UK banking industry body the BBA.
Continued here:
Benefit-risk 'tipping point' for cloud computing now passed, says ... - Out-Law.com
Posted in Cloud Computing
Comments Off on Benefit-risk ‘tipping point’ for cloud computing now passed, says … – Out-Law.com
Boston schools CIO Mark Racine takes hybrid approach to cloud computing – EdScoop News (press release) (registration) (blog)
Posted: at 2:27 am
The district is also developing a single sign-on platform to better integrate applications and data.
With nearly 60,000 students and a mix of traditional, charter and pilot schools, CIO Mark Racine is always looking for ways to make educational technology go farther for the faculty and families of Boston Public Schools.
Like many CIOs, Racine has his eye on cloud computing as the future of data management.
But with limited funding preventing an immediate full-on move to the cloud, Racine and his infrastructure team are still banking on a hybrid approach, he said in a recent interview with EdScoop. The approach provides scaling opportunities to relieve stress on the network, especially at certain high-traffic points during the school year.
He likened it to the 1-800-Flowers approach, the way flower companies will need to scale up for Valentine's Day, and then come back inside, Racine said.
We would move to the cloud tomorrow if we could, he said.
View more of EdScoop's interviews with innovative school CIOs.
Among other edtech initiatives, Racine said he and his 50-member IT team have also invested heavily in single sign-on technology, geared towards increasing connectivity across the district.
The technology is also aimed at building toward greater data integration. The platform will take authentication to all kinds of different learning apps, and allow us to take our Ed-Fi database and scale that data to all educational platforms as well, he said.
When an educational technology platform is working well in a classroom or school, we want to be able to bring that up to 130 buildings, he said.
Another big initiative underway for Boston Public Schools, according to Racine, is finding the best way to support the districts school choice program.
Boston schools offer parents the flexibility to walk into a family resource center, explore all the schools that are available to them, learn about the educational programming that's in that building, and then be able to make a choice on where they want to send their child.
The ultimate goal of this is to, as Racine says, Eliminate the amount of lost-learning time, through the process of integrating technology into school choice programs.
Ryan Johnston contributed to this report.
Continue reading here:
Posted in Cloud Computing
Comments Off on Boston schools CIO Mark Racine takes hybrid approach to cloud computing – EdScoop News (press release) (registration) (blog)
Achieving compliance in the cloud – CSO Online
Posted: at 2:27 am
More and more organizations are moving towards cloud technologies for scalability, cost reduction, and new service offerings. In this short article we will review cloud basics and look at auditing for compliance challenges in the cloud.
Cloud computing is a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources that can be rapidly provisioned and released with minimal management effort or service provider interaction. This cloud model is composed of five essential characteristics, three service models, and four deployment models.- The NIST 800-145 Definition of Cloud Computing
Lets review the deployment models:
Public Cloud- Cloud computing services from vendors that can be accessed across the internet or a private network, using systems in one or more data centers, shared among multiple customers, with varying degrees of data privacy control.
Private Cloud - Computing architectures modeled after Public Clouds, yet built, managed, and used internally by an enterprise; uses a shared services model with variable usage of a common pool of virtualized computing resources. Data is controlled within the enterprise.
Hybrid Cloud - A mix of vendor cloud services, internal cloud computing architectures, and classic IT infrastructure, forming a hybrid model that uses the best-of-breed technologies to meet specific needs.
Community Cloud - The cloud infrastructure is shared by several organizations and supports a specific community that has shared concerns (for example, mission, objectives, security requirements, policy, and compliance considerations). It may be managed by the organizations or a third party, and may exist on-premise or off-premise.
Service Delivery Models:
Infrastructure as a Service (IaaS) - Delivers computer infrastructure, typically a platform virtualization environment as a service. Service is typically billed on a utility computing basis and amount of resources consumed.
Platform as a Service (PaaS) - Delivers a computing platform as a service. It facilitates deployment of applications while limiting or reducing the cost and complexity of buying and managing the underlying hardware and software layers.
Software as a Service (SaaS) - Delivers software as a service over the internet, avoiding the need to install and run the application on the customer's own computers and simplifying maintenance and support.
So now that we have reviewed the basics of deployment and service delivery, what does it all mean to be compliant in the cloud vs compliance on a traditional perimeter based corporate network? We also have to consider the business sector or compliance model and sometimes this is mixed. For example in healthcare its HIPAA compliance we are trying to achieve, In the credit card retail environment it's PCI DSS and in government it's FISMA or the NIST Cyber Security framework we must achieve. Of course healthcare uses credit cards to create a mixed compliance.
It's important to know where the responsibility is when working in the cloud. As we move from IaaS to PaaS and finally to SaaS, we see that the cloud vendor is responsible for more. For example in SaaS they are delivering it all. In IaaS they deliver the least so the rest is all your responsibility. The more they provide the more you lose control.
Some real challenges in working with a cloud environment are understanding the scope of the cloud environment, Can your current risk assessment work in the cloud? Audit trails in the cloud?
The key is to go with a risk-based approach and know that cloud-based risk is different. For example, the concept of a perimeter in a multi-tenant environment doesnt make sense anymore. Some examples: in service delivery risk, we must evaluate virtualization risk, SaaS risk, PaaS, and IaaS risk.
Then we need to look at deployment risk, business model risk and security risk just to name a few.
What we really need now is a map, this is getting too confusing right? Deloittes Cloud Computing Risk Intelligence Map is very helpful.
Take a look at data management in the cloud risk map. Notice that for data usage we have a lack of clear ownership of cloud generated data, and unauthorized access or inappropriate use of sensitive data, personal data or intellectual property. These are real-world issues with cloud computing because you dont have full control especially if you are in an SaaS environment. At the same time you must be able to apply the deployment and service delivery models to your actual compliance framework as in HIPAA, PCI DSS and FISMA for example.
SOC 1 is for service organizations assessments that impact financials, therefore let's look at SOC 2 and 3. SOC 2 is geared towards technology companies and allows the incorporation of other frameworks into the SOC 2 report. SOC 2 assessment consists of the Trust Service Principles (TSP) framework from American Institute of Certified Public Accountants (AICPA) for evaluating a service organization's internal controls against the prescribed set of Common Criteria found in the TSPs.
SOC 2 assessments cover a wide range of controls such as operational, technical and information security controls. SOC 3 SysTrust/WebTrust also known as Trust Services, which are broad based and also from (AICPA). We are really talking about e-commerce compliance here! So SOC 3 covers e-commerce web servers and the systems that interconnect and support e-commerce business platforms.
Trust Services are a set of professional attestation and advisory services based on a core set of principles and criteria that address the risks and opportunities of IT-enabled systems and privacy programs. The following principles and related criteria are used by practitioners in the performance of Trust Services engagements:
In cloud environments, multiple partys data and services can exist on a single physical platform running virtual services for its customers. This creates several problems for security, compliance and audit, including:
Limited ability to control data and applications
Limited knowledge and no visibility into the degree of segmentation and security controls between those collocated virtual resources
Audit and control of data in the public cloud with no visibility into the providers systems and controls even in a private cloud that is privately managed, multi-tenancy is enacted at many layers, including storage, application, database, operating platform and hypervisor-based infrastructure. In other words, shared hosts, data centers and networks can potentially exist between the same and different organizations or internal business units. As such, it is critical that network segmentation is created securely with the ability to monitor any anomalies that may occur across virtual network boundaries.
The auditee in this case the cloud provider or consumer is required to produce compliance reports to prove that their security measures are protecting their assets from being compromised. Several open source and commercial tools, including security information and event management (SIEM) and GRC tools, that enable generation of compliance reports on a periodic and/or on-demand basis, exist in the market.
In cloud environments its important to know what is different in an onsite local computing environment vs cloud service providers. Who has responsibility and to capture this in an service-level agreement and system security plan. Nothing can be assumed. The fact that you are sharing a cloud environment to provide growth and on demand scalability means we must realize the issues related to sharing.
Just like renting a room out in your house changes your security, and privacy so too does sharing cloud computing resources. The NIST and Cloud Security Alliance Standards are mandatory to manage the ever changing and complex cloud environment. In both local and cloud environments we are managing risk and in the cloud its more complex, shared and dynamic.
For further reading on cloud virtual machine issues I recommend a paper titled TenantGuard: Scalable Runtime Verification of Cloud Wide VM level network isolation.
NIST SP 800-53, NIST SP 800-144, SP 800-30, Deloitte cloud computing risk intelligence map, ZCloud, Security Alliance Cloud Controls Matrix, ISACA Cloud computing Audit program, FedRamp Federal Risk and Authorization management Program.
References SANS
Deloitte
This article is published as part of the IDG Contributor Network. Want to Join?
More:
Posted in Cloud Computing
Comments Off on Achieving compliance in the cloud – CSO Online
Quantum Computers Sound Great, But Who’s Going to Program Them? – TrendinTech
Posted: at 2:27 am
While everyones in a rush to get to the end of the quantum computer race, has anyone really given a moment thought as to who will actually program these machines? The idea of achieving quantum supremacy came after Google unveiled its new quantum chip design and is all about creating a device that can perform calculation impossible for a conventional computer to carry out.
Quantum computers should have no trouble in outperforming conventional computers as they work on the basis of qubits. Unlike bits that run conventional computers and either a 0 or a 1, qubits can be both at the same time. This is a phenomenon known as superposition. But in order to demonstrate that thousands of qubits would be needed, and right now, thats just not possible. So instead of Google is planning to compare the computers ability to simulate the behavior of a random arrangement of quantum circuits and estimate it will take around 50 qubits to outdo the most powerful of computers.
IBM is getting ready to release the worlds first commercial universe quantum computing service later this year that will give users the chance to connect to one of its quantum computers via the cloud for a fee. But, there are still many hurdles to overcome before this technology becomes mainstream. One of these problems is that programming a quantum computer is much harder than programming a conventional computer. So, whos going to program them?
There are a number of quantum simulators available now that will help users get familiar with quantum computing, but its not the real thing and is likely to behave very differently. MIT physicist, Isaac Chuang, said, The real challenge is whether you can make your algorithm work on real hardware that has imperfections. It will take time for any computer programmer to learn the skills needed for quantum computing, but until the systems have been developed, what will they learn on?
This is one of the reasons for the push in making quantum devices more accessible. D-wave made available their Qbsoly and Qmasm tools earlier this year in an attempt to get more people into the realms of quantum computing. If the tools are available, more people will be tempted to have a go and budding quantum computer scientists will be born. And as Googles researchers wrote in a statement, If early quantum-computing devices can offer even a modest increase in computing speed or power, early adopters will reap the rewards.
More News to Read
comments
View post:
Quantum Computers Sound Great, But Who's Going to Program Them? - TrendinTech
Posted in Quantum Computing
Comments Off on Quantum Computers Sound Great, But Who’s Going to Program Them? – TrendinTech
D-Wave Closes $50M Facility to Fund Next Generation of Quantum Computers – Marketwired (press release)
Posted: at 2:27 am
BURNABY, BC--(Marketwired - May 16, 2017) - D-Wave Systems Inc., the leader in quantum computing systems and software, today announced that it has received new capital in the form of convertible notes from the Public Sector Pension Investment Board ("PSP Investments"). PSP Investments funded US$30 million at closing, with an additional US$20 million available at D-Wave's option upon the achievement of certain milestones. This facility brings D-Wave's total funding to approximately US$200 million. The new capital is expected to enable D-Wave to deploy its next-generation quantum computing system with more densely-connected qubits, as well as platforms and products for machine learning applications.
"This commitment from PSP Investments is a strong validation of D-Wave's leadership in quantum computing," said Vern Brownell, CEO of D-Wave. "While other organizations are researching quantum computing and building small prototypes in the lab, the support of our customers and investors enables us to deliver quantum computing technology for real-world applications today. In fact, we've already demonstrated practical uses of quantum computing with innovative companies like Volkswagen. This new investment provides a solid base as we build the next generation of our technology."
This latest funding comes on the heels of significant momentum for D-Wave. Milestones achieved so far in 2017 include:
About D-Wave Systems Inc. D-Wave is the leader in the development and delivery of quantum computing systems and software, and the world's only commercial supplier of quantum computers. Our mission is to unlock the power of quantum computing for the world. We believe that quantum computing will enable solutions to the most challenging national defense, scientific, technical, and commercial problems. D-Wave's systems are being used by some of the world's most advanced organizations, including Lockheed Martin, Google, NASA Ames, USRA, USC, Los Alamos National Laboratory, and Temporal Defense Systems. With headquarters near Vancouver, Canada, D-Wave's U.S. operations are based in Palo Alto, CA and Hanover, MD. D-Wave has a blue-chip investor base including PSP Investments, Goldman Sachs, Bezos Expeditions, DFJ, In-Q-Tel, BDC Capital, Growthworks, 180 Degree Capital Corp., International Investment and Underwriting, and Kensington Partners Limited. For more information, visit: http://www.dwavesys.com.
Read the original here:
Posted in Quantum Computing
Comments Off on D-Wave Closes $50M Facility to Fund Next Generation of Quantum Computers – Marketwired (press release)







