Daily Archives: June 9, 2017

When art and astronomy mix – Astronomy Magazine

Posted: June 9, 2017 at 1:50 pm

It can be hard to visualize something you can't see, so when NASA announced the TRAPPIST-1 system, they knew they needed to get some great artists to visually represent the amazing new system.

Robert Hurt, a visualization scientist at Caltechs IPAC Center with a Ph.D. in astrophysics, and Tim Pyle, a multimedia producer with a background in Hollywood special effects, came together to create visualizations of the TRAPPIST-1 system.

The seven-planet system discovered by NASAs Spitzer Space Telescope has three Earth-size planets in its habitable zone. As no telescope is powerful enough to photograph our distant neighbors yet, the two were tasked with creating realistic renderings of what they might look like.

"For the public, the value of this is not just giving them a picture of something somebody made up," Douglas Hudgins, a program scientist for the Exoplanet Exploration Program at NASA Headquarters, said in a press release. "These are real, educated guesses of how something might look to human beings. An image is worth a thousand words."

Hurt and Pyle worked with data from telescopes and consulted the discovery team at NASA as they went along. TRAPPIST-1b was inspired by Jupiters moon, Io. Pyle based the design of TRAPPIST-1h, the most distant and mysterious planet in the system, off two more of Jupiters moons, Ganymede and Europa.

"When we're doing these artist's concepts, we're never saying, 'This is what these planets actually look like,'" Pyle said. "We're doing plausible illustrations of what they could look like, based on what we know so far. Having this wide range of seven planets actually let us illustrate almost the whole breadth of what would be plausible. This was going to be this incredible interstellar laboratory for what could happen on an Earth-sized planet."

Based on the possibly that the planets are tidally locked, Hurt put an ice cap on TRAPPIST-1cs dark side. Hurt also took a little creative liberty, putting water on the dayside of TRAPPIST-1d, one of the three habitable planets. Scientists originally wanted him to depict an eyeball world, where the side facing the host star would be hot and dry, the side on the back would be icy, and the middle would have water. But Hurt tried to convince them his design would be the best bet.

Then I kind of pushed back, and said, 'If it's on the dark side, no one can look at it and understand we're saying there's water there,' Hurt said.

After the disagreement, the team compromised, allowing water to be seen in the dayside.

Ultimately, the teams main goal was to get the public excited about science and give them more information about what these planets might look like.

Read more:

When art and astronomy mix - Astronomy Magazine

Posted in Astronomy | Comments Off on When art and astronomy mix – Astronomy Magazine

Ingredient of life found around infant Sun-like stars – Astronomy Now Online

Posted: at 1:50 pm

ALMA has observed stars like the Sun at a very early stage in their formation and found traces of methyl isocyanate a chemical building block of life. This is the first ever detection of this prebiotic molecule towards a solar-type protostar, the sort from which our Solar System evolved. The discovery could help astronomers understand how life arose on Earth.This image shows the spectacular region of star formation where methyl isocyanate was found. The insert shows the molecular structure of this chemical. Credit: ESO/Digitized Sky Survey 2/L. Calada

ALMA has observed stars like the Sun at a very early stage in their formation and found traces of methyl isocyanate a chemical building block of life. This is the first ever detection of this prebiotic molecule towards solar-type protostars, the sort from which our Solar System evolved. The discovery could help astronomers understand how life arose on Earth.

Two teams of astronomers have harnessed the power of the Atacama Large Millimeter/submillimeter Array(ALMA) in Chile to detect the prebiotic complex organic moleculemethyl isocyanatein the multiple star systemIRAS 16293-2422. One team was co-led by Rafael Martn-Domnech at theCentro de Astrobiologain Madrid, Spain, and Vctor M. Rivilla, at the INAF-Osservatorio Astrofisico di Arcetriin Florence, Italy; and the other by Niels Ligterink at theLeiden Observatoryin the Netherlands and Audrey Coutens at University College London, United Kingdom.

This star system seems to keep on giving! Following the discovery of sugars, weve now found methyl isocyanate. This family of organic molecules is involved in the synthesis ofpeptidesandamino acids, which, in the form of proteins, are the biological basis for life as we know it, explain Niels Ligterink and Audrey Coutens.

ALMAs capabilities allowed both teams to observe the molecule at several different and characteristic wavelengths across theradio spectrum. They found the unique chemical fingerprints located in the warm, dense inner regions of the cocoon of dust and gas surrounding young stars in their earliest stages of evolution. Each team identified and isolated the signatures of the complex organic molecule methyl isocyanate. They then followed this up with computer chemical modelling and laboratory experiments to refine our understanding of the molecules origin.

IRAS 16293-2422is a multiple system of very young stars, around 400 light-years away in a large star-forming region calledRho Ophiuchiin the constellation ofOphiuchus(The Serpent Bearer). The new results from ALMA show that methyl isocyanate gas surrounds each of these young stars.

Earth and the other planets in our Solar System formed from thematerialleft over after the formation of the Sun. Studying solar-type protostars can therefore open a window to the past for astronomers and allow them to observe conditions similar to those that led to the formation of our Solar System over 4.5 billion years ago.

Rafael Martn-Domnech and Vctor M. Rivilla, lead authors of one of the papers, comment: We are particularly excited about the result because these protostars are very similar to the Sun at the beginning of its lifetime, with the sort of conditions that are well suited for Earth-sized planets to form. By finding prebiotic molecules in this study, we may now have another piece of the puzzle in understanding how life came about on our planet.

Niels Ligterink is delighted with the supporting laboratory results: Besides detecting molecules we also want to understand how they are formed. Our laboratory experiments show that methyl isocyanate can indeed be produced on icy particles under very cold conditions that are similar to those in interstellar space This implies that this molecule and thus the basis for peptide bonds is indeed likely to be present near most new young solar-type stars.

Go here to read the rest:

Ingredient of life found around infant Sun-like stars - Astronomy Now Online

Posted in Astronomy | Comments Off on Ingredient of life found around infant Sun-like stars – Astronomy Now Online

The TRAPPIST-1 system may have formed pebble-by-pebble … – Astronomy Magazine

Posted: at 1:50 pm

The TRAPPIST-1 system looks more like Jupiter and its moons than our own solar system. Seven planets orbit in an elaborate synchronous dance around a star only slightly larger than Jupiter. Those seven planets are constrained within a 3 million-mile space and all of them are between the size of Mars and a slightly-larger-than-Earth rocky planet. Oh, and at least three of the planets are habitable.

And now, a group of University of Amsterdam professors believe they know how it formed. And they think it happened rock-by-rock.

Solar systems typically form from nebula as gas accumulates and clumps, it forms a star, which then helps gravitationally shape planets. But in a small system like TRAPPIST-1, the planets have to stay close in order to keep gravitationally bound. In the TRAPPIST system, the first batch of planets formed from material leftovers clumps of dirt and ice then migrated outward. At a certain point they reach a place where water sublimes into vapor, and water accumulates onto that ice and rock. It becomes enough material to smoosh into a proto-planet and migrates closer to the star.

The end result? Seven icy, Earth-sized worlds. Chris Ornell, lead author of the paper recently accepted to the journal Astronomy and Astrophysics, said in a press release, We have been working on pebble aggregation and sweep-up by planets for a long time and were also developing a new ice-line model. Thanks to the discovery of Trappist-1 we can compare our model with reality.

This method of accumulation also helps settle why the system seems to have no gas giants like Neptune or Uranus, at least that we know of. Its unknown yet if such a mechanism creates atmospheres on the planets or how this might affect habitability in the system.

Read the rest here:

The TRAPPIST-1 system may have formed pebble-by-pebble ... - Astronomy Magazine

Posted in Astronomy | Comments Off on The TRAPPIST-1 system may have formed pebble-by-pebble … – Astronomy Magazine

Edge Computing Is New Cloud Computing Tech Investors Should Track – GuruFocus.com

Posted: at 1:50 pm

The cloud computing industry is still in its early stages of adoption. In 2016, the Infrastructure as a Service (IaaS) segment recorded just $22 billion in annual revenues. Considering the hundreds of billions dollars the IT industry spends every year, it is very clear that IaaS still has a long way to go.

The Software as a Service segment is a bit older, but the model has now become the preferred method for software delivery. Microsoft has done an excellent job of ditching its old annual licensing model for SaaS, and the success of Office 365, their lead SaaS product, is ample validation of that. Oracle is targeting $10 billion in annual revenues from SaaS over the next few years.

With these and thousands of other companies in the fray, the SaaS segment is expected to continue its double-digit growth over the next several years.

The cloud software market reached $48.8 billion in revenue in 2014, representing a 24.4% year-over-year growth rate. IDC expects cloud software will grow to surpass $112.8 billion by 2019 at a compound annual growth rate (CAGR) of 18.3%. SaaS delivery will significantly outpace traditional software product delivery, growing nearly five times faster than the traditional software market and becoming a significant growth driver to all functional software markets. By 2019, the cloud software model will account for $1 of every $4.59 spent on software.IDC

As businesses around the world slowly started warming up to the idea of third party-managed infrastructure services (IaaS) and software products delivered over the cloud (SaaS), the segment has piqued the interests of all the major tech players. Early entrants Amazon (NASDAQ:AMZN) and Microsoft (NASDAQ:MSFT) have already crossed $13 billion in trailing 12-month revenues while IBM has, so far, kept pace. Google is still working toward getting its bona fides in the cloud game by building datacenters and increasing features and services while Oracle is slowly working on its IaaS portfolio as well.

But Amazon and Microsoft, the lead players in the cloud story, have now made it clear that they are already on their way to embracing the next level of cloud. Microsofts CEO Satya Nadella made a huge announcement during the recent Microsoft Build 2017 developer conference that the companys cloud strategy is moving toward edge computing:

It has been barely four years since Microsoft CEO Satya Nadella announced the companys Mobile First, Cloud First strategy. Instead of basking in the glory of newfound success in Cloud, Microsoft CEO Satya Nadella has now announced that the time has to come to move on from a Mobile First, Cloud First strategy toward a more cloud-focused Intelligent Cloud, Intelligent Edge strategy.1reddrop

Amazon made its edge computing/IoT-focused software, Amazon Greengrass, publicly available Thursday, making it loud and clear that they, too, are in the race to move computing closer to the edge.

But most investors in the stock market have barely begun to discover cloud computing so heres a little primer on the new wave of cloud computing.

What is edge computing?

In cloud computing, the processing power is always centralized. Data has to travel from a device to servers, where it gets processed; the output is then pushed back to the device. Edge computing moves these heavy processing tasks or as many of them as possible closer to the point of origin, hence the word "edge."

This reduces the time data needs to travel, thereby reducing latency and cutting reliance on internet connections. It results in improved reliability and faster, more reliable decision making at the edge.

Among its application areas are artificial intelligence, or AI, where it It completely transforms the way AI can be applied to various scenarios.

Thats probably an oversimplified description of edge computing, but its enough at this point to understand that the "edge" part of the equation takes the "computing" part of cloud computing away from massive data centers and brings it closer to the connected devices themselves.

There are several obvious advantages to adopting an edge computing model over a traditional cloud computing one, but the segment itself is in very early stages of its development. Edge computing also needs a robust IoT and AI device ecosystem to make its impact felt in full force. By moving in early on this new paradigm in cloud computing, Amazon and Microsoft, the top two cloud companies, have once again moved the cloud goal posts and significantly raised the bar.

Their competitors now have to take the risk to invest sufficient resources, time and money if they want to keep Amazon and Microsoft in check. Considering the fact that Google and Oracle are only now starting on the cloud computing segment itself, it is going to be an extremely difficult task to execute to keep expanding their cloud offerings while also working on edge computing and IoT technologies.

By putting a clear moat around their businesses, Amazon and Microsoft are further differentiating themselves from the now-crowded cloud computing space. Microsoft is even going so far as to redefine its very vision for the future of cloud computing and the direction that the companys cloud push is going to take.

Why do investors need to know this? Because these are the moves that will take Microsoft and Amazon from their annual cloud revenues of $13 billion to twice, thrice that and beyond. Its not something of which a serious tech investor can afford to be unaware.

Disclosure: I have no positions in the stock mentioned above and no intention to initiate a position in the next 72 hours.

Sangara Narayanan

See original here:

Edge Computing Is New Cloud Computing Tech Investors Should Track - GuruFocus.com

Posted in Cloud Computing | Comments Off on Edge Computing Is New Cloud Computing Tech Investors Should Track – GuruFocus.com

Virtualization admin? Pivot — pivot now — to a cloud computing career – TechTarget

Posted: at 1:50 pm

For those virtualization admins hiding under a virtual rock regarding cloud, I have news for you. Your job isn't safe. No one can put the cloud genie back in the bottle. Cloud computing is here to stay, and virtualization admins need to shift focus to keep up with tomorrow's jobs.

This complimentary guide helps readers determine the pros, cons and key considerations of DevOps by offering up 5 important questions you should be asking in order to create a realistic DevOps assessment.

By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.

You also agree that your personal information may be transferred and processed in the United States, and that you have read and agree to the Terms of Use and the Privacy Policy.

The move to cloud is already happening at all levels, from the smallest through to the largest businesses. Cloud and microservices mark a new iteration of change that is as disruptive as the original arrival of virtualization with VMware -- if not more so.

Virtualization has two phases: consolidation and abstraction.

In the beginning, virtualization's goal was more efficient use of underutilized hardware. Rarely do servers consume all the resources allocated to them. Virtualization admins could reclaim these lost resources and vastly reduce wasted capacity.

In phase two, virtualization developed advanced functions such as live storage motion or migration, high availability and fault tolerance. These virtualization capabilities address the issues that arise when several machines share one physical piece of hardware. Automation arrived and made server deployment simple and straightforward.

I argue that this virtualization adoption curve peaked a few years ago -- we are now moving to the next iteration, and you'll need to follow a cloud computing career path to come along.

Even once-conservative technology adopters, such as financial institutions, are jumping on board with the third wave of virtualization.

There is a thirst to cut costs, and automation allows massive cost cuts. There will be job losses. No virtualization admin should think it will never happen to them. You are fooling yourself. Fewer staff means fewer medical plans and pensions to support. It is not hard to see why the cloud appeals to the bottom line.

There will not be enough cloud computing careers to go around based on old virtualization working practices, such as in a phase one scenario.

Consider virtual machine orchestration. In early-phase virtualization environments, VMs still required some level of administration action, such as deployment from a template, to accompany automated steps. Tools such as VMware vRealize Automation or Platform9's managed vSphere stack enable an approved user to request a VM, customized to their specifications, and have it deployed within 10 minutes with no administrator interactions. Larger companies used to have several virtualization admins whose jobs purely entailed VM creation and deployment. Within a year or two, that job role disappeared.

Virtual machines are now moving to cattle status, meaning they're disposable commodities. To scale applications, organizations adopt automation tools that deploy new VMs. It's quicker to deploy another instance of a machine than to troubleshoot a broken one.

DevOps does away with manual work; manual deployment is the exact opposite of how DevOps is supposed to work. A key tenet of DevOps is that tasks performed more than once in the same way should be scripted so the IT platform does the action itself.

There is still time to retool and get on a cloud computing career path. Virtualization admins are luckier than most.

Platform as a service reduces the workload. Workloads that used to be custom-built and based on infrastructure as a service are now provided as a service for consumption by developers and companies. Examples include the larger cloud vendors offering secure and highly available database hosting that organizations consume without any effort to build and manage the underlying database infrastructure. Little to no database admin input required. No server admin required either.

The complexity hasn't gone away -- it has just changed. Management complexity moved from the VMs to orchestration and scaling. Virtualization elements such as high availability and disaster recovery (DR) lost importance, while the IT industry turned its attention to microservices that are scalable, redundant and can be spun up and down at will. Automation means little to no hands-on intervention. For example, you can spin up a cloud infrastructure from a single PowerShell script.

Classic DR locations are now costly relics of waste. Cloud affects virtualization in secondary ways. For example, businesses are used to having one primary data center and one DR setup in another data center. Given a relatively modern application set, the entire company infrastructure can restart in its entirety in the cloud in the event of a disaster. Modern DR management products, such as Zerto and Acronis, eliminate the costly secondary data center, allowing businesses to prepopulate and configure DR setups in the cloud.

This is the reality for virtualization admins, and the only future is in a cloud computing career. Over time, more applications are built cloud-first to save money from the start; the old, immovable on-site applications go the way of pagers and typewriters.

The reality is that most virtualization admin roles as we know them will vastly shrink or become outmoded over the next decade. A virtual data center requires far fewer staff, and with automation and scripting, a single administrator can manage massive numbers of servers.

There is still time to retool and get on a cloud computing career path. Virtualization admins are luckier than most. While the technology itself may change, these administrators have skills that easily translate to the popular cloud and DevOps arena.

This doesn't mean becoming a code guru or programmer, but a virtualization admin will need a deep understanding of architectures and tools such as Docker for containerization, Chef for configuration management and Kubernetes for container orchestration to become a DevOps admin. Learn multiple scripting languages and investigate hyper-converged infrastructure for cloud hosting.

The warning signs are there, fellow admins. It is just a case of doing something about it while you still can.

Help keep an organization on track as a cloud capacity manager

Break down seemingly convoluted DevOps job requirements

DevOps engineers must demonstrate strong communication skills

Set up a DevOps home lab to gain hands-on skills

Visit link:

Virtualization admin? Pivot -- pivot now -- to a cloud computing career - TechTarget

Posted in Cloud Computing | Comments Off on Virtualization admin? Pivot — pivot now — to a cloud computing career – TechTarget

The benefits of cloud computing, Rust 1.18, and intelligent tracking prevention in WebKit SD Times news digest … – SDTimes.com

Posted: at 1:50 pm

The cloud is no longer an afterthought, it is a competitive advantage. According to a new Insight-sponsored report by Harvard Business Review Analytic Services, businesses are turning to the cloud for agility, data capabilities, customer and user experiences as well as cost savings.

A companys IT environment should work for them by enabling them to both run and innovate. Large and small to mid-sized companies need to focus on managing and modernizing their IT infrastructure, so that it becomes a transformative part of their business that can directly improve results, said David Lewerke, Director, hybrid cloud consulting practice at Insight. While we knew there were a number of benefits, we wanted to better understand from respondents exactly how cloud systems were impacting their business outcomes.

The report found 42% use a hybrid cloud approach, 40% host their systems in a private cloud, and 13% host in a public cloud. Other benefits of cloud adoption included time to market, ability to manage security, and the ability to mitigate risk.

Rust 1.18 releasedThe latest version of the systems programing language Rust has been released with new improvements, cleanups and features. The biggest changes in Rust 1.18 includes an update to the latest edition of The Rust Programming Language book. The book is being written in the open on GitHub. Version 1.18 features the first draft of the second edition, as well as 19 out of 20 draft chapters.

Other features include an expansion of the pub keyword, library stabilizations, and cargo features.

More information is available here.

WebKits intelligent tracking prevention featureWebKit, an open source web browser engine, is providing a new feature called cross-site tracking. The Intelligent Tracking Prevention feature limits cookies and other website data to help users feel they can trust the privacy-sensitive data about their web activity again.

The success of the web as a platform relies on user trust. Many users feel that trust is broken when they are being tracked and privacy-sensitive data about their web activity is acquired for purposes that they never agreed to, John Wilander, security engineer for WebKit, wrote in a post.

Continue reading here:

The benefits of cloud computing, Rust 1.18, and intelligent tracking prevention in WebKit SD Times news digest ... - SDTimes.com

Posted in Cloud Computing | Comments Off on The benefits of cloud computing, Rust 1.18, and intelligent tracking prevention in WebKit SD Times news digest … – SDTimes.com

Growing Patent Claim Risks in Cloud Computing – Lexology (registration)

Posted: at 1:50 pm

This blog develops the themes of our February piece on cloud availability risks from software patent claims. It shows how the patent cloudscape is changing; how PAEs are increasingly active in Europe as well as in the USA; and how CSPs are starting to respond in their contract terms.

With increasingly recognised benefits of security, flexibility and reliability, cloud computing continues to carry all before it. Its aggregation of massive processing power also heralds deep, connected and transformative innovation in our daily lives. Intellectual property (IP) is at the centre of this wave of innovation, and an increasingly fierce battleground as the current high profile dispute between Alphabets Waymo and Uber over core autonomous vehicle technology shows.[1]

You might think that the cloud, built and running on shared environments and public standards, would be a safe space from intrusive IP disputes. But the evidence is mounting that the cloud is proving attractive for PAEs (Patent Assertion Entities, businesses who litigate their patents but generally dont otherwise use their patented technology). And whilst cloud users are increasingly aware of the importance of security and privacy, cloud IP risks are now equally important but still somewhat overlooked: many enterprises dont yet have complete clarity on their IP litigation strategy or IP innovation strategy, especially in a global context.

There are persuasive reasons for cloud customers to focus more on patent risks. PwC, in its most recent (May 2017) Patent Litigation Study[2] notes that damages awards for PAEs are almost four times greater than for other patent claimants and that damages awards at trial in patent disputes continue to rise.

Europe is quickly becoming a key jurisdiction for patent enforcement: the European Patent Office granted 96,000 patents in 2016, [3] up 40% from 2015, and the Unitary Patent along with EU-wide injunctions will soon be a reality.[4]

The cloud computing patent landscape is also developing rapidly. Cloud patent families are well-known in areas such file-storage and protocols but other areas like Fintech[5] are also growing quickly.

PAEs are acquiring cloud computing patents at a rapid pace according to IPlytics, an IP intelligence provider,[6] who note that:

PAEs often acquire patents in technological areas that will likely become strategically important for future markets.

This is borne out in a European Commission report on PAEs in Europe[7] which (on page 26) cites findings that:[8]

PAEs are overwhelmingly involved in the litigation of German and UK patents related to computer and telecommunications technology [and that] these findings are consistent with existing evidence on the activity of US PAEs, which also tend to enforce high-tech patents at a disproportionately high frequency, especially software patents.

Part of the attraction for PAEs is that patent infringement is increasingly easy to detect in the cloud: detailed documentation, APIs and the code for open source (the software that powers much of the cloud) are readily available, and can be read and analysed by anyone, making the cloud a soft target.

As the economic importance of the cloud rises, cloud customers make increasingly interesting targets for PAEs: customers generally dont have the same level of expertise in cloud tech as cloud service providers (CSPs), have a greater incentive to settle, are less prepared to fight an IP battle, and have little incentive to solve an IP Issue for others. Contrast this with the position of the CSP, who will want to avoid an IP threat becoming an issue across its customer base.

A measure of this growing cloud patent claim risk is the evolving approach of the largest global CSPs to this issue in their cloud service agreements.

Microsoft has taken an early lead through its recently announced Azure IP Advantage[9] programme with uncapped indemnification for its Azure cloud services, including open source incorporated in its services, and 10,000 (7,500 currently, 2,500 to come) patents that Microsoft is sharing with its consuming customers.

Google in its Cloud Platform Terms of Service[10] seeks (at section 14.2) to exclude open source software entirely from its IP infringement indemnification a big carve-out given the importance of open source in the cloud environment.

In Amazon Web Services (AWS) Customer Agreement,[11] the effect of section 10 is that AWS does not offer in its standard terms any IP protection at all for its services. Section 8.5 is an unusual IP non-assert term that requires the customer not itself or through others to assert any IP claim regarding the AWS services it has used. The clause continues without limit in time after the agreement has ended; and to the extent it could be said to amount to a patent no-challenge clause, could be problematic in Europe under EU competition law, for example.

The fact that all the largest CSPs are starting to address cloud patent risk expressly in their contract terms is perhaps the most compelling evidence that this PAE-fuelled risk is becoming increasingly relevant and material. Cloud customers, and their regulators in regulated sectors, should take note as well.

See original here:

Growing Patent Claim Risks in Cloud Computing - Lexology (registration)

Posted in Cloud Computing | Comments Off on Growing Patent Claim Risks in Cloud Computing – Lexology (registration)

New Cloud Computing and IT Outsourcing Requirements in the Financial Sector – Lexology (registration)

Posted: at 1:50 pm

On 17 May, 2017 the Luxembourg Financial Regulator (CSSF) published four new circulars concerning cloud computing and IT outsourcing. The new regulations will immediately affect credit institutions, professionals of the financial sector, payment service providers, and electronic money issuers (Entities). The four CSSF circulars, which came into effect on the date of their publication, introduce new rules and replace existing requirements set out in existing circulars.

Main novelties and amendments

Circular 17/654

This circular addresses the obligations that Entities must meet when their IT infrastructure uses or will rely on a cloud computing infrastructure.

The circular applies to the partial or full transfer of the activities and does not make many differences between an external provider and an internal provider within a group of companies.

The CSSF defines the term of material activity as any activity that, when not properly performed, reduces the ability of an Entity to meet regulatory requirements or continue its operations, and any activities that are necessary for the sound and prudent risk management.

Three different IT service models are described:

For each of the above service models, the CSSF provides an interpretation of the levels of control on the systems and the software that an Entity must respect when applying such model.

Within these service models the CSSF differentiates four different cloud types:

An Entitys outsourcing of IT matters will qualify for particular regulatory treatment, if it meets specific criteria set out by the CSSF and will be excluded from the scope of other existing regulations relating the Entitys central administration, accounting organization, internal governance and risk management (e.g. Circulars 12/552 or 17/656).

The criteria that the CSSF uses to define the specific regulatory treatment are:

If the above criteria are fulfilled an Entity must obtain the CSSFs prior approval (if a material activity is concerned). In case a Luxembourg based professional of the financial sector is used, an Entity must only file a prior notification to the CSSF.

Once the outsourcing is implemented, all the changes to the set-up and the service providers as well as the in-sourcing must be notified to the regulator before an Entity enacts them.

Entities under the supervision of the CSSF that would like to offer cloud computing services or related operating services to their clients must submit a program description to the CSSF to obtain its prior approval.

This circular amends the requirements applicable to credit institutions, investment firms and professional lenders. The amendments introduce Circular 17/654 and clarify that Circular 05/178 is repealed.

In addition, the amendments clarify that every time specific infrastructures are used or changed, authorized entities must observe data protection and professional secrecy rules.

The circular clarifies the conditions for the use of other group entities that are not authorized by the CSSF. The systems of such group entities may be used under the condition that no confidential information is stored in a readable manner on those systems. If this is the case, the supervised entity must inform its clients and, if required, collect their consent.

This circular aligns the IT outsourcing requirements for professionals of the financial sector other than investment firms, payment service providers and electronic money issuers to those applicable to credit institutions and investment firms. It copies the wording of the relevant sections of Circular 12/552 to ensure consistency and ease further alignments.

Finally, the circular introduces Circular 17/564 and clarifies that professionals of the financial sector that offer IT services to their clients, may use the infrastructure of a third party or sub-delegate a part of their services only with the prior consent of the concerned clients.

This circular amends Circular 06/240 and is applicable to all credit institutions and professionals of the financial sector. One important clarification of this circular consists of providing that only the production environment should contain confidential data, whereas the test and development environment(s) (that as per applicable regulation may be accessed by third parties) should not contain confidential data.

Future developments

As the four circulars came into effect on the date of their publication, the Entities auditors are expected to pay particular attention to the new requirements when carrying out their audits.

Entities supervised by the CSSF will have to carefully study the new circulars and analyze the impact on their existing administrative organization and IT infrastructure, because if affected, they must be aligned to the new requirements. Therefore, changes may need to be implemented at multiple levels:

As service providers located outside of Luxembourg will be required to accept contractual provisions that they have never been requested to comply with before, (for instance, amendments to certifications and controls), the time to implement the changes should not be underestimated.

Read this article:

New Cloud Computing and IT Outsourcing Requirements in the Financial Sector - Lexology (registration)

Posted in Cloud Computing | Comments Off on New Cloud Computing and IT Outsourcing Requirements in the Financial Sector – Lexology (registration)

Purdue, Microsoft to Collaborate on Quantum Computer – Photonics.com

Posted: at 1:50 pm

Photonics.com Jun 2017 WEST LAFAYETTE, Ind., June 9, 2017 Purdue University and Microsoft Corp. have signed a five-year agreement to develop a useable quantum computer.

Purdue is one of four international universities in the collaboration. Michael Manfra, Purdue University's Bill and Dee O'Brien Chair Professor of Physics and Astronomy, professor of materials engineering and professor of electrical and computer engineering, will lead the effort at Purdue to build by producing a "topological qubit."

"Someday, quantum computing will move from the laboratory to actual daily use, and when it does, it will signal another explosion of computing power like that brought about by the silicon chip," said Michael Daniels, president of Purdue. "Its thrilling to imagine Purdue at the center of this next leap forward.

With quantum computers, information is encoded in qubits, which are quantum units of information. With a qubit, however, this physical state isn't just 0 or 1, but can also be a linear combination of 0 and 1. Because of the quantum mechanic phenomenon of "superposition," a qubit can be in both states at the same time. This characteristic is essential to quantum computations potential power, allowing for solutions to problems that are intractable using classical architectures.

The team assembled by Microsoft will work on a type of quantum computer that is expected to be especially robust against interference from its surroundings, a situation known in quantum computing as decoherence. The scalable topological quantum computer is theoretically more stable and less error-prone.

Purdue and Microsoft entered into an agreement in April 2016 that extends their collaboration on quantum computing research, effectively establishing "Station Q Purdue," one of the Station Q experimental research sites that work closely with two Station Q theory sites. This new, multi-year agreement extends that collaboration and includes Microsoft employees being embedded in Manfra's research team at Purdue.

Manfras group at Station Q Purdue will collaborate with Redmond, Wash.-based Microsoft team members, as well as a global experimental group established by Microsoft including experimental groups at the Niels Bohr Institute at the University of Copenhagen in Denmark, TU Delft in the Netherlands and the University of Sydney, Australia. They are also coupled to the theorists at Microsoft Station Q in Santa Barbara. All groups are working together to solve quantum computings biggest challenges.

"What's exciting is that we're doing the science and engineering hand in hand, at the same time," Manfra says. We are lucky to be part of this truly amazing global team.

Visit link:

Purdue, Microsoft to Collaborate on Quantum Computer - Photonics.com

Posted in Quantum Computing | Comments Off on Purdue, Microsoft to Collaborate on Quantum Computer – Photonics.com

Scientists May Have Found a Way to Combat Quantum Computer Blockchain Hacking – Futurism

Posted: at 1:50 pm

In Brief While quantum computers could improve the world by decreasing processing times, they could also be the ideal tool for hackers, which is a true threat to the success of blockchain. Russian scientists, though, may have found the solution. Russias Solution to Quantum Hacking

A serious concern in the computing industry is that when true quantum computers are produced, the principles of encryption will break down due to the dizzyingly superior processing power.

Although blockchain is a far more secure method of transaction than our current financial system, even it will become vulnerable to a brute force attack by a quantum computer. Andersen Cheng, co-founder of U.K. cybersecurity firm Post Quantum, told Newsweek, Bitcoin will expire the very day the first quantum computer appears.

A team lead by Evgeny Kiktenko at the Russian Quantum Center in Moscow, though, may have found a way to protect blockchains by fighting fire with fire using quantum mechanics. They are designing a quantum-secured blockchain where each block, hypothetically, is signed by a quantum key rather than a digital one.

They propose that transmitting and encrypting information using quantum particles such as photons, which cannot be copied or meddled with without the particles being destroyed, ensures the blockchains safety. The principle is based on Zero-knowledge proofs which allow you to validate information without sharing it.

In recent months Russia has become increasingly interested in blockchain. The central bank is composing new laws focused on cryptocurrencies and is interested in developing one of its own. This research marks a step forward in these efforts because it concerns the protection of such systems.

If the quantum-secured blockchain proves successful it would be hugely beneficial to the rest of the world as well. Blockchain has the potential to do a lot of good for the world by streamlining the transaction system, making it more secure, and ensuring transparency like never before. Countries such as Senegal have developed currencies that are entirely digital, Japan is accepting bitcoin (which uses blockchain) as legal tender in 260,000 stores this summer, and Ukraine is considering using it to combat corruption.

If the advent of quantum computing could be the apocalypse for blockchain, it is therefore crucially important that we begin thinking about how to protect these system before entire countries and currencies could be subject to hacks from the abusers of quantum computers.

Excerpt from:

Scientists May Have Found a Way to Combat Quantum Computer Blockchain Hacking - Futurism

Posted in Quantum Computing | Comments Off on Scientists May Have Found a Way to Combat Quantum Computer Blockchain Hacking – Futurism