Page 149«..1020..148149150151..»

Category Archives: Cloud Computing

Cloud Computing, Term of Art Complete Preakness Works – BloodHorse.com (press release) (registration) (blog)

Posted: May 14, 2017 at 6:21 pm

Klaravich Stables and William Lawrence's Cloud Computing tuned up for his expected run in the May 20 Preakness Stakes (G1) with a half-mile breeze in :48.85 over the Belmont Park training track May 13.

Under exercise rider Peter Roman, Cloud Computing beat the worst of the rain by coming out just after thetraining track opened at 5:30 a.m. The son of Maclean's Music posted the second fastest drill of 32 moves at the distance.

"He breezed very well, galloped out super, and came back good so far," said trainer Chad Brown. "That's his last piece of work and if he comes out of it well he'll be on to Baltimore on Tuesday."

Unraced as a juvenile, Cloud Computing has made three starts this season, with his most recent outing being a third-place finish in the April 8 Wood Memorial presented by NYRA Bets (G2). The dark bay colt previously ran second to J Boys Echo in the March 4 Gotham Stakes (G3) after he broke his maiden at first asking going six furlongs at Aqueduct Racetrack Feb. 11.

On the opposite coast, fellow Preakness hopeful Term of Art also completed his last serious work before shipping to Baltimore. He worked six furlongs in 1:13 4/5 at Santa Anita Park Saturday.

Calumet Farm's Term of Art is slated to leave Santa Anita May 16 for the middle leg of the Triple Crown. The Doug O'Neill-trained son of Tiznow was most recently seventh in the Santa Anita Derby (G1) but captured the Cecil B. DeMille Stakes (G3) at Del Mar last November.

Read the original post:

Cloud Computing, Term of Art Complete Preakness Works - BloodHorse.com (press release) (registration) (blog)

Posted in Cloud Computing | Comments Off on Cloud Computing, Term of Art Complete Preakness Works – BloodHorse.com (press release) (registration) (blog)

Trump signs cybersecurity executive order, mandating a move to cloud computing – GeekWire

Posted: at 6:21 pm

The White House plan to address cybersecurity is taking shape. (White House / Pho.to / GeekWire Graphic)

President Donald Trump today signed a long-awaited executive order aimed at beefing up cybersecurity at federal government agencies with a shift of computer capabilities to the cloud as a key part of the strategy.

Weve got to move to the cloud and try to protect ourselves instead of fracturing our security posture, Homeland Security Adviser Tom Bossert told reporters during a White House briefing.

The executive order gives the lead role in managingthe cloud shift to the director of the White Houses newly established American Technology Council, which is due to meet for the first time next month.

Although the councils full roster of members has not yet been announced, the director is said to be Chris Liddell, who formerly served as chief financialofficer at Microsoft and General Motors.

Some agencies already have begun shiftingdata resources to cloud computing services, including Amazon Web Services and Microsoft Azure. Carson Sweet, CTO and co-founder of San Francisco-based CloudPassage, said the emphasis on the cloud makes sense and builds on a trend that began during the Obama administration.

The question now will be how well the administration does with identifying and eliminating the obstructions agencies are facing as they consider adopting cloud / shared services, Sweet told GeekWire in an email.

The executive order also calls upon all federal agencies to implement the NIST Cybersecurity Framework, a set of best practices developed by the National Institute of Standards and Technology for the information technology industry. And it calls on Cabinet secretaries to develop plans to protect critical infrastructure, ranging from utilities to the health care system to the financial system.

Bossert said the measures build on the efforts made by the Obama administration. A lot of progress was made in the last administration, but not nearly enough, he said.

As an example of past failures, Bossert pointed to 2015s data breach at the Office of Personnel Management, which exposed millions of sensitive employment records to hackers. He said such records are the crown jewels of the governments dataassetsand require enhanced protection.

Bossertnoted that Trumps budget blueprint sets aside $1.5 billion for cybersecurity.

Back in January, Trump vowed to come up with a major report on hacking defense within 90 days,but some observers said the executive order didnt meet the target.

Drew Mitnick, policy counsel at Access Now, said in a statement that the measures will serve as incremental changes to existing policies, while the Trump administration has otherwise either ignored or undermined pressing digital security threats internet users face.

The action does not touch several critical areas, like the insecurity of Internet of Things devices, data breaches, or vulnerability disclosure, Mitnick said.

During the briefing, one reporter asked whether shifting the federal governments data to the cloud might heighten rather than reduce cybersecurity risks. Bossert said its better to centralize risk, rather thanhaving 190 federal agencies come up with separate measures.

I dont think thats a wise risk, Bossert said.

Another reporter asked whether concerns over Russias online meddling with last years presidential campaign had any effect on the executive order.

The Russians are not our only adversary, Bossert replied. The Russians, the Chinese, the Iranians, other nation-states are motivated to use cybersecurity and cyber tools to attack our people and our governments and their data. And thats something we can no longer abide.

He declined to say what type of cyber attack might constitute an act of war, other than to say that if somebody does something to the United States of America that we cant tolerate, we will act.

Trump was reportedly on the verge of signing an executive order on cybersecurity back in January, but held off. Bossert said there was nothing unusual behind the delay. He noted that between then and now, the White House had the chance to lay out a budget blueprint and announced the formation of the technology council two developments that set the stage for the executive order.

Bossert also acknowledged that some tech companies expressed concerns that theyd be compelled to take actions to head off distributed denial-of-service attacks, also known as botnet attacks. He emphasized today that the anti-botnet initiative would be voluntary.

The executive order callson Commerce Secretary Wilbur Ross and Homeland Security Secretary John Kelly to file a preliminary report on the anti-botnet campaign within 240 days.

Bossert declined to confirm a claim that federal computers are hit by tens of thousands of hacking attempts daily, but he acknowledged that attempted data break-ins and successful intrusions are on the rise.

The trend line is going in the wrong direction, he told reporters.

Correction for 1:50 p.m. PT May 13: An earlier version of this report incorrectly referred to Chris Liddell as the former chief technology officer of Microsoft and GM. He has served as chief financial officer for those and other companies.

Continue reading here:

Trump signs cybersecurity executive order, mandating a move to cloud computing - GeekWire

Posted in Cloud Computing | Comments Off on Trump signs cybersecurity executive order, mandating a move to cloud computing – GeekWire

Virtustream Adds Enterprise Cloud to Global Dell EMC Partner Program – Cloud Computing Intelligence (registration) (blog)

Posted: May 13, 2017 at 6:23 am

Dell EMC Partner, Technologent, works with Virtustream to modernise IT for global, enterprise client

Virtustream, the enterprise-class cloud company and a Dell Technologies business, today announced that Virtustream Enterprise Cloud (VEC) has been added to the global Dell EMC Partner Program.

Dell EMC partners now have the opportunity to become Virtustream partners, selling Virtustream Enterprise Cloud and Virtustream Storage Cloud, through the auspices of the Dell EMC Partner Program. Virtustream Storage Cloud was the first solution made available to Dell EMC partners last December in order to help meet growing customer demand for enterprise-class cloud storage.

Simplifying the process for our customers to obtain the perfect blend of Dell EMC and Virtustream solutions for their digital transformation projects is imperative, said Scott Millard, vice president, global channels and alliances, Virtustream. Including Virtustream Enterprise Cloud in the Dell EMC Partner Program accomplishes this goal and arms the already strong Dell EMC partner ecosystem with cloud solutions that make a substantive impact on their digital transformation journey.

One Dell EMC partner, Technologent, has already tapped Virtustream to provide enterprise-class cloud solutions for its customer, a multibillion dollar, North American financial services firm that operates as a key component within a global conglomerate. Technologent, an Irvine, CA-based provider of technology solutions for Fortune 1000 enterprises, was tasked with helping this client to modernise its IT.

Our client wanted several important benefits from its modernization project, said Marco Mohajer, executive vice president of sales and marketing, Technologent. They wanted resources available on demand, a transition from a CAPEX model to an OPEX model to help the bottom line, and a faster time-to-market in responding to their born in the cloud competitors. Virtustream is the perfect partner for both Technologent and our client in developing a hybrid, multi-cloud environment that made all of these goals a reality.

This project helped Technologents client to:

Dell EMC announced a unified partner programme, which combined the legacy Dell and EMC programs into an integrated and improved organisation this past February. Virtustream Storage Cloud and Virtustream Enterprise Cloud are a part of this combined ecosystem.

Read the original here:

Virtustream Adds Enterprise Cloud to Global Dell EMC Partner Program - Cloud Computing Intelligence (registration) (blog)

Posted in Cloud Computing | Comments Off on Virtustream Adds Enterprise Cloud to Global Dell EMC Partner Program – Cloud Computing Intelligence (registration) (blog)

Microsoft launches Android app to manage its Azure cloud computing platform – Android Police

Posted: May 11, 2017 at 1:25 pm

Yesterday Microsoft launched a new Android app for its cloud computing platform Azure. Administrators using the service will probably find the convenience of being able to check on things with their Android devices helpful. It even provides notifications and alerts in the event of specific problems, making it that much easier for your work to find you at home.

If you don't know what Azure is, then this probably won't have much of an effect on you (wiki-hole, if you are curious), but it's basically a platform similar to Google's Cloud Platform and Amazon's AWS. Which is to say it gives you a bunch of hosting, platform as a service, and software as a service type stuff that is mostly used by businesses.

The new app will eventually give you access to a shell for your instances (it may be a placeholder for now) and it also allows you to check current statuses like resource use, health, and hardware monitoring, as well as check metrics being collected for things like requests. It will also provide notifications and alerts based on what you have set. It seems like a useful tool to let you know about emergencies or to check things quickly when you might be worried. Don't expect it to take over for the full management portal, though.

If you're using any of Azure's services, the app might be worth checking out. Download it over at Google Play or via the widget below.

See the article here:

Microsoft launches Android app to manage its Azure cloud computing platform - Android Police

Posted in Cloud Computing | Comments Off on Microsoft launches Android app to manage its Azure cloud computing platform – Android Police

3 Cloud Computing Stocks To Buy Right Now – May 10, 2017 … – Zacks.com

Posted: at 1:25 pm

In the matter of just a few years, the Cloud has evolved from the new feature that your grandmother just cant quite seem to understand to one of the main factors driving growth in the technology sector. Cloud computing is now an essential focus for software-related companies, and cloud stocks have piqued the interest of many tech-focused investors.

New technologies and changing consumer behavior have changed the shape of the technology landscape, and an industry that was once centered on the personal computer has adapted to survive in the world of mobile computing and the Cloud. The markets have been paying attention, and some of the best tech stocks have been those that are either primarily cloud-based companies, or those that have shown growth in their cloud operations.

With this in mind, weve highlighted three stocks that are not only showing strong cloud-related activity, but also strong fundamental metrics. Check out these three cloud stocks to buy right now:

1. Adobe Systems (ADBE - Free Report)

Adobe Systems is a provider of graphic design, publishing, and imaging software for Web and print production. The companys main offering is its Creative Cloud, which is a software-as-a-service (SaaS) product that allows users to access all of Adobes tools at one monthly price. The stock currently has a Zacks Rank #2 (Buy).

Within the last 60 days, we have seen at least one positive estimate revision for Adobes current-quarter, next-quarter, full-year, and next-year earnings. Our consensus estimate for the quarter calls for EPS growth of 40% on sales growth of nearly 24%

2. Five9, Inc. (FIVN - Free Report)

Five9 provides cloud software for contact centers. The company offers software products such as workforce management, speech recognition, predictive dialer, and voice applications, as well as an all-in-one contact center cloud platform. Currently, FIVN holds a Zacks Rank #2 (Buy).

Five9 is still a loss-making company, but it recently surpassed our Zacks Consensus Estimate by 40%, and weve seen four positive revisions for its full-year earnings within the last week. With sales projected to grow by nearly 20% this year and the stock gaining more than 25% in 12 weeks, Five9 has earned A grades for both Growth and Momentum.

3. VMWare, Inc. (VMW - Free Report)

VMWare provides cloud and virtualization software and services. Its solutions enable organizations to aggregate multiple servers, storage infrastructure, and networks together into shared pools of capacity that can be allocated dynamically, securely and reliably to applications as needed, increasing hardware utilization and reducing spending. The stock is currently a Zacks Rank #2 (Buy).

Despite its long history, VMWare is still growing its earnings, and our current consensus estimates call for EPS growth of nearly 15% this quarter. The stock has been on an impressive run, gaining more than 20% year-to-date. Its P/E ratio, ROE, and Net Margin all out-perform the industry average, and it could be on the cusp of breaking into a new range as it nears its 52-week high.

Bottom Line

Cloud-based companies have been some of the best performing stocks in the tech sector this year, and these cloud stocks also boast strong fundamental metrics. If youre looking to add tech stocks to your portfolio right now, this list is probably a good place to start.

Want more stock market analysis from this author? Make sure to follow @Ryan_McQueeney on Twitter!

The Best & Worst of Zacks

Today you are invited to download the full, up-to-the-minute list of 220 Zacks Rank #1 "Strong Buys" free of charge. From 1988 through 2015 this list has averaged a stellar gain of +25% per year. Plus, you may download 220 Zacks Rank #5 "Strong Sells." Even though this list holds many stocks that seem to be solid, it has historically performed 6X worse than the market. See these critical buys and sells free >>

Continued here:

3 Cloud Computing Stocks To Buy Right Now - May 10, 2017 ... - Zacks.com

Posted in Cloud Computing | Comments Off on 3 Cloud Computing Stocks To Buy Right Now – May 10, 2017 … – Zacks.com

You really should know what the Andrew File System is – Network World

Posted: at 1:25 pm

By Bob Brown, News Editor, Network World | May 10, 2017 2:20 PM PT

Your Alpha Doggs editor is Bob Brown, Network World Online Executive Editor, News.

When I saw that the creators of the Andrew File System (AFS) had been named recipients of the $35K ACM Software System Award, I said to myself "That's cool, I remember AFS from the days of companies like Sun Microsystems... just please don't ask me to explain what the heck it is."

Don't ask my colleagues either. A quick walking-around-the-office survey of a half dozen of them turned up mostly blank stares at the mention of the Andrew File System, a technology developed in the early 1980s and named after Andrew Carnegie and Andrew Mellon. But as the Association for Computing Machinery's award would indicate, AFS is indeed worth knowing about as a foundational technology that paved the way for widely used cloud computing techniques and applications.

MORE: Whirlwind tour of tech's major awards, honors and prizes

Mahadev "Satya" Satyanarayanan, a Carnegie Mellon University Computer Science professor who was part of the AFS team, answered a handful of my questions via email about the origins of this scalable and secure distributed file system, the significance of it, and where it stands today. Satyanarayanan was recognized by ACM along with John Howard, Michael Leon Kazar, Robert Nasmyth Sidebotham, David Nichols, Sherri Nichols, Alfred Spectorand Michael West, who worked as a team via the Information Technology Center partnership between Carnegie Mellon and IBM (the latter of which incidentally funded this ACM prize).

Is there any way to quantify how widespread AFS use became and which sorts of organizations used it most? Any sense of how much it continues to be used, and for what?

Over a roughly 25-year timeframe, AFS has been used by many U.S. and non-U.S. universities. Many national labs, supercomputing centers and similar institutions have also used AFS. Companies in the financial industry (e.g., Goldman Sachs) and other industries have also used AFS. A useful snapshot of AFS deployment was provided by the paper "An Empirical Study of a Wide-Area Distributed File System" that appeared in ACM Transactions on Computer Systemsin 1996. That paper states:

"Originally intended as a solution to the computing needs of the Carnegie Mellon University, AFS has expanded to unite about 1000 servers and 20,000 clients in 10 countries. We estimate that more than 100,000 users use this system worldwide. In geographic span as well as in number of users and machines, AFS is the largest distributed file system that has ever been built and put to serious use."

Figure 1 in that paper shows that AFS spanned 59 educational cells, 22 commercial cells, 11 governmental cells, and 39 cells outside the United States at the time of the snapshot. In addition to this large federated multi-organization deployment of AFS, there were many non-federated deployments of AFS within individual organizations.

What has been AFS's biggest impact on today's cloud and enterprise computing environments?

The model of storing data in the cloud and delivering parts of it via on-demand caching at the edge is something everyone takes for granted today. That model was first conceived and demonstrated by AFS, and is perhaps its biggest impact. It simplifies management complexity for operational staff, while preserving performance and scalability for end users. From the viewpoint of end users, the ability to walk up to any machine and use it as your own provides enormous flexibility and convenience. All the data that is specific to a user is delivered on demand over the network. Keeping in sync all the machines that you use becomes trivial. Users at organizations that deployed AFS found this an addictive capability. Indeed, it was this ability that inspired the founders of DropBox to start their company. They had used AFS at MIT as part of the Athena environment, and wanted to enable at wider scale this effortless ability to keep in sync all the machines used by a person. Finally, many of the architectural principles and implementation techniques of AFS have influenced many other systems over the past decades.

How did AFS come to be created in the first place?

In 1982, CMU and IBM signed a collaborative agreement to create a "distributed personal computing environment" on the CMU campus, that could later be commercialized by IBM. The actual collaboration began in January 1983. A good reference for information about these early days is the1986 CACM paper by [James H.] Morris et al entitled "Andrew: A Distributed Personal Computing Environment". The context of the agreement was as follows. In 1982, IBM had just introduced the IBM PC, which was proving to be very successful. At the same time, IBM was fully aware that enterprise-scale use of personal computing required the technical ability to share information easily, securely, and with appropriate access controls. This was possible in the timesharing systems that were still dominant in the early 1980s. How to achieve this in the dispersed and fragmented world of a PC-based enterprise was not clear in 1982. A big part of the IBM-CMU collaborative agreement was to develop a solution to this problem. More than half of the first year of the Information Technology Center (1983) was spent in brainstorming on how best to achieve this goal. Through this brainstorming process, a distributed file system emerged by about August 1983 as the best mechanism for enterprise-scale information sharing. How to implement such a distributed file system then became the focus of our efforts.

What would the AFS creators have done differently in building AFS if they had to do it over again?

I can think of at least two things: one small and one big.

The small thing is that the design and early evolution of AFS happened prior to the emergence of [network address translation (NAT)]-based firewalls in networking. These are in widespread use today in homes, small enterprises, etc. Their presence makes it difficult for a server to initiate contact with a client in order to establish a callback channel. If we had developed AFS after the widespread use of NAT-based firewalls, we would have carefully rethought how best to implement callbacks in the presence of NAT firewalls.

The bigger thing has to do with the World Wide Web. The Mosaic browser emerged in the early 1990s, and Netscape Navigator a bit later. By then AFS had been in existence for many years, and was in widespread use at many places. Had we realized how valuable the browser would eventually become as a tool, we would have paid much more attention to it. For example, a browser can be used in AFS by using "file://" rather than "http://" in addresses. All of the powerful caching and consistence-maintenance machinery that is built into AFS would then have been accessible through a user-friendly tool that has eventually proved to be enormously valuable. It is possible that the browser and AFS could have had a much more symbiotic evolution, as HTTP and browsers eventually did.

Looks like maybe there are remnants of AFS alive in the open source world?

Indeed. OpenAFS continues to be an active open source project. Many institutions (including CMU) continue to use AFS for production use, and this code is now based on OpenAFS.

Also, my work on the Coda File System forked off from the November 1986 version of AFS. Coda was open-sourced in the mid-1990s. That code base continues to be alive and functional today. Buried in Coda are ideas and actual code from early AFS.

Do any of you have any spectacular plans for what theyll do with the prize money?

Nothing concrete yet. We have discussed possibly donating the funds to a charitable cause.

Go here to see the original:

You really should know what the Andrew File System is - Network World

Posted in Cloud Computing | Comments Off on You really should know what the Andrew File System is – Network World

Oracle launches cloud computing service for India | Business Line – Hindu Business Line

Posted: at 1:25 pm

New Delhi, May 10:

Technology giant Oracle has launched its cloud computing service for India, which aims to support the government's GST rollout in July and plans to open data centres in the country.

Addressing a gathering of 12,000 attendees, which included technology partners, analysts and heads of state such as Maharashtra Chief Minister Devendra Fadnavis, at Oracle OpenWorld, CEO, Safra Catz, said that India is at an "amazing moment'' in terms of sociological factors such as a high concentration of youth as well as efforts taken by the government to use technology more.

My last visit with PM Narendra changed the way I looked at India and my views about the country, she said. Oracle has been present in India for more than two decades, providing database technology that forms the backbone for many commercial transactions.

It was during that visit when Modi urged Catz to do more for Indian citizens - which could unleash the power of people's ideas, Catz said.

In an effort to simplify the tax regime in India and to ensure higher compliance with tax laws, Oracle's ERP solution aims to provide support for GST Network integration, statutory reporting, payment processing, among others. In the GST regime, companies will have to upgrade their ERP systems.

'SARAL GST'

Apart from Oracle, a week ago, Reliance Corporate IT Park, a subsidiary of Reliance Industries Ltd, has signed an MoU with Oracle rival SAP to launch SARAL GST a solution for taxpayers to be GST compliant and access the governments GST System.

Analysts welcomed this move. "It will open up opportunities for software and hardware but the core theme should be on simplification which would benefit an end user," said Sanchit Vir Gogia, CEO, Greyhound Research.

State governments in India are also adopting Oracle's solutions. The Maharashtra Government has a partnership with Oracle. Additionally, the Jharkhand Government and Oracle have signed an MoU to improve citizen services with an aim to make Jharkhand an attractive destination for start-ups.

The MoU was signed at Oracle OpenWorld and Oracle will offer its support to the state through its portfolio of technology solutions, including Oracle Cloud. These solutions cater to the growing requirements and expectations of citizens, businesses and government departments for smarter, transparent and efficient governance within the state of Jharkhand, company executives said.

Earlier this year, Jharkhand had received investment support from the Union Government for an internal venture capital fund to support start-ups in the state.

Further as part of the MoU, Oracle and the Government of Jharkhand will collaborate to create proof of concepts and help new start-ups using Oracle Cloud-based platforms to operationalise citizen services and start-up centres.

Adopters of technology in small businesses have some inherent advantages. A survey by Kantar IMRB in November 2016 of 504 Indian SMBs found that ones who adopt digital technologies grow profits up two times faster than offline SMBs.

The report found that 51 per cent of digitally enabled SMBs sell beyond city boundaries compared with 29 per cent of offline small businesses.

(This article was published on May 10, 2017)

Please enter your email. Thank You.

Newsletter has been successfully subscribed.

Continue reading here:

Oracle launches cloud computing service for India | Business Line - Hindu Business Line

Posted in Cloud Computing | Comments Off on Oracle launches cloud computing service for India | Business Line – Hindu Business Line

Microsoft is on the edge: Windows, Office? Naah. Let’s talk about cloud, AI – The Register

Posted: at 1:25 pm

Build At the Build 2017 developer conference today, Microsoft CEO Satya Nadella marked a Windows milestone 500 million monthly active users and proceeded to say very little about Windows or Office.

Instead he, along with Scott Guthrie, EVP of the Microsoft Cloud and Enterprise Group, and Harry Shum, EVP of Microsoft's Artificial Intelligence and Research group, spent most of their time on stage, in Seattle, talking about Azure cloud services, databases, and cross-platform development tools.

Arriving on stage to give his keynote address, Nadella in jest said that he thought it would be an awesome idea on such a sunny day "to bring everyone into a dark room to talk about cloud computing."

Office and Windows can wait.

Microsoft watchers may recall that its cloud-oriented businesses have been doing well enough to deserve the spotlight. In conjunction with the company's fiscal second quarter earnings report in January, the Windows and Office empire revealed that Azure revenue grew 93 per cent year-on-year.

During a pre-briefing for the press on Tuesday, Microsoft communications chief Frank Shaw described "a new worldview" for the company framed by the "Intelligent Edge" and the "Intelligent Cloud."

Nadella described this newborn weltanschauung as "a massive shift that is going to play out in the years to come."

He mused about a software-based personal assistant to illustrate his point. "Your personal digital assistant, by definition, will be available on all your devices," he said, to make the case that the centralized computing model, client and server, has become outmoded. Data and devices are dispersed.

In other words, all the data coming off connected devices requires both local and cloud computing resources. The revolution will not be centralized.

That could easily be taken as reheated Cisco frothing about the explosive growth of the Internet of Things and bringing processing smarts to the edge of the network. But Microsoft actually introduced a new service that fit its avowed vision.

Microsoft's bipolar worldview the Intelligent Edge and the Intelligent Cloud manifests itself in a novel "planet scale" database called Azure Cosmos DB. It's a distributed, multi-model database, based on the work of Microsoft Researcher Leslie Lamport, that promises to make data available locally, across Microsoft's 34 regions, while also maintaining a specified level of consistency across various instances of the data.

An Intelligent Meeting demonstration, featuring Cortana, showed how AI has the potential to exchange and coordinate data across multiple services. But "potential" requires developer work it will take coding to create the Cortana Skills necessary to connect the dots and manage the sort of cross-application communication that knowledge workers accomplish today through application switching, copying, and pasting.

Conveniently, the Cortana Skills Kit is now in public preview, allowing developers to extend the capabilities of Microsoft's assistant software to devices like Harman Kardon's Invoke speaker.

Beyond code, it will take data associated with people and devices in an organization to make those connections. That's something Microsoft with its Azure Active Directory, its Graph, and LinkedIn has in abundance.

A demonstration of real-time image recognition to oversee a construction worksite showed how a capability like image recognition might be useful to corporate customers. Cameras spotted unauthorized people and located requested equipment on-site. It looked like something companies might actually find useful.

Artificial intelligence as a general term sounds like naive science fiction. But as employed by Microsoft, it refers to machine learning frameworks, natural language processing, computer vision, image recognition or the like.

"We believe AI is about amplifying human ingenuity," said Shum.

Microsoft's concern is convincing developers and corporate clients to build and adopt AI-driven applications using Microsoft cloud computing resources, rather than taking their business to AWS or Google Cloud Platform.

One way Microsoft hopes to achieve that is by offering cloud computing outside the cloud, on endpoints like IoT devices. The company previewed a service called Azure IoT Edge to run containerized functions locally. It's a way of reducing latency and increasing responsiveness, which matters for customers like Sandvik.

The Swedish industrial automation biz has been testing Azure IoT Edge to anticipate equipment failure in its workplace machines, in order to shut them down before components break, causing damage and delays.

Read this article:

Microsoft is on the edge: Windows, Office? Naah. Let's talk about cloud, AI - The Register

Posted in Cloud Computing | Comments Off on Microsoft is on the edge: Windows, Office? Naah. Let’s talk about cloud, AI – The Register

IBM touts its cloud platform as quickest for AI with benchmark tests – Cloud Tech

Posted: at 1:25 pm

IBM claims it has the fastest cloud for deep learning and artificial intelligence (AI) after publishing benchmark tests which show NVIDIA Tesla P100 GPU accelerators on the IBM Cloud can provide up to 2.8 times more performance than the previous generation in certain cases.

The tests, when fleshed out, will enable organisations to quickly create advanced AI applications on the cloud. Deep learning techniques are a key driver behind the increased demand for and sophistication of AI applications, the company noted. However, training a deep learning model to do a specific task is a compute-heavy process that can be time and cost-intensive.

IBM purports to be the first of the large cloud providers to offer NVIDIA Tesla P100 GPUs. Separate tests were carried out, first by IBM engineers and then by cloud simulation platform provider Rescale. For the IBM tests, engineers trained a deep learning model for image classification using two NVIDIA P100 cards on Bluemix bare metal, before comparing the same process to two Tesla K80 GPU cards.

The second performance benchmark, from Rescale, also picked up time reduction on deep learning training, based on its ScaleX platform, which features capabilities for deep learning software as a service (SaaS).

Innovation in AI is happening at a breakneck speed thanks to advances in cloud computing, said John Considine, IBM general manager for cloud infrastructure services in a statement. As the first major cloud provider to offer the NVIDIA Tesla P100 GPU, IBM Cloud is providing enterprises with accelerated performance so they can quickly and more cost-effectively create sophisticated AI and cognitive experiences for their end users.

Another cloud vendor utilising NVIDIAs Tesla P100 GPU although not of the same scale as IBM is Tencent, who made the announcement back in March. As this publication noted at the time, virtually every major cloud player is an NVIDIA customer of some sort, including Amazon Web Services (AWS), Google, and Microsoft.

You can find out more about the IBM tests here.

Continued here:

IBM touts its cloud platform as quickest for AI with benchmark tests - Cloud Tech

Posted in Cloud Computing | Comments Off on IBM touts its cloud platform as quickest for AI with benchmark tests – Cloud Tech

Enterprise-owned data centres still ‘essential’ despite cloud growth, research notes – Cloud Tech

Posted: at 1:25 pm

Enterprises may be starting to move workloads to the cloud, but enterprise-owned data centres remain the primary compute venue with workloads staying consistent over the past three years, according to new research from the Uptime Institute.

The study, which polled more than 1,000 data centre and IT professionals globally, argues that enterprises continue to see the data centre as essential to their digital-centric strategies with the majority of budgets increasing or staying consistent through 2017.

Respondents reported that nearly two thirds of their IT assets were currently deployed in their own data centres. 22% were deployed in colocation or multi-tenant data centre providers, with only 13% deployed in the cloud.

Despite this, more than two thirds (68%) of companies polled say they rely on IT-based resiliency, relying on live application failover in case of an outage due to multiple, geographically distributed data centres. An overwhelming majority (90%) said their companys management was more concerned around outages compared to this time last year.

The survey findings reflect several key trends that are acting together as a powerful catalyst for change within the industry, said Matt Stansberry, senior director of content and publications at Uptime Institute. Increased performance at the processor level, further expansion of server virtualisation, and the adoption of cloud computing have all created an IT foundation that differs greatly from those seen just five years ago. Through this change, enterprise-owned data centres have remained a central component.

We urge data centre and IT professionals to focus on the business aspects of running their IT foundation, creating sets of repeatable processes to make it work efficiently and adopting new technologies and solutions when the business demands it, added Stansberry.

You can find out more about the results here.

Continue reading here:

Enterprise-owned data centres still 'essential' despite cloud growth, research notes - Cloud Tech

Posted in Cloud Computing | Comments Off on Enterprise-owned data centres still ‘essential’ despite cloud growth, research notes – Cloud Tech

Page 149«..1020..148149150151..»