How the DOD Plans to Bring Its JEDI Cloud Contract to Life – FedTech Magazine

On Oct. 25, the Defense Department awarded its Joint Enterprise Defense Infrastructure cloud contract to Microsoft, capping a yearslong effort to start deploying commercial cloud capabilities for Infrastructure as a Service and Platform as a Service. That was just the beginning.

Senior Microsoft leaders including CEO Satya Nadella, Toni Townes-Whitley (one of FedTechs 30 Federal IT Influencers Worth a Follow in 2019), Jason Zander, Tom Keane and Mark Russinovich were scheduled to meet with Pentagon officials last week to lay the foundation for working together, Nextgov reports. Nadella and members of the Microsoft Azure and public sector teams were to meet with DOD CIO Dana Deasy and other senior DOD technology leaders from Dec. 11-13 as part of requisite activities to prepare the cloud environment, the department confirmed to Nextgov.

The Department of Defense is confident in the JEDI Cloud Contract award and remains focused on getting this critical capability into the hands of our warfighters as quickly and efficiently as possible, DOD spokeswoman Elissa Smith told Nextgov. The departments Cloud Computing Program Office continues to work with Microsoft to prepare the JEDI Cloud environment.

MORE FROM FEDTECH: Follow the 5 Rs of rationalization for an effective cloud migration.

DOD has chosen 14 entities to act as pathfinders for the clouds capabilities. Deasy says those Pentagon components which include the U.S. Special Operations Command, the U.S Transportation Command and the Joint Artificial Intelligence Center will be the first to use JEDI on a more tactical level, according to Federal News Network.

These early adopters have unique missions that are more than just using JEDI for base compute, raw storage capacity, and want to do real unique platform-for-service opportunities on top of that, Deasy said Thursday at the Northern Virginia Chapter of AFCEAs Air Force IT Day in Arlington, Va. The variety of the early adopters allows us to test various principles on JEDI from the tactical edge all the way to the top secret needing to use the cross domain.

Deasy indicated the pathfinder components will be able to learn quickly what it takes to go from the strategic vision to stand up and bring JEDI capabilities to life, Federal News Network reports.

Shortly after the contract was awarded, Peter Ranks, a deputy CIO at the DOD, told reporters after speaking at a Professional Services Council event that awarding JEDI was a prerequisite to faster software development. DOD must modernize the way it builds software as it shifts to commercial cloud infrastructure.

If we get modern cloud infrastructure but dont modernize the way we build software, we will not achieve the promises of cloud computing. We want software capabilities in the hands of warfighters faster, Ranks said during his Vision Federal Market Forecast keynote, according to MeriTalk. We want software that can adjust to changing requirements or the changing dynamics of the battlefield more quickly. That is whats driving our cloud strategy.

We have fooled ourselves into thinking that if we can just hire the cloud provider, it will solve all those problems. Hiring the cloud provider wasnt supposed to be the hard part, Ranks added. Like any other weapons system, mastery of the weapons system is really where the challenge comes in, he added.

The DOD has struggled to run applications, such as the Global Command & Control System Joint, across all combatant commands because the cloud infrastructures of the Army, Air Force and Navy vary widely. Ranks said JEDI aims to fix such issues.

What we need is a focused effort to make sure that we have a provider that is filling the gaps in that current multi-cloud solution, he said, according to FedScoop. For all the cloud providers we have today, they still havent solved those problems of classification, tactical edge and something that is common across the enterprise.

Court documents indicate that the DOD has agreed not to proceed with performance under JEDI until at least Feb. 11, aside from initial preparatory activities, pending a lawsuit Amazon Web Services filed in the U.S. Court of Federal Claims over the award, according to NextGov.

Microsoft says it is pushing ahead with its work on the contract. As the selected contractor to support [the Defense Department] in its mission to modernize its enterprise cloud, we are diligently working with the Cloud Computing Program Office to bring this critical new technology to our men and women in uniform, a Microsoft spokesperson told Nextgov.

Continued here:

How the DOD Plans to Bring Its JEDI Cloud Contract to Life - FedTech Magazine

Cymatic Named Most Promising Start-Up Finalist in 2019-20 Cloud Awards – Business Wire

RALEIGH, N.C.--(BUSINESS WIRE)--Cymatic today announced that is has been declared a finalist as the 2019-2020 Most Promising Start-Up in the international Cloud Computing Awards program, The Cloud Awards.

Since 2011, The Cloud Awards program has sought to champion excellence and innovation in cloud computing across multiple industry sectors. Cymatics inclusion as a finalist for the years most promising start-up is a testament to its next-generation all-in-one web application defense platform that moves protection from the network side to the client to defend against todays most sophisticated client and browser-based attacks.

To be shortlisted for our revolutionary work in web application defense is not only an honor, but a clear recognition of our early success in leading secure cloud technologies, said Cymatic Founder and Chief Executive Jason Hollander. We offer the only unified WAF that deploys at the client through a simple line of JavaScript without agents or proxies to deliver first-look, first-strike capability that is earliest in the kill chaina fundamental shift from traditional approaches to security.

The Cymatic platform provides universal in-session visibility and control to reduce risk across web applications, networks and users, while decreasing network traffic loads and eliminating user friction. Instead of just protecting network-based threats like traditional WAFs, Cymatic uses sophisticated artificial intelligence and machine-learning algorithms to identify page mutations and user anomalies. The platform protects against user-derived and device-based threats such as poor credential hygiene, dark web vulnerabilities and potentially risky devices. It is invisible and frictionless to users, deploys in mere minutes and has immediate time-to-value.

Head of Operations for the Cloud Awards, James Williams, said: Simply, Cymatic has recognized the importance of adopting and pioneering leading cloud technologies in order to deliver outstanding client success, which is why theyre a deserving finalist in the Cloud Awards program. The Cloud Awards team already had a near-impossible task sorting the exceptional from the excellent and the bleeding-edge from the cutting-edge. Weighing both proven successes and exceptional promise across several unique categories is a constant challenge.

Cymatic was selected from hundreds of organizations that entered from the Americas, Australia, Europe and the Middle East. Its advanced cloud-based microservices and real-time message-bus architecture offer unparalleled scale to provide the resilience necessary to validate and process the millions of users and transactions that touch web properties every second. The platform is engineered to work across all web applications regardless of operating system, browser or device. It eliminates the OWASP top 10, bots, CAPTCHAs, dark web threats, forced MFA, shared accounts, IP threats, device vulnerabilities and other cloud-based threats with no erosion of the user experience.

Click here to see Cymatics all-in-one web application security at work.

Click here to view the entire list of award finalists. Category winners will be announced on Thursday, January 30, 2020.

About the Cloud Awards

The Cloud Awards is an international program which recognizes and honors industry leaders, innovators and organizational transformation in cloud computing. The awards are open to large, small, established and start-up organizations from across the entire globe, with an aim to find and celebrate the pioneers who will shape the future of the Cloud as we move into 2020 and beyond. The Cloud Awards currently offers two awards programs, the Cloud Computing Awards and the Software-as-a-Service Awards.

Categories for the Cloud Computing Awards include Most Promising Start-Up, Best SaaS, and Best in Mobile Cloud Solution. Finalists were selected by a judging panel of international industry experts. For more information about the Cloud Awards, please visit https://www.cloud-awards.com/.

About Cymatic

Cymatic offers the only all-in-one web application security platform that deploys at the client rather than the network for improved efficacy and ease and speed of deployment. Cymatics first-look, first-strike capability is earliest in the kill chain, reducing risk across applications, networks, and users while decreasing network traffic loads. The solution is invisible and frictionless to users and deploys in minutes. Organizations that have web applications rely on Cymatic for real-time visibility and control to protect their web properties from OWASP top 10 threats, including credential-, user-, and device-based threats. Cymatic is headquartered in Raleigh, NC with offices in California and New York. Learn more at cymatic.io and follow Cymatic on Twitter and LinkedIn.

Excerpt from:

Cymatic Named Most Promising Start-Up Finalist in 2019-20 Cloud Awards - Business Wire

Private cloud evolves as public cloud steals the industry spotlight – CIO Dive

With the race to the public cloud, private clouds at first glance appear left behind.

Eighty-four percent of enterprises have a multicloud strategy, according to Flexera's Rightscale 2019 State of the Cloud Report. Within that 84%, 9% of respondents use multiple private clouds; 17% use multiple public clouds; and 58% use hybrid cloud.

Only 3% of enterprises use a single private cloud.

"We're seeing the number of workloads move over to the public cloud, so that certainly is creating pressure on the whole world of private clouds," Kim Weins, VP of cloud strategy at Flexera, told CIO Dive in an interview.

But private clouds aren't disappearing. Instead, they're evolving to fit into a hybrid and multicloud world, whether for sensitive and secure data while other functions are spun up into a private cloud, or digital-first companies using private clouds to help house and analyze massive amounts of data.

Here's how private clouds are changing to stay in the game.

A true private cloud is one that is not only used on site, but one that an enterprise builds and maintains.

That takes a lot of work and resources to keep running, said Weins, which is why it's not a popular choice.

But private clouds are still preferred by enterprises with strict security requirements and/or in highly regulated industries.

Private clouds "tend to be a little more constraining, but there's an exclusivity of it. They get to custom design the management of the architecture, and they tend to have control on how much computing resources they put at it," Keith Renison, senior product marketing manager at SAS, told CIO Dive in an interview. That can also lead to better performance.

Some enterprises that say they use private cloud aren't really running true private clouds either, Scott Sinclair, senior analyst at the Enterprise Strategy Group (ESG), told CIO Dive in an interview.

"Many organizations view themselves as having some sort of private cloud" even if they're leveraging software like VMware, which provides cloud computing services, he said. "It's being equated with private cloud."

Whether enterprises are building a private cloud from the ground up, or using software to maintain the cloud on premises, almost no one exclusively uses private cloud.

When clients want to scale, they are more likely to push non-secure jobs into the public cloud, Reinson said. "There tends to be an evolution from on-premises, moving into private cloud where appropriate and then there's been a massive adoption of public cloud."

Private clouds are also still used by companies that "are a little bit behind the curve and haven't quite embraced public cloud," Weins added.

While public cloud has been the cloud of choice for young companies, Sinclair said they're moving to hybrid setups too.

ESG has seen "a lot of movement from the public cloud providers trying to bring that cloud experience down," he said.It challenges the idea that private is going to go away.

Public clouds are getting in on the private cloud market too. "We've also seen public cloud providers trying to offer private cloud options," Weins said, with strongest interest in Azure Stack and AWS Outpost, though adoption is "still fairly early."

Private has also been a resource for companies young, digital companies "that believe data is their business," Sinclair said. "They consider themselves cloud first. New workloads come their way and they think about cloud."

With massive amounts of data and quick growth these companies are expanding in private and public clouds.

"We're seeing companies that say 'my data is growing more than 50% year-over-year,'" he said. "That fuels the hybrid demand, at least for the near term."

The rest is here:

Private cloud evolves as public cloud steals the industry spotlight - CIO Dive

Cloud Computing and Security: The Risks You Need to Know – TechDecisions

Cloud computing has become a valuable and increasingly popular approach to digital technology that includes on-demand self-service, broad network access, software as a service, and much, much more. As businesses continue to explore cloud options for a number of applications, its critical that they assess and align specific needs with the appropriate cloud vendor and service early in the cloud transition. Misaligning them, or underestimating cloud computing risks can spell trouble.

Clouds platforms and services available from technology giants including IBM, Google, Amazon, Salesforce, SAP and Oracle provide formats of IT service that offer users significant advantages over old-school, on-premises data centers.

For example, the users capital costs are lower and better matched to actual consumption; no hardware or software installations are required. Cloud-based IT infrastructure provides customers with rapid access to computing power whenever its needed.

Beyond that, the most significant misconception is that cloud services are protected 24/7 armies of security experts, making them virtually bullet-proof. That vision of an unimpregnable fortress, safeguarding client data against all adversaries, is comforting but its misguided.

Using cloud services actually carries its own set of risks risks that are unique to the cloud providers own operating environment, as well as other risks associated with traditional data centers.

Clients who dont recognize those risks and accept their own responsibilities for mitigating them, are almost as likely to experience data loss and compromise as they were before migrating to cloud operations.

Understanding and managing these shared cloud computing risks is key to successfully utilizing a cloud service. And it is equally important to recognize the cloud is not a monolithic concept; clouds vary both in who can use them and in what they do.

For one thing, just as in meteorological cloud formations, there are also different computing cloud configurations.

They include private clouds, which are hosted internally and used by a single organization; public clouds, which are commercial ventures available to the general public; community clouds which are only accessible to specific groups of users, and hybrid clouds which include elements of two or more such arrangements.

Cloud platforms and services are owned and operated by different companies, each with their own policies, prices, and resources.

There are also differences in the types of computing services they offer. Infrastructure as a Service, or IaaS, controls user access to computing resources servers, storage, network and so on which are actually owned by the client.

Platform as a Service, or PaaS, controls user access to the operating software and services needed to develop new applications.

The third and most popular cloud operations product is Software as a Service, or SaaS, which gives users direct access to the clients software applications.

For example, once theyre migrated to the cloud, client organizations lose a good amount of visibility and control over their assets and operations.

The monitoring and analysis of information about the companys applications, services, data and users never loses importance, but it will need to take a different form than it did when the clients own network monitoring and logging procedures were in place.

Before a clients data ever gets to the cloud, it travels across the internet. Unless the users network and internet channel are secure, powered by strong authentication standards and encrypted data, information in transit is susceptible to exposure.

Vulnerabilities in shared servers and system software used by public clouds to keep the data of multiple tenants separate can be exploited, enabling an attacker to access one organizations data via a separate organization and/or user.

Permanently removing sensitive data that a client wants securely deleted is difficult to confirm because of the reduced visibility inherent in cloud operations, which frequently includes data distributed over an assortment of storage devices. Any residual data can become available to attackers.

If a cloud service provider goes out of business or fails to meet your business and/or security needs, transferring data from that operator to another can be more costly in terms of time, effort and money than it was to initially become a subscriber. Additionally, each providers non-standard and proprietary tools can complicate data transfer.

Cloud operations are complicated by their technology, polices and their implementation methods. This complexity requires the clients IT staff to learn new ways of handling their information, because as complexity grows, so does the potential for a data breach.

Insider abuse has the potential to inflict even greater damage on the clients data than it did before due to the clouds ability to provide users with more access to more resources. Depending on your cloud service, the forensic capabilities needed to trace and detect any malicious insider may not be available.

The loss of stored data due to an accidental deletion or a physical catastrophe such as a fire or earthquake, can be permanent. A well thought out data recovery strategy needs to be in place, but the client and service provider must work together to establish a secure and effective process.

Managing user identities carefully controlling users identity attributes and regulating their privileged access remains an equally challenging task in cloud operations as it ever was in on-premises environments. Due to the nature of cloud services, the challenge in some cases can be much greater than in on-premises environments.

Providing appropriate levels of secure access for different user roles, such as employees, contractors and partners is critical to protecting your cloud environment, making Identity Governance a high priority when migrating to the cloud. Cloud computing and security should constantly be thought of as joined concepts, not separate silos.

Read Next:Getting the Most Out of Enterprise Content Management Software

Cloud operations provide a variety of valuable avenues to exploit. And while the childlike faith that cloud platforms and services are immune to malicious attacks may be touching, its simply not true. Vigilance is equally, if not even more important, than it was before migrating to the cloud.

View original post here:

Cloud Computing and Security: The Risks You Need to Know - TechDecisions

Year in Review: PwC – Gigabit Magazine – Technology News, Magazine and Website

By William Smith . Dec 12, 2019, 6:46AM

Back in Septembers issue, Gigabit spoke with professional services firm PricewaterhouseCoopers (PWC) about the digital transformation expertise it provides in areas such as cloud computing and cybersecurity.

Sub Mahapatra, Management Consultant at PwCs New York network guided us through his vision of digital transformation as an existential business. The company can guide companies who are not clear how to best achieve a digital transformation, but no two of PwCs clients are the same.

If we have a partnership in place, then were good, said Mahapatra. They can bring us into a room and talk about what they want to do, and how we can help them. Thats easy. If you don't have that relationship in place, we have our business experience and technology sessions. We work with our clients to find out the problem we are trying to solve. Do they just want to get into the digital world, or do they want to increase their revenue, or upscale their customer base?

SEE ALSO:

Once the problem is identified PwC works alongside clients as the transformation progresses. We have a very broad offering from an advisory perspective, but we dont just strategize and then go away, said Mahapatra. We are there to realize the value we are trying to envision for them.

Cybersecurity is a large part of clients concerns, and PwC can accordingly offer protection to what are deemed crown jewels. We find what kind of data our clients want to keep to themselves, and the extra security they want to have. We work with them to find out where that data is residing, and if that data is at risk, or if it is in transit, we create a whole data protection layer alongside it.

For the full interview with Mahapatra and many more details on the work PwC is doing, take a look at our September issue or the companys exclusive brochure.

Read the original post:

Year in Review: PwC - Gigabit Magazine - Technology News, Magazine and Website

Healthcare Cloud Computing Market – Global Forecast to 2024 – Increasing Adoption of Big Data Analytics, Wearable Devices, and IoT in Healthcare -…

DUBLIN, Dec. 9, 2019 /PRNewswire/ -- The "Healthcare Cloud Computing Market by Product (EMR/EHR, PACS, VNA, PHM, Telehealth, RCM, CRM, Fraud Management), Service (SaaS, IaaS), Deployment (private cloud, hybrid cloud), Pricing (Pay as you go), Component (Software) - Global Forecast to 2024" report has been added to ResearchAndMarkets.com's offering.

The global healthcare cloud computing market size is projected to reach USD 51.9 billion by 2024, from an estimated USD 23.4 billion in 2019 at a CAGR of 17.2% during the forecast period.

The key factors driving the growth of this market include the increasing adoption of big data analytics, wearable devices, and IoT in healthcare and the advantages of cloud usage, such as improved storage, flexibility, and scalability of data. However, concerns over data security and privacy are expected to restrain the growth of this market.

By product, the healthcare provider solutions segment is expected to grow at the highest CAGR during the forecast period (2019-2024)

Based on the product, the healthcare cloud computing industry is segmented into healthcare providers' solutions and healthcare payer solutions. The healthcare provider solutions segment accounted for the largest share of the healthcare cloud computing market in 2018. The large share and high growth of this segment can be attributed to the growing population and rising prevalence of diseases, leading to an increasing volume of patient data generated globally.

By service model, the software as service segment is expected to grow at the highest CAGR during the forecast period (2019-2024)

Based on the service model, the healthcare cloud computing market is segmented into software-as-a-service (SaaS), infrastructure-as-a-service (IaaS), and platform-as-a-service (PaaS). The SaaS segment commanded the largest share of the healthcare cloud computing market in 2018. The SaaS model offers several advantages over on-premise solutions, such as security, the lower total cost of ownership, faster deployment time, and limited up-front capital expenses.

However, the IaaS segment is expected to grow at the highest CAGR during the forecast period. This model does not need any upfront charges, bandwidth utilization fees, or minimum term commitments, owing to which the adoption of IaaS is expected to increase in the coming years.

North America to witness the highest growth during the forecast period (2019-2024)

North America accounted for the largest share of the global healthcare cloud computing market from 2019-2024. The large share of North America in this market is attributed to the increasing adoption of Electronic Health Records (EHRs) among medical professionals, the incentive-driven approach of government health IT programs, and active participation by private sector players in the region's industrial development.

Market Dynamics

Drivers

Restraints

Opportunities

Challenges

Companies Mentioned

For more information about this report visit https://www.researchandmarkets.com/r/wto61o

Research and Markets also offers Custom Research services providing focused, comprehensive and tailored research.

Media Contact:

Research and Markets Laura Wood, Senior Manager press@researchandmarkets.com

For E.S.T Office Hours Call +1-917-300-0470 For U.S./CAN Toll Free Call +1-800-526-8630 For GMT Office Hours Call +353-1-416-8900

U.S. Fax: 646-607-1907 Fax (outside U.S.): +353-1-481-1716

SOURCE Research and Markets

http://www.researchandmarkets.com

Here is the original post:

Healthcare Cloud Computing Market - Global Forecast to 2024 - Increasing Adoption of Big Data Analytics, Wearable Devices, and IoT in Healthcare -...

Cloud computing vs fog computing, all information you need to know – Optocrypto

We all know what is cloud computing. Today, we are going to learn about cloud computing vs fog computing. From the same makers of the Cloud, we get the fog. And no, its not about a Stephen King book, but about data processing in the limbus between hardware and network.

It may seem to some that I took the term out of my pocket just to invent a title. I was a little familiar with the general meaning of what we will explain next, but by reading How to Geek I discovered it was called Fog Computing. After reading many explanations that try more to sell me something than to illustrate the concept, I was convinced that it was something much simpler than it seems.

Consider cloud computing as an initial concept, as many already know what it means. All those computer services that are offered online, on the Internet, that involve a large network of computers connected to host the data of the software that users access without having to install anything. Cloud services concentrate and process everything on a central server and not on the users computer.

Granted, its a funny term. Fog computing is a model where data processing and applications focus on devices on the edge of the network and not entirely in the cloud. This allows data to be processed locally on an intelligent device instead of sending it to the cloud. This model is specifically designed to focus on the Internet of Things, all the new devices like your home thermostat or fridge that are now connected to the Internet.Such devices can transfer large amounts of data, and the idea of Fog Computing is to use the same device as an access point to increase the speed of data processing and thus relieve the cloud of some of the work for smoother and more immediate user experience. After all, who wants to wait 30 seconds for the refrigerator to know the condition of their vegetables?

On the other hand, some argue that current cloud computing already has all the elements of supposed fog computing and that it is just a marketing term to attract attention. And indeed, what is the cloud, but a metaphor for the Internet.

Some experts believe that the expected implementation of 5G mobile communications could create more opportunities for Fog Computing. 5G technology in some cases requires the installation of very dense antennas, and under certain circumstances, the antennas must be within 20 kilometers of each other. In such an application, a fog computing architecture could be created between these stations, including a central controller that manages applications running on this 5G network and manages connections to data centers or background clouds.

A fog computer fabric can have a variety of components and functions. It could also be Fog c, which are gateways that accept data from IoT devices. It could include a variety of wired and wireless granular collection endpoints, including rugged routers and switching devices. Other aspects could include the equipment of customer premises (CPE) and gateways for accessing edge nodes. Further up the stack, fog computer architectures would also affect central networks and routers, and ultimately services and servers in the global cloud.The OpenFog Consortium, the group that develops reference architectures, has defined three goals for the development of a Fog framework. Fog environments should be horizontally scalable, meaning that they support vertical use cases from different industries, can work from the cloud to the continuum of things and are a system-level technology that extends from things across the boundaries of the network to the cloud and across different network protocols.

Originally posted here:

Cloud computing vs fog computing, all information you need to know - Optocrypto

Cloud computing surges in the UAE in 2019 – ComputerWeekly.com

The past 12 months has seen a rapid growth of cloud computing in the United Arab Emirates (UAE) as suppliers race to set their footprints.

Microsoft and Oracle launched datacentres in the country in 2019, adding to existing UAE centres offered by SAP and Alibaba Cloud. Amazon Web Services (AWS) also unveiled its first Middle Eastern datacentre in nearby Bahrain in July.

SAP is the current front-runner in the Middle East race with three centres in the region in Dubai, Riyadh and Dammam, while Oracle opened its first datacentre in the Middle East in Abu Dhabi in February to offer cloud storage to customers across the region. In a show of serious intent, Oracle is set open two more datacentres in the UAE and two in Saudi Arabia within the next year.

Meanwhile Microsoft brought online its first datacentre regions in the Middle East in June this year, opening one in Dubai and Abu Dhabi. Comparatively a smaller player, Alibaba Cloud cloud computing arm of the Chinese ecommerce giant opened its first regional datacentre in Dubai in 2016.

According to a recent YouGov survey of more than 500 IT decision-makers in the UAE, 88% planned to increase cloud spend in 2019, 83% are running partially or completely in the cloud in 2019, and nearly 90% expect cost savings on the cloud.

The public cloud services market in the Middle East and North Africa (MENA) is projected to grow to $1.9bn (AED7.97bn) by 2020, double what it was in 2016, according to data research firm Statista.

According to Jyoti Lalchandani, group vice-president and regional managing director for the Middle East, Turkey and Africa at IDC, the arrival of several new datacentres in the UAE in 2019 foretells a transformational year for the country.

Lalchandani told Computer Weekly: The fact that several tech suppliers have entered the UAE shows that there is a changing landscape. This trend shows there is a strong national focus on public cloud services. The datacentres are arriving in the region to fulfil the demand from local customers.

He noted that local companies are now moving their mission critical services to the cloud, which is evidence of growing trust in remote hosting centres. Very traditional organisations such as the Commercial Bank of Dubai have moved all their sensitive data to the cloud. I foresee other banks doing the same, he said.

As UAE cloud uptake grows, Lalchandani also predicted a heightened focus on security and regulations. With all these providers coming in, there might more regulatory frameworks put in place. I predict the government will become more involved in cloud regulations.

According to Zakaria Haltout, managing director at SAP in the UAE, 2019 has been a landmark year for cloud computing in the UAE.

Haltout said every industry vertical in the UAE is undergoing digital transformation through the cloud, especially sectors such as oil and gas, utilities, government, retail, passenger travel, and financial services. More and more public and private sector organisations are digitally transforming on the cloud, he said.

The SAP MD said the companys local cloud datacentre in the UAE is the centrepiece of its ongoingfive-year $200m UAE investment plan.SAP was the first multi-national business applications company to go live in the country and onboard customers with localised data centre solutions.

Haltout predicts continued and rapid cloud services growth in the UAE, particularly as local organisations embrace the experience economy and personalise customer experiences with cloud-based solutions, rather than merely selling products and services. UAE organisations that leverage customer experience solutions on the cloud are set to see the biggest business benefits, said Haltout.

He said the upcoming Expo 2020 Dubai will offer companies opportunities to experiment with cloud projects. The event, which will run on SAP to optimise processes and costs, is expected to deliver personalised experiences for 25 million visitors and 192 participating countries.

Looking ahead, Haltout said the biggest challenge for cloud take-up in the UAE lies in implementation issues.The biggest challenge is not in business vision for the cloud, but in its implementation. Channel partners play a key role in supporting UAE organisations to understand the business challenges that organisations face, and which cloud solutions best meet business needs.

According to Haltout, CIOs should work with channel partners to develop strategies for change management and skills development, to ensure that employees can optimise their cloud-based workplaces and business applications.

Jayakumar Mohanachandran, head of IT at Dubai-based packaging firm Precision Group, said many companies in the UAE are now ready to take advantage of cloud benefits, such as quick deployment of IT resources, shared resource usage, and the ability to monitor usage.

Mohanachandran, who is currently managing a large-scale digital transformation project at Precision Group,said cloud migration has enabled his firm to build a solid foundation for future growth plans.

Precision was running on a legacy system for more than 25 years and this migration has helped us to be more agile and flexible. Now all our employees can work from any part of the world and stay connected all the time with all their information available at their fingertips. Employees can manage, monitor or approve all requests through mobiles which is a huge transformation for us.

Santhosh Rao, senior director analyst, Gartner UAE, predicted that many more local companies will shift their data to UAE cloud centres.

We expect a steady stream of projects where enterprises engage advisors to come up with cloud migration strategies. The UAE is transforming from an oil economy to data economy, so there is a need to create new revenues streams such as artificial intelligence. Cloud is a really nice way to kick things off its a good way to start the transformation with less risk.

Read the rest here:

Cloud computing surges in the UAE in 2019 - ComputerWeekly.com

Microsoft is pushing forward with JEDI recruiting despite Amazon’s legal fight – CNBC

Satya Nadella, chief executive officer of Microsoft Corp., departs the "Tech For Good" meeting at Elysee Palace in Paris, France, on Wednesday, May 23, 2018. A group of industry executives met with France's President Emmanuel Macron to discuss how to use technology to improve people's lives.

Christophe Morin | Bloomberg | Getty Images

Microsoft is staffing up in preparation for its work with the Defense Department, even as Amazon is in court protesting the Pentagon's decision, according to people familiar with the matter.

In the more than six weeks since winning the Joint Enterprise Defense Infrastructure, or JEDI, deal, which is worth up to $10 billion, Microsoft has been trying to lure talent from defense contractors and other companies and get employees the necessary authorization to work on the project, said the people, who asked not to be named because they're not authorized to speak on behalf of the company.

Amazon has contested the Pentagon's decision to award the contract to its smaller cloud rival, citing in a lawsuit a bias on the part ofPresident Donald Trump and repeated attacks against Amazon and CEO Jeff Bezos as evidence of an unfair process.

But Microsoft isn't slowing down. Brad Smith, Microsoft's president and chief legal officer, told CNBC on Saturday that "we have if anything been moving even faster since that contract was awarded."

Microsoft has a history with government contracts and has spent years going through the process of clearing employees for defense work, said two of the people. The company has hundreds of cleared engineers, one of the people said. But there are so many people in the pipeline that Microsoft faces an 18-month bottleneck getting current employees through the process, a different person said.

Microsoft's website lists more than 100 openings for people with security clearances, though none mention JEDI by name. Openings are for positions including principal software engineering managers and principal program managers. In January, Microsoft plans to hold two recruiting events in Reston, Virginia, near the headquarters of the Central Intelligence Agency in Langley.

"The Department of Defense is confident in the JEDI cloud contract award and remains focused on getting this critical capability into the hands of our warfighters as quickly and efficiently as possible," Pentagon spokesperson Elissa Smith said in an email. "The department's Cloud Computing Program Office continues to work with Microsoft to prepare the JEDI cloud environment."

A Microsoft spokesperson declined to comment.

As part of the internal recruiting process for Defense Department work, Microsoft regularly offers to pay for employee applications for security clearances, according to a person familiar with the process. The clearance brings with it a 20% to 30% salary increase, depending on the level of authorization and the employee's role, the person said, adding that not every employee will need clearance to contribute to JEDI.

According to a court filing that became public on Monday, Amazon and the Defense Department have agreed that there's currently no reason to ask the court for a restraining order because the Pentagon has said it "will not proceed with performance of the JEDI contract beyond initial preparatory activities" until Feb. 11, 2020, at the earliest.

The battle for JEDI has been hotly contested since 2018, when the Pentagon announced its plan for a $10 billion and 10-year upgrade to its IT operations. Amazon Web Services holds a commanding lead in cloud infrastructure, with 47.8% of the market, according to Gartner, and was widely considered the favorite to win JEDI, in part because of a big contract the company has had with the CIA since 2013. Microsoft Azure is a distant second, with 15.5% of the market.

In Monday's court filing, Amazon asserted there was a "fundamental defect" in the procurement process. Early on, according to Amazon, Pentagon staffers observed the strengths of AWS offerings, but their views became more negative over time as Trump's bias became more evident.

"This abrupt change in course reflects the culmination of President Trump's improper interference and express direction to officials responsible for overseeing the award of the JEDI contract," Amazon wrote.

CNBC's Amanda Macias contributed to this report.

WATCH: Microsoft President Brad Smith on tech and national security

Read more:

Microsoft is pushing forward with JEDI recruiting despite Amazon's legal fight - CNBC

This Is How Amazon Web Services Is Turning Its Cloud Into A Supercomputer – CRN: Technology news for channel partners and solution providers

A Supercomputing Cloud

AWS has a reputation for winning hesitant, often skeptical customers to the cloud. But for years now, the IaaS kingpin has faced a tough sell when trying to convince high performance computing practitioners to ditch their data centers.

The scientists and engineers that run those resource-demanding applications have long resisted Amazon's overtures: They are more comfortable deploying (and more often waiting to deploy) their compute intensive jobs on expensive mainframe supercomputers.

But AWS thinks it's finally found the formula for matching the elasticity of cloud computing with the scale-out capabilities required for high performance computing, Peter DeSantis, Amazon's vice president of global infrastructure and customer support, told attendees of a nighttime keynote that kicked off the AWS re:Invent conference last week in Las Vegas.

Over the last year, the goal to host "some of the most amazing and demanding workloads out there" has become reality, DeSantis said in an annual talk that provides a unique glimpse at the technical wonders of AWS infrastructure.

Advances in custom hardware, primarily networking, now enable HPC applications "to run on AWS the same as any other workload," DeSantis said.

As DeSantis dived deep into the innovations underpinning AWS' progress, Derek Magill, HPC solution architect at Flux7, a pioneering AWS partner that just launched an HPC practice, listened closely.

That DeSantis dedicated his highly anticipated re:Invent keynote to HPC in itself is an important signal to channel partners implementing the technology, Magill later told CRN.

"Not only do we believe it, but Peter DeSantis getting up there and spending an hour-and-a-half talking about HPC on AWS, I think it really validated our move to HPC," Magill said.

And DeSantis did a good job explaining how AWS has overcome the technological barriers that have kept HPC workloads on-premises.

"HPC people talk about how we're different and special and our needs can't be met by cloud," Magill said. But DeSantis keynote went far in knocking down some of their biggest excuses.

Read the original here:

This Is How Amazon Web Services Is Turning Its Cloud Into A Supercomputer - CRN: Technology news for channel partners and solution providers

Global Cloud Computing Market To Witness a Healthy Growth During 2020 To 2029 – Sound On Sound Fest

New York City, NY: Dec 13, 2019 Published via (Wired Release) The Research Report on Cloud Computing market is a Proficient and Detailed Analysis of the Current Situation of Cloud Computing Industry. This report highlighted on the Key Drivers, Constraints, Opportunities, and Risks for Cloud Computing Major Players. It also Offers Gritty Analysis of Market Share, Segmentation, Revenue Forecasts (USD$)and Region-wise Study till 2029.

Cloud Computing Report presents before you a thorough and detail study of the Cloud Computing Market. The Cloud Computing report comprises of various company profiles of leading market players of Cloud Computing market. With a comprehensive market segment in terms of distinct countries, this report bifurcates the market into a few major countries, with consumption (sales), revenue, market share, and growth rate of the Cloud Computing market in these countries during the forecast period 2020-2029. The Cloud Computing report focuses on the major market dynamics of the sector. The ongoing market scenario and future estimation also have been analyzed. Furthermore, fundamental strategical activities in the market, which includes the development of the products, and acquisitions, partnerships, etc., are included.[ Download Free PDF Sample Of This Report ]

Our Free sample report provides a brief introduction to the research report overview, TOC, list of tables and figures, an overview of major market players and key regions included.

The Top Manufacturers/Players Are: CA Technologies, Cisco Systems, Google, HP, Amazon.Com, IBM, Microsoft, SAP AG, Yahoo! Inc., Oracle, Flexiant, Citrix Systems, Inc., ENKI Consulting, Akamai Technologies, Inc., Citrix Systems, Inc..

Want a free sample report?

Click Here, And Download Free Sample Copy OfCloud Computing Market ResearchReport @https://marketresearch.biz/report/cloud-computing-market/request-sample

All Cloud Computing facets of the marketplace are qualitywise in addition to quantitywise appraise to analyze the worldwide as well as nearby marketplace comparatively. The Cloud Computing fundamental facts just like the definition, prevailing chain and the government regulations related to the Cloud Computing marketplace also are explained in the Cloud Computing document.

We have also done the product classification of the same on the basis of product type, end-user industry, and region.

Key Question for this Market Report:

What are the most crucial appraise driving the Cloud Computing Market?

What are the essential trends impacting the development of the Cloud Computing Market?

What is estimated Cloud Computing industry size as well as the growth rate in 2029?

Who will be the leading manufacturers of this particular Cloud Computing Market?

What are the key driving factors impacting the Cloud Computing market shares in upcoming years?

Get upto 25% off on this report:https://marketresearch.biz/report/cloud-computing-market/#inquiry

Reasons for Buying this Report:

It serves forward-looking prospects on various factors driving or constraining Cloud Computing market growth.

It gives a forecast appraised on the basis of how the Cloud Computing market is estimated to grow.

It guides to understand the major product segments and their future.

It provides vital information which keeps you ahead of Cloud Computing competitors.

It gives guidance in making a decisive business judgment by having a complete study of Cloud Computing market and by making a detailed analysis of Cloud Computing market segments.

The above Cloud Computing report can be customized according to your needs.

About Us:

We at MarketResearch.Biz offers an extensive study by providing insight reports of the various market verticals. Our objective is to offer an in-depth analysis of the vast markets globally, backed by rich information. Decision-makers can have trust on our well-defined information gathering methods to get the accurate market anticipation along with the deep study.

Media Contact:

Mr. Benni Johnson

MarketResearch.Biz (Powered By Prudour Pvt. Ltd.)

420 Lexington Avenue, Suite 300

New York City, NY 10170,

United States

Tel: +1 347 826 1876

Website:https://marketresearch.biz

Read more from the original source:

Global Cloud Computing Market To Witness a Healthy Growth During 2020 To 2029 - Sound On Sound Fest

How to become a computing expert in software, web and cloud – Study International News

The cloud computing business is growing bigger every day. Between now and 2023, the global cloud computing market size is poised to grow by US$190.32 billion, progressing at a CAGR of over 16 percent.

Attitudes of business owners have certainly changed in the last few years. Where before they would have been sceptical to adopt internet cloud computing, today they cant wait to add more. While Amazon Web Services was the pioneer in this field, Microsoft is catching up as a strong second contender, with its Microsoft Azure unit which supplies cloud-based computer processing and storage growing by 63 percent from a year before in the most recent quarter.

What all this translates to is a wealth of new opportunities emerging for computing professionals today. At Tampere University, a whole department specialises in training computing professionals to take advantage of these opportunities.

The Faculty of Information Technology and Communication Sciences at Finlands second largest university offers a unique and wide range of expertise through research and teaching. From the natural sciences and engineering to event theatre and drama, this is a Faculty that boldly pushes the boundaries of multidisciplinary research and teaching across organisational boundaries. Its also Finnish youths second most popular choice for a university degree out of 25 universities of applied sciences and 13 Finnish universities. In a recent survey, Tampere University was rated among the top three of universities in five categories and topped the categories for attractiveness of fields of study and city attractiveness.

Tampere University

At the core of its vision is the aim to provide the knowledge and solutions to the complex challenges of our global, digital and multicultural society. Their new Master of Science in Software, Web and Cloud and Master of Science Technology in Software, Web and Cloud are two postgraduate offerings from this respected institution that realise this vision and more. The detailed course description will be updated and available on the university curriculum page in March 2020, and the courses will start in August 2020.

With a curriculum that provides a solid foundation in computer science and software engineering, Tampere Universitys goal is for each student to expand their knowledge and skills in the development of high-quality software. Through in-depth studies and skill development, students gain a good understanding of software engineering, the ability to design and implement large software systems, the ability to manage and improve software development processes as well as the competence and ability to understand, design and implement web- and cloud-based systems.

The official language of the programme is English, meaning all courses, exams and student services are offered in English. Applicants can apply to take a Master of Science degree or a Master of Science (Tech) degree in Software, Web and Cloud, but applicants should note that the eligibility criteria of these two tracks are different. For the Masters programme in Computing Sciences, Software, Web and Cloud (MSc Tech) track, your previous degree must be in one of the following (or related) fields: computing, computer science, software engineering, information technology or other closely related field with proficiency in mathematics, programming, data structures, databases and physics.

An industrial perspective in one of the worlds most respected education systems

Located in the Tampere region, these are Master degrees that greatly benefit from industry perspectives thanks to the high concentration of IT companies located here. This means students will be able to leverage on one of the universitys biggest strengths: the ability to apply and test learning in real life situations, whether its in cities, companies or societal services.

For Software, Web and Cloud MSc students, this doesnt just manifest in one final semester capstone project. Instead, the curriculum is designed with real-world problems intentionally incorporated, exploring subjects such as requirements management, software modelling and specification, implementation and testing software, software project management, and the web and cloud. Students can mix and match a variety of minor subject studies according to their unique interests from the wide offering of complementary studies this programme offers, such as Data Structures, Database Programming, Innovative Project and Functional Programming.

There is an option to undertake an internship where students apply their knowledge of computer science to practice, expanding their knowledge according to their workplace requirements. Upon completion, theyll be able to report on their tasks and assess their progress.Add to that the research activities and close collaboration between the university and local industry, and youve got a Masters of high academic quality and real-world relevance to industrial positions.

Tampere University

This bodes well for Tampere graduates entering the global job market today. The US only had 63,744 computer science students join the workforce in 2018, despite more than 500,000 open computing vacancies available nationwide. In Australia, less than 5,000 ICT students graduate annually comprising only one third of creative arts graduates yet demand for the broader group of ICT workers in the county projects the current 660,000 will increase to 750,000 by 2023.

Finland is also struggling with a lack of software professionals. By 2020, the Finnish Information Processing Association estimates the country will be short of up to 15,000 IT experts. With its current undersupply of qualified IT professionals, employers are racing to hire IT talent, offering improved workplaces and focusing on job satisfaction. Almost 90 percent of students pursuing software-related majors at Tampere University are employed on the day they graduate.

This demand for computing talent in software and cloud is unlikely to slow down anytime soon. As internet access broadens and increases, every aspect of modern life will increasingly include computing systems. From retail to agriculture to security, industries are expanding their adoption of technologies like the Internet of Things, big data analytics, artificial intelligence, cloud technology and more. There is an urgent need for companies to invest in more skilled talent in order to capitalise of the new opportunities these technologies will bring.

Tampere University is ready to fill these roles in the continuing digital revolution. With its robust training and industrial awareness, students stand to realise their potential as versatile software professionals, finding employment in commercial and administrative fields, or furthering their studies on the doctoral level. Graduates typically start working as a programmer and then advance towards the expert and/or managerial positions.

Follow Tampere UniversityonFacebook,Twitter,YouTubeandLinkedIn

Designing human-technology interaction for versatile aspects of life

Computer science studies that lead to expertise in big data and machine learning

More:

How to become a computing expert in software, web and cloud - Study International News

Cryptshare Brings Secure File Transfer and Email Encryption to the Cloud with Cryptshare on Microsoft Azure, Now Available on the Azure Marketplace -…

Now with so many organizations deciding that the most secure place to protect their sensitive electronic data exchange is in the cloud, simplifying the installation for Azure by gaining certification for the Microsoft Azure Marketplace became an important part of our product roadmap.

BOSTON (PRWEB) December 12, 2019

Cryptshare, the German developer of data security and privacy solutions for the exchange of business-critical information, today announced the U.S. availability of Cryptshare on Microsoft Azure, their first release in support of Cryptshare as PaaS (cloud computing).

"Companies with a cloud-first approach now have a secure file transport option that matches their IT strategy, explains Matthias Kess, CTO of Cryptshare AG. They can run Cryptshare on a Microsoft Azure server and gain all the advantages associated with cloud computing and PaaS -- from scalability to a wide range of customizations and redundancy."

The decision to launch Cryptshare on Azure comes in direct response to the growing popularity of Microsoft Office 365 and Microsoft Outlook 365 among current Cryptshare users as well as the increased demand for an option to operate in an infrastructure environment fully hosted by Microsoft Azure.

Already weve given our users the option to install the encryption software solution on their own network, in their own data center or with a hosting party of their choice," explained CEO and President Mark Forrest. Now with so many organizations deciding that the most secure place to protect their sensitive electronic data exchange is in the cloud, simplifying the installation for Azure by gaining certification for the Microsoft Azure Marketplace became an important part of our product roadmap.

Todays announcement signals the first of several cloud computing moves Cryptshare will be making in the coming months. As more organizations take a Cloud-First approach to efficient, cost-effective computing, Cryptshare will be expanding its support of cloud infrastructures to give their customers a range of cloud computing choices.

Cryptshare has been offering a reliable and user-friendly solution for sharing e-mails and files securely since 2007. Current Cryptshare customers migrating to Microsoft Azure can use a Bring Your Own License approach to deploying their current secure file exchange solution to the cloud. To learn more, U.S.-based customers should look for Cryptshare Server on the Azure Marketplace or contact Cryptshare directly at azure@cryptshare.com.

About Cryptshare Cryptshare AG develops and supplies software solutions that help companies support, optimize and secure their email processes, important data transmissions and large file handling. Today, more than four million users in 2000+ companies across 80 countries rely upon its leading security technology, Cryptshare, for the safe exchange of their business sensitive information. Founded in 2000, their success has been recognized with the Cybersecurity Excellence Award 2017 and the Infosecurity Products Guide Global Excellence Award. The company is headquartered in Freiburg, Germany, and operates sales offices in the UK and the Netherlands. Cryptshare, Inc. operates as a subsidiary of Cryptshare AG from its U.S. office in Boston. To learn more, visit: http://www.cryptshare.com.

Share article on social media or email:

Continue reading here:

Cryptshare Brings Secure File Transfer and Email Encryption to the Cloud with Cryptshare on Microsoft Azure, Now Available on the Azure Marketplace -...

Edge Security: How to Secure the Edge of the Network – eSecurity Planet

Posted December 12, 2019

SHARE

Edge computing is a term that is becoming increasingly popular not just as a buzzword but as a way of understanding and modeling IT infrastructure in an era of pervasive cloud computing. With the rise of edge computing has come a need for edge security.

Edge security isn't just about securing edge computing though; it's also potentially a new approach to defining user and enterprise security in the cloud-connected world.

While there are no shortage of meaningless buzzwords in IT, edge computing isn't an abstract concept and neither is edge security. For both edge computing and edge security, there is an emerging set of definitions and standards.

In this eSecurity Planet guide, we look at what edge security is all about and some of the top vendors in the space.

In many modern IT deployments, a data center stands at the core of a network architecture. That data center can be an on-premises, corporate-owned facility, or increasingly the data center can be a collection of public cloud resources.

Edge computing is often defined as computing that happens at the edge of a network. The broader corporate campus, branch offices and retail locations can be considered at the edge of the network, since the core of the network is the data center or cloud. With the emergence of 5G cellular, with base station deployments that integrate powerful compute capabilities, 5G is also considered part of edge computing.

Simply stating that compute that is at the edge of the network is edge computing is not the formal definition, however, at least according to the Open Glossary of Edge Computing, an open source effort led by the Linux Foundation's LF Edge group.

"By shortening the distance between devices and the cloud resources that serve them, and also reducing network hops, edge computing mitigates the latency and bandwidth constraints of today's Internet, ushering in new classes of applications," the glossary explains.

As edge computing is a growing area, so too is edge security. There are several aspects involved in edge security, including:

Perimeter security: Securing access to edge compute resources via encrypted tunnels, firewall and access control

Application security: Beyond the network layer, edge compute devices run applications that must be secured

Threat detection: As edge computing is by definition not centralized, it's critically important for providers to employ proactive threat detection technologies to identify potential issues early

Vulnerability management: There are both known and unknown vulnerabilities that need to be managed

Patching cycles: Automated patching to keep devices up to date is important for reducing the potential attack surface

In 2019, a new term was coined by Gartner to define a category of hardware and services that help enable edge security; that term is Secure Access Service Edge (SASE).

According to Garnter, SASE is an emerging offering combining comprehensive WAN capabilities with comprehensive network security functions, such as secure web gateways (SWG), CASB, firewalls as a service (FWaaS) and zero trust network access (ZTNA), to support the dynamic secure access needs of digital enterprises.

Even though the term SASE is new, in August 2019 Gartner forecast that by 2024, at least 40% of enterprises will have explicit strategies to adopt SASE, up from less than 1% at year-end 2018.

Though the term edge security is relatively new, there are multiple vendors in the space that have product offerings. Not all the vendors listed below fall into the SASE category, as some lack the WAN functionality and only provide a subset of edge security needs.

Akamai was once primarily known as a content delivery network (CDN), with its global distributed network. In recent years, Akamai has expanded significantly into security, with multiple capabilities to help organizations defend against both network and application layer attacks.

The Akamai Intelligent Edge Platform provides what the company refers to as a defensive shield that can surround and protect users from the edge all the way up to full data centers.

Cisco has long been a dominant vendor in the networking market, with an expansive product portfolio that includes both hardware and software.

When it comes to edge security, Cisco is positioning a number of its capabilities as part of a security stack that includes traditional perimeter security as well as cloud-based security controls.

Cloudflare is a global cloud platform that has multiple content and application delivery services as well as security capabilities.

One of the controls needed for edge security is protection against attacks, which is where Cloudflare's DDoS and DNS Protection services can be useful.

Fortinet is a network security vendor that has increasingly been active in the SD-WAN space.

The FortiGate Secure SD-WAN offering can potentially fit in the SASE category of edge security, providing organizations with secure access as well as threat protection for edge computing deployments.

Palo Alto Networks got its start as a network security vendor and has come to fully embrace the SASE model.

The Prisma Access platform integrates Firewall as a Service (FWaaS), threat prevention, DNS security and data loss prevention (DLP) capabilities for edge resource protection.

Cato Networks got started in 2016 with a Security-as-a-Service model for network security, headed by Shlomo Kramer, perhaps best known as the founder of Check Point Software.

Cato Networks has strongly embraced the SASE model and its Cato Cloud provides a global SD-WAN architecture that delivers security services, including FWaaS, SWG, anti-malware, IPS and threat detection capabilities to both the edge and data center use cases.

While VMware is well known for its virtualization technologies, it is now also a strong player in the emerging SASE market as well, thanks to its SD-WAN by VeloCloud platform.

VMware acquired VeloCloud in 2017 and has steadily improved the platform since with increased security capabilities that can help to protect edge computing.

Zscaler has a global clout network that provides security-as-a-service that can be used to create a SASE-style deployment.

The Zscaler Cloud Security platform can provide full content inspection of both inbound and outbound traffic, and also benefits from the integration of threat intelligence feeds to help correlate and block threats in real time.

See more here:

Edge Security: How to Secure the Edge of the Network - eSecurity Planet

Three Key Takeaways From AWS re:Invent 2019 – Forbes

AWS re:Invent 2019, the industrys largest cloud computing conference concluded last week. With over 65,000 attendees and 2500 sessions, the scale of the event only grows each year.

Andy Jassy, CEO

Its not just the scale, the pace of innovation at Amazon refuses to slow down. This years re:Invent witnessed more than 100 new features, products, and services added to AWS portfolio.

Here are three key themes that emerged from AWS re:Invent 2019:

1. Redefining Edge Computing

Traditionally, the cloud is considered to be a highly centralized resource, much like an IBM mainframe. But the advent of edge computing has made the cloud truly distributed. Developers dont need to work around the limitations of latency and the performance trade-off involved in deploying applications in remote data centers.

Edge computing is often used either in the context of delivering static content through CDN or moving the processing closer to IoT devices. But, AWS has redefined edge computing in more than one way. It has built a continuum of compute by delivering a new form of edge computing.

First, there are AWS Outposts - hardware appliances that customers can rent from Amazon to run within their data center. Customers can launch Amazon EC2 instances on AWS Outposts, and manage them from the same set of tools such as CloudFormation and AWS Console. Each Outpost deployment is associated with an AWS region that is capable of running managed services such as Amazon RDS and Amazon S3.

If AWS Outposts are confined to the data center, AWS Local Zones make cloud hyper-local by bringing compute, storage, and network services closer to users within a city or a metro. Each AWS Local Zone location is an extension of an AWS Region where customers can deploy latency-sensitive applications using AWS services in geographic proximity to end-users. Developers can now deploy applications across availability zones that span traditional and local zones. The new distributed architecture of cloud opens up additional avenues for developers and businesses building modern applications for AR/VR experiences, smart cities, connected cars and more. The Los Angeles AWS Local Zone is generally available by invitation.

Amazon is expanding the footprint of AWS to telecom providers that offer 5G networks. AWS Wavelength Zones are AWS infrastructure deployments that embed AWS compute and storage services within telecommunications providers data centers at the edge of the 5G network. Wavelength brings the power of the AWS cloud to the network edge to enable latency-sensitive use cases that require near-real-time responses.

From the data center to the metro to the telco infrastructure, AWS is bringing the cloud to new territories. Amazon is certainly redefining edge computing.

2. Investments in Next-Gen Hardware

Amazon is investing in a new breed of chips and hardware that makes AWS infrastructure efficient and cost-effective.

Since the acquisition of Annapurna Labs in 2016, Amazon has been moving software-based heavy lifting to purpose-built hardware. Nitro System, a collection of hardware accelerators, offloads hypervisor, storage, and network to custom chips freeing up resources on EC2 to deliver the best performance.

AWS Graviton2 processors use a combination of hardware from ARM and Nitro System. The new breed of EC2 instances based on Graviton2 uses the Nitro System that features the Nitro security chip with dedicated hardware and software for security functions, and support encrypted EBS storage volumes by default. When compared to the first generation of Graviton processor-based EC2 instances, the current generation of instances delivers 40% price performance.

Amazon has also announced the general availability of Amazon EC2 Inf1 instances, a family of instances exclusively built for machine learning inference applications. Combined with 2nd Gen Intel Xeon Scalable processors and 100Gbps networking, these instances deliver unmatched performance for running applications that depend on machine learning inference. Inf1 instances are powered by AWS Inferentia chips that are purpose-built for accelerating inference.

3. Big Bets on Machine Learning and Artificial Intelligence

Amazon continues to expand the AI and ML portfolio through new services and platform offerings. At re:Invent 2019, AWS announced multiple services that take advantage of ML and AI.

Amazon CodeGuru is a managed service that helps developers proactively improve code quality and application performance through AI-driven recommendations. The service comes with a reviewer and profiler that can detect and identify issues in code. Amazon CodeGuru can review and profile Java code targeting the Java Virtual Machine.

Venturing into enterprise search, AWS has launched Amazon Kendra, a managed service that brings intelligent, contextual search to applications. Amazon Kendra can discover and parse documents stored in a variety of mediums including file systems, web sites, Box, DropBox, Salesforce, SharePoint, RDBMS and Amazon S3.

Amazon Fraud Detector is a service that can identify potentially fraudulent online activities such as online payment fraud and the creation of fake accounts. Based on the experience of fraud detection from Amazon.com, AWS has built this service to help customers integrate sophisticated fraud detection techniques in their applications.

Amazon SageMaker, the platform as a service (PaaS) offering from AWS has got new capabilities that make developers and data scientists productive. Built on top of Jupyter Notebooks, Amazon SageMaker now has a full-blown development environment. Branded as Amazon SageMaker Studio, the new tooling experience includes debugging, pipeline management, integrated deployment, and model monitoring. The platform is tightly integrated with Git and Jupyter Notebooks to bring collaboration among developers.

Read the original post:

Three Key Takeaways From AWS re:Invent 2019 - Forbes

Cloud Computing Services Market 2019| In-depth Analysis by Regions, Production and Consumption by Market Size, and Forecast to 2026 | Research…

This new research report that entirely centers Global Cloud Computing Services Market 2019 is an exhaustive analysis of driving forces, risks, challenges, threats, and business opportunities, involved in the Cloud Computing Services market. It offers conclusive specks of the market such as major leading players, market size over the forecast period of eight years, segmentation analysis, market share, current market trends, movements and major geographical regions involved. The report includes a comprehensive market study and competitors landscape with the addition of a SWOT study of the leading vendors.

This report analysis also studies the product and business strategy of some of the key vendors in the market. Considering the segmentation in the Cloud Computing Services market, key vendors in the market are highlighting mergers and collaborations and advancing their supply chains.

Get Sample PDF Brochure (including full TOC, Tables, and Figures) of Cloud Computing Services Market @ https://researchindustry.us/report/global-cloud-computing-services-market-ric/676589/request-sample

Cloud Computing Services Market Impressive Report Offerings:

Some of the Major Cloud Computing Services Market Players Are:

SAPAmazon Web Services (AWS)MicrosoftRackspaceAliyunDELLVmwareOracleAnd Others

The global Cloud Computing Services market report covers scope and product overview to describe the key terms and offers detailed information about market dynamics to the readers.

This is followed by the segmental analysis and regional outlook. The report also comprises of the facts and key values of the global Cloud Computing Services market in terms of sales and volume, revenue and growth rate.

One of the significant factors in the global Cloud Computing Services market report is competitive analysis. The report covers all the key factors such as product innovation, market strategies of the key players, market share, latest research and development, revenue generation, and market expert views.

For Best Discount on purchasing this report,(Price 2000 USD for a single-user license) Visit: https://www.researchindustry.us/checkout?report=676589&type=single

Cloud Computing Services market regional Analysis

The global Cloud Computing Services market is fragmented across the globe which not only includes the market of North America but covers the other regions such as Europe, the Asia Pacific, South America, and the Middle East.

North American countries, especially the U.S. and Canada represent noteworthy growth in this market. Similarly, Western European regions are also ahead in influencing the global markets.

Here are the questions we answer

Customize report and inquiry before the purchasing this report @ https://researchindustry.us/report/global-cloud-computing-services-market-ric/676589/request-customization

Get In Touch!

Navale ICON IT Park,

Office No. 407, 4th Floor, Mumbai Banglore Highway, Narhe, Pune

Maharashtra 411041

phone +91-844-601-6060

Email sales@researchindustry.us

More here:

Cloud Computing Services Market 2019| In-depth Analysis by Regions, Production and Consumption by Market Size, and Forecast to 2026 | Research...

Verizon and AWS team up to deliver 5G edge cloud computing – Verizon Communications

SEATTLEBASKING RIDGE, NJ December 3, 2019 Today at AWS re:Invent, AWS, an Amazon.com company, and Verizon Communications Inc. (Verizon), announced a partnership that will bring the power of the worlds leading cloud closer to mobile and connected devices at the edge of Verizons 5G Ultra Wideband network. Verizon is the first technology company in the world to offer 5G network edge computing, and will use AWSs new service, AWS Wavelength, to provide developers the ability to deploy applications that require ultra-low latency to mobile devices using 5G. The companies are currently piloting AWS Wavelength on Verizons edge compute platform, 5G Edge, in Chicago for a select group of customers, including award-winning, worldwide video game publisher Bethesda Softworks and the National Football League (NFL). Additional deployments are planned in other locations across the U.S. in 2020.

By utilizing AWS Wavelength and Verizon 5G Edge, developers will be able to deliver a wide range of transformative, latency-sensitive use cases like machine learning inference at the edge, autonomous industrial equipment, smart cars and cities, Internet of Things (IoT), and augmented and virtual reality. To accomplish this, Verizon 5G Edge provides mobile edge computing and an efficient high-volume connection between users, devices, and applications. AWS Wavelength lets customers deploy the parts of an application that require ultra-low latency to the edge of the network and then seamlessly connect back to the full range of cloud services running in AWS.

Verizon 5G Ultra Wideband technology enables a wide range of new capabilities and diverse use cases with download speeds many times faster than typical 4G networks. 5G will also dramatically increase the number of devices that can be supported within the same geographic areas and greatly reduce network latency to mobile devices. Mobile edge compute (MEC) technology further reduces latency. Currently, application data has to travel from the device, to the mobile network, to networking devices at the mobile edge, and then to the Internet to get to the application servers in remote locations, which can result in longer latency. This prevents developers from realizing the full potential of 5G in addressing lower latency use-cases. For example, game streaming requires less than 20 millisecond latency for a truly immersive experience.

In placing AWS compute and storage services at the edge of Verizons 5G Ultra Wideband network with AWS Wavelength, AWS and Verizon bring processing power and storage physically closer to 5G mobile users and wireless devices, and enable developers to build applications that can deliver enhanced user experiences like near real-time analytics for instant decision-making, immersive game streaming, and automated robotic systems in manufacturing facilities.

We are first in the world to launch Mobile Edge Compute -- deeply integrating Verizons 5G Edge platform with Wavelength to allow developers to build new categories of applications and network cloud experiences built in ways we cant even imagine yet, said Hans Vestberg, CEO and Chairman of Verizon. Bringing together the full capabilities of Verizons 5G Ultra Wideband and AWS, the world's leading cloud with the broadest and deepest services portfolio, we unlock the full potential of our 5G services for customers to create applications and solutions with the fastest speeds, improved security, and ultra-low latency.

Weve worked closely with Verizon to deliver a way for AWS customers to easily take advantage of the ubiquitous connectivity and advanced features of 5G, said Andy Jassy, CEO of AWS. AWS Wavelength provides the same AWS environment -- APIs, management console, and tools -- that theyre using today at the edge of the 5G network. Starting with Verizons 5G network locations in the US, customers will be able to deploy the latency-sensitive portions of an application at the edge to provide single-digit millisecond latency to mobile and connected devices. While some ultra-low latency use cases like smart cars, streaming games, VR, and autonomous industrial equipment are well understood today, we cant wait to see how builders use 5G edge computing to delight their mobile end users and connected device customers.

Bethesda Softworks is an award-winning, worldwide video game publisher best known for iconic franchises like The Elder Scrolls, Fallout, and DOOM.Bethesda, along with its engineering team at subsidiary id Software, has developed new cloud gaming technology, called Orion, that greatly enhances the experience of streaming video games. This new technology is incorporated into a games engine in order to optimize performance in the cloud.By substantially reducing latency and bandwidth, Orion provides a much better experience for gamers and significantly lowers costs for publishers, developers, and streaming service providers.

The promise of streaming is great play your favorite games anywhere, anytime, without downloads or the need to buy expensive consoles or PCs. But gamers dont have unlimited bandwidth, and they demand an ultra-low latency experience.If you cant provide that, it wont be good enough, said James Altman, Bethesdas Director of Publishing.Now, by combining Orion with AWS Wavelength and Verizons 5G network, we will be able to deliver on the promise of streaming: a frictionless, ultra-low latency experience that will enable millions of gamers to play AAA quality games at max settings, wherever they want, whenever they want no downloads or consoles required.

The NFL continues to look at technology to grow the sport of football to the millions of fans who engage with the League throughout the year, including how the use of Verizons 5G Ultra Wideband network and the cloud can help unlock new and exciting ways for NFL fans to view, share, and engage with their favorite teams no matter where they are.

Fans love the live game-day football experience in stadium, and we constantly strive to deepen engagement on and off the field, said Matt Swensson, Vice President, NFL Emerging Products and Technology. The Next Gen Stats platform captures player location data in real-time, generates over 200 stats per play, and charts individual movements within inches. The use of AWSs Wavelength and Verizons 5G Ultra Wideband network has the potential to lower data transmission latency for delivery of new and exciting in-stadium enhancements to fans.

More here:

Verizon and AWS team up to deliver 5G edge cloud computing - Verizon Communications

Andy Jassy’s 12 Boldest Remarks On The Future Of Cloud Computing – CRN: The Biggest Tech News For Partners And The IT Channel

Its still the very early days of cloud adoption, and Amazon Web Services CEO Andy Jassy says AWS channel partners should prioritize customers long-term success over any short-term gains for themselves to build sustainable businesses.

There is so much opportunity for all of us if we can make sure that we get deep in the cloud and in the services, that we give the right advice to customers, we make sure that we focus on what matters most to customers, Jassy told CRN. If you do right by customers over a long period of time, the business usually follows as well.

CRN sat down with Jassy in October for an exclusive interview at AWS headquarters in Seattle, where he shared his thoughts on the future of cloud computing, including whats driving public cloud adoption, the cloud cost equation, AWS customer base, the AWS Partner Network and AWS new channel chief. He also addressed where partners should be channeling their investments, AWS market-leading position and competition, the U.S. Department of Defenses Joint Enterprise Defense Infrastructure (JEDI) contract and Oracle among other topics.

We're still at what I think of is the early stages of the meat of enterprise and public sector adoption in the U.S., Jassy said. Outside of the U.S., they're about 12 to 36 months behind, depending on the industry or the country. We're still at the beginning of this titanic shift to the cloud.

Heres some more of what Jassy had to say.

Originally posted here:

Andy Jassy's 12 Boldest Remarks On The Future Of Cloud Computing - CRN: The Biggest Tech News For Partners And The IT Channel

The future of computing is a better, faster cloud – News@Northeastern

Every time you open a weather app, watch a video on youtube, or search the internet for the best Thai food near you, youre accessing and processing data with software stored on servers scattered around the world. This is what the tech industry calls the cloud.

Cloud computing hasnt just changed how we use our phones or store our pictures. It has also changed how research is conducted, allowing scientists to, essentially, rent the computers necessary to store and process massive amounts of data. They dont need to buy their own servers and banks of hard drivesthey just pay for cloud services when they need them.

But even using hundreds of powerful computers linked on the cloud, the calculations necessary to model the changing climate or discover new potential drugs can take weeks, or even months, to run.

Researchers at Northeastern, Boston University, and the University of Massachusetts Amherst, are working together to build their own cloud computing testbed, to find ways to make this technology more effective and efficient.

Were trying to push the edge of cloud computing, says Miriam Leeser, a professor of electrical and computer engineering at Northeastern. What should the next generation of cloud services look like?

Most cloud computing research is done by Google, Microsoft, and Amazon, says Peter Desnoyer, an associate professor of computer science, because those companies operate most cloud services.

The insights they get from operating these large systems and observing users needs are key to many advances, Desnoyer says. Yet theyre also cautious, because they dont want to disrupt service, making it hard for them to take large leaps.

This cloud computing testbed will allow researchers to develop new and innovative cloud services, and simultaneously provide those services to the broader research community.

Theres an infinite set of arbitrary things we could createsome of them are useful, most of them arent, Desnoyers says. Offering actual services to real users helps us find the innovations that are actually useful.

The researchers have received a grant from the National Science Foundation for up to $5 million over the next five years, shared among the universities. They will be integrating aspects of previous cloud research projects, including the Massachusetts Green High Performance Computing Center and Mass Open Cloud.

So far, Leeser and Desnoyers have been awarded more than $1 million to support their aspects of the research.

Leesers focus is on the hardware. To process large amounts of data, most current cloud computing systems rely on graphics-processing units, known as GPUs, which were originally developed to render images. These chips can perform multiple calculations simultaneously (researchers call this parallelism), which makes them useful for running scientific simulations, as well as video games.

People have gotten really good performance using GPUs, Leeser says. But they are also high power and the kind of parallelism they do is restricted.

For this new testbed, Leeser is installing more flexible chips called field-programmable gate arrays, or FPGAs. These computer chips are more energy-efficient than graphics processing units, and their circuits arent fixedthey can be reconfigured any number of times to suit a researchers needs.

Its hardware you can program like its software, Leeser says. You can build a complete architecture thats designed for the problem that you want to solve. Its much more adaptive

Desnoyers is working on the software, finding ways for researchers to efficiently share resources without interfering with each other. To effectively serve the larger community, the testbed must be able to temporarily isolate computers working on a specific problem or redirect resources from one group to another, while maintaining a secure system.

Were trying to unify it in a way that can handle the radically different ways that researchers use these systems, Desnoyers says. You cant make something that works for everyone, but you can do a lot better than we do today. And the advantages are not just being able to operate your computers more efficiently; theres also a huge gain in scientific collaboration that comes when scientists from different fields work together on these systems.

The project is just getting started, but the plan is to grow it into a testbed that will support researchers around the country.

The academic community has always had a strong role in the evolution of these technologies, Desnoyers says. Our goal is to enable computer scientists across all of this community to work on the research that will build the cloud of the future.

For media inquiries, please contact media@northeastern.edu.

Originally posted here:

The future of computing is a better, faster cloud - News@Northeastern

‘Big 3’ Public Cloud Providers: 4 Reasons Not to Use Them – ITPro Today

When most folks think cloud, three names come straight to mind: AWS, Azure and Google Cloud. (People may even be thinking AWS more so than usual, with AWS re:invent in full swing.) These public clouds--which are known collectively as the Big Three--have dominated the public cloud computing market for at least the past five years. But just because there are three major public cloud providers does not mean you have to use one of them.

Indeed, AWS, Azure and Google Cloud are hardly the only public cloud providers out there. There are a variety of other contenders, ranging from the general-purpose clouds associated with major enterprises, like Oracles and IBMs, to public clouds from smaller vendors that specialize in only certain types of cloud services, like Wasabi and Backblaze.

This begs the question: When might you decide not to use one of the Big Three public cloud providers and instead opt for a lesser-known option?

To answer that question, lets start by considering why you would choose one of the Big Three. The reasons are obvious enough, but they are worth spelling out:

Each of these factors helps to make AWS, Azure or Google Cloud a compelling choice for many workloads.

But just because the Big Three are the most popular public cloud providers, it doesnt make them the best choice for every workload and deployment. Following are reasons why you might want to consider an alternative public cloud.

Perhaps the most obvious is cost. Depending on what you are deploying on the cloud, a Big Three vendor may or may not offer the most cost-efficient solution.

This tends to be particularly true in situations where you only need to run a certain type of workload on a cloud. In that case, you might find a better price by choosing a vendor that specializes in that service, rather than turning to one of the general-purpose public cloud providers.

For example, if all you need is cloud storage, a vendor that specializes in storage, like Backblaze or Wasabi, may provide better pricing than the storage services available from AWS, Azure and Google Cloud.

Likewise, you may find that the Big Three vendors offer less choice or customization for a given type of workload than does another, smaller vendor.

Here again, this is often particularly true in situations where you have a certain type of workload to deploy. For instance, each of the Big Three clouds lets you run Kubernetes-based workloads. However, a variety of other vendors specialize specifically in cloud-based Kubernetes (or container-based apps in general), like OpenShift Online or Platform9.

Although most public cloud providers have data centers spread around the world, these centers are not always spread evenly. In some situations, you may opt not to use one of the Big Three clouds because it lacks data centers (or enough data centers) in a given geographic area that you need to serve.

For example, if most of your users are in Asia, you might prefer Alibaba Cloud over one of the Big Three. Alibaba has more than two dozen Asia-based cloud regions, whereas most other major public clouds have only a few, if any. On the other hand, Alibabas presence in Europe and North America is more limited.

Choosing a cloud provider that offers many hosting options in a particular region can help improve performance in that region (because it means data centers are closer to your users). Presence in a particular region may also simplify compliance requirements, in the event that regulations require workloads to be hosted in a certain country.

Each of the Big Three clouds offers dozens of services. In general, having this array of options is a good thing.

But for organizations where IT governance is lacking or oversight is lax, too many choices can become a negative. They can lead to what I call cloud sprawl, or the temptation to launch new cloud services just because you can.

You can avoid this temptation by choosing a cloud provider that simply doesnt offer so many services. For example, if your basic cloud computing needs amount to IaaS, you might decide to make it an organizational policy to use Rackspace instead of AWS, Azure or Google. Rackspace offers a fairly extensive list of IaaS-related cloud services, but it doesnt offer a lot of other options that could result in cloud sprawl.

Its worth noting that we are living in the age of multicloud. Many companies are no longer choosing just one or another. However, in many cases, multicloud strategies are oriented around combining two or more of the Big Three clouds together, rather than mixing a Big Three cloud with a lesser-known alternative.

As long as you are comfortable with the complexities that come with multicloud, then, by all means, adopt a multicloud architecture. But as you build your multicloud strategy, keep in mind that multicloud doesnt have to involve just AWS and Azure, or just Azure and Google Cloud. You can mix and match other public clouds into your multicloud architecture, as well. In fact, you dont need to include any of the Big Three clouds in a multicloud strategy at all; you could build a multicloud architecture out of alternative clouds alone.

There are some good reasons to build a cloud computing strategy based on on AWS, Azure and/or Google Cloud. But there are other good reasons for looking beyond the Big Three and considering lesser-known or more specialized public cloud computing vendors.

More:

'Big 3' Public Cloud Providers: 4 Reasons Not to Use Them - ITPro Today