12345...10...


Healthcare Cloud Computing World Market Analyses; 2014-2019 & 2020-2026 – Yahoo Finance

DUBLIN, Jan. 17, 2020 /PRNewswire/ -- The "Healthcare Cloud Computing Market Size, Share & Trends Analysis Report By Application, By Deployment Model (Private, Public, Hybrid), By Pricing Model, By Service Model, By End Use (Providers, Payers), And Segment Forecasts, 2019 - 2026" report has been added to ResearchAndMarkets.com's offering.

Research and Markets Logo

The global healthcare cloud computing market size is expected to reach USD 27.8 billion by 2026, exhibiting a CAGR of 11.8% over the forecast period.

The associated benefits of data analytics and increase in demand for flexible & scalable data storage by healthcare professionals is expected to drive the demand for these services over the forecast period.

Healthcare organizations are digitalizing their IT infrastructure and deploying cloud servers to improve features of systems. These solutions help organizations in reducing infrastructure cost & interoperability issues and aid in complying with regulatory standards. Hence, rising demand from health professionals to curb IT infrastructure costs and limit space usage are anticipated to boost market growth over the forecast period.

Increase in government initiatives undertaken to develop and deploy IT systems in this industry is one of the key drivers of this market. Moreover, increase in partnerships between private & public players and presence of a large number of players offering customized solutions are some of the factors anticipated to drive demand in the coming years.

Further key findings from the study suggest:

Key Topics Covered

Chapter 1 Methodology & Scope

Chapter 2 Executive Summary2.1 Market Outlook2.2 Segment Outlook2.3 Competitive Insights

Chapter 3 Healthcare Cloud Computing Market Variables, Trends, and Scope3.1 Market Lineage Outlook3.1.1 Parent market outlook3.1.2 Ancillary market outlook3.2 Penetration & Growth Prospect Mapping3.3 User Perspective Analysis3.3.1 Consumer behavior analysis3.3.2 Market influencer analysis3.4 List of Key End Users3.5 Regulatory Framework3.6 Market Dynamics3.6.1 Market driver analysis3.6.2 Market restrain analysis3.6.3 Industry challenges3.7 Healthcare Cloud Computing: Market Analysis Tools3.7.1 Industry analysis - Porter's3.7.2 PESTLE analysis3.7.3 Major deals and strategic alliances3.7.4 Market entry strategies

Chapter 4 Healthcare Cloud Computing Market: Segment Analysis, by Application, 2014-2026 (USD Million)4.1 Definition and Scope4.2 Application Market Share Analysis, 2018 & 20264.3 Segment Dashboard4.4 Global Healthcare Cloud Computing Market, by Application, 2015 to 20264.5 Market Size & Forecasts and Trend Analyses, 2015 to 20264.5.1 Clinical information systems4.5.2 Non-clinical information systems

Chapter 5 Healthcare Cloud Computing Market: Segment Analysis, by Deployment Methods, 2014 - 2026 (USD Million)5.1 Definition and Scope5.2 Deployment Methods Market Share Analysis, 2018 & 20265.3 Segment Dashboard5.4 Global Healthcare Cloud Computing Market, by Deployment Methods, 2015 to 20265.5 Market Size & Forecasts and Trend Analyses, 2015 to 20265.5.1 Private cloud5.5.2 Public cloud5.5.3 Hybrid cloud

Chapter 6 Healthcare Cloud Computing Market: Segment Analysis, by Pricing Model, 2014 - 2026 (USD Million)6.1 Definition and Scope6.2 Deployment Methods Market Share Analysis, 2018 & 20266.3 Segment Dashboard6.4 Global Healthcare Cloud Computing Market, by Pricing Model, 2015 to 20266.5 Market Size & Forecasts and Trend Analyses, 2015 to 20266.5.1 Pay-as-you-go6.5.2 Spot pricing

Chapter 7 Healthcare Cloud Computing Market: Segment Analysis, by Services Model, 2014 - 2026 (USD Million)7.1 Definition and Scope7.2 Deployment Methods Market Share Analysis, 2017 & 20267.3 Segment Dashboard7.4 Global Healthcare Cloud Computing Market, by Service Models, 2015 to 20267.5 Market Size & Forecasts and Trend Analyses, 2015 to 20267.5.1 Software-as-a-service7.5.2 Infrastructure-as-a-Services7.5.3 Platform-as-a-Service

Chapter 8 Healthcare Cloud Computing Market: Segment Analysis, by End Use, 2014 - 2026 (USD Million)8.1 Definition and Scope8.2 End-use Methods Market Share Analysis, 2018 & 20268.3 Segment Dashboard8.4 Global Healthcare Cloud Computing Market, by End Use, 2015 to 20268.5 Market Size & Forecasts and Trend Analyses, 2015 to 20268.5.1 Healthcare providers8.5.2 Healthcare payers

Chapter 9 Healthcare Cloud Computing Market: Regional Market Analysis, 2014 - 2026 (USD Million)9.1 Definition & Scope9.2 Regional Market Share Analysis, 2018 & 20269.3 Regional Market Dashboard9.4 Regional Market Snapshot9.5 Regional Market Share and Leading Players, 20189.6 SWOT Analysis, by Factor (Political & Legal, Economic and Technological)9.7 Market Size, & Forecasts, Volume and Trend Analysis, 2018 to 20259.8 North America9.9 Europe9.10 Asia-Pacific9.11 Central & South America9.12 MEA

Chapter 10 Healthcare Cloud Computing Market - Competitive Analysis10.1 Recent Developments & Impact Analysis, by Key Market Participants10.1.1 Ansoff Matrix10.1.2 Heat Map Analysis10.2 Company Categorization10.2.1 Innovators10.3 Company Profiles10.3.1 NextGen Healthcare10.3.2 Carestream Corporation10.3.3 INFINITT Healthcare10.3.4 Dell Inc.10.3.5 NTT DATA Corporation10.3.6 Sectra AB10.3.7 Allscripts10.3.8 Ambra Health10.3.9 Nuance Communications10.3.10 Siemens Healthineers

Chapter 11 KOL Commentary11.1 Key Insights11.2 KOL Views

For more information about this report visit https://www.researchandmarkets.com/r/3m00oc

Story continues

Link:

Healthcare Cloud Computing World Market Analyses; 2014-2019 & 2020-2026 - Yahoo Finance

A Top Cloud Stock to Buy in 2020 and Hold for the Long Run – The Motley Fool

The cloud computing industry has been growing at a nice clip for the past few years, and it isn't expected to slow down anytime soon. IDC estimatesthat public cloud spending will hit $500 billion by 2023 from $229 billion last year. That translates into a compound annual growth rate of 22.3%, which would be driven by increasing demand for Software-as-a-Service (SaaS) by corporations and enterprises.

Gartner, for instance, forecaststhat SaaS revenue could jump over 50% by 2022. Nutanix (NASDAQ:NTNX) is one cloud stock investors can buy to take advantage of the growth in public cloud spending, as it stands to win big from the SaaS vertical. Let's see how.

Image source: Getty Images.

Nutanix has quickly transformed itself from selling hardware appliances to a software-centric business model in the space of just fivequarters. The company was getting a quarterof its revenue from selling hardwarefor hyper-converged cloud infrastructure before the pivot happened. But that business had weak margins and the prospects looked bleak thanks to ballooning memory costs.

In fact, Nutanix's gross profit margin was 61.9% when Nutanix announcedthe software pivot. The metric was moving in the wrong direction as hardware costs pressured the company's margin profile. The story is much different now, as Nutanix has started getting a big chunk of its revenue from software sales.

NTNX Gross Profit Margin data by YCharts

In the firstquarter of fiscal 2020, Nutanix's software and support revenue came in at $305 million, an increase of 9% over the prior-year period. The company's total revenue stood at $314.8 million, which means that hardware sales have been reduced to a tiny portion of the company's overall business. More specifically, Nutanix got just $9.7 million in revenue from hardware sales last quarter as compared to $32.5 million in the year-ago period.

Subscription-only sales clockedimpressive annual growth of around 71% to almost $218 million. However, one-time software sales plunged to $77.5 million from $146.5 million a year ago as Nutanix's transition to a software-centric model continued. This means that Nutanix now gets just over 69% of its total revenue from selling software subscriptions.

So, the company still has some way to go before it completely eliminates its legacy hardware and software businesses. Nutanix has set itself a targetof generating 75% of total billings from the subscription business by the end of the current fiscal year. It is close to attaining that target, as now 72.5% of billingsin the first quarter came from subscriptions.

It won't be surprising to see Nutanix's subscription business supply a greater proportion of the overall revenue in the future, considering the pace of growth in its deferred revenue. The company's deferred revenue shotup 39% annually to $975 million during the first quarter of fiscal 2020, while its overall revenue was flat on a year-over-year basis at $314.8 million.

Deferred revenue is the amount of money collected by a company in advance for services that will be rendered at a future date. The deferred revenue is recognized on the income statement when services are actually delivered.

For Nutanix, this means that the company's subscription business is succeeding. In fact, the average contract term of Nutanix's subscription customers stoodat 3.9 years in the last reported quarter. This points toward long-term growth at Nutanix, as the company is able to lock customers in long-term contracts.

What's more, Nutanix's existing customers are spending more money on its services. That's evident from the company's 132% dollar-based net expansion ratelast quarter. This is considered to be a solidnumber for a SaaS company as per Crunchbase estimates, which puts the industry average at 106%. Additionally, Nutanix was able to retain 97% of its customers last quarter.

In all, Nutanix looks like a solid cloud computing bet, as it is operating in a fast-growing market and is pulling the right strings to increase both margins and revenue. Mordor Intelligence estimates that the hyper-converged cloud infrastructure market will clock13% annual growth from 2020 to 2025. The growth in Nutanix's software business suggests that it is growing at a faster pace than the industry it operates in.

This makes Nutanix an attractive bet for anyone looking to buy a top stock for the long run.

Go here to read the rest:

A Top Cloud Stock to Buy in 2020 and Hold for the Long Run - The Motley Fool

Latin America Cloud Computing Market Study, 2019-2023 – Market to Exhibit a CAGR of 22.4%, Driven by the Increasing Demand for Hybrid Cloud Solutions…

DUBLIN, Jan. 17, 2020 /PRNewswire/ -- The "Cloud Computing in Latin America, 2019: Telco Cloud Offers, Best Practices and Market Opportunities to 2023" report has been added to ResearchAndMarkets.com's offering.

The Latin American cloud computing services market will expand at a 22.4% CAGR between 2019 and 2023, driven by the increasing demand for hybrid cloud solutions in the IaaS and SaaS segments.

Given increasing enterprise cloud adoption in Latin America that has led to more complex environments, telcos in the region are expanding their presence in the cloud space by acting as cloud resellers, providing managed services and supporting companies managing hybrid and multi-cloud environments.Cloud Computing in Latin America, 2019 provides an executive-level overview of the cloud computing services market opportunity for telecoms companies in Latin America. It delivers quantitative and qualitative insights into the cloud market, analyzing key trends and growth drivers in the region.

It provides in-depth analysis of the following:

Key Highlights of the Market

Reasons to Buy This Report

Key Topics Covered

Section 1: Definitions

Section 2: Cloud computing market opportunity in Latin America

Section 3: Telco cloud positioning and go-to-market strategies

Section 4: Best practices from telco case studies

Section 5: Key findings and recommendations

Companies Mentioned

For more information about this report visit https://www.researchandmarkets.com/r/gwtxek

Research and Markets also offers Custom Research services providing focused, comprehensive and tailored research.

Media Contact:

Research and Markets Laura Wood, Senior Manager press@researchandmarkets.com

For E.S.T Office Hours Call +1-917-300-0470 For U.S./CAN Toll Free Call +1-800-526-8630 For GMT Office Hours Call +353-1-416-8900

U.S. Fax: 646-607-1907 Fax (outside U.S.): +353-1-481-1716

SOURCE Research and Markets

http://www.researchandmarkets.com

Follow this link:

Latin America Cloud Computing Market Study, 2019-2023 - Market to Exhibit a CAGR of 22.4%, Driven by the Increasing Demand for Hybrid Cloud Solutions...

Is Cloud Computing the Answer to Genomics Big Data… – Labiotech.eu

The success of the genomics industry has led to generation of huge amounts of sequence data. If put to good use, this information has the potential to revolutionize medicine, but the expense of the high-powered computers needed to achieve this is making full exploitation of the data difficult. Could cloud computing be the answer?

Over the last decade, genomics has become the backbone of drug discovery. It has allowed scientists to develop more targeted therapies, boosting the chances of successful clinical trials. In 2018 alone, over 40% of FDA-approved drugs had the capacity for being personalized to patients, largely based on genomics data. As the percentage has doubled over the past four years, this trend is unlikely to slow down anytime soon.

The ever-increasing use of genomics in the realm of drug discovery and personalized treatments can be traced back to two significant developments over the past decade: plunging sequencing costs and, consequently, an explosion of data.

As sequencing technologies are constantly evolving and being optimized, the cost of sequencing a genome has plummeted. The first sequenced genome, part of the Human Genome Project, cost 2.4B and took around 13 years to complete. Fast forward to today, and you can get your genome sequenced in less than a day for under 900.

According to the Global Alliance for Genomics and Health, more than 100 million genomes will have been sequenced in a healthcare setting by 2025. Most of these genomes will be sequenced as part of large-scale genomic projects stemming from both big pharma and national population genomics initiatives. These efforts are already garnering immense quantities of data that are only likely to increase over time. With the right analysis and interpretation, this information could push precision medicine into a new golden age.

Are we ready to deal with enormous quantities of data?

Genomics is now considered a legitimate big data field just one whole human genome sequence produces approximately 200 gigabytes of raw data. If we manage to sequence 100M genomes by 2025 we will have accumulated over 20B gigabytes of raw data. The massive amount of data can partially be managed through data compression technologies, with companies such as Petagene, but that doesnt solve the whole problem.

Whats more, sequencing is futile unless each genome is thoroughly analyzed to achieve meaningful scientific insights. Genomics data analysis normally generates an additional 100 gigabytes of data per genome for downstream analysis, and requires massive computing power supported by large computer clusters a feat that is economically unfeasible for the majority of companies and institutions.

Researchers working with large genomics datasets have been searching for other solutions, because relying solely on such high-performance computers (HPC) for data analysis is economically out of the question for many. Large servers require exorbitant amounts of capital upfront and incur significant maintenance overheads. Not to mention, specialized and high-level hardware, such as graphics processing units, require constant upgrades to remain performant.

Furthermore, as most HPCs have different configurations, ranging from technical specs to required software, the reproducibility of genomics analyses across different infrastructures is not a trivial feat.

Cloud computing: a data solution for small companies

Cloud computing has emerged as a viable way to analyze large datasets fast without having to worry about maintaining and upgrading servers. Simply put, Cloud computing is a pay-as-you-go model allowing you to rent computational power and storage. and its pervasive across many different sectors.

According to Univa the industrial leader in workload scheduling in the cloud and HPC more than 90% of organizations requiring high performance computing capacity have moved, or are looking into moving to the cloud. Although this is not specific for companies in the life sciences, Gary Tyreman Univas CEO suggests that pharmaceutical companies are ahead of the market in terms of adoption.

The cloud offers flexibility, an alluring characteristic for small life science companies that may not have the capital on-hand to commit to large upfront expenses for IT infrastructure: HPC costs can make or break any company. As a consequence, many opt to test their product in the cloud first, and if numbers look profitable, they can then invest in an in-house HPC solution.

The inherent elasticity of cloud resources enables companies to scale their computational resources in relation to the amount of genomic data that they need to analyze. Unlike with in-house HPCs, this means that there is no risk money will be wasted on idle computational resources.

Elasticity also extends to storage: data can be downloaded directly to the cloud and removed once the analyses are finished, with many protocols and best practices in place to ensure data protection. Cloud resources are allocated in virtualized slices called instances. Each instance hardware and software is pre-configured according to the users demand, ensuring reproducibility.

Will Jones, CTO of Sano Genetics, a startup based in Cambridge, UK, offering consumer genetic tests with support for study recruitment, believes the cloud is the future of drug discovery. The company carries out large data analyses for researchers using its services in the cloud.

In a partnership between Sano Genetics and another Cambridge-based biotech, Joness team used the cloud to complete the study at a tenth of the cost and in a fraction of the time it would have taken with alternative solutions.

Besides economic efficiency, Jones says that moving operations to the cloud has provided Sano Genetics with an additional security layer, as the leading cloud providers have developed best practices and tools to ensure data protection.

Why isnt cloud computing more mainstream in genomics?

Despite all of the positives of cloud computing, we havent seen a global adoption of the cloud in the genomics sector yet.

Medley Genomics a US-based startup using genomics to improve diagnosis and treatment of complex heterogeneous diseases, such as cancer moved all company operations to the cloud in 2019 in a partnership with London-based Lifebit.

Having spent more than 25 years at the interface between genomics and medicine, Patrice Milos, CEO and co-founder of Medley Genomics, recognized that cloud uptake has been slow in the field of drug discovery, as the cloud has several limitations that are preventing its widespread adoption.

For starters, long-term cloud storage is more expensive than the HPC counterpart: cloud solutions charge per month per gigabyte, whereas with HPC, once youve upgraded your storage disk, you have no additional costs. The same goes for computing costs: while the cloud offers elasticity, Univas CEO Tyreman says that the computation cost of a single analysis is five times more expensive compared to an HPC solution in many scenarios. However, as cloud technologies continue to progress and the market becomes increasingly more competitive among providers, the ongoing cloud war will likely bring prices down.

Furthermore, in the world of drug discovery, privacy and data safety are paramount. While cloud providers have developed protocols to ensure the data is safe, some risks still exist, for example, when moving the data. Therefore, large pharmaceutical companies prefer internal solutions to minimize these risks.

According to Milos, privacy remains the main obstacle for pharmaceutical companies to fully embrace the cloud, while the cost to move operations away from HPCs is no longer a barrier. While risks will always exist to a certain extent, Milos highlighted that the cloud allows seamless collaboration and reproducibility, both of which are essential for research and drug discovery.

Current players in the cloud genomics space

Cloud computing is a booming business and 86% of cloud customers rely on three main providers: AWS (Amazon), Azure (Microsoft) and Google Cloud. Although the three giants currently control the market, many other providers exist, offering more specialized commercial and academic services.

Emerging companies are now leveraging the technology offered by cloud providers to offer bioinformatics solutions in the cloud, such as London-based Lifebit, whose technology allows users to run any bioinformatics analyses through any cloud provider with a user-friendly interface effectively democratizing bioinformatics for all researchers, regardless of skill set.

Federation is a concept from computing now used in the field of genomics. It allows separate computers in different networks to work together to perform secure analysis without having to expose private data to others, effectively removing any potential security issues.

The amount of data organizations are now dealing with has become absolutely unmanageable with traditional technologies, and is too big to even think about moving, explained Maria Chatzou Dunford, Lifebits CEO and co-founder.

When data is moved, you increase the chances of having it be intercepted by third-parties, essentially putting it at significant risk. Data federation is the only way around this unnecessary data storage and duplication costs, and painstakingly slow data transfers become a thing of the past.

Getting ready for the genomics revolution

Its no secret that genomics is key to enabling personalized medicine and advancing drug discovery. We are now seeing a genomics revolution where we have an unprecedented amount of data ready to be analyzed.

The challenge now is: are we ready for it? To be analyzed, big data requires massive computation power, effectively becoming an entry barrier for most small organizations. Cloud computing provides an alternative to scale analyses, while at the same time, facilitating reproducibility and collaboration

While the cost and security limitations of cloud computing are preventing companies from fully embracing the cloud, these drawbacks are technical and are expected to be resolved within the next few years.

Many believe that the benefits of the cloud heavily outweigh its limitations. With major tech giants competing to offer the best cloud solutions a market valued at $340 billion by 2024 we might be able to expect a drastic reduction in costs. While some privacy concerns may still exist, leading genomics organizations are developing new tools and technologies to protect genomic data.

Taken as a whole, it is likely that the cloud will be increasingly important in accelerating drug discovery and personalized medicine. According to Univas Tyreman, it will take around 1015 years to see the accelerated transition from HPC to cloud, as large organizations are often conservative in embracing novel approaches.

Distributed big data is the number one overwhelming challenge for life sciences today, the major obstacle impeding progress for precision medicine, Chatzou Dunford concluded.

The cloud and associated technologies are already powering intelligent data-driven insights, accelerating research, discovery and novel therapies. I have no doubt we are on the cusp of a genomics revolution.

Filippo Abbondanza is a PhD candidate in Human Genomics at the University of St Andrews in the UK. While doing his PhD, he is doing an internship at Lifebit and is working as marketing assistant at Global Biotech Revolution, a not-for-profit company growing the next generation of biotech leaders. When not working, he posts news on LinkedIn and Twitter.

Images via E. Resko, Lifebit and Shutterstock

Visit link:

Is Cloud Computing the Answer to Genomics Big Data... - Labiotech.eu

Exploring the Future of Cloud Computing in 2020 and Beyond – G2

Cloud computing has become a fundamental requirement for most organizations.

With this in mind, cloud computing is massively on the rise in the current day and age. In fact, 81 percent of companies with 1,000 employees or more have a multi-platform strategy. The number is to rise to more than 90 percent by 2024. Between 2018 and 2021, worldwide spending on public cloud services is to grow to 73 percent, from $160B to $277B.

Cloud computing has been around for so many years, and this sudden growth might surprise a lot of the industry players.

Cloud computing became a phenomenon in the early 2000s. However, due to the lack of awareness about the potential of technology, many brands hesitated to adopt it for their products and processes. Bart McDonough, CEO of Agio, believes the recent rapid adoption of cloud is mainly due to the understanding of ease of use and scalability of the technology.

As organizations expand their understanding of the enormous benefits of cloud computing, they are now more willing to conduct workload tests on cloud and even migrate entire applications to the cloud.

Let's take a look at some major moments in cloud computing's recent history.

Since cloud computing simplifies the process of monitoring resource consumption, the brands were able to use it for development and services delivery with a much higher level of confidence. Gradually, real-time streaming services started processing data over the cloud. Capitalizing on the opportunity, Microsoft launched Power BI, a business analytics and intelligence tool with comprehensive interactive visualization.

Tech giants are always eager to unleash breakthroughs to dominate the industry. In October 2019, Google announced a Quantum Breakthrough that could revolutionize cloud computing. Google claims to achieve results that well beyond the limitations of conventional technology.

A quantum machine has significant implications in areas like artificial intelligence. It actually makes todays most powerful supercomputers look like toys. The Google device performs a mathematical calculation in fewer than three and a half minutes. In comparison, a modern supercomputer would require more than 10,000 years to complete the calculation. Thats not all.

Google launched Stadia, a cloud gaming service where users can stream and can play AAA games on the go. All processing is done on the cloud, and users can play the games without requiring specialized gaming hardware. This is a massive breakthrough for gamers who now do not have to invest in expensive hardware.

Kris is the CSO and Co-Founder of Egnyte. He is responsible for creating and implementing Egnyte's global information security and compliance management strategies, policies, and controls that protect all customers content and users.

He notes that the future of cloud computing is on its own very robust track that impacts everyone. The latest news about quantum computing indicates very early theoretical advances that will take many years before it is going to be available to the public. Eventually, the same advances in quantum computing may expand the scope of what is available via cloud computing in general.

The growth of cloud computing will only underscore the importance of getting people to focus on business value and customer acquisition first. The time and energy spent on foundational infrastructure requirements will be significantly offset by the core capabilities available immediately via cloud computing resources. As a result, key metrics like time to value will only become accelerated.

As cloud computing becomes mainstream, organizations are now moving toward adopting it into the in-house processes. As such, almost all industries are witnessing an increase in the use of cloud-based platforms and services.

Traditionally, all the marketing data/metrics (including campaign stats, strategy plan, number of engagement, etc.) were collected and reported separately. A marketer had to sit down with an analyst to connect all the dots to come up with a detailed picture that could then be used for setting up a future course of action.

A cloud-based marketing platform is an end-to-end digital marketing platform that integrates with marketing tools such as emails, analytics tools, and social management measurement tools. All integrated data helps marketers build and optimize marketing strategies. The cloud-based marketing solution also allows simplifies the execution and management of multi-channel campaigns (social media, mobile, email, and web).

The purpose of the marketing cloud platform is to help develop a great marketing strategy, improve customer engagement, and increase return on investment.

Gone are the days when the search for information required a library. These days, digitally-enabled classrooms to allow students to create and submit presentations online; attend classes remotely through web conferencing; and collaborate and participate in globally distributed projects.

Hospital records before the era of cloud-based health management systems were notorious for their bulk. To figure out a course of treatment, hospital staff has to wade through a mountain of forms and files to discover patient history from paper files. Fast forward to today, all data is on secure cloud solutions. Information sharing and access among all relevant stakeholders (medical professionals and insurance) is a breeze.

Cloud-based solutions ensure excellent and timely treatment without unnecessary delays. Recently a remote surgery was performed from thousand of miles away. Such innovation could revolutionize the healthcare industry.

RELATED: Read about other high technologies like artificial intelligence making noise in the world of healthcare.

The future is bright for cloud computing. Analysts at IDC estimate that the field will evolve rapidly in the coming years, with almost 75% of data operations will be carried out outside the normal data center. Moreover, 40% of organizations will deploy cloud technology, with edge computing becoming an integral part of the technological setup. Also, a quarter of end-point devices will be ready to execute AI algorithms by the year 2022.

Cloud computing facilitates businesses to focus on achieving their goals, with performance at the core, without any fuss. When a five-pronged cloud computing phenomenon that leverages flexibility, agility, security, mobility, and scalability combines with existing processes, the level of business performance rises to a new level. The idea is to help businesses get computational power they need, and how they want it to work for them, and move on.

Software development agencies tend to follow the agile framework. With a launch-and-learn mindset working to their advantage, they work on continuous integration of processes and delivery, meanwhile, publishing many open-source software.

Online security firms are now using zero-trust enterprise security model from Google, instead of the traditional firewall standards. This practice has led them to let users work remotely from any location on Earth, with approved devices. With a managed hosting service working behind-the-scenes, users do not have to bother about the network level access to secure their devices. Therefore, the devices security remains intact.

Data analytics firms now work on continuous data help businesses make informed decisions. The data is first fed into a machine learning system to get better results out of it. Today, every process has become continuous. This means we cant label any process or product as completely finished.

Even the transition into this new digital world is a continuous journey. There are always updates and iterations being released by the organizations, thus testifying the process is in a continuous flow.

Similarly, the security is not a fixed process, but rather it a steady flow of events, an ongoing practice that demands continuous upgrades. Also, data analytics is not just faster, but iterative. That being said, cloud computing calls for not just working faster, but working in a more efficient way. This is why cloud computing unleashes a completely new set of possibilities to think about and work within the technological environment.

This fact brings us to the point that technological transition has created a strong social impact, both within the organizations and the people whom these businesses work with.

Document sharing is one of the best examples of how collaborative technology works. It favors the continuous process of writing, editing, commenting, and publishing a document. Similarly, developing videos with a click of a button has strengthened the value of communication, thereby speeding up the actions.

Availability of rich data streams has transformed the way to design a prototype with an emphasis on personalization.

When it comes to product development lifecycles, cloud-connected objects are having a positive impact on the whole process, thereby forcing the organizations to make frequent updates to the software they offer to the users. New versions of highly effective subscription-based model is one such example. Last year, Tesla had to improve the brakes of the cars with an over-the-air download. This is an example how cloud-connected objects influenced an alteration in the hardware system of a car. Both these examples pinpoint the fact that today, customers have become more conscious about the products they buy.

This is why they want businesses to understand their wishes and pain points in advance and anticipate what they want next. This paradigm shift in the thinking of a common customer has raised the bar for competitors within the industry.

Businesses like logistics and supply chain have also embraced the technological shift. Such organizations are readily accepting streaming capabilities. Blockchains have triggered a revolution in real-time payments and virtual warehousing. Similarly, Uber-like delivery services have played their part in making the payment processes continuous. Shippabo is a company that optimizes route management and compliance to make operations fast, thanks to the cloud-based infrastructure.

Just a decade ago, no one could have imagined the specifications of the consoles and mobile devices that we use today. Similarly, the application of cloud in the form of the popular marketing cloud platforms, healthcare-based cloud platforms, and the emerging cloud-based educational applications are poised to change the way we work (and play) in the coming years.

With this rapid growth, all major processing and computational capabilities will soon move on to the cloud and the end-users will be able to leverage this power anywhere through an on-demand consumption model.

The best part? Users no longer have to invest in equipment because the cloud would take over the current endpoint hardware processing requirements. For brands, this opens up a whole new avenue of building computational and processing infrastructure that users could access through simple UI. The main challenge in this scenario is to ensure the sustained and consistent delivery of these services to a globally distributed user base and continue to provide value through new features and services.

As many futurists envision, cloud computing will give rise to a whole new breed of API and microservices that would become the main service/product delivery channel for the brands. This would simplify the process of development for the end users because now they can ship releases faster without getting bogged down into code writing issues.

Lets dive into some cloud computing trends making waves in 2020.

Quantum computing will transform the business world like never before. Companies like Google are leveraging the principles of quantum physics to make breakthroughs by developing next-generation products for the end users. The supercomputers are the best example of how quantum computing can work wonders if put into proper use. Corporations like IBM, Microsoft, Google, AWS are making efforts to gain competitive advantage over their competitors by adapting to the new quantum technology.

Quantum computers use the principles of quantum physics to perform complex algorithm calculations and process massive datasets in quick time. These powerful computers can be used to encrypt electronic communication and help in augmenting cybersecurity.

Financial institutions can use quantum computing to their advantage by making the transaction process faster. Consequently, this practice will save more time, making the process efficient. In quantum computing, data is stored in qubits, making the process much quicker, as qubits is a simpler form of data. Using quantum computing will also reduce any additional costs required to incur new resources to handle already optimized operations.

Automation helps business organizations improve their productivity without spending too much time and effort. The automation tools available to us have proved to be very important when it comes to addressing errors in business processes, meanwhile streamlining them to generate fruitful results.

For instance, developers can make changes to their websites hosted on the cloud before going live. If anything goes wrong, they can restore an older version of the website without affecting the sales process or user experience. As soon as the website goes live, it starts getting traffic. Opting for cloud means there will be more data consumption involved. Managing applications and routine tasks can become tedious. Developers can use automation to get rid of the manual process they have to use to carry out daily operations.

Many organizations take security for granted. They simply misunderstand the concept of cloud-based security. The high-ups in the organizations think that the cloud service provider must also be responsible to provide cloud security. Thats where the misunderstanding lies. The fact remains that the security compliance is a shared responsibility of all the stakeholders involved overseeing the security operations of the organization, by proper resource usage.

It is important to note that for a SaaS company the cloud service provider offers an extra layer of security along with the already built in features that come with the service package. In case of shared hosting, the user(s) must implement security measures and compliance to enhance the application of existing services and policies.

We live in the world of Internet of Things (IoT). With every device connected to the internet businesses now encourage use of IoT in almost every aspect of organizational operations. The IoT devices can leverage cloud computing as it offers, high speed, performance, flexibility, and ample storage space to keep the data safe, find resources, and share information among different users within the same space.

There is another phenomenon the Internet of Everything (IoE) an offshoot of IoT that helps us interact with each other via connected devices to a particular network. This concept is continuously evolving and it is about time when everything we use will be interconnected to an already populated network of devices.

The serverless paradigm is the next revolution in waiting, according to the CTO of Amazon. The concept of serverless paradigm relates to the fact that it facilitates cloud to execute a code snippet without any hassles for the developers.

Using this approach developers can divide software into chunks of code to upload on cloud to address customers desires, thereby delivering valuable experience. This practice ensures faster release cycle for software. Amazon Web Services (AWS) has already started using the serverless paradigm to its advantage.

As cloud computing continues to make inroads in enterprise worlds, all stakeholders are looking forward to the evolution of the model. As things stand today, almost every significant innovation such as blockchain, artificial intelligence, AR/VR, robotics, and IoT rely on cloud computing technology.

Its not just computational power, networking speed, or storage capacity that makes cloud computing great. Those are just operational metrics that better technology would eventually change and replace over time. The real value of technology is what it does, not what its made of.

While you're pondering the future of cloud computing and its benefits, find the best cloud file storage software on the marketplace to use today!

Read more:

Exploring the Future of Cloud Computing in 2020 and Beyond - G2

Socionext Side Steps the Pitfalls of Cloud Computing with New Edge Computing Devices – News – All About Circuits

Socionextrecently partnered with Foxconn Technology Group and Network Optix to produce a newsolution for edge computing, the BOXiedge. But what drives this multicore, edge-computing server and how does its SoC point to the trend in edge computing?

Cloud computing has its disadvantages: latency, security vulnerability, and unreliable network connection. These disadvantages have given rise to edge computing. Edge computing allows IoT devices to run AI algorithms that are trained on the cloud.

Socionext claims that BOXidege is one such edge computingsolution. The fanless deviceincludesa 24-core mini-server that uses 30 watts of power. The BOXiedge is designed for edge computing applications that require intelligence, including factory automation, retail, agriculture, and digital health.

By using many cores on a local system that is capable of running AI algorithms, users will not need to rely ona stable,secure internet connection. Sensitive data is also stored locally on the server, improving security.

A deeper dive into the processor that drives the BOXidege may illustratehow the virtues of edge computingare reflected in the hardware design.

The SynQuacer SC2A11 is a 24-core, 64-bit Arm Cortex-A53clocked at 1GHz. The SoC includesthree levels of cache with 4MB of L3, DDR4 RAM, PCIe, and LAN.

It also includesvarious IOs including UART, I2C, and GPIO.

Each core has two processorstwo levels of cachewhile the L3 cache is used in theinterconnects with all the cores. The SC2A11 is scalable, supporting 64 devices in parallela total possible core count of 1,536.

Designed for power efficiency (specifically, 5W),the SC2A11 is aimed at servers,industrial applications, and edge computing applications. The many cores are said to allow large amounts of data to be processed simultaneously.

The SC2A11 is housed in a metal lid BGA package, measuring 30mm x 30mm. This compact designmakes it a useful option for applications requiring a large number of cores. Several security features are also integrated into the SC2A11, including OPAL andan ethernet processor. Socionext also offersan evaluation board for the device.

The evaluation board includes the SC2A11 with multiple DDR4 memory slots and various I/O.

Microcontrollers are oftengeared for processing a single task.

The SC2A11, however, demonstrates that microcontrollers may take a new direction:multicore designs. While the SC2A11 is a dramatic example of a multicore SoC, it does show how designs are being influenced by edge computing.

The demand for more secure systemsand lower-latency devices (such as those required in self-driving vehicles) emphasizes the utility of parallel designs. It's likely that this focus on security and latency will influence futuremicrocontrollers, including the integration of multiple cores and AI co-processors.

Read more:

Socionext Side Steps the Pitfalls of Cloud Computing with New Edge Computing Devices - News - All About Circuits

Top 7 emerging hybrid cloud computing trends to watch in 2020 – Express Computer

By Nitin Mishra

Cloud computing industry is evolving at a lightning speed with new trends and developments emerging every year. However, one thing has become quite clearthe future of cloud is hybrid. Combining the higher security for mission-critical applications that comes with private cloud with the flexibility and scalability offered by public cloud, hybrid cloud promises organizations the best of the both worlds.

Given the advantages, hybrid cloud has emerged as the preferred implementation model that will drive the progression of the cloud industry. This trend is corroborated by the findings of the research firm Gartner, which reveals that 77% of enterprise global infrastructure decision makers that are planning, implementing, or upgrading cloud technology say that they are in a hybrid cloud environment.

As hybrid cloud space continues to mature, it is undergoing a rapid evolution. Lets look at some definitive trends that will define the hybrid cloud future in the coming year:

Consistent hybrid experienceThe talks around hybrid cloud being the future have been doing rounds for quite some time. The year 2020 will see that turn into reality. With new innovations in hybrid cloud space, organizations will be able to ensure seamless experience across the environment instead of looking at public cloud and on-premise or private cloud as separate pieces. Organizations will be able to move past the latency challenges and experience truly consistent hybrid experience.

Hybrid multicloud on the riseOrganizations across industries are picking and mixing technologies and services from multiple cloud providers as per their specific business needs to avoid vendor lock-in and gain from the best-of-the-breed capabilities. As multiple cloud providers get added to the public piece of hybrid cloud puzzle, hybrid multicloud will become the new IT normal. Using a mix of on-premise and/or private / public cloud from multiple providers, hybrid multicloud offers organizations freedom and flexibility to run their workload on-permise or cloud and even change cloud providers, if required. Further, a hybrid multicloud approach enables organizations to adopt common management and software development capabilities across the environment. Through 2020, hybrid multicloud will emerge as a dominant trend that will shape the cloud industry. As per McKinsey & Company, hybrid, multicloud is set to be a USD 1.2 trillion market opportunity by 2022.

Edge computing gains greater relevance in hybrid cloud strategiesEdge computing, a model where computations are performed as close as possible to the sources and sinks of data and content, will become a critical element of hybrid cloud strategies. Combination capabilities of hybrid cloud and edge computing brings a great value proposition for organizations, wherein hybrid cloud ecosystem can be used to aggregate most relevant data and back-end functions, while edge can support processing and real-time analytics. In 2020, as number of IoT devices increase, more and more enterprises will tap into the advantage of edge with hybrid cloud model to discover key business insights.

Containerization continues to riseIndustry pundits are widely recognizing containers as the core of an effective hybrid cloud model due to their capabilities to provide consistency, regardless whether a workload is deployed on-premise or on one or more clouds. Containers simplify deployment, management, and operational concerns associated with a hybrid cloud, helping organizations to maximize the business value from their hybrid cloud strategy. In 2020, while containers will not become mainstream, we can expect stronger adoption. With every tech giant introducing platforms to simplify deployment and management of clusters, organizations will consider tapping into the power of containers in making hybrid cloud heterogenous and workload agnostic.

Hyperconverged infrastructure for hybrid cloud gains tractionHyperconverged infrastructure (HCI) is emerging as a best match to support the hybrid environment as it addresses the biggest concern with the environmentincreased complexity. HCI enables organizations to integrate cloud into their environments by eliminating the need to manage compute, storage and network resources as separate tiers. Organizations gain the ability to manage everything from a single pane of glass. Further, pre-integrated, consolidated compute and storage resources of HCI solutions enable cloud implementations to run faster, scale higher and respond quicker. As hybrid cloud continues to gain momentum, HCI solutions are evolving in order to align with the needs of hybrid cloud world. In 2020, expect more and more organizations to run their hybrid cloud on HCI, which according to industry watchers, will eventually become the go-to infrastructure platform for hybrid cloud.

DR and backup requirements will spur hybrid cloud adoptionIn todays always-on business scenario, effective DR and backup has become more important than ever to ensure business continuity and data safety. Providing scalability, flexibility and cost efficiency, a hybrid cloud approach for DR and backup can prove extremely beneficial for businesses. DR is complex and cost and resource-intensive activity, hence cloud-based DR and backup is emerging as a viable option. Using hybrid cloud model enables organizations to have the secondary off-site backup location in the cloud, which is way more effective than traditional storage, while having the flexibility to host sensitive data on private network and meeting compliance requirements. Increasing DR and backup requirements will continue to drive hybrid cloud adoption through 2020.

Stronger uptake for hybrid cloud managed servicesIn 2020, a large number of organizations will partner with cloud service providers to define and determine the optimum approach to cloud management. Organizations will choose providers to handle complexities associated with hybrid cloud and effectively manage instances across cloud providers and hosted on various deployment models (private cloud and on-premise). Cloud providers offering a combination of robust Cloud Management Platform and Managed Services will emerge as preferred partners on account of their capabilities to help organizations ensure a single view and holistic and seamless experience of their hybrid infrastructure.

As we enter a new decade, cloud is certainly braver and stronger than ever before. Hybrid cloud will continue to grow and evolve to offer organizations more flexibility and consistency and help them speed up application deployment cycle.

(The author is the Senior Executive Vice President & Chief Product Officer, NTT-Netmagic)

If you have an interesting article / experience / case study to share, please get in touch with us at [emailprotected]

Continued here:

Top 7 emerging hybrid cloud computing trends to watch in 2020 - Express Computer

How Alibaba plans to improve the Olympics with cloud computing – MarketWatch

LAUSANNE, Switzerland (AP) Alibabas promise to the Olympic family is to bring its technological might to help organizers, broadcasters and fans.

The Chinese tech giants Olympic sponsorship deal is worth hundreds of millions of dollars and runs for 12 years, starting in 2017 and continuing through the upcoming Tokyo Games and the 2022 Winter Games in Beijing and into the 2024 Paris Olympics and the 2028 Los Angeles Games.

We are not just putting the rings alongside our logo, Alibaba BABA, +1.56% chief marketing officer Chris Tung told The Associated Press in an interview at the Winter Youth Olympics in Lausanne, the IOCs home city. We also like to leverage our technology to help transform and upgrade the games. That is always what we have in mind by being a sponsor.

At this years Tokyo Olympics, broadcasters will gain the most from Alibabas cloud computing work. Fans will see the biggest changes at the 2022 Winter Games in Beijing as ticketing and merchandise operations become more digital.

Then it should be the turn of organizers as they reap more rewards from Alibabas expertise at the 2024 and 2028 Games.

We expect Paris and L.A. will be a total explosion of what has been tested and what has been discussed with the previous games, Tung said.

Now clearly established in a top tier of global partners working with both the International Olympic Committee and FIFA, Alibaba is looking to improve sports events on top of helping finance them. With Olympic sports and anti-doping bodies now regularly targeted by hackers, including from Russia, cybersecurity is cited as an absolute must, by Tung.

Asked about integrity issues confronting Chinese tech giant Huawei, Tung pointed to Alibaba enjoying the trust of 200,000 brands using its e-commerce platforms Tmall and Aliexpress.

The two-week Winter Youth Olympics will test Alibabas work with the IOC subsidiary Olympic Broadcasting Services to create a better service for rights-holding national networks in Tokyo. Routing thousands of hours of action and interviews through Alibabas cloud servers should be faster, more efficient with fewer staff and less equipment on site and create more content for social media.

What works in 2020 should be improved in 2022 when Beijing becomes the first city to host both the Summer and Winter Games.

Fans and athletes will experience a totally different Olympic Games with a lot of support from cloud technology, Tung said, predicting long lines at souvenir stands replaced by immediate delivery to hotels from ordering on Alibabas online platforms.

After a decision by FIFA in October, China and Alibaba will get an extra forum to test hosting ideas seven months before the Winter Olympics. In June 2021, a revamped 24-team Club World Cup will kick off in China bringing tens of thousands of soccer fans worldwide to see games in several cities. Alibaba Cloud is already the title sponsor of the existing annual, but low-key, club tournament.

We are excited about the opportunity, Tung said, adding Alibaba awaited details from FIFA about new Club World Cup plans. Its very strategic for us to increase the reach and influence of (soccer) as a sponsor.

Many observers see a logical next step in Alibaba joining Olympic sponsors like Coca-Cola and Visa on FIFAs World Cup slate especially as China is widely expected to host in 2030 or 2034.

We, Tung said, are open to the possibility.

More:

How Alibaba plans to improve the Olympics with cloud computing - MarketWatch

Satya Nadella credits Steve Ballmer for pushing Microsoft into the cloud – CNBC

Microsoft's stock price has more than quadrupled since Satya Nadella took over as CEO six years ago, thanks to the company's rapid growth in cloud computing.

But Nadella credits his predecessor, the louder and more outspoken Steve Ballmer, for making the decision to take on Amazon Web Services in cloud infrastructure, where Microsoft is now the clear No. 2 player.

"The guy who gave me permission to do all this was Steve Ballmer," Nadella told CNBC's Jim Cramer on Thursday, in an interview on"Mad Money"from Microsoft's headquarters in Redmond, Washington. "He wanted us to be bold and go at the cloud very aggressively, and that's what we did."

With a market value of $1.27 trillion, Microsoft is the third most valuable public company, behind Saudi Aramco and Apple. The stock rose 58% in 2019, its best year in a decade, and continues to push the S&P 500 higher.

Microsoft's stock under Nadella

CNBC

The story was much different during Ballmer's tenure, as sales growth slowed and the stock flatlined. But even in those years, Ballmer's Microsoft was in the early phases of assembling a cloud that could challenge AWS. Azure was announced in 2008, six years before Nadella became CEO.

Nadella then took steps to make Azure an easier choice for software developers. Two months after he became CEO, he changed Windows Azure to Microsoft Azure, underscoring the service's ability to handle computing tasks from Linux, which had long been a rival to Windows. Nadella also forged partnerships with competing companies like Red Hat and SAP.

While Nadella wants to recognize Ballmer for guiding Microsoft toward the cloud, he's made a number of decisions to undo moves made by the prior CEO, who now owns the NBA's Los Angeles Clippers.

Nadella backed Microsoft away from mobile hardware by selling off the feature phone business it picked up with the 2013 acquisition of the Nokia devices and services business. And while Windows was at the very center of Microsoft under Ballmer and even Bill Gates, in October Nadella told Wired that "the operating system is no longer the most important layer for us."

Steve Ballmer, left, and NBA Commissioner Adam SIlver attend an NBA playoff game between the Oklahoma City Thunder and the Los Angeles Clippers at Staples Center on May 11, 2014 in Los Angeles.

Noel Vasquez | GC | Getty Images

Under Nadella, cloud has become the centerpiece of the business, with Azure serving as the backbone for cloud-based Office applications and other services for personal and home use as well as powering other large enterprises and government agencies. On Thursday, Nadella called out cloud customers Marks & Spencer, Walmart and Walgreens.

"I think you have to have conviction on where the world is going, make sure you bet long before anybody gives you credit for it, and then, of course, execute," Nadella said. "That's what we have done in every layer of the stack."

In comparing the Microsoft of today to the company Gates led in the 1970s, Nadella highlighted the current focus on artificial intelligence, distributed computing and "completely different ways to think about even end-user computing with things like HoloLens."

There's one fundamental similarity.

"But guess what?" he said. "It's software."

Follow @CNBCtech on Twitter for the latest tech industry news.

Read more from the original source:

Satya Nadella credits Steve Ballmer for pushing Microsoft into the cloud - CNBC

Majority of workloads will go to cloud in 2020 – TechHQ

If we needed evidence of cloud computings success, that AWS revenue has risen from US$300 million in 2009 to become a US$32 billion business today, is fairly substantial.

What began as a movement to provide companies with easier and more cost-effective access to IT infrastructure has become the most important computing development in history, opens a new report by AllCloud which cements the trajectory of what is, simply, access to computers on the internet.

According to the Cloud Infrastructure Report, featuring responses from more than 150 IT decision-makers at organizations where at least 300 employees were using cloud infrastructure, 85 percent of organizations expect to shift the majority of their workloads cloudwards by the end of the year, while just shy of a quarter (24 percent) plan to be cloud-only.

At present, around seven in 10 already run at least half of their workloads on the cloud.

The firm said the findings supported its own client feedback: that firms are now comfortable and familiar with the technology, and are ready to explore further benefits and new ways to use it.

What was once a simple implementation or a single workload migration to the cloud has now transformed into a complete infrastructure, platform and application modernization, AllCloud said on the findings.

The continued and aggressive move to the cloud will come hand-in-hand with a further move towards new technologies, such as containers and microservices. More than 56 percent of respondents said at least half of all their cloud workloads are using these technologies.

The adoption of containerized workloads adds efficiency, consistency, and ease of management to cloud strategies. In particular, AllCloud has seen an increase in the number of clients that are planning to implement Kubernetes.

Given AllClouds focus on supporting AWS, the report was centered on the use of this particular service and its extensive network of managed service providers (MSPs).

These are providing 43 percent of organizations with outside support for more than half of their workloads. Database MSPs, in particular, have seen the biggest leap, simplifying the process of migration. There is also a growing understanding that organizing and managing data is a necessary foundational service and strategy, allowing businesses to set the stage for analysis and broader cloud goals.

More than a fifth of respondents (21 percent) said theyd be looking to enlist database MSPs in the coming year, while the limitless use cases of IoT for business are proving attractive for 17 percent.

However, nearly two-thirds (65 percent) of respondents in the study claimed their company used a hybrid cloud approach: This reinforces the flexibility of the AWS platform, which easily works with other infrastructure and technology solutions, the report noted.

These hybrid collaborations are sometimes short-term, while others come down to necessity.

Owed to AWS partnership with VMware, almost three-quarters of enterprise private cloud workloads are using its virtualization services which underlines the need for continued integration between the two platforms, AllCloud said.

The existing partnership is likely to grow stronger and broader, with more accessibility released between the technologies. This will allow a faster rate of enterprise adoption for organizations that want to leverage the benefits of the cloud.

Interestingly, despite the expense of cloud computing services being a commonly-cited concern, cost (14 percent) ranked just fourth as the main deciding factor when it comes to deciding on a cloud platform of choice.

Security (28 percent) was ranked first priority, while reliability (26 percent) and flexibility (22 percent) were also top considerations.

Visit link:

Majority of workloads will go to cloud in 2020 - TechHQ

Global IT compensation is at an all-time high, thanks to skills trainings – TechRepublic

IT professionals earned an average of $5,000 more in 2019, all because of job performance, Global Knowledge found.

The average annual salary for IT professionals worldwide is $89,732the highest yet found by Global Knowledge's yearly report. The report is the largest worldwide study of professionals in the tech community and has been conducted every year for more than a decade.

IT professionals earned an average of $5,000 more in 2019 than in 2018, with the main reason being improved job performance, the report found.

This increase in both salaries and performance quality indicates that more people are taking steps to progress their professional development, resulting in better performance and more compensation, according to the report.

SEE: Why IT pros need soft skills to advance their careers (free PDF) (TechRepublic)

Global Knowledge's 2019 IT Skills and Salary Survey used the Qualtrics Insight platform to tabulate submissions. The survey was sent out via email to recipients in the Global Knowledge database and yielded 12,271 responses, with more than half (54%) from the US and Canada, and the rest from other countries around the world.

Globally, the highest salaries in various functional areas included cloud computing ($115,889), IT architecture and design ($98,580), project management ($98,344), and cybersecurity ($97,322). The report also broke down salaries by region.

SEE: How to become a cybersecurity pro: A cheat sheet (free PDF) (TechRepublic)

North America

North American IT professionals earned $109,985 on average, which is 23% more than the worldwide average mentioned previously. These professionals also saw a 6% raise in salary in 2019 and 50% received bonuses, according to the report.

US IT professionals, in particular, earned higher average salaries than those in any other region at $113,639, which is largely due to geography. While salaries along the coasts were the highest, those locations are also where the cost of living is the most expensive, the report found.

In Canada, the average annual salary is $74,048, with the highest paying IT professionals living in Quebec and making around $77,897.

The highest paying job function areas in North America included executives ($148,034), cloud computing ($138,320), and IT architecture and design ($126,095), according to the report.

Latin America

In Latin America, IT professionals earned an average of $41,465. IT decision makers in this region make significantly more than their staff--a 44% difference. IT decision makers also saw the highest raise percentage out of all regions at 9%, while their IT staff only saw a 5% increase, the report found.

The functional areas with the highest salaries were executives ($68,253), cloud computing ($50,480), and project and program management ($48,478), according to the report.

Europe, the Middle East and Africa (EMEA)

The average salary for IT professionals in EMEA was $70,445, with IT staff seeing a 6% raise and IT decision makers seeing a 5% raise, which was in line with the worldwide average raise percentages, the report found.

Within Europe, Switzerland dominated the salaries with an average of $136,301. Norway had the second highest at $97,525, followed by Germany at $95,456, according to the report.

The highest paying function areas in EMEA were executives ($101,523), cloud computing ($99,290), and IT architecture and design ($83,606), the report found.

Asia-Pacific

IT professionals in the Asia-Pacific area made on average $65,738 per year. Similar to Latin America, the ratio between the salaries of IT decision-makers and IT staff is significant, at 38% difference.

However, the raise percentages for IT decision makers and IT staff are the same, at 5%, the report found.

The highest paying function areas for the Asia-Pacific region were executives ($108,794), cloud computing ($84,764), and program and project management ($74,608), according to the report.

A commonality across all regions was that every area received some percentage of a raise. As previously stated, salaries were higher in this edition of the report than any other. This survey aimed to discover the reason behind the shift.

Respondents said that the top factors that increased salary included their current job performance (42%), a standard company increase (39%), a promotion within the company (15%), and a cost of living increase (15%).

A particularly interesting reason behind a pay raise was the development of new skills (9%), the report found. Those same individuals who said they developed new skills of added value reportedly earned nearly $12,000 more than last year, indicating reskilling and upskilling training sessions pay off, according to the report.

Learning and development is often overlookedor not prioritized in companies, which is harmful to both employees and companies. The majority of employees (94%) say they would stay longer at an organization if they were offered opportunities to learn and grow, LinkedIn's 2019 Workforce Learning report found.

Upskilling and reskilling employees is particularly crucial in the era of digital transformation; as technology is constantly changing and evolving, employees need to do the same.

Rather than firing and rehiring workers with every technological shift, companies should instead allocate resources toward intellectual growth and skills. And Growth Knowledge's report is evidence that the skills training does its job.

For more, check out Impressive professional development benefits from Amazon, Google, Microsoft, and more on TechRepublic.

We deliver the top business tech news stories about the companies, the people, and the products revolutionizing the planet. Delivered Daily

Read the rest here:

Global IT compensation is at an all-time high, thanks to skills trainings - TechRepublic

Job of the Week: Head of Research Computing at the Norwich Biosciences Institute – insideHPC

The Norwich Biosciences Institute Partnership is seeking a new Head of Research Computing in our Job of the Week.

We have an exciting opportunity for a strategic High-Performance Computing (HPC) leader to take accountability for the future development of research computing across one of the UKs foremost research organisations. The Head of Research Computing will set the future technology strategy and service model for mission-critical research IT services at NBI.

The Norwich Biosciences Institutes (NBI) are a cluster of internationally renowned research organisations. The institutes work together to tackle the major challenges facing us all in the 21st Century the sustainability of our environment; our food supplies and healthy ageing. Across NBI, there are over 1,000 scientists working to find realistic and practical solutions to these challenges. We then provide the infrastructure and support to translate their discoveries into commercially-successful businesses.

This vital, and impactful, new role will have end-to-end ownership for the provision of our research computing shared-service. The Head of Research Computing will lead the development and delivery of research computing provision across NBI, including the strategic evolution of the technology roadmap, incorporating contemporary approaches such as flexible cloud computing based HPC provision.

The Role:

Looking for a new gig? OurJobs Boardhelps companies of all sizes hire the best talent and offers the best opportunity for job seekers to get hired.

Are you paying too much for your job ads?Priced atjust$99.99 dollars for 90 days, ads on our insideHPC Jobs boardare a great way to reach the top supercomputing professionals.

Post a Job

Here is the original post:

Job of the Week: Head of Research Computing at the Norwich Biosciences Institute - insideHPC

Middle East Cloud Applications Market Worth $4.5 Billion by 2024 – Exclusive Report by MarketsandMarkets – PR Newswire UK

CHICAGO, Jan. 17, 2020 /PRNewswire/ -- According to a new research report"Middle East Cloud Applications Marketby Application (ERP, CRM, HCM, SCM, and Business Intelligence and Analytics), Organization Size, Vertical (BFSI, Manufacturing, and Telecommunications), and Country - Forecast to 2024", published by MarketsandMarkets, the Middle East Cloud Applications Market is expected to grow from USD 2.0 billion in 2019 to USD 4.5 billion by 2024, at a Compound Annual Growth Rate (CAGR) of 17.5% during the forecast period.

The growing demand for cloud-based services and advanced technologies, increasing need to engage with customers, and deliver an enriched experience continuously are some of the major factors driving the growth of the Middle East Cloud Applications Market.

Browsein-depth TOC on"Middle East Cloud Applications Market"

38 Tables

28 Figures

110 Pages

Download PDF Brochure:

https://www.marketsandmarkets.com/pdfdownloadNew.asp?id=262938413

Among applications, cloud-based CRM applications to grow at a higher rate during the forecast period

Cloud Customer Relationship Management (CRM) enables enterprises to store and utilize customer data at scale to offer better services and manage relationships with customers. Cloud-based CRM is gaining popularity among enterprises due to various benefits it offers, such as easy accessibility, affordability (especially for Small and Medium-sized Enterprises [SMEs]), rapid implementation, easy upgradation, scalability, and integration capability with other data sources. Cloud-based CRM applications centralize the customer database and provide a comprehensive view of all interactions with customers, offer instant access to real-time insights of sales opportunities, and automate task management processes. With ease of use and affordability, it increases customer retention rates making business more successful. Salesforce, Zoho, Oracle, Microsoft, and Oracle are some leading vendors offering cloud CRM.

The retail and consumer goods vertical is one of the fastest-growing verticals in the region

Factors driving the adoption of cloud applications are the rising purchasing power of customers and the need to satisfy customer expectations, which leads to existing customer retention and new customer acquisition. Online retailing and cloud technologies have significantly disrupted the retail and consumer goods vertical leading to the adoption of cloud computing mainly for storage, backup, and security services. Cloud computing services enable retailers to access customer data with just 1 click from any store located anywhere, thus leading to better customer service delivery. For instance, Hallmark Cards is a retail shop with a number of stores worldwide and is known for selling greeting cards and other products. This store is leveraging the private cloud to store its data and manage all operations, such as conducting real-time inventory tracking, enabling employees to focus on delivering enhanced customers experience, and business critical activities.

Speak to Our Expert Analyst:

https://www.marketsandmarkets.com/speaktoanalystNew.asp?id=262938413

Saudi Arabia to have the largest market size during the forecast period

Saudi Arabia is accelerating the adoption of Information Technology (IT) services in recent years. The country's regulatory body, Communications and Information Technology Commission (CITC) has regulated Cloud Computing in Saudi Arabia by publishing the Cloud Computing Regulatory Framework (CCRF) on its website. The CCRF aims to provide clarity and certainty on the rights and obligations of Cloud Service Providers (CSPs) and users of cloud services. This shows the interest of the government to accelerate the adoption of cloud-based services in this country. The key factors driving the adoption of the cloud technology in this country include reduced costs, improved infrastructure efficiency, and enhanced scalability.

Market Players:

Major vendors offering Middle East Cloud Applications Market across the globe includes SAP (Germany), Oracle (US), Microsoft (US), Infor (US), Salesforce (US), Sage Group (UK), IBM (US), Epicor (US), 3I Infotech (India), Ramco Systems (India), Prolitus Technologies (India), IFS (Sweden), and QAD (US).

Browse Adjacent Markets: Cloud Computing Market Research Reports & Consulting

About MarketsandMarkets

MarketsandMarkets provides quantified B2B research on 30,000 high growth niche opportunities/threats which will impact 70% to 80% of worldwide companies' revenues. Currently servicing 7500 customers worldwide including 80% of global Fortune 1000 companies as clients. Almost 75,000 top officers across eight industries worldwide approach MarketsandMarkets for their painpoints around revenues decisions.

Our 850 fulltime analyst and SMEs at MarketsandMarkets are tracking global high growth markets following the "Growth Engagement Model GEM". The GEM aims at proactive collaboration with the clients to identify new opportunities, identify most important customers, write "Attack, avoid and defend" strategies, identify sources of incremental revenues for both the company and its competitors. MarketsandMarkets now coming up with 1,500 MicroQuadrants (Positioning top players across leaders, emerging companies, innovators, strategic players) annually in high growth emerging segments. MarketsandMarkets is determined to benefit more than 10,000 companies this year for their revenue planning and help them take their innovations/disruptions early to the market by providing them research ahead of the curve.

MarketsandMarkets's flagship competitive intelligence and market research platform, "Knowledge Store" connects over 200,000 markets and entire value chains for deeper understanding of the unmet insights along with market sizing and forecasts of niche markets.

Contact: Mr. Shelly Singh MarketsandMarkets INC. 630 Dundee Road Suite 430 Northbrook, IL 60062 USA: +1-888-600-6441 Email: sales@marketsandmarkets.comVisit Our Website: https://www.marketsandmarkets.comResearch Insights:https://www.marketsandmarkets.com/ResearchInsight/middle-east-cloud-application-market.aspContent Source: https://www.marketsandmarkets.com/PressReleases/middle-east-cloud-application.asp

Logo: https://mma.prnewswire.com/media/660509/MarketsandMarkets_Logo.jpg

SOURCE MarketsandMarkets

Link:

Middle East Cloud Applications Market Worth $4.5 Billion by 2024 - Exclusive Report by MarketsandMarkets - PR Newswire UK

Alibaba’s 10 Tech Trends to Watch in… – Alizila

The Alibaba DAMO Academy, Alibaba Groups global program for tackling ambitious, high-impact technology research, has made some predictions about the trends that will shape the industry in the year ahead. From more-advanced artificial intelligence to large-scale blockchain applications, heres what you can expect in 2020.

1. Artificial Intelligence Gets More Human2020 is set to be a breakthrough year for AI, according to DAMO. Researchers will be taking inspiration from a host of new areas to upgrade the technology, namely cognitive psychology and neuroscience combined with insights into human behavior and history. Theyll also adopt new machine-learning techniques, such as continual learning, which allows machines to remember what theyve learned in order to more quickly learn new things something humans take for granted. With these advances in cognitive intelligence, machines will be able to better understand and make use of knowledge rather than merely perceive and express information.

2. The Next Generation of ComputationComputers these days send information back and forth between the processor and the memory in order to complete tasks. The problem? Computing demands have grown to such an extent in the digital age that our computers cant keep up. Enter processing-in-memory architecture, which integrates the processor and memory into a single chip for faster processing speed. PIM innovations will play a critical role in spurring next-generation AI, DAMO said.

3. Hyper-Connected ManufacturingThe rapid deployment of 5G, Internet of Things and cloud- and edge-computing applications will help manufacturers go digital, including everything from automating equipment, logistics and production scheduling to integrating their factory, IT and communications systems. In turn, DAMO predicts theyll be faster to react to changes in demand and coordinate with suppliers in real time to help productivity and profitability.

WATCH: An Inside Look at Cainiaos Hyperconnected Warehouse

4. Machines Talking to Machines at ScaleMore-advanced IoT and 5G will enable more large-scale deployments of connected devices, which brings with them a range of benefits for governments, companies and consumers. For example, traffic-signal systems could be optimized in real time to keep drivers moving (and happy), while driverless cars could access roadside sensors to better navigate their surroundings. These technologies would also allow warehouse robots to maneuver around obstacles and sort parcels, and fleets of drones to efficiently and securely make last-mile deliveries.

5. Chip Design Gets EasierHave you heard? Moores Law is dying. It is now becoming too expensive to build faster and smaller semiconductors. In its place, chipmakers are now piecing together smaller chiplets into single wafers to handle more-demanding tasks. Think Legos. Another advantage of chiplets is that they often use already-inspected silicon, speeding up time to market. Barriers to entry in chipmaking are dropping, too, as open-source communities provide alternatives to traditional, proprietary design. And as more companies design their own custom chips, they are increasingly contributing to a growing ecosystem of development tools, product information and related software that will enable still easier and faster chip design in the future.

6. Blockchain Moves Toward MainstreamThe nascent blockchain industry is about to see some changes of its own. For one, expect the rise of the blockchain-as-a-service model to make these applications more accessible to businesses. Also, there will be a rise in specialized hardware chips for cloud and edge computing, powered by core algorithms used in blockchain technologies. Scientists at DAMO forecast that the number of new blockchain applications will grow significantly this year, as well, while blockchain-related collaborations across industries will become more common. Lastly, the academy expects large-scale blockchain applications to see wide-scale adoption.

7. A Turning Point for Quantum ComputingRecent advancements in this field have stirred up hopes for making large-scale quantum computers a reality, which will prompt more investments into quantum R&D, according to DAMO. That will result in increased competition and ecosystem growth around quantum technologies, as well as more attempts to commercialize the technology. DAMO predicts that after a difficult but critical period of intensive research in the coming years, quantum information science will deliver breakthroughs such as computers that can correct computation errors in real time.

8. More Revolution in SemiconductorsDemand is surging for computing power and storage, but major chipmakers still havent developed a better solution than 3-nanometer node silicon-based transistors. Experiments in design have led to the discovery of other materials that might boost performance. Topological insulators and two-dimensional superconducting materials, for example, may become connective materials as their properties allow electrical currents to flow without resistance. New magnetic and resistive switching materials might also be used to create next-generation magnetic memory technology, which can run on less power than their predecessors.

9. Data Protection Powered by AIAs businesses face a growing number of data-protection regulations and the rising compliance costs to meet them interest is growing in new solutions that support data security. AI algorithms can do that. They help organizations manage and filter through information, protect user information shared across multiple parties and make regulatory compliance easier, or even automatic. These technologies can help companies promote trust in the reuse and sharing of analytics, as well as overcome problems such as data silos, where certain information is not accessible to an entire organization and causes inefficiencies as a result.

10. Innovation Starts on the CloudCloud computing has evolved far beyond its intended purpose as technological infrastructure to take on a defining role in IT innovation. Today, clouds computing power is the backbone of the digital economy as it transforms the newest, most-advanced innovations into accessible services. From semiconductor chips, databases and blockchain to IoT and quantum computing, nearly all technologies are now tied to cloud computing. It has also given rise to new technologies, such as serverless computing architecture and cloud-powered robotic automation.

Visit link:

Alibaba's 10 Tech Trends to Watch in... - Alizila

API Economy: Is It The Next Big Thing? – Forbes

null

APIs (Application programming interfaces) have been around for decades.They allow different systems to talk to each other in a seamless, fast fashion, said Gary Hoberman, who is the CEO of Unqork.

Yet its been during the past decade that this technology has become a major force.For example, in early 2019 Salesforce.com shelled out $6.5 billion for MuleSoft, a management system for APIs. Then there was another notable deal that actually was announced this week:Visa agreed to pay $5.3 billion for Plaid (heres my post about the acquisition in Forbes.com).

So then why all the interest in APIs?Well, one of the most important reasons is the secular growth in cloud computing, which has led to the need for integration.Just look at the Uber app.A large part of the underlying technology is not even created by the companybut is instead called from APIs!

APIs enable companies to more easily build products and services that would otherwise take too long to build,said Augusto Aghi Marietti, who is the CEO of Kong.Developers can use these APIs to more easily access business-critical information and focus on other priorities instead.

This is not to imply that APIs are a cure-all.No technology is.The world is on course to having a trillion programmable endpoints, said Tyler Jewell, the managing director at Dell Technologies Capital.The momentum behind containers, serverless, multi-cloud and APIs is increasing into this year, so the world will probably double the number of endpoints that are generated. This is going to create all sorts of new problems that need to be solved: autonomous management of APIs, ML-assisted development tools to improve coder productivity, new programming languages and cloud abstractions to remove complexity.

Despite all this, APIs still have many benefitsand will likely be critical for digital transformation.

"The prospects for the API economy are exponential," said Bernard Harguindeguy, who is the CTO at Ping Identity. "APIs are creating powerful ways for businesses to streamline how they engage with partners and together deliver new generations of applications that empower consumers with more services and options."

One of the pioneers of the API economy is Jeff Lawson, who is the co-founder and CEO of Twilio. In 2008, he pitched Fred Wilson of Union Square Ventures, saying: We have taken the entire messy and complex world of telephony and reduced it to five API calls.

Wilson was naturally skeptical but listened.Lawson asked for his phone number and wrote some codeand then sent a message to him.Wilson was blown away and said: You can stop there.Thats amazing.He would go on to say it was the best seed pitch hed ever heard.

Twilio has certainly come a long way since then.In the latest quarter, revenues soared by 75% to $295.1 million and the market cap is about $16.4 billion.

Before creating Twilio, Lawson worked as a technical product manager at Amazon, where he saw how APIs were critical for AWS.He also worked at several other startups where there was lots of heavy-lifting in creating new applications.

With web services, you can go from idea to prototype quickly, said Lawson.Because of this, developers can focus on what customers really want.Its why we are seeing an explosion of startups.

But the megatrend of the API Economy is not just about tech companies either.Every company is becoming a software company, said Lawson.

Steven Wang grew up in China and then came to the US to get his PhD in computer science.He would work at several companies and then start is his own, called Measure Square.The idea was a practical one.His wife, who worked at a flooring company, said her boss wanted to find a way to automate some of the manual processes.

So Wang built a program that worked quite well. And so he quit his job to start his own software company.

But it was far from easy.We made lots of mistakes along the way, said Wang.One was developing software on Java.Another was relying on outsourcing.

Oh, and he did not have any external funding.Measure Square was a pure bootstrap.

But he was persistent and the company has since become a leader in the flooring industry, with thousands of customers, including Sherwin Williams, Redi Carpet, Daltile and RiteRug.

The technology world is very dynamic, said Wang. You cant sit still. So a few years ago we completely revamped our technology infrastructure to allow for an API platform.It was a major undertaking and a big risk. But we are also very customer-centric and knew that this was the right strategy.

As of today, the Measure Square cloud logs 2.5 million API calls per month that streamline accurate estimates and integrate with Salesforce.com, Microsoft Dynamics, NetSuite, Rollmaster, and QPros.There are also deployments for Lowes installers (covering more than 750 stores).There is even an API that leverages AI (Artificial Intelligence) for automatic plan takeoffs.

We are still in the early stages, said Wang.The APIs have also made it easier to roll out new products, such as PropertyLinkwhich is for hosting property measurement plansand JobTrakr, an app that empowers large commercial users to track their job productions on construction sites with exact 3D models.No doubt, APIs have been a game changer for us.

Tom (@ttaulli) is the author of the book,Artificial Intelligence Basics: A Non-Technical Introduction.

Continue reading here:

API Economy: Is It The Next Big Thing? - Forbes

MongoDB: Riding The Data Wave – Seeking Alpha

MongoDB (MDB) is a database software company which is benefiting from the growth in unstructured data and leading the growth in non-relational databases. Despite MongoDB's recent rise in share price, its current valuation is modest given its strong position in a large and attractive market.

There has been an explosion in the growth of data in recent years with this growth being dominated by unstructured data. Unstructured data is currently growing at a rate of 26.8% annually compared to structured data which is growing at rate of 19.6% annually.

Figure 1: Growth in Data

(source: m-files)

Unstructured data refers to any data which despite possibly having internal structure is not structured via pre-defined data models or schema. Unstructured data includes formats like audio, video and social media postings and is often stored in non-relational database like NoSQL. Structured data is suitable for storage in a traditional database (rows and columns) and is normally stored in relational databases.

Mature analytics tools exist for structured data, but analytics tools for mining unstructured data are nascent. Improved data analytics tools for unstructured data will help to increase the value of this data and encourage companies to ensure they are collecting and storing as much of it as possible. Unstructured data analytics tools are designed to analyze information that doesn't have a pre-defined model and include tools like natural language processing.

Table 1: Structured Data Versus Unstructured Data

(source: Adapted by author from igneous)

Unstructured data is typically stored in NoSQL databases which can take a variety of forms, including:

Unstructured data can also be stored in multimodel databases which incorporate multiple database structures in the one package.

Figure 2: Multimodel Database

(source: Created by author)

Some of the potential advantages of NoSQL databases include:

Common use-cases for NoSQL databases include web-scale, IoT, mobile applications, DevOps, social networking, shopping carts and recommendation engines.

Relational databases have historically dominated the database market, but they were not built to handle the volume, variety and velocity of data being generated today nor were they built to take advantage of the commodity storage and processing power available today. Common applications of relational databases include ERP, CRM and ecommerce. Relational databases are tabular, highly dependent on pre-defined data definitions and usually scale vertically (a single server has to host the entire database to ensure acceptable performance). As a result, relational databases can be expensive, difficult to scale and have a relatively small number of failure points. The solution to support rapidly growing applications is to scale horizontally, by adding servers instead of concentrating more capacity in a single server. Organizations are now turning to scale-out architectures using open software technologies, commodity servers and cloud computing instead of large monolithic servers and storage infrastructure.

Figure 3: Data Structure and Database Type

(source: Created by author)

According to IDC, the worldwide database software market, which it refers to as structured data management software, was $44.6 billion in 2016 and is expected to grow to $61.3 billion in 2020, representing an 8% compound annual growth rate. Despite the rapid growth in unstructured data and the increasing importance of non-relational databases, IDC forecasts that relational databases will still account for 80% of the total operational database market in 2022.

Database management systems (DBMS) cloud services were 23.3% of the DBMS market in 2018, excluding DBMS licenses hosted in the cloud. In 2017 cloud DBMS accounted for 68% of the DBMS market growth with Amazon Web Services (AMZN) and Microsoft (MSFT) accounting for 75% of the growth.

MongoDB provides document databases using open source software and is one of the leading providers of NoSQL databases to address the requirements of unstructured data. MongoDB's software was downloaded 30 million times between 2009 and 2017 with 10 million downloads in 2017 and is frequently used for mobile apps, content management, real-time analytics and applications involving the Internet of Things, but can be a good choice for any application where there is no clear schema definition.

Figure 4: MongoDB downloads

(source: MongoDB)

MongoDB has a number of offerings, including:

Figure 5: MongoDB Platform

(source: MongoDB)

Functionality of the software includes:

MongoDB's platform offers high performance, horizontal scalability, flexible data schema and reliability through advanced security features and fault-tolerance. These features are helping to attract users of relational databases with approximately 30% of MongoDB's new business in 2017 resulting from the migration of applications from relational databases.

MongoDB generates revenue through term licenses and hosted as-a-service solutions. Most contracts are 1 year in length invoiced upfront with revenue recognized ratably over the term of the contract although a growing number of customers are entering multiyear subscriptions. Revenue from hosted as-a-service solutions is primarily generated on a usage basis and is billed either in arrears or paid up front. Services revenue is comprised of consulting and training services which generally result in losses and are primarily used to drive customer retention and expansion.

MongoDB's open source business model has allowed the company to scale rapidly and they now have over 16,800 customers, including half of the Global Fortune 100 in 2017. Their open source business model uses the community version as a pipeline for potential future subscribers and relies on customers converting to a paid model once they require premium support and tools.

Figure 6: Prominent MongoDB Customers

(source: Created by author using data from MongoDB)

MongoDB's growth is driven largely by its ability to expand revenue from existing customers. This is shown by the expansion of Annual Recurring Revenue (ARR) overtime, where ARR is defined as the subscription revenue contractually expected from customers over the following 12 months assuming no increases or reductions in their subscriptions. ARR excludes MongoDB Atlas, professional services and other self-service products. The fiscal year 2013 cohort increased their initial ARR from $5.3 million to $22.1 million in fiscal year 2017, representing a multiple of 4.1x.

Figure 7: MongoDB Cohort ARR

(source: MongoDB)

Although MongoDB continues to incur significant operating losses the contribution margin of new customers quickly becomes positive, indicating that as MongoDB's growth rate slows the company will become profitable. Contribution margin is defined as the ARR of subscription commitments from the customer cohort at the end of a period less the associated cost of subscription revenue and estimated allocated sales and marketing expense.

Figure 8: MongoDB 2015 Cohort Contribution Margin

(source: MongoDB)

MongoDB continues to achieve rapid revenue growth driven by an increasing number of customers and increased revenue per customer. Revenue growth has shown little sign of decline which is not surprising given the size of MongoDB's market opportunity. Revenue per customer is modest and MongoDB still has significant potential to expand the number of Global Fortune 100 customers.

Figure 9: MongoDB Revenue

(source: Created by author using data from MongoDB)

Figure 10: MongoDB Customer Numbers

(source: Created by author using data from MongoDB)

MongoDB's revenue growth has been higher than other listed database vendors since 2017 as a result of their expanding customer base and growing revenue per customer. The rise of cloud computing and non-relational databases has a large impact on relational database vendors with DBMS growth now dominated by cloud computing vendors and non-relational database vendors.

Figure 11: Database Vendor Revenue

(source: Created by author using data from company reports)

MongoDB's revenue growth is relatively high for its size when compared to other database vendors, but is likely to begin to decline in coming years.

Figure 12: Database Vendor Revenue Growth

(source: Created by author using data from company reports)

MongoDB's revenue is dominated by subscription revenue and this percentage has been increasing over time. This relatively stable source of income holds MongoDB in good stead for the future, particularly if customers can be converted to longer-term contracts.

Figure 13: MongoDB Subscription Revenue

(source: Created by author using data from MongoDB)

MongoDB generates reasonable gross profit margins for an enterprise software company from its subscription services, although these have begun to decline in recent periods. Likely as the result of the introduction of the entry level Atlas offering in 2016 and possibly also as a result of increasing competition.

Figure 14: MongoDB Gross Profit Margin

(source: Created by author using data from MongoDB)

MongoDB has exhibited a large amount of operating leverage in the past and is now approaching positive operating profitability. This is largely the result of declining sales and marketing and research and development costs relative to revenue. This trend is likely to continue as MongoDB expands, particularly as growth begins to decline and the burden of attracting new customers eases.

Figure 15: MongoDB Operating Profit Margin

(source: Created by author using data from MongoDB)

Figure 16: MongoDB Operating Expenses

(source: Created by author using data from MongoDB)

Although MongoDB's operating profitability is still negative it is in line with other database vendors and should become positive within the next few years. This is supported by the positive contribution margin of MongoDB's customers after their first year.

Figure 17: Database Vendor Operating Profit Margins

(source: Created by author using data from company reports)

MongoDB is yet to achieve consistently positive free cash flows, although appears to be on track as the business scales. This should be expected based on the high margin nature of the business and the low capital requirements. Current negative free cash flow is largely a result of expenditures in support of future growth in the form of sales and marketing and research and development.

Figure 18: MongoDB Free Cash Flow

(source: Created by author using data from MongoDB)

Competitors in the database vendor market can be broken into incumbents, cloud platforms and challengers. Incumbents are the current dominant players in the market, like Oracle (ORCL), who offer relational databases. Cloud platforms are cloud computing vendors like Amazon and Microsoft that also offer database software and services. Challengers are pure play database vendors who offer a range of non-relational database software and services.

Table 2: Database Vendors

(source: Created by author)

Incumbents

Incumbents offer proven technology with large set of features which may be important for mission critical transactional applications. This gives incumbents a strong position, particularly as relational databases are expected to continue to retain the lion's share of the database market in coming years. Incumbent players that lack a strong infrastructure-as-a-service platform though are poorly positioned to capture new applications and likely to be losers in the long run. This trend is evidenced by Teradata's (TDC) struggles since the advent of cloud computing and non-relational databases.

Cloud Platforms

Cloud service providers are able to offer a suite of SaaS solutions in addition to cloud computing, creating a compelling value proposition for customers. In exchange for reducing the number of vendors required and gaining access to applications designed to run together, database customers run the risk of being locked into a cloud vendor and paying significantly more for services which could potentially be inferior.

Challengers

Dedicated database vendors can offer best in breed technology, low costs and multi-cloud portability which helps to prevent cloud vendor lock-in.

The DBMS is typically broken into operational and analytical markets. The operational DBMS market refers to databases that are tied to a live application whereas the analytical market refers to the processing and analyzing of data imported from various sources.

Figure 19: Database Market Competitive Landscape

(source: Created by author)

Gartner assesses MongoDB as a challenger in the operational database systems market due primarily to a lack of completeness of vision. The leaders are generally large companies which offer a broader range of database types in addition to cloud computing services. MongoDB's ability to succeed against these companies will be dependent on them being able to offer best in class services and/or lower cost services.

Original post:

MongoDB: Riding The Data Wave - Seeking Alpha

Technology Trends to Keep an Eye on in 2020 – Built In Austin

Consumers have a lot of tech news to keep up with in 2020, with anticipated advances in autonomous vehicles, folding touchscreen phones and new video game consoles. But what are tech professionals gearing up for this year?

The answer depends on who you ask. For example, Executive VP of Product at Arrive Logistics Michael Senftleber is watching how business processes particularly in the world of freight will be affected by the increasing popularity of artificial intelligence and machine learning. Meanwhile, Vikram Phatak, founder of cybersecurity firm NSS Labs, is monitoring how 5G, IoT devices and other infrastructure will affect the future of digital protection strategies.

These are just a few of the upcoming tech evolutions the following Austin professionals are paying attention to. While each tech leader is harnessing different technological developments for different reasons, all paths lead to improved customer satisfaction and a continual evolution of their businesses.

Arrive Logistics EVP of Product Michael Senftleber said his team is planning to use AI and machine learning to enhance the overall capabilities of their freight prediction tech, while also improving the usability of their platform.

What are the top tech trends youre watching in 2020?

I often hear people talk about data science, AI and machine learning like theyre magic silver bullets for every problem; they are not. AI and ML are best applied when theres significant data, a computationally complex problem and repeated samples or transactions. In 2020, the AI and ML hype will continue to grow. But so will tangible business applications that leverage those technologies and data science to solve real challenges and provide unique insights.

In 2020, the AI and ML hype will continue to grow.

How are you applying these trends in your work in the year ahead?

The opportunities for data science, AI and ML in the logistics industry are enormous. We are leveraging these technologies with market and proprietary data to predict the future cost to move a load of freight, match the right load to the right truck, alert on opportunities or deviations and make business decisions in real time.

However, while powerful, often complex technologies end up in complex, hard-to-use platforms, its critical that we continue to build technology to harness opportunities and enable business workflows, while building an interface to enable a simple, seamless experience for our users.

At the 2019 Consumer Electronics Show, IBM unveiled the worlds first quantum computer designed for commercial and scientific use. NSS Labs Founder Vikram Phatak said further developments in the world of quantum computing will play a significant role in moving cybersecurityforward, and his company is gearing up for that change.

What are the top tech trends youre watching in 2020?

The adoption of cloud computing, ubiquitous high-speed internet like 5G wireless, internet of things, artificial intelligence and quantum computing will drive major shifts in the way the world works, including how we protect people. IoT devices will operate more efficiently and autonomously as AI evolves. And quantum computing is a transformative leap forward that makes possible new technologies that we havent even imagined yet.

Modern encryption that takes thousands of years to break with current computing technology can be broken in seconds using quantum computing. Our research indicates that as the virtual and physical worlds merge, cybersecurity will naturally evolve to focus on protecting people regardless of whether they are using mobile devices, computers or IoT devices connected to the cloud. This new paradigm will spur a scalable, zero-trust alternative to current cybersecurity architectures.

As the virtual and physical worlds merge, cybersecurity will naturally evolve.

How are you applying these trends in your work in the year ahead?

We select test topics based on enterprise customer demand. The world is rapidly changing and our plans for 2020 reflect that. This year we have a lot planned, including testing cloud security offerings like secure access service edge (SASE), and security for cloud computing, like cloud network firewall. We are also initiating research on how 5G will change cloud computing, practical uses of AI and getting prepared for a post-quantum world.

Kuldeep Chowhan has his head in clouds...of cloud computing. The engineer at vacation property rental site Vrbo said his team is watching public cloud computing.

What are the top tech trends youre watching in 2020?

Innovation and scale at the major public cloud computing providers continues to accelerate. Many platforms that were difficult to operationalize are becoming managed services that are easy and cost-effective to consume. Many of these services lower the bar for entry for data science and machine learning, which are blossoming in sophistication and applicability.

Vrbo is accelerating its migration to the cloud.

How are you applying these trends in your work in the year ahead?

Vrbo is accelerating its migration to the cloud so we can leverage the power of the Expedia Group travel platform. We want to provide travelers with a rich product offering that is personalized and relevant to them. That strategy includes a hybrid cloud data platform for all of Expedia Group that will power new AI capabilities to help our travelers find their perfect vacations.

Janani Mohanakrishnan said the water sector will experience a significant number of evolutions in the coming year. And as a result, the VP of product innovation and delivery said her team atwater conservation technology provider Banyan Water is adjusting their approach to automation and user data.

What are the top tech trends youre watching in 2020?

In the water sector, Im keeping an eye on more utilities and commercial customers leveraging digital solutions to improve operations. These operations include the following: keeping privacy in mind, detecting issues faster and reducing costs associated with troubleshooting and resolution. Were also watching the identification of improved uses of water, more businesses considering migrating to a circular economy, reduced truck rolls, and improved customer communication and engagement.

We will be updating our take on user research, data analytics and automation.

How are you applying these trends in your work in the year ahead?

Our products help commercial customers save money and water through the optimization of irrigation and indoor water management. We will be updating our take on user research, data analytics and automation to maximize value for customers.

We will be smart about how we make research an inherent part of the process, knowing we are resource-constrained. Weve trained select team members to collect feedback from users whenever they can. We accept that some research is better than no research, and that its OK for us to be brave with predictions.

Well also improve the intelligence of our existing models, leveraging tailored interventions as applicable to help buildings dramatically reduce indoor water consumption when there are inefficiencies.

Continue reading here:

Technology Trends to Keep an Eye on in 2020 - Built In Austin

Computing at the edge – ComputerWeekly.com

Edge computing has grown over the past several years to become one of the most important current trends in IT. It is increasingly viewed as a part of digital transformation, and linked with other trends such as the internet of things (IoT), analytics and cloud computing. But, as with those trends, there is no precise definition and often much hype about what edge computing is.

A simple definition of edge computing is that it involves some processing and decision-making taking place at the edge of the network, rather than everything being centralised in the datacentre or the cloud. This goes against the widely held notion that all IT functions will eventually be cloud-hosted, and some have even suggested that edge computing will replace the cloud.

Instead, both cloud and edge computing will co-exist, since they address different requirements. According to analyst firm IDC, the two approaches are complementary and interact in a smart and intelligent way. In a report called The technology impacts of edge computing in Europe, the firm forecast that in 2020, more than 50% of cloud deployments by European organisations will include edge computing and that 20% of endpoint devices and systems will execute artificial intelligence (AI) algorithms.

One of the factors driving edge computing is the IoT, which is bringing a whole host of new connected devices to networks. These devices need to be managed, but more importantly many of them are designed to generate streams of data to be analysed for operational reasons or for insights that could lead to more efficient ways of working.

Some use cases of edge computing include industrial automation, autonomous vehicles, smart homes, automated systems aboard oil and gas rigs, and 5G network infrastructure.

The latter is a good example, not only because 5G is expected to play a key role in many IoT deployments thanks to its ability to support a much greater number of connected devices per cell base station, but because the sheer processing power required for operating 5G networks means that cell base stations are becoming more like miniature datacentres.

While centralising all processing in the cloud might seem like the more efficient thing to do, that idea runs into problems when it comes to latency the delay in transmitting data across the network and in getting a response back.

This means data often needs to be processed and acted on at the point where it is being generated. In a smart factory, for example, sensors monitoring machinery might detect a serious fault condition that requires an instant remedial response.

The volumes of data generated by some applications are also growing rapidly. For example, some test autonomous vehicles have been found to generate as much as 8-10TB (terabytes) of data per day. In many cases, transmitting everything to the cloud may not be a viable option, according to Seagates executive vice-president and head of operations, Jeff Nygaard.

Its not free to move data through a pipe from endpoint to edge to cloud; it costs money to send data through that pipeline. The idea that you should really only be moving data if you need to move the data based on how youve architected, and how you get value out of the data is something you should be thinking about, said Nygaard, speaking in a panel discussion on edge computing.

For reasons such as these, it makes sense in many situations to analyse data as it is generated at the edge, and this has led to a requirement for more powerful hardware capable of running analytics on all that data. This means edge systems have expanded from relatively simple edge gateway devices managing a bunch of sensors, to include full-blown servers and even micro-datacentres.

This fits with analyst firm Ovums view, outlined in its report Defining the market for edge and edge cloud. Ovum sees a near edge based on traditional servers, storage or hyper-converged infrastructure (HCI) devices, with an outer edge made up of gateway devices, the latter either fully managed or immutable so that they are simply replaced if and when upgrades are required.

Micro-datacentres are enclosures containing one or more datacentre racks, which can be filled up with servers, storage and network kit, plus power and cooling systems. In other words, they can house the kind of IT equipment that would normally be found in a rack in a normal datacentre, but can be installed in a factory or on an oil rig, or anywhere where a decent amount of compute power is required.

These are available from suppliers such as Schneider Electric and Rittal, but also from major IT suppliers such as HPE and Dell EMC, which are naturally keen to sell such enclosures ready configured with their own servers, storage and networking.

But it is also important to recognise that whether data is processed at the edge or in the cloud will depend on the application, and the two are not mutually exclusive. Edge computing allows for data to be filtered and processed before being sent to the cloud, for example, while the cloud may also serve as a central site for collating data for further analysis from multiple edge sites.

In addition to analytics, edge systems are increasingly going to be called on to carry out demanding tasks such as visual recognition, inspecting items on a factory production line for defects, for example. These tasks often rely on AI techniques such as machine learning or deep learning models to deliver results speedily, and this means hardware accelerators such as graphics processing units (GPUs) or field programmable gate arrays (FPGAs) may be required.

In fact, GPU maker Nvidia unveiled an edge platform last year that combines its GPU products with a software stack incorporating Kubernetes, a container runtime and containerised AI frameworks, designed to run on standard server hardware.

The EGX platform is described by Nvidia as bringing the power of accelerated AI computing to the edge with an easy-to-deploy cloud-native software stack. EGX partners include HPE, Dell EMC, Fujitsu, Cisco and Supermicro.

New applications and services are also driving the development of edge computing. Demand for high-bandwidth streamed video, for example, is leading service providers to cache content locally at datacentres located closer to customers.

Amazon Web Services (AWS) announced in December 2019 that it is planning to build a series of hyper-local datacentre hubs close to major cities for exactly this reason. These hubs, dubbed Local Zones by AWS, are intended to attract businesses with latency-sensitive workloads and will be housed in small-scale datacentres rather than the companys large regional facilities.

Of course, there are potential issues with edge computing. Having numerous sites collecting and analysing data means more sites that need to be configured and monitored, all of which adds complexity. And the distributed nature of edge computing means that technicians are not always likely to be available onsite if and when a failure occurs.

Edge computing also has implications for networks. As more computing happens at the edge, network bandwidth will have to adapt to this shift in emphasis. According to IDC, edge computing directly increases the importance of networks, especially delocalised networks. Edge computing will also require innovation in how networks are analysed, managed and orchestrated.

Security is an obvious issue for all IT infrastructure, but with edge systems potentially running unattended in remote sites, physical security of the hardware is as much a concern as the potential for a cyber attack. It also means that access control and protection of data both at rest and in transit become even more important.

Management issues may include the need to deliver secure application updates to the edge hardware, plus the ability to remotely diagnose and fix any issues that may develop.

According to Ovum, operational management is more likely to be an extension of the existing operational management market, rather than specific products for edge computing. Likewise, it expects orchestration at the edge to effectively form part of an expansion of the multicloud management market, which, according to Ovums forecasts will be worth $11bn by 2022.

Key takeaways on edge computing are that it is not going to replace cloud, but in some instances can be thought of as bringing cloud computing closer to where data is being generated. It is intended to support new and emerging workloads that may be latency sensitive, require a lot of compute power, or involve such large data volumes that sending everything back to the cloud is impractical. All of this brings new challenges, but also potential rewards for organisations that can get it right.

Link:

Computing at the edge - ComputerWeekly.com

Cloud ‘land grab’ to descend on SA this year – ITWeb

Cloud computing will accelerate the continents digital transformation.

Rivals Microsoft, Oracle and Huawei are readying for cloud battles in 2020.

ITWeb interviewed the companies about their plans to drive cloud computing solutions in SA this year.

Last year saw US-based software giant Microsoft open two data centre regions in SA, becoming the first global provider to deliver cloud services from data centres on the African continent.

In March last year, Chinese telecommunications giant Huawei also started offering its cloud services in SA. The company is leasing a data centre in Johannesburg from a partner from where it is deploying localised public cloud services based on local industry policies, customer requirements and partner conditions.

US-based enterprise software company Oracle in September last year also announced plans to launch data centres in SA.

Similarly, fellow US-based company Amazon Web Services is expected to open data centres in SA this year.

Johannes Kanis, cloud and enterprise business group lead at Microsoft SA, says over the last 10months, the extent of the appetite for cloud in SA has meant the company has one of the fastest-growing Microsoft data centre regions globally.

He points out that significant organisations, like Nedbank, Standard Bank and Altron, have already migrated components of their IT infrastructure into the cloud.

In October 2019, the State IT Agency and Microsoft SA announced details of a memorandum of understanding that paves the way for government adoption of public cloud services, which drives government's ability to accelerate its digital transformation journey, Kanis notes.

We anticipate that the need for and interest in cloud-based services will drive more innovation from customers and partners, and will continue to spur the continents digital transformation.

According to Kanis, Gartner research from last year indicates SA is expected to be the fourth fastest-growing major IT market in the world, and this strong performance is largely driven by companies embracing cloud.

On competition in the local cloud market, Kanis comments: We compete with hyperscaler providers all over the world and are proud to be the first global provider to launch two cloud data centres on the continent. Our goal was to bring the technology closer to African businesses and organisations, offer enterprise-grade reliability and performance combined with data residency in South Africa.

For Niral Patel, MD of Oracle SA, the opportunities cloud creates are real and present today, providing the building blocks for companies to pioneer ground-breaking innovations and disrupt entire industries.

Were seeing financial services use AI [artificial intelligence] for automatic forecasting without human intervention, to smart manufacturing utilising real-time IOT [Internet of things] data for equipment optimisations.

The opportunities are endless, not only for our business, but for our partners and customers. The pending data centre that will reach our shores this year will enable our customers to address cost-efficiencies, providing the ability to innovate quickly, creating large-scale agility while benefiting from the utmost levels of security.

Niral Patel, MD of Oracle SA.

Unlike other cloud providers, Oracle is committed to offer a second region for disaster recovery in every country where we launch Oracle Cloud Infrastructure services, a strategy thats aligned with our customers needs. With these dual regions, customers can deploy both production and disaster recovery capacity within the country, to meet business continuity and compliance requirements, says Patel.

He adds that while competitors were building their data centres, Oracle was building its cloud applications from the ground up.

We partner with customers to tackle their most complex business problems, run their operations, and achieve the best possible outcomes. Customers are fast embracing and upgrading to Autonomous Database, a self-driving software that uses machine learning to enable unprecedented availability, high performance and security at a much lower cost. By using machine learning, the self-driving self-healing self-learning database is being used by customers in the cloud to see new ways of getting more value from more secure data more quickly.

Rui Houwei, president of Huawei Cloud Africa, notes that over the past 10months, Huawei Cloud has experienced rapid growth in the African market, notably in SA, and already has more than 300 customers.

Likewise, he says, government agencies, telecom carriers and enterprises from a wide range of industries have made use of Huawei Cloud services to enjoy previously unforeseen levels of success in the market.

Last year, Huawei also launched its Partner Programme 2.0, an initiative that offers crucial support for partners, with regard to online and onsite training, market expansion, marketing activities and technical understanding.

Rui Houwei, president of Huawei Cloud Africa.

In SA alone, Huawei Cloud has established partnerships with over 50 partners, spanning diverse industries, including telecoms, finance, manufacturing, education, retail and logistics, as well as the public sector. Such wide-ranging collaboration is likely to result in unique and broadly-shared benefits across the ecosystem, says Houwei.

Jon Tullett, senior research manager for cloud/IT services at IDC, believes there will be a cloud land grab just as with fibre providers in SA.

However, he notes adoption of cloud computing may be slower than expected, as moving workloads to the cloud can take a long time.

The other challenge SA will face regarding cloud adoption this year will be the skills shortage, says Tullett. Cloud computing skills are in short supply and they are very expensive.

Nonetheless, he notes there is increased interest and confidence in the cloud among South African organisations, including government.

As an example, he said the South African Reserve Bank issued a directive in 2018 and guidance notes on cloud computing and the offshoring of data by banks. The directive is applicable to all banks, controlling companies, branches of foreign institutions and auditors of banks or controlling companies. The directive was effective from 1 October 2018.

See original here:

Cloud 'land grab' to descend on SA this year - ITWeb


12345...10...