Top 5 Reasons To Migrate Databases to the Cloud – Spiceworks News and Insights

Database migration involves moving data from one or multiple source platforms to a different database. Organizations often create a database migration strategy that helps align the migration process with specific business needs. For example, an organization may decide to migrate its on-premises data to a cloud-based database to reduce costs. Another business may opt to migrate to a database that offers extended features more suitable to current needs.

A cloud database is a managed service accessed and built via a cloud platform. Users simply request a database instance from a cloud provider, and it is automatically deployed on cloud infrastructure. A cloud database has many functions of a traditional database, with the additional flexibility offered by cloud computing.

Here are typical features of cloud databases:

Here are the top reasons you should consider moving your on-premises database to the cloud.

Migrating your database to the cloud reduces the need for in-house IT staff and data center facilities. Cloud database migration also does not need the specialized tools and resources required to manage complex IT environments. Over time, database cloud migration results in lower capital costs and decreased HVAC and electrical operating expenditures.

The cloud computing vendor offers storage, servers, and other infrastructure in a cloud database environment. It is responsible for maintaining high availability and maintaining the infrastructure. The organization that operates and owns the database is responsible for configuring it, as well as loading, managing and protecting the data.

You can scale a database up and down more easily when you migrate to the cloud. Cloud computing also provides increased elasticity and flexibility. Cloud database migration enables dynamic scaling, so additional database instances can be created to meet changing application loads.

Organizations are using the cloud to help them enforce a strong disaster recovery plan. Organizations can copy or backup entire virtual servers to an off-premises data center with cloud computing.

You can spin up the virtual server on a virtual host in a few minutes. The benefit of this is that you can safely and accurately restore a database in a remote data center without reinstalling the server. Consequently, you can cut down on disaster recovery times.

One of the key organizational goals when migrating databases to the cloud is to improve analytics capabilities, including data lakes and data warehousing. You can even prepare for advanced analytics, for example, machine learning and artificial intelligence.

Data modernization is the initial step to initiating scalable analytics capabilities. Organizations can use cloud computing to achieve real-time data availability.

Via the cloud, organizations can visualize their data and provide access to more employees for improved decision-making. Most importantly, cloud providers offer a range of data analytics and machine learning services that can help organizations gain deeper insights into data without a major investment in infrastructure.

See More: Applying Gartners 6 Rs to Data Migration

Here are cloud database offerings from the worlds leading cloud providers Amazon and Microsoft.

Amazon Web Services offers the following cloud database services:

Azure offers the following cloud database services:

See More: A Cloud Networking Primer: Building Your Network in the Azure Cloud

Cloud database services are becoming a pillar of IT operations in the cloud. Traditionally, managing database infrastructure and concerns like scaling and high availability were high on the priority list of IT teams, because databases are typically a mission critical application. Today, many organizations are moving to the cloud and outsourcing these concerns to a third-party provider.

Migrating a database to the cloud is convenient, but also involves a certain loss of control. In the on-premise data center, IT staff could choose their database version, implement customizations, fine-tune performance, and easily integrate their database with legacy systems. All these become much more difficult, if not impossible, in a managed service.

That being said, there are many strong drivers for migrating to the cloud, such as cost savings, elimination of physical infrastructure, easier scalability, disaster recovery, and improved analytics.

In addition, I covered popular cloud database offerings from the big three cloud providers, which you can leverage to make a move to the cloud, notably Amazon RDS, Azure SQL Database, and Azure Cosmos DB. All the leading providers offer free trials and free pricing tiers, so you can test drive any of these services and see if a cloud database is right for your project.

Have you moved your on-premises database to the cloud? What benefits have you seen? Let us know on Facebook, Twitter, and LinkedIn.

Here is the original post:

Top 5 Reasons To Migrate Databases to the Cloud - Spiceworks News and Insights

Cloud Migration: Strategies, Process, Benefits and Challenges – Security Boulevard

The pandemic accelerated technological adoption among small and midsized businesses (SMBs) by five years, forcing them to upgrade their IT infrastructure by adopting cloud-based and digital tools to stay competitive and service clients remotely. According to the 2022 IT Operations Survey Results Report, the percentage of respondents using cloud cost management tools increased from 7% to 24% as cloud adoption climbs.

Cloud services are a boon for the fast-expanding mid-market segment who can pay on the go and expand and scale their business without incurring high IT infrastructure costs. The report also highlighted that nearly two-thirds (64%) of respondents spend as much as 25% of their resources on cloud infrastructure.

With cloud technologies becoming increasingly popular, this blog will aim to answer common questions such as what cloud migration means, why its important, top migration strategies and other key queries. Dive in.

Cloud migration refers to moving company data, applications and other IT resources from on-premises data centers and servers to the cloud. Companies can either transfer their data to public cloud service providers like Microsoft Azure, Google Cloud, or Amazon Web Services (AWS), set up their private cloud computing environment or create a hybrid environment.

With cloud services gaining popularity, there is also an increasing rise in cloud-to-cloud migrations in which companies move their resources from one cloud service provider to another. Another concept under the umbrella of cloud migration is cloud repatriation, or reverse cloud migration, in which users move their data and resources from a cloud environment to a local server.

Cloud migration is important because it gives SMBs the capability to support a diversified and hybrid employee and client base efficiently. Cloud computing is the future of IT, and not migrating to it will result in you falling behind. Consider these points:

Cloud migration strategies are in-depth plans companies make to migrate their data and resources from on-premises infrastructure to the cloud or from one cloud provider to another. No two businesses will have the same cloud migration strategy. It will vary depending on their expectations from cloud adoption, its impacts on their business operations, the money they expect to save and other business factors.

Migrating to the cloud is not a simple one-size-fits-all process. For migration to be successful, each application, dataset and workload must be mapped out in detail. According to Gartner, there are five categories of cloud migration strategies, dubbed the FIVE Rs. Lets take a look at what they are:

The cloud deployment model indicates how youve configured the cloud infrastructure, which determines how much access and control you have over it. A deployment model determines where and who controls cloud databases and servers. There are four primary deployment models, which are:

The public cloud setup gives users access to comprehensive IT resources like virtual machines, computing power, application storage and data backup over the internet without requiring them to maintain the hardware themselves. The public cloud service providers share the computing resources with multiple tenants and charge them on a pay-per-usage or subscription basis.

A private cloud setup is for use by a single customer. Companies create the cloud environment for personal use and do not share it with others. This option combines the benefits of public cloud with the security and control of an on-premise IT ecosystem. Although this setup is costly due to upfront investment in technology, many organizations find the security benefits outweigh the costs.

A hybrid cloud setup is when a company uses a mix of on-premise, public and private cloud environments. Companies use data management processes to connect systems running on traditional architecture that they may not want to expose to the cloud. Often, companies keep confidential resources and data on-premises and use the cloud for services like analytics. The hybrid model is where most businesses end up.

Multicloud setups involve connecting multiple public clouds in one architecture to create a single user experience.

A successful cloud adoption strategy will vary based on unique business needs and requirements. However, all cloud migration takes place in the following four stages:

Cloud computing technology and cloud-based services have matured. Their capabilities and reliability have advanced to a degree that for most organizations its no longer a matter of if they will launch a cloud deployment or expansion, but when. Here are some of the top benefits of cloud migration:

With no upfront commitments or long-term contracts, you pay only for the resources (storage, compute power, etc.) you use. This reduces your IT operational costs and helps boosts profits. You can spend the money you save on introducing innovation at the workplace and improving your own services.

Cloud services offer high scalability and availability to their users. Its easy to scale your usage up or down, depending on the changing needs of your business. You can also modify the computing power required with just a few clicks.

Cloud-hosted websites or applications run better for end users since the cloud provider will naturally have significantly more data centers. As these data centers are located around the globe, you can host your data in a market you want to serve and remove location-based latency. As a result, you will be able to provide better service to your users.

Cloud adoption provides businesses with flexibility and scalability by not restricting them to the physical limitations of on-premises servers. In addition, you can also take advantage of the reliability of multiple cloud data centers as well as responsive and customizable load balancing that evolves with your changing demands. This way you never have to worry about high fixed costs since everything is variable.

Even though migrating to the cloud has many benefits and is the inevitable next step in information technology, several challenges remain.

Successful migration to the cloud requires proper planning, and most companies dont pay enough attention to this step. Whether its due to lack of time, inattention or managements inability to get on the same page, errors during migration are preventable if all the wrinkles are ironed out during the planning phase.

Technology adoption comes at a cost. Many technicians see migration as a net new cost rather than considering its long-term cost-saving benefits. When companies are on a tight budget, migrating to the cloud can be challenging. However, the cloud can be a great way for companies to save money and unlock efficiencies in their business.

Cloud operates on a shared responsibility model where the service providers supply robust security controls but the responsibility to configure them correctly is up to the users. There is also a risk associated with mass data transfers, as information can get intercepted during the transfer. When using cloud services, users must exercise all cybersecurity precautions.

Compliance is another challenge. When deploying cloud technologies, you must ensure compliance with the various rules and regulations that will vary based on your industry and location. Ensure youre compliant with the standards appropriate for your organization.

The migration to cloud computing has rapidly accelerated in the past year as organizations have digitally transformed their business. Infrastructure as a Service (IaaS) extends your IT environment from on-premises to the cloud. The global market size of IaaS is expected to reach nearly $82.2 billion this year. Needless to say, IT teams need an endpoint management solution that enables them to manage cloud-based environments on services such as Azure and AWS, as well as hybrid on-premise and cloud environments.

With Kaseya VSA, you can automate the discovery process of all endpoints and network devices, including virtual hosts, virtual machines and cloud infrastructure for services such as Microsoft Azure. You can deploy our automation agents to Azure Active Directory (AD), with more cloud providers on the horizon. VSA gives you the visibility and functionality you need to manage all of IT in a single UI. If your RMM cant manage your hybrid IT ecosystem, its time to upgrade. Request your demo today!

The post Cloud Migration: Strategies, Process, Benefits and Challenges appeared first on Kaseya.

*** This is a Security Bloggers Network syndicated blog from Blog - Kaseya authored by Kaseya. Read the original post at: https://www.kaseya.com/blog/2022/09/14/cloud-migration-strategies-process-benefits-challenges/

View post:

Cloud Migration: Strategies, Process, Benefits and Challenges - Security Boulevard

G-Cloud 13 and public sector technology procurement – Open Access Government

Cloud computing is now so widely used that it has become the new normal, with on premises infrastruc- ture largely consigned to history at least for many public sector digital services. However, this is really quite a recent shift and one that was facilitated in no small part by the transformation in Cloud service procurement via The Government Digital Marketplace (originally created as CloudStore).

This transformation began in earnest in 2012 with the launch of G-Cloud, enabling public sector organisations to more easily adopt a Cloud first approach to IT. Now, with the launch of G-Cloud 13, Zoocha looks over their ten years of success in digital transformation.

One of the greatest impacts of G-Cloud has been to provide a mechanism for small and medium-sized businesses (SMEs) to offer their services to public sector organisations. This is not only transformational for those businesses, but creates a vastly greater choice for buyers to achieve the best value for money, without increasing procurement bureaucracy.

By May 2013, a little over a year after launch, there were over 700 suppliers on the framework, the vast majority of which were SMEs. Today, SMEs account for over 90% of all suppliers listed on the G-Cloud framework.

Another key to the success of G-Cloud is the clear structure of the services offered. Since G-Cloud 9, services have been classified into three lots: Lot 1: Cloud Hosting (IaaS) and (PaaS): Cloud platform or infrastruc- ture services that can help buyers to deploy, manage and run software as well as provision and use processing, storage or networking resources. Lot 2: Cloud Software (SaaS): Appli- cations that are typically accessed over a public or private network, for example, the internet and hosted in the Cloud. Lot 3: Cloud Support: Services that support buyers to implement and maintain their Cloud systems.

This is designed to provide buyers with an easier way to identify and procure solutions and service providers, without the need for an intensive and expensive tendering process. The process for buyers is simple:

Define their requirements. Search on the digital marketplace using keywords relating to your requirements. Save and export your search results both ease the audit trail. Evaluate the services in the search results and shortlist the ones that meet your requirements and budget. Ask further clarification questions of the suppliers in your shortlist. Decide on the preferred supplier and award the contract.

Similarly, for suppliers, both the process for listing your services on G-Cloud and the direct award procurement process described above, have dramatically simplified the task of selling to public sector organisations and enabled them to compete against larger businesses on a level playing field.

In addition to Lots 1, 2 and 3, G-Cloud 13 includes a new fourth Lot. Whilst Lots 1-3 services will continue to be offered via the Digital Marketplace for buyers to search, shortlist and direct award contracts, Lot 4 will be separate, with procurement following a bidding process rather than service listing and direct award.

Lot 4 is designed to enable buyers to source providers who can implement larger-scale transition projects, from early business analysis through to migration, implementation and legacy systems integration.

In the last decade, almost 12 billion worth of Cloud services have been procured through the framework with 39% of that spend awarded to SMEs. In terms of breakdown, Cloud Support has overwhelmingly been the most successful Lot on the framework, with 6.7 billion spent since 2013 and accounting for 65% of total framework spend, with Cloud Software contributing 23% of spend and Cloud Hosting contributing 12%.

Perhaps more tellingly, the breakdown of spend by sector (i.e. type of public sector organisation) highlights that whilst G-Cloud has been successful in attracting central government buyers, accounting for 78% of framework spend, adoption by local government buyers has yet to gather pace, accounting for only 6% of spend. This means there are a huge number of organisations who are yet to access the benefits of the framework and represents one of the key future growth opportunities for G-Cloud and the suppliers listed on it.

G-Cloud has come a long way since its inception, driving transformational change in government technology procurement and in the ability for SMEs to access government contracts. We expect this trend to continue as the wider landscape of public sector organisations, including local government and health service providers, realise the benefits of using the framework for Cloud service procurement.

Editor's Recommended Articles

See the original post here:

G-Cloud 13 and public sector technology procurement - Open Access Government

Taking Law Firms to the Next Level With Cloud-Based SaaS – Spiceworks News and Insights

Despite the legal industrys reluctance to fully embrace tech, the legaltech market has grown swiftly. According to Zion Market Research, it was valued at approximately $3,245 million in 2018 and is expected to grow tenfold by 2026. And once more companies in the field realize the value of tech-enabled benefits, others will jump on the bandwagon, too.

But how do you leap cabinets full of folders and clunky Excel sheets? The industrys resistance to innovation has been hampering growth, with disengaged teams with no proper training for handling modern software.

The best way to navigate doubts about going digital and cloud-based is by dispelling myths, showcasing advantages, and sharing the steps needed for a seamless and effective transition. This way, legal firms can confidently launch into the practices future, ready to take on all advantages of using cloud computing software.

Just like in any industry that handles sensitive information, data security is a primary concern for any legal company. In reality, the feeling seems similar to keeping cash under a mattress rather than in the bank over safety matters; even though cyber risks do exist, there is always an effective solution to prevent them. Every software is eventually targeted for cyber threats. Companies and software developers are in charge of implementing their cybersecurity architecture and deciding how many safety barriers they will execute according to their products needs.

Legal software companies usually get certified by implementing policies that assure law firms that their on-cloud activity is protected. Some of these certifications are ISO/IEC 27001 and ISO/IEC 27017. Antivirus, anti-spyware, and hardware firewalls are additional steps a firm can take to safeguard its operations. However, rest assured that with security-certified software, the SaaS providers IT team will vouch for the datas safety.

The security put in place by SaaS operating on the cloud also entails end-to-end encryption, data at rest encryption (when stored on servers), and in transit encryption (while traveling from the client to the providers server). Additionally, some SaaS operate on the cloud through web services, such as Amazon Web Services, which provide them with a platform to run on. These cloud computing services also have security systems; therefore, a law firms data is secured in several different layers.

One of the biggest reasons lawyers stay on the fence about switching to legaltech is cost and ROI hesitation. Often, practices prefer to stick to their old guns to avoid extra costs using cloud-based software; however, knowing how the software will help cut and control expenses is the trick. Though the market slowed down after the pandemic, companies that use this service still reported yearly revenue growth of 32% in 2021.

The most straightforward answer to the cost vs. benefit dilemma is that practice management automation tools used in legal software leverage artificial intelligence (AI). AI enables firms to work more efficiently, avoid missing billed hours, and reduce time spent on repetitive tasks. Using automation allows companies to negotiate accurate prices, showing the time each task takes. Thus, the software expenses will return as better time-tracking, better billing, and more time spent on billable activities.

AI is not just about keeping track of hours and calculating invoices. Another task it supports is document assembly, where tailored documents are generated by filling in basic information, making it an efficient process with little space for human error. And there are also benefits for clients. Automation boosts the client experience as some legal software offers customer relationship management (CRM) with self-service capabilities. This way, clients only need to answer a few questions to complete an entire document.

Moreover, AI takes an extra step to save time and costs with the technology-assisted review (TAR). This subset leverages machine learning (ML) to run complex tasks, and developing these processes requires the help of someone who inputs and regulates the information fed to it. For example, e-Discovery uses ML to find keywords in several documents, rank them by relevancy to the case, and delete duplicates, saving hours and even days of work. It also handles tasks like extracting data from text, identifying mistakes, missing definitions, and legal traps.

TARs heavily rely on predictive coding in ML and AI. Despite being in its early stages, predictive coding also helps filter documents according to tone, context, and concept in just minutes, sparing lawyers time scrutinizing endless files. However, this feature needs extensive tuning to work effectively, making it financially viable for only a few in Big Law. As technology advances, it will become more affordable for smaller law firms to leverage some of these options.

As a preliminary overview, there are several SaaS solutions to choose from with different features and pricing levels. Firms can leverage pricing according to their needs, thanks to legal software providers scalability and price range, making it accessible for practices of all sizes. Likewise, providers must also be clear about the charges and payment options based on the requested services. After weighing these factors, legal firms should make the right call depending on suitable providers and the size of the firms they assist. The switch should be relatively easy when selecting the most convenient option.

Most providers offer a free trial, so the client can test the product to ensure it meets their requirements even before making a decision. Starting with a SaaS solution should take a couple of hours to a few days but no more than that.

A key point of cloud-based SaaS is that they do not require additional server hardware or an in-house IT team, as the SaaS provides these tools on the cloud, saving companies more expenses. These IT teams are not just there to deliver a suitable product for companies but also to assist them with the transition. The process can be rocky, so legal firms should lean on the providers to help every step of the way. Otherwise, the product cannot function properly, and departments that rely heavily on it will be left unsatisfied.

The advantages of SaaS for legal companies are plentiful, and it is up to law firms to examine all of their options and go for the right fit. Legaltech continues to expand and evolve, and with it, new tools will take legal practice to the next level, making the job easier for lawyers in all fields.

Why do you think cloud-based SaaS is the future of legaltech? Let us know your thoughts on LinkedIn, Twitter, or Facebook. We would love to hear from you!

See more here:

Taking Law Firms to the Next Level With Cloud-Based SaaS - Spiceworks News and Insights

Global Microserver Market is Predicted to Grow at a CAGR of ~14% during 2022-2031; Growing Need for Data Centers, Cloud Computing Increased Prevalence…

New York, Sept. 13, 2022 (GLOBE NEWSWIRE) -- Kenneth Research has published a detailed market report on Global Microserver Market for the forecast period, i.e. 2022 2031 which includes the following factors:

Global Microserver Market Size:

With the increase in digital world, need of data centers is also rising. Data centers are integral part of digital economy, everything that take place online is stored in these data centers. Global microserver market is poised to grow at about 14% CAGR throughout the forecast years, while nailing USD 47 billion for 2021. They are full of videos, imported software and severs, moreover, they are a spot for exchange of data between different networks. Companies are migrating towards digitization and relying on digital networks. To ease their work and to efficiently organize, analyze and to process the data, organizations demand for data centers is escalating. In 2021, the total number data centers present globally has reached to nearly 8 million.

Get a Sample PDF of This Report @ https://www.kennethresearch.com/sample-request-10154391

Global Microserver Market: Key Takeaways

Higher Usage of Cloud Computing, Increased Digitization and Digital Data is to Augment the Market Growth

Enterprise needs to access their scattered networks from one point and to provide the best security to protect the data is elevating the use of cloud computing. As of 2022, approximately 95% of organizations are using cloud computing for their work and nearly 85% of companies have deployed their workload on cloud in 2021. Furthermore, with the rising usage of internet and mobile phones & laptops, digitization is flourishing rapidly. By 2023, the revenue produced from digitally transformed companies will be equal to almost half of global GDP. Government, public and private sectors are migrating towards digitization and most of the service, products and information is digitally available. With the rise in digitization, the data generated at online platforms is also expanding massively. According to United States International Trade Commission, in 2020, digital data was around 59 zettabyte and it is expected to reach to 175 zettabyte by 2025.

Global Microserver Market: Regional Overview

The global microserver market is segmented into five major regions including North America, Europe, Asia Pacific, Latin America, and the Middle East and Africa region.

Browse to access In-depth research report on Global Microserver Market with detailed charts and figures: https://www.kennethresearch.com/report-details/microserver-market/10154391

Asia Pacific Regional Market to Elevate by Accelerated Digitization and Higher Deployment of Cloud Services

Rising need for cost effective data storage solution is leading Asia Pacific to deploy cloud services and it has been accelerated by the impact of COVID-19. Nearly 60% of companies have boosted their migration journey to cloud service due to COVID-19. Furthermore, increased prevalence of digital transformation is prompting government, public and private sectors to digitize their work process and the service provided by them. Around 60% of business have elevated their speed of digital transformation which is nearly 45% higher than rest of the world. Rise in number of data center is expected to further augment the growth of microserver market. Currently there are approximately 100 data centers in Asia Pacific with server room space of nearly 185,000 m2 and around 30,000 m is under development.

Higher Use of Cloud Service and Increased Number of Data Centers is to Elevate the Market Growth in North America

Onset of COVID-19 has fueled the utilization of cloud services in every industry in United States. It is estimated that about 95% of U.S. companies have deployed at-least one cloud service for their workload. Furthermore, As of January 2020, United States have nearly 2700 data centers, thus higher number of data center is likely to propel the market growth

The study further incorporates Y-O-Y growth, demand & supply and forecast future opportunity in:

Get a Sample PDF of Global Microserver Market @ https://www.kennethresearch.com/sample-request-10154391

Global Microserver Market, Segmentation by Processor

Intel is expected to contribute notably to the growth of microserver market. Intel is the worlds largest semiconductor chip makers. It has sustained the position for very long by constantly improvising the processor and producing new innovations. It is well known for radiating less heat and is best suitable for mini laptops. Furthermore, it gives wide variety of motherboards and is comparatively less expensive than other processors. In the second quarter of 2022, the share of x86 computer processor or CPU tests recorded from Intel is around 65%, and for laptops it is nearly 75%.

Global Microserver Market, Segmentation by Application

Companys rising need to secure their all network ends and to easily manage the employee all over the world is prompting them to deploy their work on cloud computing. Around 60% companies have started using cloud computing to avoid the risk of data theft and to secure critical data. Furthermore, smart feature of hybrid cloud, the flexibility usage between public and private cloud is increasing its popularity. Enterprises are counting on hybrid cloud in order to leverage their company with different types of cloud with the deployment of just hybrid cloud. As of March 2022, nearly 79% enterprises have installed hybrid cloud for carrying out tasks.

Global microserver market is further segregated based on solution, type, and, and end-user.

Access full Report Description, TOC, Table of Figure, Chart, etc. @ https://www.kennethresearch.com/sample-request-10154391

Global Microserver Market, Segmentation by Solution

Global Microserver Market, Segmentation by Service Type

Global Microserver Market, Segmentation by End User

Few of the well-known market leaders in the global microserver market that are profiled by Kenneth Research are FUJITSU Limited, Dell Technologies, Inc., IBM Corporation, Acer Inc., Hitachi, Ltd., Intel Corporation, Hewlett Packard Enterprise Development LP, Advanced Micro Devices, Inc., NVIDIA Corporation, and Ambedded Technology Co., LTD. and others.

Enquiry before Buying This Report @ https://www.kennethresearch.com/sample-request-10154391

Recent Developments in the Global Microserver Market

Browse More Related Reports:

Robotics Process Automation (RPA) Market Segmentation by Operation (Rule-Based and Knowledge-Based); by Deployment Mode (On-Premises, and Cloud); by Application (Customer Support, Administration & Reporting, Data Migration & Capture Extraction, Analysis, and Others); by Industry (BFSI, Healthcare, IT & Telecom, Manufacturing, Retail, Hospitality, and Others)-Global Demand Analysis and Opportunity Outlook 2031

Enterprise Data Management Market Segmentation by Deployment Type (Cloud, and On-Premise); by Component (Services, and Solutions); by Verticals (IT & Telecommunication, BFSI, Retail, Healthcare, Government, and Others); and By Organization Size (Large Size Organizations, and Small & Medium Size Organizations)-Global Demand Analysis & Opportunity Outlook 2031

Operational Intelligence Market Segmentation by Deployment (On-Premise, and Cloud); by Verticals (Manufacturing, Retail, Healthcare, IT & Telecommunication, BFSI, and Others); by Application (Commercial, Residential, and Others); and by Organization (Large Organizations, and Small & Medium Organizations)-Global Demand Analysis & Opportunity Outlook 2031

Data Center Power Market Segmentation by Data Center Sizes (Large Data Center, and Small & Medium Data Center); by Component (Service, and Solution); and by End-Users (Manufacturing, BFSI, IT & Telecommunication, Government, and Others)-Global Demand Analysis & Opportunity Outlook 2031

Passport Reader Market Analysis by Technology (Barcode, RFID, and OCR); by Application (Airport Security, and Border Control); and by Sector (Public, and Private Sector)-Global Supply & Demand Analysis & Opportunity Outlook 2022-2031

About Kenneth Research

Kenneth Research is a leading service provider for strategic market research and consulting. We aim to provide unbiased, unparalleled market insights and industry analysis to help industries, conglomerates and executives to take wise decisions for their future marketing strategy, expansion and investment, etc. We believe every business can expand to its new horizon, provided a right guidance at a right time is available through strategic minds. Our out of box thinking helps our clients to take wise decision so as to avoid future uncertainties.

Contact for more Info:

AJ Daniel

Email: info@kennethresearch.com

U.S. Phone: +1 313 462 0609

Web: https://www.kennethresearch.com/

Originally posted here:

Global Microserver Market is Predicted to Grow at a CAGR of ~14% during 2022-2031; Growing Need for Data Centers, Cloud Computing Increased Prevalence...

On cloud platforms and SME antitrust complaints – TechCrunch

End-to-end encrypted email provider Tutanota finally got a fix last month from Microsoft for a registration issue that had affected users who were trying to sign up to the tech giants cloud-based collaboration platform, Teams, using a Tutanota email address but only after complaining about the problem publicly.

TechCrunch picked up its complaint last month.

In a blog post confirming the resolution yesterday, Tutanota writes that Microsoft got in touch with it within a week after media outlets such as this one raised the issue with Microsoft. It had been complaining about the issue through Microsofts official support channels since January 2021 without any resolution. But after the oxygen of publicity arrived the problem was swiftly fixed last month. Fancy that!

While its (finally) a happy ending for Tutanota, its co-founder Matthias Pfau makes the salient point that this situation remains an entirely unsatisfactory one for SMEs faced with the market muscle of powerful platforms which have at best a competitive disinterest in swiftly attending to access issues and other problems affecting smaller businesses that need fair interfacing with their platforms to ensure they can properly serve their own customers.

While the issue has been resolved pretty quickly by Microsoft after the right people contacted us following the media attention, we still believe that this example shows why we need better antitrust regulations. It is not fair that a Big Tech company can ignore a small companys request to fix an issue that effects its users for months, and is only interested in fixing the issue after it received bad publicity because of this, he writes.

After all, not every small company has the option to go public, possibly because the media will decide their issue is not worth talking about or because they simply do not have established media contacts and find it hard to get through to the right people.

While we are very happy that this particular issue has now been fixed for all Tutanota users, we still believe that there must be a better way for companies to contact Big Tech and request fixes from them one where they can not simply answer to the request with Sorry, fixing the issue you are having is not feasible for us.

Platform fairness is one issue that the European Commission has been attending to in recent years but apparently not with enough of a flex to ensure all SMEs are being treated attentively by cloud giants.

Tutanota is not alone in experiencing issues with Microsofts support response to its complaint. Another SME, the browser maker Vivaldi, got in touch following our report on Tutanotas issue saying users of a webmail service it offers had reported a similar issue on Azure, another Microsoft cloud computing platform. It told us that users of its Vivaldi.net email service had been given information and possibly access to other vivaldi.net users Azure accounts. Which sounds, well, suboptimal.

The reason is that vivaldi.net is handled as a corporate domain, not an email provider domain. Microsoft has refused to fix the problem, claiming it is by design, a spokesperson for the company explained last month, adding: We have also had similar reports about other services.

Its frustrating that in 2022 to find Microsoft blatantly continues to engage in anti-competitive practices, they added.

After TechCrunch raised Vivaldis complaint with Microsoft, the SME got back in touch with us to say surprise! it had suddenly had fresh attention from the cloud giant to its complaint We are having a meeting with them this week. So they have woken up after two years. Lets see what comes out of this, its spokesperson told us a few weeks ago.

We followed up this month to see if Vivaldi has also had a resolution but at the time of writing were still waiting on a response.

Update: Vivaldi told us the meeting with Microsoft moved in a positive direction and the issues are partially fixed. Vivaldi.net is being treated as a personal email address and thats what we expect. Although its been fixed for new signups, for existing accounts that were created before they made their changes, we dont think theyve done anything yet, they also said. Hopefully, all such issues should soon be resolved.

We also asked for an update from Microsoft but havent heard back yet. But the tech giant previously said: Were in touch with Vivaldi.net to look into their concerns around data and will take action as needed to ensure that customer data is handled properly and any issues are addressed appropriately.

One thing is clear: These two complaints are just the tip of the iceberg. (Just the social media chatter around our Tutanota reporting includes a similar complaint about IBM Cloud and another that Microsoft also blocks self hosted emails from its virtual private servers without any sort of explanation, so you can conveniently get an email address from them as well, with the complainant accusing its business of always been forced dominance for e.g.)

Whats a whole lot less clear is whether or not current (and incoming) EU regulations are up to the task of protecting SMEs from cloud giants power to be totally disinterested in resolving platform problems that affect smaller competitors.

Back in 2019, the European Union agreed a regulation the blocs lawmakers claimed was pioneering in this regard aimed at tackling unfair platform business practices, with the Commission saying they wanted to outlaw some of the most unfair practices and create a benchmark for transparency. The regulation, which came into force just over two years ago, included a requirement that platforms set up new avenues for dispute resolution by mandating they have an internal complaint-handling system to assist business users.

However the EUs platform-to-business (P2B) trading regulation, which was targeted at so-called online intermediation services which provide services to business users that to enable them to reach consumers, had a heavy focus on ecommerce platforms, search engines, app stores and rental websites etc (and barely any mention of cloud computing). So its not clear whether services like Microsoft Teams and Azure are intended to fall in scope despite online intermediation itself being a broad concept.

If the regulation is supposed to apply to cloud services, the poor experiences of SMEs like Tutanota having core issues affecting their users essentially ignored via official support channels indicates something isnt working. So, at very least, theres a failure of enforcement going on here. The lack of clarity around whether the P2B regulation even applies in such cases also obviously doesnt help. So there does seem to be a communication gap if not an outright loophole.

The EU has further digital regulations incoming that are squarely targeted at ruling how platforms do business with others, with the goal of ensuring open and contestable markets via proactive enforcement of fair terms and conditions. Most notably the Digital Markets Act (DMA), which will apply to the most powerful gatekeeper platforms.

However this regulation is not yet in force application will start next year and it will require individual gatekeepers and core platform services to be designated before requirements apply, which will take many months in each case. So, well, its not going to be a quick fix.

Additionally, there have also been some concerns about whether the new regime will robustly apply to cloud giants productivity and enterprise services to other businesses. So some legal fuzziness around cloud services may persist.

Asked if its confident the DMA will be an antitrust game-changer, a spokeswoman for Tutanota was doubtful it will prove a silver bullet to resolve the baked-in power imbalance between platforms and SMEs. A better way to resolve such issues is needed, she told us. Possibly the DMA will address this but consequences in cases of negligence on the gatekeepers side must be in place; otherwise it will be easy for them to continue to ignore small competitors.

As long as Big Tech companies do not have to fear any kind of consequence be it bad publicity or drastic fines they will not be interested to invest into fixing issues of competitors users which from their business perspective is understandable. This is exactly why we need better legislation in this regard.

We expect the DMA to be a good first step into this direction, though it will probably not address all issues, she added.

The Commission was contacted with questions on these issues but at the time of writing it had not responded. Well update this report if we hear back.

Link:

On cloud platforms and SME antitrust complaints - TechCrunch

The Future of FinTech and the Cloud – DevOps.com

The integration of FinTech continues to revolutionize the way businesses interact with their consumers. What started as a viable solution to eliminating the need to carry physical currency has now become a multi-billion-dollar industry.

Basically, FinTech is a catch-all term used to describe software, mobile applications and other integrated technologies that improve and automate traditional forms of finance for businesses and their consumers. The financial infrastructure-as-a-service (IaaS) focuses on enabling FinTech companies to quickly facilitate highly secure transactions among an internal network while improving user experience. The complex (yet efficient) solution helps streamline fiscal transactions while simultaneously eliminating unnecessary steps throughout the transaction process. Ultimately, making financial transactions more affordable and accessible.

The integration of FinTech continues to bolster unprecedented growth within the business sector. The industrys rapid growth can be attributed to a shift in consumers lifestyles towards more accessible and affordable solutions. As a result, FinTech has been able to solidify a prominent role within business operations, however, needs to remain vigilant in adapting infrastructures to satisfy ever-changing consumer and industry demands.

With this goal in mind, FinTech companies have begun to innovate new solutions which include embracing the cloud within its infrastructures. With new integration, companies will have to determine whether using fintech-cloud computing entities will ultimately be an asset in achieving longevity and lucrative business growth or not.

FinTech companies are rapidly turning to the cloud to provide businesses and consumers with effective solutions. According to a research study, 87% of enterprises plan to accelerate their cloud migration by 2025. So, why is cloud computing becoming imperative for FinTech companies to integrate?

1. Agility: For businesses to compete, they must stay ahead of the competitive curve by reacting quickly to market changes. Cloud services satisfy this need by enabling companies to accelerate its scaling process. Businesses can scale their capacity up or down quickly to meet customer demand in real-time. As a result, providing a more cost-efficient solution.

2. Competitive Advantage: Within the financial industrys competitive landscape, it can be quite difficult for smaller companies to gain significant market share. Likewise, larger corporations struggle to retain customers as internal legacy IT practices hinder reaction rate times to changing customer demands. The cloud provides both entities with a better way to compete by offering a quicker alternative to meeting customers demands.

3. Enhanced Customer Experience: Millennials and Gen Z have transformed the banking industry by starting to view finances through the lens of technology. Cloud computing offers FinTech companies the opportunity to promote enhanced customer experience with consumer-centric web applications and improved efficiency metrics. Ultimately, providing the next generation of consumers with an accessible, agile and much more manageable solution that adheres to their technology-savvy lifestyle.

4. Cost-Effective: Businesses can incur significant infrastructure costs while developing and deploying FinTech products for a large customer base. Cloud adoption can help cut down costs by limiting usage, promoting elasticity of its products and adhering to an all-inclusive price.

5. Amplified Security Measures: With the ecosystem of hackers and cybercriminals continuing to evolve, companies must remain proactive in mitigating the potential risk of data breaches and cyberattacks. FinTech applications with weak security measures can compromise customer data and other private information. Consequently, leaving the door open to exploitation by malicious operators. Hybrid cloud architecture provides a more secure framework by allowing companies to build their own solutions in the cloud infrastructure they own, rather than in a shared environment within a shared network. As a result, they can be proactive and help eliminate potential data security risks.

Cloud computing integration continues to provide FinTech companies the opportunity to accelerate business growth. While there are a number of FinTech companies that have managed to achieve great success while using these infrastructures, other companies have failed.

Any company that invests in FinTech (or any other public cloud service) is susceptible to potential disadvantages. Its important that companies begin to identify these potential challenges and actively try to avoid them.

Here are potential limitations to consider:

1. Refusing to Use Existing Tools: Change is difficult to manage. Nevertheless, its important that companies continue to embrace new technology as they emerge. If FinTech companies refuse to build platforms using current technology, they run the risk of an influx of user discrepancies. On the contrary, if they decide to scale up, they run the risk of hindering the agility of the entire company. By using existing and open source technology, companies can proactively fix their problems and promote continuity in their infrastructures by eliminating outdated data, obscure patches and massive inefficiencies.

2. Ignoring Scope and Capacity: Using FinTech engineering requires an abundance of capacity planning to ensure an effective solution is created. Without planning, companies run the risk of creating infrastructures that cannot accommodate the volume of metrics used. This can result in killing dashboard performance and making troubleshooting issues impossible. Its important to plan at every stage regardless of what providers are used.

3. The Desire to Implement More than Required: Amid a fast-paced and competitive industry, FinTech engineers are constantly implementing new innovations to solidify their competitive advantage. Its important companies continue to innovate and improve their products; however, operations should not stray too far from their original scope of work. Businesses should work to eliminate unnecessary additions within their projects that could result in wasting a considerable amount of time and resources. Keep simple tasks simple.

4. Plan and Evaluate: Businesses need flexibility, independence and a cost structure to thrive. To achieve these marks, its necessary for companies to take the time to plan and evaluate each offering to determine which solution accommodates their business needs. Although switching providers might seem doable, doing so can result in a more complicated and expensive solution. Setting long-term objectives before deciding on a provider could make that eventual move simpler for your team.

The integration of FinTech and cloud computing services continues to impact the technology, finance and business sectors. The question of whether a FinTech-cloud computing structure is a more viable solution in businesses remains. By planning and evaluating each financial solution, businesses can determine which FinTech service is the most feasible for optimizing business objectives.

FinTech companies will always continue to provide safe and seamless solutions for their customers, however, its the companies who can capitalize on efficiency that will rise to the top. As FinTech and cloud integration continues to revolutionize customer-brand solutions, we will become closer to unveiling the future of FinTech and the lasting impacts it has on society.

Follow this link:

The Future of FinTech and the Cloud - DevOps.com

Vision to Value: How the Energy Industry Can Maximize Cloud Investment – Spiceworks News and Insights

The energy industry is in transition, experiencing fundamental changefrom decarbonization to decentralizationat pace. These changes are putting pressure on traditional ways of doing business, requiring companies to transform their operating models, processes and even culture. But why is there so much struggle across the industry when it comes to making this move? Craig Davis, head of cloud success services at SAP North America, explores how energy leaders can turn vision into value through cloud transformation, realizing the exceptional benefits this technology can bring to the sector.

The energy industry has witnessed unprecedented disruption over the last decade due to falling costs of renewables and energy storage, the worlds accelerating pivot to non-fossil fuels and tougher environmental policies and regulatory reform. Against this backdrop, the industry is at an inflection point where business leaders are looking to cloud technology to redefine resilience, boost competitiveness and prepare for a sustainable energy future.

And for good reason. According to recent data from Gartner, 95 percent of new digital workloads will be deployed on cloud-native platforms (up from 30 percent in 2021) and global cloud revenue is expected to surpass $474 billion in 2022 (up from $408 billion in 2021). As one Gartner VP, Milind Govekar, put it, There is no business strategy without a cloud strategy.

But if the consensus across the energy industry is that cloud technology is essential, why is there so much struggle and even outright failure when it comes to making a move? Why are most forays limited to migrating functional processes instead of truly scaled adoption? I would argue it is because far too many are focused on a solution, one without a vision for what is truly possible. One focused on cost but not value.

Like it or not, disruption is the norm for energy business leaders. Whether it is industry-specific disruptors like aging infrastructure and adoption of renewable and distributed energy sources or

life-altering disruptors like COVID-19, industry players should operate under the assumption that the tried-and-true practices of today may be irrelevant when they wake up tomorrow.

But siloed operating models create a process complexity that hinders the ability of energy leaders to react quickly and flex to these disruptions and shifting priorities. That is why the sector needs a new kind of vision, one dedicated to agility and adaptability for anything that comes along.

We cant predict the future, but the flexibility and visibility inherent in the cloud can enable us to view and analyze challenges and opportunities through the lens of insightful next-generation tools and technologies. Whether it is AI, machine learning or advanced data and analytics, todays energy leaders can embrace the cloud to help streamline existing processes, enhance product offerings and serve customers better.

For example, real-time data and analytics can help energy leaders harmonize field service processes and quickly react to external events, such as weather disasters. Imagine how much easier it would be to control customer experience during outages with cloud-connected IoT technology monitoring key infrastructure. Not only could companies respond to the outage faster, but they could trigger notifications to inform the customer that they are working to repair it before they are even aware.

Disruptive ideas dont have to be complicated ones. It is not always about out-thinking competitors. Sometimes, it is just about finding the simplest, fastest way to deliver a solution to the customer. The cloud offers the vision necessary to do just that.

See More: Reducing Your Cloud Computing Climate Impact

Especially in the current climate of significant disruptions and change, agile and innovative solutions in the cloud that help companies strengthen their core business processes and innovate for the future are key. However, energy leaders cant just have vision alone. They must be able to draw a direct line from vision to value.

This is partially because what the energy sector is currently experiencing is something even greater than disruption it is a whole-scale change to the way they typically function and operate.

Lets look at the example of Chevron, a company aiming to build reliable, affordable and ever-cleaner energy across its operations in order to keep pace with the massive changes affecting the oil and gas industry. They wanted a standard cloud product that would increase its overall speed and agility, decrease its total cost of ownership for its business solutions, and drive greater digital innovation overall. Thats why it embraces a digital foundation that enhances production efficiency and drives faster, more informed decisions.

By leveraging cloud technology, energy companies can better navigate complex implementation scenarios and achieve new efficiencies across the organization, completely rethinking their digital transformation and allowing them to develop resilience in an inherently volatile and cyclical industry. This technology unites the company under a single source of truth and leaves it poised for future growth.

With industry priorities shifting, the cloud value proposition for energy companies has never been greater. Truly embracing the potential has become a business imperative. Cloud providers are eager to help by building and investing in custom solutions for the energy industry and laying the groundwork for mass adoption. But turning vision into value is not solely the responsibility of this technology.

While the cloud can help drive energy sector innovation and help boost top-line, bottom-line and green-line growth it is only part of the journey towards continuous improvement. Businesses must take action with their cloud technology, using its data to garner maximum business value across their company-wide processes, whether via analytics, application development or database management.

They must find ways to work with the CFO and other leaders to upgrade technology strategically, whether it is about showing immediate value in the short term or planning for deeper value in the long term. Simply put, embracing a grander vision from the outset will ensure stronger value in the end.

Energy leaders should ask: Where should our cloud investment begin? What is mission-critical for success? Would it be effective to go all-in, all at once? Whats the plan for actually implementing it? Are you being intentional about the planned adoption and consumption of cloud software?

These are just some of the questions as the energy industry pursues cloud transformation. Because while you may know you need to invest in the cloud, you still need to understand why. Thats what leads to success.

Tell us about your cloud transformation journey on Facebook, Twitter, and LinkedIn. Wed love to hear all about it!

Read more from the original source:

Vision to Value: How the Energy Industry Can Maximize Cloud Investment - Spiceworks News and Insights

Cloud Computing in Higher Education Market Share, Size, Demand, Growth Opportunities, Industry Revenue, Future and Business Analysis by Forecast…

According to the Astute Analytica study on the global Cloud Computing in Higher Education Market, the size of the market will increase from US$ 2,693.5 Million in 2021 to US$ 15,180.1 Million by 2030, registering a remarkable compound annual growth rate (CAGR) of 22% from 2022 to 2030.

The segmentation section of the report focuses on every segment, along with highlighting the ones having a strong impact on the global Cloud Computing in Higher Education Market. The segmentation served as the foundation for finding businesses and examining their financial standings, product portfolios, and future growth potential. The second step entailed evaluating the core competencies and market shares of top firms in order to predict the degree of competition. A bottom-up method was used to assess the markets overall size.

Request Sample Copy of Research Report @ https://www.astuteanalytica.com/request-sample/cloud-computing-higher-education-market

On the basis of institute type, the technical schools are estimated to hold the highest market share in 2021 and is also expected to project the highest CAGR over the forecast period owing to increasing demand for cloud computing in technical schools. Moreover, based on ownership, private institutes segment is anticipated to hold the largest market share owing to increasing funding in private institutes for adoption of cloud computing services. Whereas, the public institutes segment is expected to grow at the highest CAGR over forecast period. Furthermore, in terms of application, administration application holds a major share in the cloud computing in higher education in 2021. Whereas, unified communication is expected to project the highest CAGR over the forecast period due to increasing trend of e-learning. In addition to this, by deployment, the hybrid cloud segment held the largest market share in 2021.

Market Dynamics and Trends

Drivers

The increasing adoption of SaaS based cloud platforms in higher education, increasing adoption of e-learning, increasing IT spending on cloud infrastructure in education and increasing application of quantum computing in education sector will boost the global cloud computing in higher education market during the forecast period. Software-as-a-Service (SaaS) is a type of delivery model of cloud computing. In the higher education sector, SaaS applications include hosting various management systems for educational institutes and managing other activities. Moreover, higher education industry witnesses an increased adoption of e-learning due to its easy accessibility and high effectiveness. Users such as drop-outs, transfer learners, full-time employees are increasingly relying on e-learning trainings and education to upgrade their skills. Furthermore, higher education institutes are rapidly moving towards cloud-based services to save an intensive IT infrastructure cost and boost efficiency of operations.

Restraints

Cybersecurity and data protection risks, lack of compliance to the SLA and legal and jurisdiction issues is a restraining factor which inhibits the growth of the market during the forecast period. Issues related to data privacy pose threats in interest to mitigation of higher education institutions to the cloud. There are federal regulations for higher education institutes along with state and local laws to manage information security in the education environment. Moreover, the level of complexity in the cloud is high, which usually complies with several service providers and thus makes it hard for users to make changes or intervene. Also, the cloud computing industry faces various legal and jurisdiction issues that can run into years due to regional laws.

Cloud Computing in Higher Education Market Country Wise Insights

North America Cloud Computing in Higher Education Market-

US holds the major share in terms of revenue in the North America cloud computing in higher education market in 2021 and is also projected to grow with the highest CAGR during the forecast period. Moreover, in terms of institute type, technical schools hold the largest market share in 2021.

Europe Cloud Computing in Higher Education Market-

Western Europe is expected to project the highest CAGR in the Europe cloud computing in higher education market during forecast period. Wherein, Germany held the major share in the Europe market in 2021 because there is high focus on innovations obtained from research & development and technology adoption in the region.

Asia Pacific Cloud Computing in Higher Education Market-

India is the highest share holder region in the Asia Pacific cloud computing in higher education market in 2021 and is expected to project the highest CAGR during the forecast period owing to potential growth opportunities, as end users such as schools and universities are turning toward cloud services in order to offer high quality services that help users to collaborate, share and track multiple versions of a document.

South America Cloud Computing in Higher Education Market-

Brazil is projected to grow with the highest CAGR in the South America cloud computing in higher education market over the forecast period. Furthermore, based on ownership, private institutes segment holds the major share in 2021 in the South America cloud computing in higher education market owing to increasing funding in private institutes for adoption of cloud computing services.

Middle East Cloud Computing in Higher Education Market-

Egypt is the highest share holder region in 2021 and UAE is projected to grow with the highest CAGR during the forecast period. Moreover, in terms of application, administration holds a major share in the cloud computing in higher education in 2021. Whereas, unified communication is expected to project the highest CAGR over the forecast period due to increasing trend of e-learning.

Africa Cloud Computing in Higher Education Market-

South Africa is the highest share holder region in the Africa cloud computing in higher education market in 2021. Furthermore, by deployment, the private cloud segment is expected to witness the highest CAGR during forecast period due to the security benefits provided by the private deployment of the cloud.

Competitive Insights

Global Cloud Computing in Higher Education Market is highly competitive in order to increase their presence in the marketplace. Some of the key players operating in the global cloud computing in higher education market include Dell EMC, Oracle Corporation, Adobe, Inc., Cisco Systems, Inc., NEC Corporation, Microsoft Corporation, IBM Corporation, Salesforce.com, Netapp, Ellucian Company L.P., Vmware, Inc and Alibaba Group among others.

Segmentation Overview

Global Cloud Computing in Higher Education Market is segmented based on institute type, ownership, application, deployment and region. The industry trends in the global cloud computing in higher education market are sub-divided into different categories in order to get a holistic view of the global marketplace.

Following are the different segments of the Global Cloud Computing in Higher Education Market:

Download Sample Report, SPECIAL OFFER (Avail an Up-to 30% discount on this report- https://www.astuteanalytica.com/industry-report/cloud-computing-higher-education-market

By Institute Type segment of the Global Cloud Computing in Higher Education Market is sub-segmented into:

By Ownership segment of the Global Cloud Computing in Higher Education Market is sub-segmented into:

By Application segment of the Global Cloud Computing in Higher Education Market is sub-segmented into:

By Deployment segment of the Global Cloud Computing in Higher Education Market is sub-segmented into:

By Region segment of the Global Cloud Computing in Higher Education Market is sub-segmented into:

North America

Europe

Western Europe

Eastern Europe

Asia Pacific

South America

Middle East

Africa

Request Full Report- https://www.astuteanalytica.com/request-sample/cloud-computing-higher-education-market

About Astute Analytica

Astute Analytica is a global analytics and advisory company which has built a solid reputation in a short period, thanks to the tangible outcomes we have delivered to our clients. We pride ourselves in generating unparalleled, in depth and uncannily accurate estimates and projections for our very demanding clients spread across different verticals. We have a long list of satisfied and repeat clients from a wide spectrum including technology, healthcare, chemicals, semiconductors, FMCG, and many more. These happy customers come to us from all across the Globe. They are able to make well calibrated decisions and leverage highly lucrative opportunities while surmounting the fierce challenges all because we analyze for them the complex business environment, segment wise existing and emerging possibilities, technology formations, growth estimates, and even the strategic choices available. In short, a complete package. All this is possible because we have a highly qualified, competent, and experienced team of professionals comprising of business analysts, economists, consultants, and technology experts. In our list of priorities, you-our patron-come at the top. You can be sure of best cost-effective, value-added package from us, should you decide to engage with us.

Contact us:Aamir BegBSI Business Park, H-15,Sector-63, Noida- 201301- IndiaPhone: +1-888 429 6757 (US Toll Free); +91-0120- 4483891 (Rest of the World)Email: sales@astuteanalytica.comWebsite: http://www.astuteanalytica.com

See the original post here:

Cloud Computing in Higher Education Market Share, Size, Demand, Growth Opportunities, Industry Revenue, Future and Business Analysis by Forecast...

How using AI & ML will help businesses stay ahead of the curve – ETCIO

By Bikram Singh BediAI is turning into a multi-faceted, pervasive technology for businesses and users across the world. Technology providers are building platforms that are helping users harness the power of AI by meeting them wherever they are.

Data management and aggregation are the essential components of every cloud infrastructure. Organizations may create sophisticated analytics and artificial intelligence (AI) solutions to address distinct challenges provided they have the relevant data at hand. Business intelligence is usually the first step in the process since it helps organizations understand underlying data better before using sophisticated analytics tools.

Organizations investing in technology that helps AI and ML achieve the greatest benefit for their operations as they realize the significance of these technologies. AI and ML are greatly affecting how we conceptualize data, necessitating the development of best practices for these technologies. By 2025, Gartner expects generative AI to account for 10% of all data produced, up from less than 1% today.

It's critical to offer cutting-edge services for users of all types and up-to-date tools for sophisticated AI practitioners. To fit the demands of the job and the user's technical proficiency, some of this requires automating or abstracting portions of the ML workflow. Whatever the perspective, AI is becoming a multifaceted, omnipresent technology for organizations and consumers everywhere, thus we believe technology providers should reflect this by creating platforms that enable users to harness the potential of AI by connecting with them wherever they are.

AI readinessSeveral steps may be taken by businesses who are currently evaluating or piloting the implementation of AI & ML to achieve scale quickly. The first phase is to identify and prioritize projects based on complexity, business effect, and hazards using the minimal viable product strategy.

Most significantly, businesses must include their business executives by integrating them into AI initiatives after giving them the appropriate AI & ML training, if required. AI initiatives have to be aligned with the company's bigger strategic objective rather than implementing them in isolation.

Building Cloud Native Platforms (CNP)Cloud native platforms are technologies that enable businesses to create new architectural applications that take use of the cloud's advantages. Cloud-native platforms provide solutions that help build a more effective and solid IT foundation through safe data integration and processing, since cloud-native technology is all about power and speed.

Enterprises must adopt CNPs to deploy operational skills everywhere. CNPs employ the fundamental features of cloud computing to offer scalable and elastic IT-related capabilities "as a service" to internet-based technology developers, resulting in a shorter time to value and lower costs.

Gartner forecasts that by 2025, more than 95% of new digital efforts will be built on cloud-native platforms, up from less than 40% in 2021.

Conclusion

AI is no longer only an uncharted territory. This combination of technology and human-in-the-loop expertise offers a true end-to-end AI data solution as firms want to deploy their models. The rate of technology advancements that can automate and simplify developing and maintaining AI systems has increased along with the need for AI. Thus, by combining the required expertise, judgment, and technology, the highest quality data could be achieved.

The author is Managing Director, Google Cloud India

Disclaimer: The views expressed are solely of the author and ETCIO.com does not necessarily subscribe to it. ETCIO.com shall not be responsible for any damage caused to any person/organization directly or indirectly.

See original here:

How using AI & ML will help businesses stay ahead of the curve - ETCIO

Got $5,000? These Are 2 of the Best Growth Stocks to Buy Right Now – The Motley Fool

A long-standing debate exists among the investing community: growth versus value. This year, value has gotten the upper hand. So far in 2022, the iShares S&P 500 Value ETF is down 8.5%, while the iShares S&P 500 Growth ETF is down 22.8%.

But, over the last 10 years, growth has outperformed value. A $10,000 investment in the aforementioned growth ETF 10 years ago would now be worth $33,210; the same investment in the value ETF would only be worth $21,680.

Cycles come and go. And when the current trend toward value stocks gives way to one that favors growth stocks, long-term investors who have built significant positions in the best growth stocks will be handsomely rewarded. So what growth stocks should you own right now? Let's have a look at two names poised to lead the way higher.

Image source: Getty Images.

Cutting-edge technology often sparks sizzling growth, and that's certainly true today. Advancements in artificial intelligence (AI), machine learning, and cloud computing drive innovation in sectors ranging from automobiles to healthcare. Yet, at the most basic level, these creations rely on a 70-year-old technology: the semiconductor.

Without semiconductors, there is no AI or cloud computing. Constant production and innovation are required to deliver the next generation of chips that make self-driving cars possible or gene editing a reality. And when it comes to semiconductor innovation and production, Advanced Micro Devices(AMD -1.02%) is one of the world's best. It operates across four segments: 1. Data Center, 2. Client, 3. Gaming, and 4. Embedded.

AMD's legacy business includes its Client and Gaming segments, which make chips for gaming consoles and PCs. And while those segments made up $3.9 billion, or 59%, of AMD's second-quarter revenue, they grow much slower than the remaining two segments, Data Center and Embedded.

AMD's Data Center segment generated $1.5 billion in revenue in the second quarter, up 83% year over year. Sales were driven by strong demand for its EPYC chips, which fuel many of the world's cloud servers.

Similarly, the company's Embedded segment grew like a weed, generating $1.3 billion in revenue in the second quarter. After completing its acquisition of Xilinx, AMD incorporated much of that company's operations into its Embedded segment, which largely produces custom-designed chips for the aerospace and defense industries.

The blistering growth rates in AMD's Data Center and Embedded segments help explain the company's staggering 70% year-over-year revenue growth rate in its most recent quarter. And while that sort of growth is truly impressive, the company's five-year average growth rate is even more so, at 38.7%. What's more, analysts expect the company to continue churning out fantastic growth rates. Wall Street forecasts full-year 2022 revenue of $26.2 billion (59.5% growth) and 2023 revenue of $29.6 billion (12.9%).

Whereas AMD produces the hardware (i.e., semiconductors), Snowflake(SNOW -0.75%) is a software company specializing in the red-hot cloud data analytics sector.

Simply put, the cloud computing revolution is well underway, changing how individuals and organizations work. Cloud computing is the process of running applications, storing data, and consolidating security on a centralized cloud server, rather than operating individual workstations. This process results in enormous efficiencies across organizations and reduces costs.

However, cloud computing creates new challenges, too. And Snowflake's signature product, Data Cloud, aims to address one of the biggest. Data Cloud allows customers to collect, classify, and decipher data across multiple cloud platforms. In a world where many organizations rely on multiple cloud vendors, Data Cloud helps customers "see" all their data -- a process that isn't always as straightforward as it might seem.

Snowflake's surging customer base shows just how essential this data aggregation software has become. The company's customer base jumped to over 6,800 as of its most recent quarter (which ended on July 31). That's up from around 5,000 one year ago, an increase of 36% year over year.

What's more, Snowflake doesn't simply garner attention from potential customers; it's caught the eye of arguably the world's most famous investor, Warren Buffett. Berkshire Hathawayowns over 6 million shares of Snowflake, now valued near $1.2 billion.

The analyst community agrees. Wall Street expects Snowflake to record 70% revenue growth for the current fiscal year (which ends Jan. 31, 2023) and 51% the following year.

Investors looking to add a hypergrowth stock need look no further. Snowflake is a name to own now -- and for years to come.

Jake Lerch has no position in any of the stocks mentioned. The Motley Fool has positions in and recommends Advanced Micro Devices, Berkshire Hathaway (B shares), and Snowflake Inc. The Motley Fool recommends the following options: long January 2023 $200 calls on Berkshire Hathaway (B shares), short January 2023 $200 puts on Berkshire Hathaway (B shares), and short January 2023 $265 calls on Berkshire Hathaway (B shares). The Motley Fool has a disclosure policy.

Read this article:

Got $5,000? These Are 2 of the Best Growth Stocks to Buy Right Now - The Motley Fool

Decoding the Future Trajectory of Healthcare with AI – ReadWrite

Artificial Intelligence (AI) is getting increasingly sophisticated day by day in its application, with enhanced efficiency and speed at a lower cost. Every single sector has been reaping benefits from AI in recent times. The Healthcare industry is no exception. Here is decoding the future trajectory of healthcare with AI.

The impact of artificial intelligence in the healthcare industry through machine learning (ML) and natural language processing (NLP) is transforming care delivery. Additionally, patients are expected to gain relatively high access to their health-related information than before through various applications such as smart wearable devices and mobile electronic medical records (EMR).

The personalized healthcare will authorize patients to take the wheel of their well-being, facilitate high-end healthcare, and promote better patient-provider communication to underprivileged areas.

For instance, IBM Watson for Health is helping healthcare organizations to apply cognitive technology to provide a vast amount of power diagnosis and health-related information.

In addition, Googles DeepMind Health is collaborating with researchers, clinicians, and patients in order to solve real-world healthcare problems. Additionally, the company has combined systems neuroscience with machine learning to develop strong general-purpose learning algorithms within neural networks to mimic the human brain.

Companies are working towards developing AI technology to solve several existing challenges, especially within the healthcare space. Strong focus on funding and starting AI healthcare programs played a significant role in Microsoft Corporations decision to launch a 5-year, US$ 40 million program known as AI for Health in January 2019.

The Microsoft program will use artificial intelligence tools to resolve some of the greatest healthcare challenges including global health crises, treatment, and disease diagnosis. Microsoft has also ensured that academia, non-profit, and research organizations have access to this technology, technical experts, and resources to leverage AI for care delivery and research.

In January 2020, these factors influenced Takeda Pharmaceuticals Company and MITs School of Engineering to join hands for three years to drive innovation and application of AI in the healthcare industry and drug development.

AI applications are only centered on three main investment areas: Diagnostics, Engagement, and Digitization. With the rapid advancement in technologies. There are exciting breakthroughs in incorporating AI in medical services.

The most interesting aspect of AI is robots. Robots are not only replacing trained medical staff but also making them more efficient in several areas. Robots help in controlling the cost while potentially providing better care and performing accurate surgery in limited space.

China and the U.S. have started investing in the development of robots to support doctors. In November 2017, a robot in China passed a medical licensing exam using only an AI brain. Also, it was the first-ever semi-automated operating robot that was used to suture blood vessels as fine as 0.03 mm.

In order to prevent coronavirus from spreading, the American doctors are relying on a robot that can measure the patients act and vitals. In addition, robots are also being used for recovery and consulting assistance and transporting units. These robots are showcasing significant potential in revolutionizing medical procedures in the future.

Precision medicine is an emerging approach to disease prevention and treatment. The precision medication approach allows researchers and doctors to predict more accurate treatment and prevention strategies.

The advent of precision medicine technology has allowed healthcare to actively track patients physiology in real-time, take multi-dimensional data, and create predictive algorithms that use collective learnings to calculate individual outcomes.

In recent years, there has been an immense focus on enabling direct-to-consumer genomics. Now, companies are aiming to create patient-centric products within digitization processes and genomics related to ordering complex testing in clinics.

In January 2020, ixLayer, a start-up based in San-Francisco, launched one of its kind precision health testing platforms to enhance the delivery of diagnostic testing and to shorten the complex relationship among physicians, precision health tests, and patients.

Personal health monitoring is a promising example of AI in healthcare. With the emergence of advanced AI and Internet of Medical Things (IoMT), demand for consumer-oriented products such as smart wearables for monitoring well-being is growing significantly.

Owing to the rapid proliferation of smart wearables and mobile apps, enterprises are introducing varied options to monitor personal health.

In October 2019, Gali Health, a health technology company, introduced its Gali AI-powered personal health assistant for people suffering from inflammatory bowel diseases (IBD). It offers health tracking and analytical tools, medically-vetted educational resources, and emotional support to the IBD community.

Similarly, start-ups are also coming forward with innovative devices integrated with state-of-the-art AI technology to contribute to the growing demand for personal health monitoring.

In recent years, AI has been used in numerous ways to support the medical imaging of all kinds. At present, the biggest use for AI is to assist in the analysis of images and perform single narrow recognition tasks.

In the United States, AI is considered highly valuable in enhancing business operations and patients care. It has the greatest impact on patient care by improving the accuracy of clinical outcomes and medical diagnosis.

Strong presence of leading market players in the country is bolstering the demand for medical imaging in hospitals and research centers.

In January 2020, Hitachi Healthcare Americas announced to start a new dedicated R&D center in North America. Medical imaging will leverage the advancements in machine learning and artificial intelligence to bring about next-gen of medical imaging technology.

With a plethora of issues driven by the growing rate of chronic disease and the aging population, the need for new innovative solutions in the healthcare industry is moving on an upswing.

Unleashing AIs complete potential in the healthcare industry is not an easy task. Both healthcare providers and AI developers together will have to tackle all the obstacles on the path towards the integration of new technologies.

Clearing all the hurdles will need a compounding of technological refinement and shifting mindsets. As AI trend become more deep-rooted, it is giving rise to highly ubiquitous discussions. Will AI replace the doctors and medical professionals, especially radiologists and physicians? The answer to this is, it will increase the efficiency of the medical professionals.

Initiatives by IBM Watson and Googles DeepMind will soon unlock the critical answers. However, AI aims to mimic the human brain in healthcare, human judgment, and intuitions that cannot be substituted.

Even though AI is augmenting in existing capabilities of the industry, it is unlikely to fully replace human intervention. AI skilled forces will swap only those who dont want to embrace technology.

Healthcare is a dynamic industry with significant opportunities. However, uncertainty, cost concerns, and complexity are making it an unnerving one.

The best opportunity for healthcare in the near future are hybrid models, where clinicians and physicians will be supported for treatment planning, diagnosis, and identifying risk factors. Also, with an increase in the number of geriatric population and the rise of health-related concerns across the globe, the overall burden of disease management has augmented.

Patients are also expecting better treatment and care. Due to growing innovations in the healthcare industry with respect to improved diagnosis and treatment, AI has gained consideration among the patients and doctors.

In order to develop better medical technology, entrepreneurs, healthcare service providers, investors, policy developers, and patients are coming together.

These factors are set to exhibit a brighter future of AI in the healthcare industry. It is extremely likely that there will be widespread use and massive advancements of AI integrated technology in the next few years. Moreover, healthcare providers are expected to invest in adequate IT infrastructure solutions and data centers to support new technological development.

Healthcare companies should continually integrate new technologies to build strong value and to keep the patients attention.

-

The insights presented in the article are based on a recent research study on Global Artificial Intelligence In Healthcare Market by Future Market Insights.

Abhishek Budholiya is a tech blogger, digital marketing pro, and has contributed to numerous tech magazines. Currently, as a technology and digital branding consultant, he offers his analysis on the tech market research landscape. His forte is analysing the commercial viability of a new breakthrough, a trait you can see in his writing. When he is not ruminating about the tech world, he can be found playing table tennis or hanging out with his friends.

See more here:

Decoding the Future Trajectory of Healthcare with AI - ReadWrite

Google’s AI subsidiary turns to blockchain technology to track UK health data – The Verge

Forays by Google subsidiary DeepMind Health into the UKs medical institutions have been characterized by two major themes. First, amazing results powered by cutting-edge AI; and second, a lack of transparency over the handling of the UKs public-funded data. With the science going swimmingly, DeepMind Health is focusing more than ever on reassuring UK citizens that their medical records are in safe hands. Its latest plan is a public ledger that shows what data its using, when, and for which purposes.

The initiative is called the Verifiable Data Audit, and was announced this week in a blogpost written by DeepMind co-founder Mustafa Suleyman, and the companys head of security and transparency, Ben Laurie. The Audit technology is not yet in place, but would keep a publicly accessible record of every time DeepMind accesses hospital data using technology thats related to the blockchain.

Each time theres any interaction with data, well begin to add an entry to a special digital ledger, write Suleyman and Laurie. That entry will record the fact that a particular piece of data has been used, and also the reason why for example, that blood test data was checked against the NHS national algorithm to detect possible acute kidney injury.

Like blockchain technologies, this information will be write-only it can be edited after the fact or deleted. It will also make use of cryptographic proofs that will allow any experts to verify the integrity of the data. Unlike most blockchain systems, though, the ledger wont be distributed among the public, but stored by a number of entities including data processors like DeepMind Health, and health care providers. The company says this wont impede the verification process, and that the choice was made to make the ledger more efficient. Blockchain entities (including Bitcoin) that are distributed among multiple players take a lot of power to compile and check as much as a small country, according to some estimates.

Speaking to The Guardian, Nicola Perrin of the Wellcome Trust, said the technology should create a robust audit trail for public health data managed by DeepMind. One of the main criticisms about DeepMinds collaboration with the Royal Free [Hospital Trust] was the difficulty of distinguishing between uses of data for care and for research, said Perrin. This type of approach could help address that challenge, and suggests they are trying to respond to the concerns. DeepMind Health says it wants implement the first pieces of the audit later this year.

Originally posted here:

Google's AI subsidiary turns to blockchain technology to track UK health data - The Verge

AI startup investment is on pace for a record year – TechCrunch

The startup investing market is crowded, expensive and rapid-fire today as venture capitalists work to preempt one another, hoping to deploy funds into hot companies before their competitors. The AI startup market may be even hotter than the average technology niche.

This should not surprise.

In the wake of the Microsoft-Nuance deal, The Exchange reported that it would be reasonable to anticipate an even more active and competitive market for AI-powered startups. Our thesis was that after Redmond dropped nearly $20 billion for the AI company, investors would have a fresh incentive to invest in upstarts with an AI focus or strong AI component; exits, especially large transactions, have a way of spurring investor interest in related companies.

That expectation is coming true. Investors The Exchange reached out to in recent days reported a fierce market for AI startups.

The Exchange explores startups, markets and money.

Read it every morning on Extra Crunch or get The Exchange newsletter every Saturday.

But dont presume that investors are simply falling over one another to fund companies betting on a future that may or may not arrive. Per a Signal AI survey of 1,000 C-level executives, nearly 92% thought that companies should lean on AI to improve their decision-making processes. And 79% of respondents said that companies are already doing so.

The gap between the two numbers implies that there is space in the market for more corporations to learn to lean on AI-powered software solutions, while the first metric belies a huge total addressable market for startups constructing software built on a foundation of artificial intelligence.

Now deep in the second quarter, were diving back into the AI startup market this morning, leaning on notes from Blumberg Capitals David Blumberg, Glasswing Ventures Rudina Seseri, Atomicos Ben Blumeand Jocelyn Goldfein of Zetta Venture Partners. Well start by looking at recent venture capital data regarding AI startups and dig into what VCs are seeing in both the U.S. and European markets before chatting about applied AI versus core AI and in which context VCs might still care about the latter.

The exit market for AI startups is more than just the big Microsoft-Nuance deal. CB Insights reports that four of the largest five American tech companies have bought a dozen or more AI-focused startups to date, with Apple leading the pack with 29 such transactions.

See the original post here:

AI startup investment is on pace for a record year - TechCrunch

AI is learning when it should and shouldnt defer to a human – MIT Technology Review

The context: Studies show that when people and AI systems work together, they can outperform either one acting alone. Medical diagnostic systems are often checked over by human doctors, and content moderation systems filter what they can before requiring human assistance. But algorithms are rarely designed to optimize for this AI-to-human handover. If they were, the AI system would only defer to its human counterpart if the person could actually make a better decision.

The research: Researchers at MITs Computer Science and AI Laboratory (CSAIL) have now developed an AI system to do this kind of optimization based on strengths and weaknesses of the human collaborator. It uses two separate machine-learning models; one makes the actual decision, whether thats diagnosing a patient or removing a social media post, and one predicts whether the AI or human is the better decision maker.

The latter model, which the researchers call the rejector, iteratively improves its predictions based on each decision makers track record over time. It can also take into account factors beyond performance, including a persons time constraints or a doctors access to sensitive patient information not available to the AI system.

See the original post here:

AI is learning when it should and shouldnt defer to a human - MIT Technology Review

Google DeepMind Shows That AI Can Have Killer Instincts – Futurism

Red and Blue

Concerns over artificial intelligence (AI) have been around for some time now, and thanks to a new study by Googles DeepMind research lab, it seems that this Terminator-esque future of intelligent machines may not be that farfetched.

Using games, a platform that Googles DeepMind AI is terribly familiar with, researchers have been testing whether neural networks are more likely to cooperate or compete, and if these AI are capable of understanding motivations behind making that choice.

For the research, they used two games with similar scenarios for two AI agents, red and blue.

In the first game, the agents were tasked with trying to gather the most apples (green) in a basic 2D graphical environment. The agents were given the option to tag one another with a laser blast that temporarily removed them from the game. After running the scenario a thousand times, they realized that the agents were willing to cooperate when the apples were abundant, but they turned on each other when the stakes were higher.

The researchers realized that, in a smaller network, the agents were more likely to cooperate. Whereas in a larger, more complex network, the AI were quicker to sabotage one another.

In the second scenario, a game called Wolfpack, the agents played as wolves that were tasked with capturing a prey.When the wolves are close in proximity during a successful capture, the rewards offered were greater. Instead of going all lone wolf, this incentivized the agents to work together.

In a larger network, the agents were quicker to understand that cooperation was the way to go.

The Google researchers hope that the study can lead to AI being better at working with other AI in situations with imperfect information. As such, the most practical application of this research, in the short term, is to be able to better understand and control complex multi-agent systems such as the economy, traffic systems, or the ecological health of our planet all of which depend on our continued cooperation,the study says.

At the very least, the study shows that AI are capable of working together and that AI can make selfish decisions.

Joel Leibo, who was the lead author of the paper, outlines the next steps in an interview withBloomberg, Going forward it would be interesting to equip agents with the ability to reason about other agents beliefs and goals.

Original post:

Google DeepMind Shows That AI Can Have Killer Instincts - Futurism

As AI startups focus on time-to-market, ethical considerations should be the priority – SmartCompany.com.au

A girl making friends with a robot at Kuromon Market in Osaka. Source: Andy Kelly/Unsplash.

Artificial intelligence (AI) has clearly emerged as one of the most transformational technologies of our age, with AI already prevalent in our everyday lives. Among many fascinating uses, AI has helped explore the universe, tackle complex and chronic diseases, formulate new medicines, and alleviate poverty.

As AI becomes more widespread over the next decade, like many, I believe we will see more innovative and creative uses.

Indeed, 93% of respondents in anISACAs Next Decade of Tech: Envisioning the 2020s study believe the augmented workforce (or people, robots and AI working closely together) will reshape how some or most jobs are performed in the 2020s.

The rise of social robots to assist patients with physical disabilities, manage elderly care and even educate our children are just some of the many uses being explored.

As AI continues to redefine humanity in various ways, ethical consideration is of paramount importance, which as Australians, we should be addressing in government and business. ISACAs research highlights the double-edged nature of this budding technology.

Only 39% of respondents in Australia believe that enterprises will give ethical considerations around AI and machine learning sufficient attention in the next decade to prevent potentially serious unintended consequences in their deployments. Respondents specifically pinpointed malicious AI attacks involving critical infrastructure, social engineering and autonomous weapons as their primary fears.

These concerns are quite disturbing, although not alarming, due to long-sounded early warnings about these risks.

For instance, in February 2018, prominent researchers and academics published a report about the increasing possibilities that rogue states, criminals, terrorists and other malefactors could soon exploit AI capabilities to cause widespread harm.

And in 2017, the late physicist Stephen Hawking cautioned that the emergence of AI could be the worst event in the history of our civilization unless society finds a way to control its development.

To date, no industry standards exist to guide the secure development and maintenance of AI systems.

Further exacerbating this lack of standards is the fact that startup firms still dominate the AI market. An MIT report revealed that, other than a few large players such as IBM and Palantir Technologies, AI remains a market of 2,600 startups. The majority of these startups are primarily focused on rapid time to market, product functionality and high return on investments. Embedding cyber resilience into their products is not a priority.

Malicious AI programs have surfaced much quicker than many pundits had anticipated. A case in point is the proliferation of deep fakes, ostensibly realistic audio or video files generated by deep learning algorithms or neural networks toperpetratea range of malevolent acts, such as faking celebrity pornographic videos, revenge porn, fake news, financial fraud, and wide range of other disinformation tactics.

Several factors underpinned the rise of deep fakes, but a few stand out.

First is the exponential increase of computing power combined with the availability of large image databases. Second, and probably the most vexing, is the absence of coherent efforts to institute global laws to curtail the development of malicious AI programs. Third, social media platforms, which are being exploited to disseminate deep fakes at scale, are struggling to keep up with the rapidly maturing and evasive threat.

Unsurprisingly, deep fake videos published online have doubled in the past nine months to almost 15,000 cases, according to DeepTrace, a Netherlands-based cyber security group.

Its clear that addressing this growing threat will prove complex and expensive, but the task is pressing.

The ACCC Digital Platforms Inquiryreport highlighted the risk of consumers being exposed to serious incidents of disinformation. Emphasising the gravity of the risk is certainly a step in the right direction, but more remains to be done.

Currently,there is no consensus globally on whether the development of AI requires its own dedicated regulator or specific statutory regime.

Ironically, the role of the auditor and IT auditor is a function that AI is touted as being able to eliminate. This premise would make for a good Hollywood script the very thing requiring ethical consideration and regulation, becomes the regulator.

Government, enterprises and startups need to be mindful of the key risks that are inherent in AI adoption, conduct appropriate oversight, and develop principles and regulation that articulate the roles that can be partially or fully automated today to secure the future of humanity and business.

Until then, AI companies need to embed protocols and cyber security into their inventions to prevent malicious use.

NOW READ:Expert warns artificial intelligence will have a huge impact on small businesses but wont take your job just yet

NOW READ:Why artificial intelligence in Australia needs to get ethical

Read the rest here:

As AI startups focus on time-to-market, ethical considerations should be the priority - SmartCompany.com.au

AI file extension – Open, view and convert . ai files

The ai file extension is associated with Adobe Illustrator the well known vector graphics editor for the Macintosh and Windows platforms.

AI file format is a widely used format for the exchange of 2D objects. Basic files in this format are simple to write, but files created by applications implementing the full AI specification can be quite large and complex and may be too slow to render.

Simple *.ai files are easy to construct, and a program can create files that can be read by any AI reader or can be printed on any PostScript printer software. Reading AI files is another matter entirely. Certain operations may be very difficult for a rendering application to implement or simulate. In light of this, developers often choose not to render the image from the PostScript-subset line data in the file. However almost all of the image can usually be reconstructed using simple operations.implementation of the PostScript language.

The *.ai files consist of a series of ASCII lines, which may be comments, data, commands, or combinations of commands and data. This data is based on the PDF language specification and older versions of Adobe Illustrator used format which is variant of Adobe Encapsulated PostScirpt (EPS) format.

If The EPS is a slightly limited subset of full PostScript, then Adobe Illustrator AI format is a strictly limited, highly simplified subset of EPS. While EPS can contain virtually any PS command that's not on the verboten list and can include elaborate program flow logic that determines what gets printed when, an AI file is limited to a much smaller number of drawing commands and it contains no programming logic at all. For all practical purposes, each unit of "code" in an AI file represents a drawing object. The program importing the AI reads each object in sequence, start to finish, no detours, no logical side-trips.

MIME: application/postscript

Read the rest here:

AI file extension - Open, view and convert . ai files

Mendel.ai nabs $2 million to match cancer patients with the latest … – TechCrunch

Dr. Karim Galil was tired. He was tired of losing patients to cancer. He was tired of messy medical records. And he was tired of trying to stay on top of the avalanche of clinical trials touting one solution or another. Losing both patience and too many patients, Galil decided to create an organized and artificially intelligent system to match those under his care with thebest diagnostic and treatment methods available.

He called his new system Mendel.ai after Gregor Mendel, the father of modern genetics science, and has just raised $2 million in seed funding from DCM Ventures, Bootstrap Labs and Launch Capitalto get the project off the ground.

Mendel.ai is similar in many ways to the U.K.-based BenevolentBio, which is focused on skimming through scientific papers to find the latest in cutting-edge medical research. But rather than using keyword data, Mendal.ai uses analgorithm that understands the unstructured, natural language content within medical documents pulled from clinicaltrials.gov,and then compares it to a patients medical record. The search process returns a fully personalized match and evaluates the patients eligibility for each suggested treatment within minutes, according to Galil.

The startup could prove useful for doctors whoincreasingly find it difficult to keep up on the exhaustive amount of clinical data.

Patients are also overwhelmed at the prospect of combing through mountains of clinical trial research. A lung cancer patient, for example, might find 500 potential trials on clinicaltrials.gov, each of which has a unique, exhaustive list of eligibility criteria that must be read and assessed, says Galil. As this pool of trials changes each week, it is humanly impossible to keep track of all good matches.

Mendel.ai seeks to reduce the time it takes and thus save more lives. The company is now integrating with the Comprehensive Blood & Cancer Center (CBCC) in Bakersfield, Calif, which will allow the centers doctors to quickly match their patients with available clinical trials in a matter of minutes, according to Galil.

The plan going forward is to workwith hospitals and cancer genomics companies like the CBCC to improve Mendel.ai and introduce the system. A more immediate goal, Galil says, would be challenging IBMs Watson against his system to see which one can match up the patients better.

This is the difference between someone dying and someone living. Its not a joke, Galil told TechCrunch.

See the original post:

Mendel.ai nabs $2 million to match cancer patients with the latest ... - TechCrunch

What you need to know about data fluency and federated AI – Healthcare IT News

Sharecare is a digital health company that offers an artificial intelligence-powered mobile app for consumers. But it has a strong viewpoint on AI and how it is used.

Sharecare believes that while other companies use augmented analytics and AI to understand data with business intelligence tools, they are missing out on the benefits of data fluency and federated AI. By using federated AI and data fluency, Sharecare says it digs deeper to find hidden similarities in the data that business intelligence tools would not be able to detect in health settings.

To gain a deeper understanding of data fluency and federated AI, Healthcare IT News sat down with Akshay Sharma, executive vice president of artificial intelligence at Sharecare, for an in-depth interview.

Q: What exactly is federated AI, and how is it different from any other form of AI?

A: Federated AI, or federated learning, guarantees that the user's data stays on the device. For example, the applications that run specific programs on the edge of the network can still learn how to process the data and build better, more efficient models by sharing a mathematical representation of key clinical features, not the data.

Traditional machine learning requires centralizing data to train and build a model. However, with edge AI and federated learning combined with other privacy-preserving techniques and zero trust infrastructure, it's possible to build models in a distributed data setup while lowering the risk of any single point of attack.

The application of federated learning also applies in cloud settings where the data doesn't have to leave the systems on which it exists but can allow for learning. We call this federated cloud learning, which organizations can use to collaborate, keeping the data private.

Q: What is data fluency, and why is it important to AI?

A: Data fluency is a framework and set of tools to rapidly unlock the value of clinical data by having every key stakeholder participate simultaneously in a collaborative environment. A machine learning environment with a data fluency framework engages clinicians, actuaries, data engineers, data scientists, managers, infrastructure engineers and all other business stakeholders to explore the data, ask questions, quickly build analytics and even model the data.

This novel approach to enterprise data analytics is purpose-built for healthcare to improve workflows, collaboration and rapid prototyping of ideas before spending time and money on building models.

Q: How do data fluency platforms enable analysts, engineers, data scientists and clinicians to collaborate more easily and efficiently?

A: Traditional healthcare systems are very siloed, and many organizations struggle to discover the value within their data and unlock actionable trends and clinical insights. Not only are data creation systems and teams isolated from data transformation systems and teams, but engineers and data scientists use coding languages while clinicians and finance teams use Word or Excel.

The disconnect creates a situation where the data knowledge is translated outside of the programming environment. The transformations between system boundaries are lossy and without feedback loops to improve an algorithm or the code. Yet, all stakeholders need early and iterative access to the data to build health algorithms effectively and with greater transparency.

The modern healthcare stack facilitates the collaboration of cross-functional teams from a single, data-driven point of view in Python Notebooks with a UI for non-engineering partners. Building AI models can be time-consuming and expensive to build, and it is essential to hedge your bets by getting early prototype input across domains of expertise.

Data fluency provides an environment for critical stakeholders to discover the value on top of the data or insights and in a real-time, agile and iterative way. The feedback from non-engineering teams is immediate and can help improve the underlying model or code in the notebook instantaneously.

Each domain expert can have multiple data views that facilitate deep collaboration and data insight discovery, enabling the continuous learning environment from care to research and from research to care. Data fluency works with cloud-native architectures, and many of the techniques can also automatically extend to computing on edge, where the patient and their data reside.

Q: Why do you say the future of analytics in healthcare is federated AI and data fluency?

A: Traditional analytics in healthcare is rooted in understanding a given set of data by using business intelligence-focused tools. The employees using these tools are not typically engineers but analysts, statisticians and business users.

The problem with traditional enterprise data analytics is that you don't learn from data; you only understand what's in it. To learn from data, you have to bring machine learning into the equation and effective feedback loops from all relevant stakeholders.

Machine learning helps surface hidden patterns in the data, especially if there are non-linear relationships that aren't easily identifiable to humans. Proactive collaboration at the data layer provides transparency into how the models or analytics metrics are built and makes it easier to unravel bias or assumptions and correct them in real time.

Federated AI and data fluency also address the barriers to data acquisition, which are often not technological, but instead include privacy, trust, regulatory compliance and intellectual property. This is especially the case in healthcare, where patients and consumers expect privacy with respect to personal information and where organizations want to protect the value of their data and are also required to follow regulatory laws such as HIPAA in the United States and the GDPR [General Data Protection Regulation] in the Eurozone.

Access to healthcare data is extremely difficult and guarded behind compliance walls. Usually, at best, access is provided to de-identified data with several security measures. Federated AI and the principles of data fluency can share a model without sharing the data used to train it and address these concerns. It will play a critical role in understanding the insights within distributed data silos while navigating with compliance barriers.

The privacy-preserving approach to unlocking the value of health data is crucial to the future of healthcare. The point is to improve healthcare machine learning adoption and understandability to drive actionable insights and better health outcomes. Federated AI goes beyond traditional enterprise data analytics to create a machine learning environment for data fluency and explainability that enables the training of models in parallel from automated multi-omics pipelines.

Twitter:@SiwickiHealthITEmail the writer:bsiwicki@himss.orgHealthcare IT News is a HIMSS Media publication.

Read more from the original source:

What you need to know about data fluency and federated AI - Healthcare IT News