Breaking News and Updates
- Abolition Of Work
- Alternative Medicine
- Artificial Intelligence
- Atlas Shrugged
- Ayn Rand
- Basic Income Guarantee
- Chess Engines
- Cloud Computing
- Conscious Evolution
- Cosmic Heaven
- Designer Babies
- Donald Trump
- Ethical Egoism
- Fifth Amendment
- Fifth Amendment
- Financial Independence
- First Amendment
- Fiscal Freedom
- Food Supplements
- Fourth Amendment
- Fourth Amendment
- Free Speech
- Freedom of Speech
- Gene Medicine
- Genetic Engineering
- Germ Warfare
- Golden Rule
- Government Oppression
- High Seas
- Hubble Telescope
- Human Genetic Engineering
- Human Genetics
- Human Longevity
- Immortality Medicine
- Intentional Communities
- Life Extension
- Mars Colonization
- Mind Uploading
- Minerva Reefs
- Modern Satanism
- Moon Colonization
- New Utopia
- Personal Empowerment
- Political Correctness
- Politically Incorrect
- Post Human
- Post Humanism
- Private Islands
- Quantum Computing
- Quantum Physics
- Resource Based Economy
- Ron Paul
- Second Amendment
- Second Amendment
- Socio-economic Collapse
- Space Exploration
- Space Station
- Space Travel
- Teilhard De Charden
- The Singularity
- Tor Browser
- Transhuman News
- Victimless Crimes
- Virtual Reality
- Wage Slavery
- War On Drugs
- Zeitgeist Movement
The Evolutionary Perspective
Category Archives: Cloud Computing
Posted: October 27, 2019 at 2:43 pm
Before the internet, there was ARPANet, an experimental government-funded prototype for a connected communications network of computers.
But before ARPANet, there was another technology called time-sharing. It was a glimpse of our own connected futurewell before the dawn of the personal computer, the smartphone, or even the web.
Time-sharing was a way of computing where a user would type into a typewriter-like terminal connected to a phone line. That phone line would connect to a larger computer somewhere else in the US. The code typed into the terminal would be transmitted to the larger computer, which would run the code and then send the output back through the phone lines to the user.
The idea of cloud computing has been around since some of the first computer systems.
The computers that we use today are far more powerful than typewriters, but its a similar idea when were typing into a program like Google Docs. We type, the data get sent to Googles servers, and we get a response in the form of words on a digital page. Software has become more complex over the years, requiring more processing power and storage. And while computers have also gotten faster, there remain tasks that are just too difficult for a single computer to run. As a result, technology companies offer cloud computing, where data are crunched or stored by massive computers, which ends up being far more cost-effective for businesses than buying, running, and updating the databases and software on their own.
This change has made cloud computing now a hundred-billion-dollar business, and its only getting bigger. Amazon, Microsoft, IBM, and Google are all vying for the largest slice of providing the infrastructure for the internet of the future. But the idea of cloud computing, which is really just multiple people using the same computer hardware at the same time, has been around since some of the first computer systems.
Through the mid-20th century, scientists worked at transforming the computer from a mechanical machine to an electronic one, shrinking the hardware from the size of a room to something that fit on a desk. But even these early, clunky electronic computers were still only capable of running one persons program a time, and generally were only found at universities and government research facilities. Everyone else at the research center would have to wait until the current programmer was done, and then reconfigure the computer for their use afterward. This hassle led to the development of a process called time-sharing, where computers could automatically handle a queue of codes to execute one after another.
One of the first projects to tackle time-sharing was MITs Project MAC, which stood for Multiple Access Computer, according to 1965 MIT graduate and then Project MAC contributor Tom Van Vleck.
Time-sharing a single computer among multiple users was proposed as a way to provide interactive computing to more than one person at a time, in order to support more people and to reduce the amount of time programmers had to wait for results, he wrote in 2014.
This is essentially the same idea that big tech companies are using today, but the speed and scale has been exponentially increased. Instead of simple mathematical equations among a handful of researchers, billions of lines of code are being run from millions of different users on tens of thousands of servers. These servers are just high-powered computers, custom-built to work together, and still take up entire warehouses, but can accomplish many orders of magnitude more computing than their earlier room-sized ancestors.
The idea of time-sharing and linking computers together in the 1960s would be formative for decades to come. In 1962, J. C. R. Licklider, a director at the US Department of Defense Advanced Research Projects Agency (ARPA) funded Project MAC as a research project to build networked computers, and later directed the project. In 1968, Licklider wrote a paper (pdf) titled Computers as a Communications Device that would sketch out the basis of the internet and the idea of connecting computers to one another, which influenced the creation of ARPANet itself.
It appears that the best and quickest way to move forward the development of interactive communities of geographically separated peopleis to set up an experimental network of multiaccess computers, he wrote, referring to what would become ARPANet, which supplanted the need for time-sharing for computer researchers who had access to the network, and eventually the internet after that.
Time-sharing was one of the only ways to get access to computers before the personal computer was available. In the 1960s, computers were marketed to mathematicians and scientists because of their enormous cost. But with time-sharing, which could be done from long distances over a phone line, the cost of the hardware was distributed across many customers, meaning access could be cheaper. Rather than just running scientific equations or simple programs, time-sharing companies sprung up and started to offer tools that we still use today, says David C. Brock, director of the Software History Center at the Computer History Center in Mountain View, California.
It turned out that there were other people interested in office automation, and doing things like payroll and mailings and forms and simple databases, he said, adding that the institutions that at first were only using computers for mathematics found that they could use computers for other tasks as well.
It turned out that there were other people interested in office automation.
Apple co-founder Steve Wozniak tinkered on time-sharing terminals when he was in high school, and the still-high cost of a terminal to connect to a time-sharing service factored into building the Apples first consumer computer, according to an interview with his co-founder, Steve Jobs.
In the years after Lickliders paper, about 150 businesses formed in the US to provide time-sharing services, according to the Computer History Museum. Small, portable typewriters with simplistic computer chips would be rented on a monthly basis, and when plugged into a phone line, these terminals would connect to a large computer computer elsewhere in the country. Just like todays cloud computing, customers were charged for how much computing power they used.
By 1978, the most well-known time-sharing startup, Tymshare, had a network of 450,000 sessions per month. At the time that was larger than ARPANet, the first iteration of what we know as the internet today which had dozens of connected computers, Brock said.
The 1980s saw the introduction of smaller, more affordable microchips, leading to the era of the personal computer, led by the likes of Apple and IBM. Time-sharing started to feel unnecessary, as many had now machines at their offices or homes, and didnt need to call into a computer somewhere else to get their work done.
But it wasnt long before the idea of cloud computing sprang up. In 1997, entrepreneur Sean OSullivan filed a trademark on the phrase cloud computing, according to MIT Technology Review. (The trademark is now dead.) OSullivans company was hammering out a contract with PC manufacturer Compaq, where OSullivan would provide the software for Compaqs server hardware. The two would in turn sell that technology to burgeoning internet service providers like AOL, who could offer new computing services to their customers.
The first mention of the technology was scribbled in a daily planner, where OSullivan wrote, Cloud Computing: The Cloud has no Borders. Just two weeks later, Compaq predicted that enterprise software, which needed to be directly installed on users computers, would be usurped by cloud services distributed over the internet, what we now call Software as a Service or SaaS.
Thats the world we live in today. Amazon, Google, Microsoft, and IBM all offer Software as a Service products, like cloud-based accounting tools, file storage, and speech recognition, as well as Infrastructure as a Service, where customers can build their entire software business on the desired tech companys servers. Much of the web is housed within Amazons servers, which is made painfully clear by huge internet outages when the company experiences malfunctions. The transaction is typically seen as a win-win: It reduces the cost of buying and maintaining servers for the customer, and the cloud provider can make buckets of money by efficiently running thousands of customers code in parallel in a server farm.
But it all started with time-sharing, and the simple idea that you could book time on someone elses computer.
Originally posted here:
Posted: at 2:43 pm
In an era that requires massive data storage necessitated by data analytics and even routine operations cloud storage is now an unquestioned foundation for many businesses.
But while cloud storage offers a big helping hand, it's not without its own complexities. Among the several: the data is stored remotely, but we need the data right now. Also: do we store all our data with one cloud company? How do we manage a multicloud strategy?
To discuss these issues, I spoke with Ellen Rubin, CEO and Co-Founder of ClearSky Data, a cloud storage-as-a-service company. A veteran tech entrepreneur, Rubin also co-founded CloudSwitch, a cloud software firm that was later acquired by Verizon.
Among many topics, Rubin and I discussed:
See below: transcribed highlights of my discussion with Ellen Rubin.
So what we're seeing today is that a typical enterprise that is not born-in-the-cloud, never-had-a-datacenter, but instead has a mix of more traditional legacy infrastructure, as well as maybe multiple cloud providers. [They are] dealing with this very extended and complex environment where they have things like VMware and databases, Oracle, SQL, and all things sitting in more traditional data centers. Which of course they're desperately trying to consolidate and get out of.
But theyre still buying more gear, making copies of the data, backing it up, having a disaster recovery site, all the things they used to do, even if they don't have as many [elements] as they used to.
So what you have is almost like an a la carte menu for the people who are in the business side: where things could be run, and what needs to happen to support the business use cases and you've got IT in the middle trying to handle and manage it.
And what we're finding is that really, nobody has thought about the fact that what you want is not to just deal with accessing [the data] and pulling things back from the cloud, which is really expensive. What you actually want is: you just want the data to be accessible wherever the computer's running.
One of the hidden things that's true is how dramatically things are changing in the networking world because of this distributed environment. The need to be able to connect across physical distances but still deal with latency as a problem was something that nobody seemed to be dealing with.
So what you see customers doing is they are running lines and they're starting to take advantage of much more modern types of approaches, not just AT&T and Verizon, but some of the more modern on-demand, more API enabled types of services. But they're still kind of stuck with the fact that latency is a problem that is not going to be solved by them. And they also are dealing with the fact that they're not expert in that.
You can't solve this problem without addressing last mile issues and the fact that the data that's sitting physically in a cloud somewhere far away may not be that great, if your compute needs to be elsewhere or it needs to be distributed.
One of the pleasures of being an entrepreneur, especially 'cause I focus on IT-related infrastructure things, is getting to meet with CIOs all the time, and that across many companies.
And I have to say that this is as challenging and trying a time as I have seen for a lot of the heads of IT and CIOs. Because certainly, the CIOs have shifted into much more of a business type of a role where they're really meant to focus not just on how do I reduce all my costs down to as little as possible, but I'm supposed to help promote the digital transformation that the executives are trying to accomplish and the business units are trying to be agile. And be able to roll things out much more real-time in this user-based, self-service, everything on demand environment.
So that puts both incredible pressure [on them]. But it also, at the same time, requires cost production and those are very hard things to do.
But in a way, the assumption is that this is all supposed to be cheaper, and so it's just a tremendous amount of pressure in terms of the decision makers. And I think that their feeling is that some of the things, they're really, really great at. But other things would be best to turn to an outsource provider of someone who can act as a trusted advisor or provide services to them. So we're a small part of that ecosystem.
It's not like keep it all, analyze it all because rarely is that so beneficial. But also it's really hard to do, right? If you're moving petabytes of data across lines, that's gonna be pretty challenging.
So I think that's what's different. But there are also is this explosion of Edge Data Centers. So we spend a lot of time with our partner Equinix as a major data center provider. And they're in all the major metros but there are also players that are trying to be yet even further to the edge in cell towers and micro data centers and sites that are really almost just regular buildings, where a lot of the data and analytic catalogs could be captured and processed.
And so what it means is, even though people are consolidating traditional data centers, there's an explosion of more locations where some sort of infrastructure has to exist but it has to be small.
This whole pendulum swings between: it's distributed, no, it's decentralized, back and forth, so we're in a big decentralization moment, where more and more things can be processed locally, decision making can happen.
Certainly, IoT is a very extreme example of it and is very impactful to certain industries but your plain old regular traditional enterprise decision maker is a little less in the IoT world maybe. And more in the trying to have more and more access to real-time data that's happening at in people's phones, at points of interaction and local stores, that kind of thing.
And I think it's a very exciting time, but what it does is, yet again, puts a lot of pressure on how much of that are you going to own and build and manage yourself, versus how much can you take advantage of what's out there in the ecosystem from technology providers? The cloud is certainly a huge part of it, but it's not the only part.
We seem to be in a very long cycle right now. We were talking about it's 10 years of cloud. And now we've got all this edge stuff going on, and we're just in the emerging few years of that taking off. So I can't say that in the next five years it will have become simpler because that probably is a little optimistic.
But what I can say is, I am sure that we will see another swing of companies getting much more standardized on a few different key types of products and vendors, for the different types of things they're doing. There'll start to be more consensus.
And I see it by industry vertical, financial services versus healthcare versus law firms that we service, where they have certain things that they do, for different types of work that have to get done. And they all talk to each other, and what starts to emerge as: that's the right way to do it.
So I think we'll have more of that, even, I don't know what to call it, the playbooks, best practices, pick your favorite word. But what it will mean is that there'll be a lot less complexity in terms of the numbers of things that are all being tried at the same time. So at least, that hopefully, will be a little bit easier.
Go here to read the rest:
Posted: at 2:43 pm
Cloud computing has introduced amazing levels of efficiency to businesses. By making resources available from virtually anywhere, your team can access company resources and stay productive.
Even if your entire team is in a single location, with cloud computing the resources you need are a click away. But one of the areas where cloud computing truly excels is recovering from disasters.
Whether you had a data breach or a natural disaster, if you have the right cloud infrastructure in place, full recovery can be minutes away.
Is your small business using the cloud?
There are so many ways that cloud computing technology can benefit a small business. It truly can be a game-changer for your small business.
Are you storing files in the cloud or using it for your CRM. What about hosting your website, the cloud now offers a great alternative. The are many options available to you.
So, in this weeks poll question, we want to know what role cloud computing handles for your business.
Posted: at 2:43 pm
Amazon and Google dont have as many existing relationships with large companies as Microsoft, Mr. Keirstead said. Amazon has been furiously hiring sales and marketing staff for AWS, investments that Amazon told investors were a drag on profit last quarter.
They are in the process of building relationships with the Fortune 500, and Microsoft is one of the largest suppliers of tech to the Fortune 500 already and has been for 20 years, Mr. Keirstead said. That is powerful.
In the Morgan Stanley survey, Microsoft was seen as the top vendor for an approach called hybrid cloud, which is popular with large organizations. Hybrid cloud computing lets companies use a single set of tools to manage their information across both remote data centers and their own servers, be it close to a store or in internet-connected machinery.
A measure of Microsofts sales for traditional servers and cloud services was up 33 percent excluding currency fluctuations, indicating the approach continued to gain traction.
Microsoft has continued to move customers from its traditional Office suite of products like Outlook and Excel to its cloud-based Office 365 service. In the quarter, Office 365s commercial sales rose 28 percent excluding currency fluctuations. The company said it had more than 200 million monthly active business users on Office 365.
Microsoft says that productivity tools are the hub of where office workers spend their time, and that moving them to the cloud can give companies more up-to-date tools and data to analyze, like improved security.
The power of that suite is clear in the fast adoption of Teams, a chat and collaboration tool that competes with Slack and Google Hangouts. While Slack has more functionality, analysts say, Teams has become good enough for companies to adopt. Also, because Microsoft bundles it with other Office 365 products, it is available at a very low cost.
That is the new user interface, Mr. Weiss said. That is why Slack is so threatening to Microsoft, and why they are competing so aggressively against it. Microsoft doesnt want you to take that nexus of where you spend your time away from them.
Read this article:
Posted: at 2:43 pm
Cloud computing has transformed the way businesses work and continues to disrupt traditional business models. IDC predicts that by 2023 public cloud spending will more than double, growing from $229 billion this year to nearly $500 billion.
Its no secret that migrating to the cloud can deliver significant cost and efficiency gains. You can spin up cloud instances in minutes and can scale up or scale down resources as needed. At the same time, you only pay for what you use while avoiding high upfront hardware costs and maintenance.
Lets not forget. Youre storing corporate data on someone elses computer -- that you control, but its still owned by a third party. Even though your cloud service provider environment is highly secure, whats inside your cloud (applications and data) is your own responsibility.
Cloud computing security is on boardroom agendas as its impact can have serious consequences on corporate reputation and shareholder value. Data moving to the cloud beyond the traditional perimeter has led to the expansion of the attack surface. As more and more sensitive information gets stored on the cloud, cloud resources will be increasingly targeted by cyber criminals.
As organizations move to the cloud, they will have to assume new responsibilities and develop and adapt processes to combat a multitude of unknown threats.
The secret to better cloud security is assuming that there is no security at all while taking stock of your entire security posture.
There are several elements to public cloud security and it can be difficult to figure out where to start. If youre already on the cloud or are planning on moving on to one, here are five best practices you can follow to safeguard your public cloud adoption.
Read the rest here:
Posted: at 2:43 pm
Todays businesses are increasingly relying on the cloud. Be it to just host their burgeoning data or to drive their end-to-end business operations, the cloud is indispensable. In this post, we look at four major cloud computing models in use by businesses today.
IaaS is the most basic or fundamental cloud computing model.
Businesses relying on IaaS are outsourcing their data storage infrastructure requirements to an external provider. This provider could be a public cloud host, such as Amazon (AWS), Microsoft (Azure), or Google (GCP). It could also be a smaller vendor with a private data center solution
In any scenario, the IaaS vendor will deliver the bare metal, servers, virtual machines, as well as day-to-day maintenance and security needs of a data center to the customer. True North ITG
In other words, the customer doesnt have to worry about any capital expenditure (CAPEX). Instead, the client pays a flat-rate fee for the capacity it needs; if the capacity decreases, so does the fee.
With IaaS, youre just talking about the cloud hosting infrastructure. What the customer hosts on that infrastructure isnt a concern for the provider. So you could have IaaS users limit their use to just backing up data, and others who could host their own platforms and software.
Like IaaS, PaaS includes a cloud infrastructure offering for hosting the clients data. However, it builds upon that by also offering a development platform/framework/stack on top of bare metal.
With PaaS, the client can develop, test, and deploy applications in the providers environment, and by using the managed cloud service providers tools or technologies.
The benefit of PaaS is that you can develop applications in an environment that just works i.e., you know the technologies and cloud platform available to you are compatible.
You wouldnt need to worry about making your software compatible with the hosting hardware, which is the case with IaaS. However, like IaaS, the vendor will manage the maintenance and security of the cloud hosting infrastructure.
In addition, the vendor will also continue improving its technology stack. You dont have to worry about falling behind in terms of technologies, your vendor will handle it, equipping you to use the advances to your benefit without worrying about development costs.
Finally, if youre working with a leading PaaS provider, then there are usually other businesses and developers on that platform too. In other words, youll have a community at your disposal in case your developers need support on using the providers technology stack.
With SaaS, youre essentially relying on an external vendor to provide the cloud infrastructure, the underlying technology stack, and the user-facing application.
In other words, youre using an entirely off-the-shelf and externally hosted application.
Common SaaS solutions include email, customer relationship management (CRM) suites, VoIP and/or messaging (e.g., Zoom, Slack, GoToMeeting, etc), and many others.
The benefit of SaaS is that theres zero lead-time in acquiring a solution. If your business needs a solution, then you can get it right away without worrying about hosting or developing it.
If theres a drawback to SaaS, its generally the fact that the feature selection and release cycle are in the hands of the vendor. For SaaS to work, you must ensure theres alignment in terms of the services features/capabilities and your needs. If not, youre going to have trouble.
Finally, you can also rely on the cloud for disaster recovery.
This is the task of backing up and, if the need arises (e.g., a cyber security breach), restoring your data. DRaaS should also involve a strong turnover element; if a data center in one areaor region goes down, the vendor should have your data back-up in another data center.
You can also work with third-party firms, such as PCM Canada, to configure and manage your DRaaS system, even if its stored in a public cloud host like Azure. For example, managed IT services include a complete DRaaS and business contuinity service.
In conclusion, cloud computing is not only essential for todays business operations, but businesses have multiple options for using the cloud. Where IaaS can favour a company with significant developer resources (to create custom frameworks and applications), SaaS enables smaller players to acquire advanced IT capabilities without having to develop, host or manage it.
View original post here:
Posted: at 2:43 pm
New Report Recognizes Webscale for Record-Breaking Growth, Market-Driven Innovation, and Industry-Leading Customer Satisfaction
SUNNYVALE, Calif., Oct. 24, 2019 (GLOBE NEWSWIRE) -- Webscale, the Digital Cloud Company, announced today that it has been named a Hot Cloud Computing Startup to Watch in Startup50s Big50-2019 Report for the third consecutive year.
Recognized for strong revenue growth and industry-leading customer satisfaction metrics, the companys momentum is mostly attributed to the continuing evolution of its Digital Cloud platform, which provides hyperscale cloud automation, management, and hosting for more than 1,200 storefronts. Offering 100% uptime, 360-degree security, blazing fast performance, advanced DevSecOps automation tools, and access to an award-winning team of multi-cloud-certified experts, Webscale simplifies the management of web applications across multi-cloud environments, while significantly enhancing user experience, allowing merchants to focus on the running of their business, not their infrastructure.
The startups in this lineup are in a good position to follow in the footsteps of recent Big50 alumni, several of which have already gone on to raise massive rounds of funding, followed by successful exits, whether through IPOs or high-dollar acquisitions, said Jeff Vance, Founder and Editor-in-Chief of Startup50.
Over the last year, Webscale has doubled its revenue and expanded its customer base to serve businesses across seven countries, including seven companies in the Fortune 1000 and seven in the Internet Retailer Top 500, said Sonal Puri, CEO of Webscale. Webscale is the only choice for digital commerce merchants looking to migrate their online storefronts to a multi-cloud environment, while significantly enhancing site performance, availability, and security.
Webscale, the Digital Cloud Company, is the leader in converged software for hyperscale cloud automation. Delivered as-a-Service, the Webscale platform allows businesses of all sizes to benefit from infinite scalability, load balancing, high performance, outage prevention, improved security, and simple management in multi-cloud environments, including Amazon Web Services (AWS Advanced Partner), Google Cloud Platform (Google Cloud Platform Partner), and Microsoft Azure (Microsoft Partner Network). Webscale enables digital transformation for B2C, B2B, and B2E e-commerce and enterprise customers in seven countries and for seven of the Fortune 1000 businesses and seven of the Internet Retailer Top 500. The company is headquartered in Sunnyvale, CA, with offices in Boulder, CO, and Bangalore, India.
For more information, visit http://www.webscale.com. Follow us on LinkedIn, Twitter, and Facebook.
Andrew HumberWebscalepr@webscale.com+1 (408) 416 7943
Continue reading here:
Posted: at 2:43 pm
A great magic act leaves an audience clutching at straws, desperately trying to work out how the supposedly unimaginable just happened
But, there are some tricks that dont leave us guessing. If a magician were to pull a bunny out of a top hat wed hardly be surprised. The expected is much less impressive. In fact, Christopher Nolan explains this in his film, The Prestige where he breaks a trick down into three parts; the pledge, the turn and the prestige.
The key here is that the prestige is a moment whereby an audiencesexpectations are transformed the unimaginable happens. This is a concept thatstretches far beyond the realms of magic and into our everyday lives. Forexample, in the world of cloud computing the prestige helps to explain tocompanies the limitless potential of the cloud. This means moving beyond theobvious; reduced costs, less overhead and focusing on the benefits the cloudcan have for a business when leveraged correctly.
Consider a cloud solution that uses machine learning to identifyanomalies, optimise backup, storage and cost. Based on file extension, age ofdata, and time since last used, the solution makes recommendations about filesthat could be deleted, highlighting potential reductions in the cloudbill.
Furthermore, recovery efficiency is another benefit of cloud solutions.Gone are the days of manually sifting through all backups to find the file inquestion. When it comes to security, finding a clean version of the file,pre-infection, too often requires multiple cycles of restore-and-test, takinghours of work. An intelligent solution uses machine learning to pinpoint thefile in its pristine state, streamlining the process and allowing any businessto recover lost data with speed and confidence.
There are many examples where the cloud, when combined with the rightdata and enabling technology, can change the experience from commonplace toextraordinary.
Theres no question that in the world of technology, even more so thanmagic, expectations increase rapidly. When the cloud was first introduced, itsmain differentiating value was enabling companies to access seemingly limitlessresources on demand through APIs from a remote service. Services such as AmazonS3 object storage and EC2 compute engines brought new levels of availability,reliability and cost optimisation the industry had never experienced, let alonethought was possible. Quickly though, new applications, features andadvancements were introduced, turning those original innovations into meretable stakes for new vendors looking to join the market.
Enabled by the cloud, things like machine learning and AI environmentsare becoming a reality and challenging previously set boundaries. The excitingpiece about these emerging technologies is that they are not simply add-onsbut enablers that will open up new doors onto previously unchartedpossibilities.
But ultimately the focus has to be on what the cloud can do for thefuture of a business. Go to cloud everyone says. but, getting to the cloud isnot a goal, its a means to an end for something that delivers businessbenefit. For most businesses, the normal expectation is that they will move tothe cloud and tap into the acknowledged benefits of lower costs, less overhead,autoscaling and other well-advertised cloud capabilities.
To capture the true value of the cloud, companies need to move beyondmigrating their current workloads to the cloud. They must also leverage theunique features of the cloud, from adding in AI and machine learning, toprotecting data, and then using it in new ways to inform the business by addingbenefits that werent possible prior to the cloud.
Nearly unlimited amounts of data can be stored cost effectively and in asimpler fashion in the cloud compared to on-premises solutions. This isessentially the pledge aspect of cloud data storage. The turn is when data canbe ingested straight into the cloud, eliminating the silos that exist in datacentres, whilst also removing the requirements for purchasing and maintaininghardware. Today, the prestige becomes things like automation, AI and machinelearning, which help companies use this data in innovative ways.
When it comes to the cloud, technology such as AI and analytics can beused to pull untapped value from pre-existing data. But that isnt the onlybenefit. With its ability to reduce manual labour by automating previouslytime-consuming tasks, the cloud enables efficiencies that were never accessiblebefore.
Much like any aspiring magician, competition is business is rife. To stand out from the crowd and remain competitive, organisations need to utilise the data stored in the cloud. With its power and capabilities, enterprises can make informed decisions, identify new value and insights, and drive their business forward.
W. Curtis Preston is Chief Technologist at Druva. Druva delivers data protection and management for the cloud era. Druva Cloud Platform is built on AWS and offered as-a-Service; customers drive down costs by over 50 percent by freeing themselves from the burden of unnecessary hardware, capacity planning, and software management.
Featured image: sdecoret
Read the original post:
Posted: at 2:43 pm
Software company SAP said today it had agreed a three-year deal with Microsoft to help its large business customers move business processes into the cloud.
The partnership, called Embrace, aims to help clients run operations hosted at remote servers supported by SAPs S/4HANA database.
Read more: Software giant SAP to buy Qualtrics for $8bn days before unicorns planned IPO
We bundled SAPs cloud platform services to support customers around the extension, integration and orchestration of SAP systems, co-chief executive Jennifer Morgan told reporters.
Morgan was appointed as co-chief executive of the company alongside Christian Klein after long-time chief executive Bill McDermott stepped down earlier this month.
Morgan is the first woman to become a chief executive of a company in Germanys blue-chip Dax index.
Read more: Unilever and SAP lead 4m investment round in digital reward platform WeGift
Bringing together the power of SAP and Microsoft provides customers with the assurance of working with two industry leaders so they can confidently and efficiently transition into intelligent enterprises, Morgan said.
The company also announced its third-quarter results today, posting revenue of just under $6.8bn (5.3bn), and operating profit of nearly $1.7bn.
The company reiterated its forecast for the year and through to 2023
Posted: at 2:43 pm
ORLANDO Corporate network infrastructure is only going to get more involved over the next two to three years as automation, network challenges and hybrid cloud become more integral to the enterprise.
Those were some of the main infrastructure trend themes espoused by Gartner vice president and distinguished analyst David Cappuccio at the research firms IT Symposium/XPO here this week.
Cappuccio noted that Gartners look at the top infrastructure and operational trends reflect offshoots of technologies such as cloud computing, automation and networking advances the companys analysts have talked about many times before.
Gartners Top Ten Trends Impacting Infrastructure and Operations list is:
Automation has been going on at some level for years, Cappuccio said, but the level of complexity as it is developed and deployed further is whats becoming confusing. The amounts and types of automation need to be managed and require a shift to a team development approach led by an automation architect that can be standardized across business units. Cappuccio said. What would help? Gartner says by 2025, more than 90 percent of enterprises will have an automation architect, up from less than 20 percent today.
Hybrid IT which includes a mix of data center, SAAS, PAAS, branch offices, edge computing and security services makes it hard to promise enterprise resources will be available or backed-up, Cappuccio said. Overly-simplistic IT disaster recovery plans may only deliver partial success. By 2021, the root cause of 90 percent of cloud-based availability issues will be the failure to fully use cloud service provider native redundancy capabilities, he said. Enterprises need to leverage their automation investments and other IT tools to refocus how systems are recovered.
ITs role in many companies has almost become that of a product manager for all its different DevOps teams. IT needs to build consistency across the enterprise because they dont want islands of DeVOps teams across the company. By 2023, 90 percent of enterprises will fail to scale DevOps initiatives if shared self-service platform approaches are not adopted, Gartner stated.
By 2022, more than 50 percent of enterprise-generated data will be created and processed outside the data center or cloud, up from less than 10 percent in 2019. Infrastructure is everywhere, Cappuccio said and every time data is moved it creates challenges. How does IT manage data-everywhere scenarios? Cappuccio advocated mandating data-driven infrastructure impact-assessment at early stages of design, investing in infrastructure tools to manage data wherever it resides, and modernizing existing backup architectures to be able to protect data wherever it resides.
The issue here is that most IoT implementations are not driven by IT, and they typically involve different protocols and vendors that dont usually deal with an IT organization. In the end, who controls and manages IoT becomes an issue and it creates security and operational risks. Cappuccio said companies need to engage with business leaders to shape IoT strategies and establish a center of excellence for IoT.
The methods of putting cloud services or cloud-like services on-premises but letting a vendor manage that cloud are increasing. Google has Athos and AWS will soon roll out OutPosts, for example, so this environment is going to change a lot in the next two years, Cappuccio said. This is a nascent market so customers should beware. Enterprises should also be prepared to set boundaries and determine who is responsible for software upgrades, patching and performance.
Humans used to learn about and adapt to technology. Today, technology learns and adapts to humans, Cappuccio said. We have created a world where customers have a serious expectation of perfection. We have designed applications where perfection is the norm. Such systems are great for mindshare, marketshare and corporate reputation, but as soon as theres one glitch thats all out the window.
Application development is no longer the realm of specialists. There has been the rollout of simpler development tools like low code or no code packages and a focus on bringing new applications to market quickly. That may bring a quicker time-to-market for the business but could be riskier for IT, Cappuccio said. IT leaders perhaps cant control such rapid development, but it needs to understand whats happening.
There are tons of emerging trends around networking such as mesh, secure-access service edge, network automation, network-on-demand service, network automation, and firewalls as a service. After decades of focusing on network performance and availability, future network innovation will target operational simplicity, automation, reliability and flexible business models, Cappuccio said. Enterprises need to automate everywhere and balance what technologies are safe vs. what is agile, he said.
The general idea here is that CIOs face the challenge of selecting the right mixture of cloud and traditional IT for the organization. The mix of many different elements such as edge, hybrid cloud, workflow and management creates complex infrastructures. Gartner recommends a focus on workflow visualization utilizing an in integrated toolset and developing a center of excellence to work on the issues, Cappuccio said.
Read more here: