Page 14«..10..13141516..2030..»

Category Archives: Cloud Computing

AI and cloud tech can enhance the "Netflix like experience" for … – TechGraph

Posted: May 8, 2023 at 5:14 pm

Speaking to TechGraph, JPS Kohil, Founder & Chairman of SkillUp Online said, In line with the Netflix experience, I believe that AI and cloud technology can provide invaluable insights into learner behavior and preferences.

Read the interview:

- Advertisement -

TechGraph: Could you help give a sense of how SkillUp Online has come in its five years of existence?

JPS Kohli: When I started SkillUp in 2016, my 25 years of experience in cloud-based learning, which included being Director of Online Training at Microsoft, revealed a problem in the industry. Though education was shifting from face-to-face learning which limits numbers to online self-paced learning which is available to anyone, anytime, anywhere completion rates for online courses were very low. I believed the human touch was missing and created SkillUp to solve this problem.

The business has grown based on a learner-centric training methodology that balances a technology-driven ethos with a human-centered approach. We focus on future skills training that keeps professionals relevant in this changing world.

Our course catalog includes fields such as artificial intelligence, data science, machine learning, cloud computing, cybersecurity, etc.

However, we also offer human skills training that enables professionals to develop the leadership, team building, time management, collaboration, and problem-solving skills that businesses desperately need.

To further the SkillUp vision, we then launched our dedicated training platform SkillUp Online in 2019. SkillUp Online is a hybrid learning platform that focuses on learning outcomes namely skills, practical experience, and certifications and learner outcomes where improved skills and practical experience facilitate access to better career opportunities, higher salaries, and future-proof jobs.SkillUp now operates in the US, Europe based in Portugal and India. The company has partnered with top organizations, including Microsoft, IBM, and Google Cloud for technical learning content, and NASSCOM and Pacific Lutheran University (USA) for extended reach.

TechGraph: How is SkillUp Online, facilitating the entire learning process digitally?

JPS Kohli: The SkillUp Online platform offers a mix of online self-paced learning, peer-to-peer interaction, and online instructor-led sessions. We utilize a flipped learning methodology that blends the best that instructor-led training and self-paced learning have to offer.

This includes high-quality technical training content developed with industry partners such as Microsoft, IBM, and Google, substantial hands-on learning, practical projects, soft skills coaching, and real-world capstone projects. And the outcome is learners develop job-ready tech and human skills that are aligned with the specific roles employers are desperately seeking to fill. This, in turn, gets learners better jobs, faster, than if theyd trained elsewhere.

Additionally, we also provide a valuable human touch to ensure our learners fly. For example, we offer 1-to-1 online mentoring with subject experts who are there to provide support, spot when learners need a nudge, answer questions, and personalize their learning.

Recent projects with NASSCOM and IBM Skills Build are outstanding examples of the success of our approach. Through these initiatives alone, the business has trained over 125,000 learners and achieved 67%+ completion rates. That alone is impressive, but what were proud of is the fact that weve achieved this by focusing specifically on learner outcomes higher employability, better salaries, and improved promotion opportunities.

- Advertisement -

TechGraph: How is the response so far to your online courses?

JPS Kohli: Our focus on career-aligned skills has prompted a very good response. Our learners are building skills that their traditional degree hasnt covered, and theyre achieving great results quickly.

But the stats speak for themselves. With 67%+ completion rates in comparison to MOOC completion rates of 3%-5% its clear that the business is not only enabling deep learning at scale but also helping learners to build successful careers quickly.

To build on this success, weve launched a unique collection of cutting-edge programs that enable learners with minimal tech experience to get job-ready and hit the ground running. For example, our popular TechMaster Certificate Programmes cover data science, artificial intelligence, and cloud computing. The strapline for these new programs Designed to get you hired! Perfectly encapsulates our focus on learner outcomes. And learners are welcoming the opportunity to invest their time and money to get a better job and higher salary quickly.

TechGraph: What are the new trends in AI & Machine Learning and the Big Data space?

JPS Kohli: The opportunities appearing through AI, machine learning, and big data are very exciting for edtech. The strides being taken in natural language processing (NLP) are highlighting the genuine possibility of AI instructors now. Though, at SkillUp the human touch will always remain an important aspect of our offering too.

Generative AI is also exploding, with ChatGPT and Bard taking the world by storm. This may soon facilitate new and up-to-date content being accessed and presented automatically in courses through machine learning. It will most likely start with text-based updates being made. But I see no reason why images, video, and even music wont follow shortly too. Content will always need to be moderated by a human being, of course, to check its accuracy. However, the process of updating content will be speeded up considerably.

Edge computing also deserves a mention here, with AI processes managing applications locally, as opposed to centrally, to reduce latency. And Im also excited about explainable AI XAI where soon, for example, powerful algorithms will provide descriptions and explanations for why decisions were made in a scenario.

TechGraph: SkillUp Online has been collaborating with different industry experts and technology partners to enhance the students learning experience. Going forward, do you see more such engagements?

JPS Kohli: Absolutely. Learning from industry experts is critical to the learners outcome once theyve completed a course. Ensuring that the right expertise and support are in place is core to the philosophy of SkillUp. Our approach to learning blends traditional expert input with self-paced learning; achieving the right balance is very important to us. Weve found that learners need experts to show them how something is done, so they can then go and practice it themselves.

However, in addition to using skilled instructors and industry experts in our programs, we also ensure we provide high-quality technical training content developed with industry partners such as Microsoft, IBM, and Google. Our technology partners enable us to include a lot of practical, hands-on material in our courses.

As learners work through a course, they get their hands dirty working on labs created by our industry partners. Plus, they are supported by proactive 1-to-1 mentoring from SkillUp Online technical experts. And with this approach, we make it easier and quicker for learners to gain the employable skills backed by real practical experience they need to get a better job.

TechGraph: What is the state of online skill development and learning platforms in the Indian market, especially for Gen Z?

JPS Kohli: Gen Z learners in general, including India, are the digital natives of our world. They intuitively know how to work an app without training. They are happy to consume bite-size chunks of content. And theyve got the confidence to keep up with the fast pace of change in a digital environment. Their challenge, though, is that a give-it-a-go attitude isnt enough they need employable skills.

This is where online courses come to the fore. The young professionals of today are very comfortable learning online via videos, sound bites, and well-defined learning paths. Online courses facilitate quick and easy access to such modular learning. And because, yes, learners still need nudging, skills-building courses that provide good mentoring support are achieving results much faster than traditional training.

The predisposition of Gen Zs ability to consume digital content, coupled with the clear need for upskilling in hot fields such as artificial intelligence, data science, and machine learning, is creating a match made in heaven.And this means that learning platforms that dont just shell out certificates, but instead ensure learners are armed with employable skills and practical experience to get a job quickly, will make the difference.

TechGraph: How do technologies, namely Al, machine learning, and cloud, have relevance in online courses? What will the future look like?

JPS Kohli: People have gotten used to technology making recommendations based on their user profile; Netflix and Amazon are excellent examples. Edtech learners are no different, and they are now expecting this tailored experience in the learning arena too.

In line with the Netflix experience, I believe that AI, machine learning, and cloud technology can provide invaluable insights into learner behavior, preferences, and needs. And by making good use of the power of machine learning, SkillUp is planning to use technology to tailor custom learning plans for individuals and offer them choices in how they learn and how they are supported to learn.

This personalized approach also has another aspect, though. To speed up deep skilling at scale, future-focused learning platforms like SkillUp Online will need to harness AI and machine learning to power continual learning. And I anticipate this will include adaptive learning, where the progression of the course content will vary depending on the pace of the learners uptake in understanding.

Generative learning powered by content created by algorithms akin to ChatGPT will also soon be working for hand in glove with explainable AI (XAI). This will take personalized, continual learning to new heights, with online courses always providing the relevant skills of the moment.

TechGraph: What has the response so far been to code learning courses on your platform?

JPS Kohli: With the explosion of AI and data science across the industry, its no surprise that our coding courses are popular. A great example is Python. Python is used across all industries and Python skills learned with SkillUp Online can be applied in many ways. This ensures our learners have transferable skills that will enhance their career opportunities considerably.

Because of this, weve made a point of integrating core Python skills into our flagship TechMaster Certificate portfolio. So, not only are learners developing specific Python skills but depending on the field they are training for, be it AI, data science, cloud computing, etc, they are also learning how to use their Python skills with other coding competencies. And this approach ensures they do have the skills they need to get hired.

- Advertisement -

Follow this link:

AI and cloud tech can enhance the "Netflix like experience" for ... - TechGraph

Posted in Cloud Computing | Comments Off on AI and cloud tech can enhance the "Netflix like experience" for … – TechGraph

Hyperautomation Market to Grow at CAGR of 16.5% through 2032 … – GlobeNewswire

Posted: at 5:14 pm

Newark, May 08, 2023 (GLOBE NEWSWIRE) -- The Brainy Insights estimates that the USD 36.46 Billion Hyperautomation market will reach USD 168 Billion by 2032. Rising demand of the Hyperautomated technological solutions for reducing business operational costs may propel the growth of the Hyperautomated Market, globally. This not only saves times, but also, energy, labour, and money of the organization. For instance: In May 2019, Thermax Limited adopted the hyperautomation systems for substituting manual chemical mixing process to hyperautomated solution. Thereby reducing the 40man days time to 15-man days time. Consequently, reducing the business operational costs.

Request to Download Sample Research Report - https://www.thebrainyinsights.com/enquiry/sample-request/13435

Report Coverage Details

North America to account for the largest market size during the forecast period accounting to 47% of the total market. Whereas Asia Pacific is expected to be the fastest growing region in the period forecasted.

North America emerged as the largest market for the global Hyperautomation market. Owing to increasing adoption of the technologies and entry of new market players in the region. Whereas Asia Pacific region is anticipated to exhibit highest growth rate over the period. Owing to rising investments on IT infrastructure from various countries such as, India, China, Japan. Furthermore, increasing demand for cloud-computing in these countries have also contributed towards the growth of the Hyperautomation market in this region.

Machine Learning (ML) has dominated the market with the most significant market revenue of USD 40 Billion in 2022.

Machine learning has dominated the market. Machine learning is a branch of Artificial Intelligence which primarily uses algorithms & models. Thereby uncovering critical insights and other focus areas. Hence, rising adoption of machine learning, globally with boost the overall growth of the Hyperautomation market.

IT & Telecom accounted for the largest share of the market, with a market revenue of USD 45.23 Billion in 2022.

IT & Telecom segment has dominated the Hyperautomation market. It is also expected to be the fastest growing segment across the globe. Owing to increased adoption of Integrating Robotic Process Automation (RPA). This ultimately helps in simplifying the operational tasks and providing long-term revenue generation opportunities in the period forecasted.

Procure Complete Research Report - https://www.thebrainyinsights.com/report/hyperautomation-market-13435

Latest Development:

In April 2022: Juniper Network entered into a partnership agreement with PP Telecommunication Sdn Bhd (PPTEL). Main objective of this partnership agreement was to provide the company with the solutions that will help them strengthen and build its growth plan. With this the company will be able to fulfil the demand and needs of its end-users and will provide high network communication facilities in the period forecasted.

In February 2022: IBM and SAP entered into partnership agreement. The main objective of this agreement was to provide its consulting services in the areas of hyperautomation technology. Further, this agreement will also provide hybrid cloud solution and disseminate object oriented critical problems from SAP to various regulated and unregulated industries.

Market Dynamics

Drivers: Digitization of traditional manufacturing plants

Rising digitization and automation of the traditional manufacturing plants is one of the major factors that boosts the growth of the Hyperautomation Market. To solve complex data problems and minimizing unwanted efforts of the labour. Various organizations have adapted Hyperautomation to reduce their Operating Expenditure (OPEX) and enhance their productivity and efficiency levels.

Restraint: Scarcity of skilled workers

With constantly evolving technology, their prevails higher demand for the skilled and trained professionals, who manages and streamlines the workflow effectively and efficiently. Therefore, lack of skilled workers may hamper the growth of the Hyperautomation market in the period forecasted.

Opportunity: Increased demand of Hyperautomation to lower overall business operational costs

Rising demand of the Hyperauomated technological solutions for reducing business operational costs may propel the growth of the Hyperautomated Market, globally. This not only saves times, but also, energy, labour, and money of the organization. For instance: In May 2019, Thermax Limited adopted the hyperautomation systems for substituting manual chemical mixing process to hyperautomated solution. Thereby reducing the 40man days time to 15-man days time. Consequently, reducing the business operational costs.

Challenge: Higher installation and maintenance costs

Higher installation and maintenance costs is one of the major challenges that the organizations and other high-tech faces in the current market scenario. With ongoing advancements in Hyperautomation and increasing demand of the same. The companies now know of the complex procedural solutions that needs to be taken care of. Thus, only the Large corporate firms are able to incur the huge installation and maintenance costs of this solution. Which puts MSMEs at par towards taking advantage of the Hyperautomated technology, to a huge extent.

Interested to Procure the Research Report? Inquire Before Buying - https://www.thebrainyinsights.com/enquiry/buying-inquiry/13435

Some of the major players operating in the Hyperautomation market are:

UiPath Wipro Ltd. Tata Consultancy Services Ltd. Mitsubishi Electric Corporation OneGlobe LLC SolveXia Appian Automation Anywhere Inc. Allerin Tech Pvt. Ltd. PagerDuty, Inc. Honeywell International Inc

Key Segments cover in the market:

By Type:

Biometrics Machine Learning Context-Aware Computing Natural Learning Generation Chatbots Robotic Process Automation

By End-User:

BFSI Retail IT & Telecom Education, Automotive Manufacturing Healthcare & Life Science

Have Any Query? Ask Our Experts:https://www.thebrainyinsights.com/enquiry/speak-to-analyst/13435

About the report:

The global Hyperautomation market is analysed based on value (USD trillion). All the segments have been analysed on a worldwide, regional, and country basis. The study includes the analysis of more than 30 countries for each part. The report offers an in-depth analysis of driving factors, opportunities, restraints, and challenges for gaining critical insight into the market. The study includes porter's five forces model, attractiveness analysis, raw material analysis, supply, demand analysis, competitor position grid analysis, distribution, and marketing channels analysis.

About The Brainy Insights:

The Brainy Insights is a market research company, aimed at providing actionable insights through data analytics to companies to improve their business acumen. We have a robust forecasting and estimation model to meet the clients' objectives of high-quality output within a short span of time. We provide both customized (clients' specific) and syndicate reports. Our repository of syndicate reports is diverse across all the categories and sub-categories across domains. Our customized solutions are tailored to meet the clients' requirements whether they are looking to expand or planning to launch a new product in the global market.

Contact Us

Avinash DHead of Business DevelopmentPhone: +1-315-215-1633Email: sales@thebrainyinsights.comWeb: http://www.thebrainyinsights.com

Visit link:

Hyperautomation Market to Grow at CAGR of 16.5% through 2032 ... - GlobeNewswire

Posted in Cloud Computing | Comments Off on Hyperautomation Market to Grow at CAGR of 16.5% through 2032 … – GlobeNewswire

UP Board modernises computer learning in schools, introduces basics of AI, drone technology – Organiser

Posted: at 5:14 pm

Now the students from Government schools in Uttar Pradesh will study and read about e-governance, artificial intelligence, cryptocurrency, drone technology, and information technology (IT) advancements.

According to the board secretary Dibyakant Shukla, Prayagraj-headquartered Uttar Pradesh Madhyamik Shiksha Board has updated the syllabus in accordance with the National Education Policy 2020 for classes 9 to 12 and uploaded it on its official website for the convenience of the students. The changes are specifically made in the curriculum for computer learning, which is taught in 28000 schools of UP board.

The syllabus is revised with the guidance and approval of subject experts. Its a significant change as it doesnt follow the current course prescribed by the National Council of Educational Research and Training (NCERT).The experts have replaced traditional computer programming languages such as C++ and HTML with Python and Java for class 11 and 12 students. This decision was made because HTML and C++ languages are not practised these days; instead, Core Java, Robotics and Drone Technology are introduced in the Class 12 syllabus.The class 11 students will study the Internet of Things (IoT), artificial intelligence, blockchain technology, augmented and virtual reality, 3-D printing and cloud computing.

Apart from HTML and C++, the board has also removed chapters on computer generations, history, and types of computers because of their irrelevance. The class 10 students will study ways to avoid hacking, phishing and cyber fraud. They will also be taught about artificial intelligence, drone technology and cyber security.Even students will study e-governance as a part of their curriculum.

Now class 9 students will be taught programming techniques, computer communication and networking, which class 10 students earlier studied.

While talking about the recent changes in the syllabus Biswanath Mishra, UP Board has made important changes in the syllabus of computer as a subject for students of classes 9 to 12. Students will now be taught modern topics like cryptocurrency, drone technology, artificial intelligence, hacking, fishing and cloud computing. This will prepare them as per the requirement of modern times. He teaches computers at Shiv Charan Das Kanhaiya Lal Inter College, Attarsuiya, Prayagraj.

The rest is here:

UP Board modernises computer learning in schools, introduces basics of AI, drone technology - Organiser

Posted in Cloud Computing | Comments Off on UP Board modernises computer learning in schools, introduces basics of AI, drone technology – Organiser

Banking on Thousands of Microservices – InfoQ.com

Posted: at 5:14 pm

Key Takeaways

In this article, I aim to share some of the practical lessons we have learned while constructing our architecture at Monzo. We will delve into both our successful endeavors and our unfortunate mishaps.

We will discuss the intricacies involved in scaling our systems and developing appropriate tools, enabling engineers to concentrate on delivering the features that our customers crave.

Our objective at Monzo is to democratize access to financial services. With a customer base of 7 million, we understand the importance of streamlining our processes and we have several payment integrations to maintain.

Some of these integrations still rely on FTP file transfers, many with distinct standards, rules, and criteria.

We continuously iterate on these systems to ensure that we can roll out new features to our customers without exposing the underlying complexities and restricting our product offerings.

In September 2022, we became direct participants in the Bacs scheme, which facilitates direct debits and credits in the UK.

Monzo had been integrated with Bacs since 2017, but through a partner who handled the integration on our behalf.

Last year we built the integration directly over the SWIFT network, and we successfully rolled it out to our customers with no disruption.

This example of seamless integration will be relevant throughout this article.

A pivotal decision was to build all our infrastructure and services on top of AWS, which was unprecedented in the financial services industry at the time. While the Financial Conduct Authority was still issuing initial guidance on cloud computing and outsourcing, we were among the first companies to deploy on the cloud. We have a few data centers for payment scheme integration, but our core platform runs on the services we build on top of AWS with minimal computing for message interfacing.

With AWS, we had the necessary infrastructure to run a bank, but we also needed modern software. While pre-built solutions exist, most rely on processing everything on-premise. Monzo aimed to be a modern bank, unburdened by legacy technology, designed to run in the cloud.

The decision to use microservices was made early on. To build a reliable banking technology, the company needed a dependable system to store money. Initially, services were created to handle the banking ledger, signups, accounts, authentication, and authorization. These services are context-bound and manage their own data. The company used static code generation to marshal data between services, which makes it easier to establish a solid API and semantic contract between entities and how they behave.

Separating entities between different database instances is also easier with this approach. For example, the transaction model has a unique account entity but all the other information lives within the account service. The account service is called using a Remote Procedure Call (RPC) to get full account information.

During the early days of Monzo, before the advent of service meshes, RPC was used over RabbitMQ, which was responsible for load balancing and deliverability of messages, with a request queue and a reply queue.

[Click on the image to view full-size]

Figure 1: Rabbit MQ in Monzos early days

Today, Monzo uses HTTP requests: when a customer makes a payment with their card, multiple services get involved in real-time to decide whether the payment should be accepted or declined. These services come from different teams, such as the payments team, the financial crime domain team, and the ledger team.

[Click on the image to view full-size]

Figure 2: A customer paying for a product with a card

Monzo doesn't want to build separate account and ledger abstractions for each payment scheme, so many of the services and abstractions need to be agnostic and able to scale independently to handle different payment integrations.

We made the decision early on to use Cassandra as our main database for services, with each service operating under its own keyspace. This strict isolation between keyspaces meant that a service could not directly read data from another service.

[Click on the image to view full-size]

Figure 3: Cassandra at Monzo

Cassandra is an open-source NoSQL database that distributes data across multiple nodes based on partitioning and replication, allowing for dynamic growth and shrinking of the cluster. It uses timestamps and quorum-based reads to provide stronger consistency, making it an eventually consistent system with last-write wins semantics.

Monzo set a replication factor of 3 for the account keyspace and defined a query with a local quorum to reach out to the three nodes owning the data and return when the majority of nodes agreed on the data. This approach allowed for a more powerful and scalable database, with fewer issues and better consistency.

In order to distribute data evenly across nodes and prevent hot partitions, it's important to choose a good partitioning key for your data. However, finding the right partitioning key can be challenging as you need to balance fast access with avoiding duplication of data across different tables. Cassandra is well-suited for this task, as it allows for efficient and inexpensive data writing.

Iterating over the entire dataset in Cassandra can be expensive and transactions are also lacking. To work around these limitations, engineers must be trained to model data differently and adopt patterns like canonical and index tables: data is written in reverse order to these tables, first to the index tables, and then to the canonical table, ensuring that the writes are fully complete.

For example, when adding a point of interest to a hotel, the data would first be written to the pois_by_hotel table, then to the hotels_by_poi table, and finally to the hotels table as the canonical table.

[Click on the image to view full-size]

Figure 4: Hotel example, with the hard-to-read point of interests table

Although scalability is beneficial, it also brings complexity and requires learning how to write data reliably. To mitigate this, we provide abstractions and autogenerated code for our engineers. To ensure highly available services and data storage, we utilize Kubernetes since 2016. Although it was still in its early releases, we saw its potential as an open-source orchestrator for application development and operations. We had to become proficient in operating Kubernetes, as managed offerings and comprehensive documentation were unavailable at the time, but our expertise in Kubernetes has since paid off immensely.

In mid-2016, the decision was made to switch to HTTP and use Linkerd for service discovery and routing. This improved load balancing and resiliency properties, especially in the event of a slow or unreliable service instance.

However, there were some problems, such as the outage experienced in 2017 when an interaction between Kubernetes and etcd caused service discovery to fail, leaving no healthy endpoints. This is an example of teething problems that arise with emerging and maturing technology. There are many stories of similar issues on k8s.af, a valuable resource for teams running Kubernetes at scale. Rather than seeing these outages as reasons to avoid Kubernetes, they should be viewed as learning opportunities.

We initially made tech choices for a small team, but later scaled to 300 engineers, 2500 microservices, and hundreds of daily deployments. To manage that, we have separate services and data boundaries and our platform team provides infrastructure and best practices embedded in core abstractions, letting engineers focus on business logic.

[Click on the image to view full-size]

Figure 5: Shared Core Library Layer

We use uniform templates and shared libraries for data marshaling, HTTP servers, and metrics, providing logging, and tracing by default.

Monzo uses various open-source tools for their observability stacks such as Prometheus, Grafana, OpenTelemetry, and Elasticsearch. We heavily invest in collecting telemetry data from our services and infrastructure, with over 25 million metric samples and hundreds of thousands of spans being scraped at any one point. Every new service that comes online immediately generates thousands of metrics, which engineers can view on templated dashboards. These dashboards also feed into automated alerts, which are routed to the appropriate team.

For example, the company used telemetry data to optimize the performance of the new customer feature Get Paid Early. When the new option caused a spike in load, we had issues with service dependencies becoming part of the hot path and not being provisioned to handle the load. We couldn't statically encode this information because it continuously shifted, and autoscaling wasn't reliable. Instead, we used Prometheus and tracing data to dynamically analyze the services involved in the hot path and scale them appropriately. Thanks to the use of telemetry data, we reduced the human error rate and made the feature self-sufficient.

Our company aims to simplify the interaction of engineers with platform infrastructure by abstracting it away from them. We have two reasons for this: engineers should not need to have a deep understanding of Kubernetes and we want to offer a set of opinionated features that we actively support and have a strong grasp on.

Since Kubernetes has a vast range of functionalities, it can be implemented in various ways. Our goal is to provide a higher level of abstraction that can ease the workload for application engineering teams, and minimize our personnel cost in running the platform. Engineers are not required to work with Kubernetes YAML.

If an engineer needs to implement a change, we provide tools that will check the accuracy of their modifications, construct all relevant Docker images in a clean environment, generate all Kubernetes manifests, and deploy everything.

[Click on the image to view full-size]

Figure 6: How an engineer deploys a change

We are currently undertaking a major project to move our Kubernetes infrastructure from our self-hosted platform to Amazon EKS, and this transition has also been made seamless by our deployment pipeline.

If you're interested in learning more about our deployment approach, code generation, and our service catalog, I gave a talk at QCon London 2022 where I discussed the tools we have developed, as well as our philosophy towards the developer experience.

The team recognizes that distributed systems are prone to failure and that it is important to acknowledge and accept it. In the case of a write operation, issues may occur and there may be uncertainty as to whether the data has been successfully written.

[Click on the image to view full-size]

Figure 7: Handling failures on Cassandra

This can result in inconsistencies when reading the data from different nodes, which can be problematic for a banking service that requires consistency. To address this issue, the team has been using a separate service running continuously in the background that is responsible for detecting and resolving inconsistent data states. This service can either flag the issue for further investigation or even automate the correction process. Alternatively, validation checks can be run when there is a user-facing request, but we noticed that this can lead to delays.

[Click on the image to view full-size]

Figure 8: Kafka and the coherence service

Coherence services are beneficial for the communication between infrastructure and services: Monzo uses Kafka clusters and Sarama-based libraries to interact with Kafka. To ensure confidence in updates to these libraries and Sarama, coherence services are continuously run in both staging and production environments. These services utilize the libraries like any other microservice and can identify problems caused by accidental changes to the library or Kafka configuration before they affect production systems.

Investment in systems and tooling is necessary for engineers to develop and run systems efficiently: the concepts of uniformity and "paved road" ensure consistency and familiarity, preventing the development of unmaintainable services with different designs.

From day one, Monzo focuses on getting new engineers onto the "paved road" by providing a documented process for writing and deploying code and a support structure for asking questions. The onboarding process is defined to establish long-lasting behaviors, ideas, and concepts, as it is difficult to change bad habits later on. Monzo continuously invests in onboarding, even having a "legacy patterns" section to highlight patterns to avoid in newer services.

While automated code modification tools are used for smaller changes, larger changes may require significant human refactoring to conform to new patterns, which takes time to implement across services. To prevent unwanted patterns or behaviors, Monzo uses static analysis checks to identify issues before they are shipped. Before making these checks mandatory, we ensure that the existing codebase is cleaned up to avoid engineers being tripped up by failing checks that are not related to their modifications. This approach ensures a high-quality signal, rather than engineers ignoring the checks. The high friction to bypass these checks is intentional to ensure that the correct behavior is the path of least resistance.

In April 2018, TSB, a high-street bank in the UK, underwent a problematic migration project to move customers to a new banking platform. This resulted in customers being unable to access their money for an extended period, which led to TSB receiving a large million fine, nearly 33 million in compensation to customers, and reputational damage. The FCA report on the incident examines both the technological and organizational aspects of the problem, including overly ambitious planning schedules, inadequate testing, and the challenge of balancing development speed with quality. While it may be tempting to solely blame technology for issues, the report emphasizes the importance of examining organizational factors that may have contributed to the outage.

Reflecting on past incidents and projects is highly beneficial in improving operations: Monzo experienced an incident in July 2019, when a configuration error in Cassandra during a scale-up operation forced a stop to all writes and reads to the cluster. This event set off a chain reaction of improvements spanning multiple years to enhance the operational capacity of the database systems. Since then, Monzo has invested in observability, deepening the understanding of Cassandra and other production systems, and we are more confident in all operational matters through runbooks and production practices.

Earlier I mentioned the early technological decisions made by Monzo and the understanding that it wouldn't be an easy ride: over the last seven years, we have had to experiment, build, and troubleshoot through many challenges, and this process continues. If an organization is not willing or able to provide the necessary investment and support for complex systems, this must be taken into consideration when making architectural and technological choices: choosing the latest technology or buzzword without adequate investment is likely to lead to failure. Instead, it is better to choose simpler, more established technology that has a higher chance of success. While some may consider this approach to be boring, it is ultimately a safer and more reliable option.

Teams are always improving tools and raising the level of abstraction. By standardizing on a small set of technological choices and continuously improving these tools and abstractions, engineers can focus on the business problem rather than the underlying infrastructure. It is important to be conscious when systems deviate from the standardized road.

While there's a lot of focus on infrastructure in organizations, such as infrastructure as code, observability, automation, and Terraform, one theme often overlooked is the bridge between infrastructure and software engineers. Engineers don't need to be experts in everything and core patterns can be abstracted away behind a well-defined, tested, documented, and bespoke interface. This approach saves time, promotes uniformity, and embraces best practices for the organization.

Showing different examples of incidents, we highlighted the importance of introspection: while many may have a technical root cause, it's essential to dig deeper and identify any organizational issues that may have contributed. Unfortunately, most post-mortems tend to focus heavily on technical details, neglecting the organizational component.

It's essential to consider the impact of organizational behaviors and incentives on the success or failure of technical architecture. Systems don't exist in isolation and monitoring, and rewarding the operational stability, speed, security, and reliability of the software you build and operate is critical to success.

See the rest here:

Banking on Thousands of Microservices - InfoQ.com

Posted in Cloud Computing | Comments Off on Banking on Thousands of Microservices – InfoQ.com

Cyber Security vs. Data Science Which Is the Right Career Path? – Analytics Insight

Posted: at 5:14 pm

Here is the comparison between the most in-demand fields Cyber Security vs. Data Science

Todays IT-intensive environment has taught us two important lessons: we need solutions to transform tidal surges of data into something that organizations can utilize to make educated decisions. We must safeguard that data and the networks on which it is stored.

As a result, we have the fields of data science and cyber security. So, which is the better job path? You wont get far if you approach the debate between cyber security vs. data science in terms of which field is more in demand. Both fields are in desperate need of a workforce.

Cyber security is the discipline of securing data, devices, and networks against unauthorized use or access while assuring and maintaining information availability, confidentiality, and integrity. A career in cybersecurity entails entering a thriving industry with more available positions than qualified applicants.

Data science combines domain knowledge, programming abilities, and mathematical and statistical knowledge to generate usable, relevant insights from massive amounts of unstructured data, often known as Big Data.

A career in data science includes carrying out data processing responsibilities, data scientists often use algorithms, processes, tools, scientific methods, techniques, and systems, and then apply the derived insights across multiple domains.

Data science and cyber security are inextricably linked since the latter demands the defences and protection that the former supplies. To obtain their conclusions and assure the security of the resultant processed information, data scientists require clean, uncompromised data. As a result, the area of data science looks to cyber security to assist protect the information in any form.

For someone interested in a career in one of the more intriguing and busy IT disciplines, cyber security and data science present fantastic chances. The career trajectories in both fields are comparable.

Experts in cyber security often begin their careers with a bachelors degree in computer science, information technology, cyber security, or a related profession. Aspirants in the field of cyber security should also be proficient in fundamental subjects like programming, cloud computing, and network and system administration.

The prospective cyber security specialist joins a corporation as an entry-level employee after graduating. After a few years of work experience, its time to apply for a senior position, which normally calls for a masters degree and certification in a variety of cybersecurity-related fields.

Cyber security experts choose career paths like security analyst, ethical hacker, chief information security officer, penetration tester, security architect, and IT security consultant.

Data scientists demand more formal education than cyber security specialists. A masters or even a bachelors degree isnt required for cybersecurity professionals, though having those resources helps. A bachelors degree in data science, computer science, or a similar branch of study is required for most data science professions. After a few years in an entry-level role, the ambitious data scientist should seek a masters degree in Data Science, reinforced by a few relevant certifications, and apply for a position as a senior data analyst.

Data science experts choose career paths like data engineer, marketing manager, data leader, product manager, and machine learning leader.

According to Glassdoor, the average yearly salary for cyber security specialists in the United States is US$94,794, whereas this figure is 110,597 in India.

In the field of data science, Indeed reports that US-based data scientists make an average salary of US$124,074 annually, while their Indian counterparts earn an average salary of US$830,319 annually.

Depending on demand, the hiring of certain individuals, and the location, these numbers frequently change.

Read the original post:

Cyber Security vs. Data Science Which Is the Right Career Path? - Analytics Insight

Posted in Cloud Computing | Comments Off on Cyber Security vs. Data Science Which Is the Right Career Path? – Analytics Insight

DIGITAL PROMISE: Amazon pledges further R30bn SA investment … – Daily Maverick

Posted: at 5:14 pm

Amazons cloud service, Amazon Web Services (AWS), has announced plans to invest a further R30.4-billion in its cloud infrastructure in South Africa by 2029. It has already invested R15.6-billion in the country.

In a new economic impact study outlining Amazons investment in its AWS Africa (Cape Town) region since 2018, the group estimates its total investment of R46-billion between 2018 and 2029 will add at least R80-billion in gross domestic product to the South African economy. It will also help to support about 5,700 full-time equivalent (FTE) jobs at local vendors each year.

The FTE jobs are supported across the data centre supply chain, such as telecommunications, non-residential construction, electricity generation, facilities maintenance and data centre operations.

AWS provides cloud computing or on-demand delivery of IT resources over the internet which allows customers to access computing power, data storage and other services with pay-as-you-go pricing, as opposed to the traditional contract-based IT model.

Many of South Africas public sector institutions make use of AWS.

GovChat, SAs largest citizen-government engagement platform, provides a conversational interface that integrates voice and text into applications and provides a unified platform that citizens can use to connect with the government.

Wits University, SAs largest research university, has adopted a cloud-first approach to its IT strategy, using technology to enhance all its core processes.

Other AWS clients include Absa, Investec, Medscheme, MiX Telematics, Old Mutual Limited, Pick n Pay, Standard Bank, Pineapple and Travelstart.

Amazon is also steaming ahead with its retail marketplace in South Africa, with an expected launch towards the end of the year.

On 28 April 2023, Bloomberg reported that Amazon had warned that growth in its cloud computing business was continuing to cool.

AWS revenue rose 16% to $21.4-billion in the first quarter, as Amazon reported stronger-than-expected profits and sales in the period.

Last week, Amazon executives jolted investors by admitting that sales growth in the cloud computing unit had slowed. Some analysts have speculated that as companies seek to trim technology costs, AWS growth could sink to single digits, according to the report.

Amazons chief financial officer, Brian Olsavsky, told reporters that AWS was less profitable now than it was a year ago, partly owing to discounts offered in exchange for longer-term contracts. BM/DM

Original post:

DIGITAL PROMISE: Amazon pledges further R30bn SA investment ... - Daily Maverick

Posted in Cloud Computing | Comments Off on DIGITAL PROMISE: Amazon pledges further R30bn SA investment … – Daily Maverick

LigaData Acquires Veloce Cloud Computing to Expand Their Cloud AI Product and Services Offerings – EIN News

Posted: March 4, 2023 at 12:24 am

LigaData Acquires Veloce Cloud Computing to Expand Their Cloud AI Product and Services Offerings  EIN News

More here:

LigaData Acquires Veloce Cloud Computing to Expand Their Cloud AI Product and Services Offerings - EIN News

Posted in Cloud Computing | Comments Off on LigaData Acquires Veloce Cloud Computing to Expand Their Cloud AI Product and Services Offerings – EIN News

Big Tech’s Cloud Computing Businesses Are Still Getting Bigger, but Not as Quickly as They … – Latest – LatestLY

Posted: February 10, 2023 at 11:51 am

Big Tech's Cloud Computing Businesses Are Still Getting Bigger, but Not as Quickly as They ... - Latest  LatestLY

Continue reading here:

Big Tech's Cloud Computing Businesses Are Still Getting Bigger, but Not as Quickly as They ... - Latest - LatestLY

Posted in Cloud Computing | Comments Off on Big Tech’s Cloud Computing Businesses Are Still Getting Bigger, but Not as Quickly as They … – Latest – LatestLY

Cloud Computing – GeeksforGeeks

Posted: January 27, 2023 at 8:04 pm

Improve Article

Save Article

Like Article

Improve Article

Save Article

In Simplest terms, cloud computing means storing and accessing the data and programs on remote servers that are hosted on the internet instead of the computers hard drive or local server. Cloud computing is also referred to as Internet-based computing. Cloud Computing Architecture: Cloud computing architecture refers to the components and sub-components required for cloud computing. These components typically refer to:

Hosting a cloud: There are three layers in cloud computing. Companies use these layers based on the service they provide.

Three layers of Cloud Computing

At the bottom is the foundation, the Infrastructure where the people start and begin to build. This is the layer where the cloud hosting lives. Now, lets have a look at hosting: Lets say you have a company and a website and the website has a lot of communications that are exchanged between members. You start with a few members talking with each other and then gradually the number of members increases. As the time passes, as the number of members increases, there would be more traffic on the network and your server will get slow down. This would cause a problem. A few years ago, the websites are put on the server somewhere, in this way you have to run around or buy and set the number of servers. It costs a lot of money and takes a lot of time. You pay for these servers when you are using them and as well as when you are not using them. This is called hosting. This problem is overcome by cloud hosting. With Cloud Computing, you have access to computing power when you needed. Now, your website is put in the cloud server as you put it on a dedicated server. People start visiting your website and if you suddenly need more computing power, you would scale up according to the need.

Benefits of Cloud Hosting:

To more clarification about how cloud computing has changed the commercial deployment of the system. Consider above the three examples:

This article is contributed by Brahmani Sai. If you like GeeksforGeeks and would like to contribute, you can also write an article using write.geeksforgeeks.org or mail your article to review-team@geeksforgeeks.org. See your article appearing on the GeeksforGeeks main page and help other Geeks.

Please write comments if you find anything incorrect, or if you want to share more information about the topic discussed above.

See the original post here:

Cloud Computing - GeeksforGeeks

Posted in Cloud Computing | Comments Off on Cloud Computing – GeeksforGeeks

What is Cloud Computing? | SNHU – Southern New Hampshire University

Posted: at 8:04 pm

When you stream your favorite album online, shop at an e-commerce store or answer your work email from your home computer, youre reaping the benefits of cloud computing. But what is cloud computing, really?

Cloud computing is a form of computing in which networks, data storage, applications, security and development tools are all enabled via the Internet, as opposed to a local computer or an on-premise server in your organization.

Companies are moving to the cloud because it makes it easy to deliver high-quality customer experiences without the complexity of maintaining data centers full of expensive computing equipment, said Jonathan Kamyck, associate dean of cyber security programsat Southern New Hampshire University (SNHU).

The field of cloud computing has been growing rapidly for years, as more companies seek to work remotely, boost efficiency through automation and save money on IT infrastructure. According to a 2021 report from Gartner, global end-user spending on public cloud services is projected to grow 23.1% in 2021 to $332.3 billion up from $270 billion in 2020.

With this growth comes evolving career opportunities. If you want to get started in this dynamic field, its important to understand the different types of cloud computing and what you can do with them.

From global brands to tech start-ups, organizations are finding new ways all the time to use cloud computing to offer services, protect data and run their businesses.

Currently, there are three primary types of cloud computing models:

IaaS provides users access to hosted computing resources, such as networking, processing power and data storage, said Adam Goldstein, an adjunct instructor in STEM programsat SNHU.

IaaS provides the basic building blocks for cloud-based IT, offering infrastructure like firewalls and virtual local area networks. Amazon Web Services (AWS) and Microsoft Azure are two common examples of IaaS.

PaaS provides access to a platform within which users can develop and build custom software and applications, said Goldstein.

With PaaS, developers can focus on the creative side of app development, without having to manage software updates and other infrastructure. Magento Commerce Cloud is an example of PaaS commonly used by e-commerce companies to build and manage custom online stores.

SaaS allows users to subscribe to a fully functioning software service that is run and managed by the service provider, said Goldstein.

With SaaS, the end-user only has to focus on how they will use that particular piece of software within their business. They dont have to think about how the service is maintained or how infrastructure is managed. An example of SaaS is Microsoft Office 365, in which all Microsoft Office applications are available in a browser without installing them on a local computer.

Among the different types of cloud computing services, there are many different uses of cloud computing across virtually every industry. As of 2021, 80% of businesses moved their work to a hybrid approach by combining both public and private clouds, according to a 2022 report from Flexera. And this trend is likely to continue in the years ahead.

The cloud makes it easier for companies of all sizes to increase their competitiveness," Kamyck said. "Resources that were once dedicated to purchasing, installing, configuring, and maintaining traditional computing networks can be redirected to focus on solving core business problems and exploring new opportunities

So what are examples of cloud computing uses? According to Kamyck and Goldstein, cloud computing drives many of the popular personal and enterprise services consumers use every day. This includes collaboration suites like Google Apps and Microsoft Office 365 as well as learning management systems used by schools, streaming services and Internet-hosted video games.

With media streaming services, for example, content is delivered over the Internet and consumed immediately instead of through files downloaded and saved to a users computer, Kamyck said.

Another example of cloud computing in action is Amazons AWS, said Goldstein. AWS provides cloud services to run Amazon.com, one of the largest e-commerce sites in the world.

The use of cloud computing doesnt end with shopping and music streaming, however. Most people are likely engaging with cloud-based services in some way throughout their daily lives.

E-commerce, software services and applications, large and small database hosting, gaming, data warehousing and internet of things are just a few of the things that people are doing in the cloud, said Goldstein.

Because there are so many applications for cloud computing across a range of industries, there is also a wide variety of jobs that use cloud computing on a daily basis.

Almost all IT jobswill have some interaction with the cloud, said Goldstein. System administrators, network engineers, software developers, IT architects, database administrators and cybersecurity engineers all may use cloud services on a regular basis.

Opportunities in these fields are growing, according to data from the U.S. Bureau of Labor Statistics (BLS). Jobs for database administrators, for example, are projected to grow 9% by 2031. Software developer jobs are expected to grow 25% and jobs for computer network architects are projected to grow 4% over the same time period.

According to Kamyck, many technology-related jobs now require some level of familiarity with cloud computing technologies.

It doesnt really matter what your niche is in the technology spacecloud computing affects everyone," Kamyck said. "Technology managers, analysts, software developers, cybersecurity experts, networking engineers, and system administrators are all being challenged to understand the cloud and learn how to harness it to benefit their organizations.

An entry-level employee may start as a cloud administrator or cloud developer and with additional experience and certifications, they could eventually work as a chief cloud architect, providing technical direction to the platform and application development teams.

Experienced cloud administrators could also take on more specialized roles such as cloud security analysts or API developers, said Goldstein.

Workers with cloud computing expertise will be well-positioned to advance in a variety of career paths over the coming years, said Kamyck.

If you want to get started on any of these fast-growing career paths, getting the right educationand training will be key.

Based on the rapid growth of cloud computing there is definitely a demand for trained individuals to work in the field, said Goldstein.

The first step toward landing a job in cloud computing is to focus on professional training and education.

Because cloud computing is becoming a core part of most technology fields, a bachelors degree in computer science, information technology, information systems or cybersecurity is an important step toward a cloud computing career.

Goldstein said that many of the technical skills needed for success in cloud computing jobs can be gained through IT and computer science degree programs, including:

A degree or higher education certificate programfocused on those applied technical skills with hands-on learning is really beneficial to acquire a variety of skills, said Goldstein.

For students who know they want to specialize in cloud computing, online training programs focused on those specific technical skills can be a valuable addition to a degree program. SNHU, for instance, offers the Amazon Web Services (AWS) Cloud Foundations course, which helps prepare students for the AWS Certified Cloud Practitioner exam.

Because cloud computing is constantly evolving, getting hands-on industry experience is another important step toward a career. A cloud computing internship is a great way to start working in the field and gain key technical and soft skills needed in the industry.

Cloud computing internships are a great way to start working in the field and gain key technical and soft skills.

Students studying computer science can also work on their own cloud-based projects building websites, games or other applications to add to a portfolio of work and gain experience with specific cloud technologies.

The quickest path to a career in cloud computing is to choose a well-known platform like Amazon AWS or Microsoft Azure, sign up for a low-cost account, and start tinkering with their technologies, said Kamyck.

Earning professional certifications in cloud computing is another important step toward working in the field.

After someone has gained selected a cloud platform and is ready for formal training, professional certification programs that emphasize hands-on learning are a great next step, Kamyck said.

AWS, for example, offers an entry-level Cloud Practitioner certificate and the more advanced AWS Certified Solutions Architect (CSA) - Associate and AWS CSA - Professional certificates.

Additional certifications from AWS, Microsoft and Google focus on other more advanced skills, including cloud architecture, cloud development, systems administration, cloud security and machine learning.

No matter what path you take to land a job in cloud computing, youll gain key skills that can help you start and grow a successful career in technology and prepare you for industry changes ahead.

Cloud computing is rapidly evolving and will become more and more important to companies over the next decade. Technology professionals that build experience and skills with cloud technologies now will reap the benefits of their efforts for years to come, said Kamyck.

A degree can change your life. Find the SNHU technology programthat can best help you meet your goals.

Danielle Gagnon is a freelance writer focused on higher education. Connect with her on LinkedIn.

View post:

What is Cloud Computing? | SNHU - Southern New Hampshire University

Posted in Cloud Computing | Comments Off on What is Cloud Computing? | SNHU – Southern New Hampshire University

Page 14«..10..13141516..2030..»