Daily Archives: April 9, 2020

Foods to boost your health while living through the pandemic – Belfast Telegraph

Posted: April 9, 2020 at 6:29 pm

In the coming months, lots of us will be looking at all areas of our life to help protect our own bodies. The overall function of the immune system is to prevent or limit infection. The immune system is good at differentiating between normal healthy cells and unhealthy ones. Infectious microbes release signals that the immune system recognises causing a response to deal with the problem. Infection arises when the bugs get ahead of the immune system. When it comes to nutrition, there are some areas that you could focus on in a bid to better support your immune system.

First and foremost, it's important to eat enough. Calories are needed to feed all systems in the body, including our immune system. On average women need about 2,000 calories a day while men require about 2,500 calories a day. It's important to try and eat enough, which may be challenging if you are sick. If sick and off your food, the 'little and often' approach is advisable as well as eating foods that you feel you can eat. Trust your gut! You may find it helpful to eat dry, plain and cold foods. Additionally, consuming calories through fluids may be easier. However, if you are not sick, it's a good idea to avoid calorie-restricted diets, especially those with large calorie deficits. A healthy diet does support the immune system. Some nutrients are worth focusing on.

Vitamin A is an antioxidant vitamin with many roles including supporting normal immune function. Additionally, vitamin A helps to keep our skin healthy as well as the mucous membranes in our mouth, stomach, intestines, and our respiratory system healthy. As our skin and mucous membranes act as one of the first barriers to infection, it's important that we nourish them. A supplement is often not required. There are two types of vitamin A: vitamin A from animal produce such as meat, dairy and eggs, and vitamin A from plants such as oils, leafy greens and yellow/orange vegetables. Aim to eat these foods often.

Vitamin C is an antioxidant vitamin that plays an important role in immune function. Since the 1930s, vitamin C has been proposed for treating respiratory infections. It became particularly popular in the 1970s when it was suggested to prevent and treat the common cold. Lots of trials have been completed since. A review from 2013 which included 29 trials and over 11,000 people stated that vitamin C supplementation had no effect on the incidence of the common cold in the general public. A review of five trials and nearly 600 participants showed that a vitamin C supplement halved the risk of the common cold in athletes. However, a regular supplement had a modest and consistent effect in reducing the duration of common cold symptoms, with results being higher in children. This is based off results from 31 studies with nearly 10,000 episodes of common cold.

The trials within this review didn't support taking vitamin C once you got a cold as it didn't impact duration or severity. Like all supplements, and despite being a water soluble vitamin, vitamin C is not healthy in large doses. As fruit and vegetables are such great sources, it may be beneficial to focus on eating your seven-a-day instead. For example, requirements are easily met if you eat red peppers.

Vitamin E is another antioxidant vitamin that has a role in supporting immune function. Naturally occurring vitamin E occurs in eight different chemical forms with varying levels of biological activity. Alpha-tocopherol is the one us humans need to focus on. Generally, a supplement isn't needed. Nuts, seeds, and vegetable oils are among the best sources of alpha-tocopherol, as well as green leafy vegetables. For example, sunflower seeds, almonds, hazelnuts and peanut butter are rich sources.

Zinc has an important role in supporting immune function. Although zinc is an essential mineral that is naturally present in our foods, it is also found in many cold lozenges and some over-the-counter drugs.

This is because researchers suggest that zinc could reduce the severity and duration of the common cold by inhibiting rhinovirus in the nasal passage and by suppressing inflammation. Studies have provided conflicting results.

Nevertheless, zinc does appear to help in certain circumstances. Lozengers and zinc-containing syrups may be the preferred choice as they spend more time in the mouth. Oysters are an incredibly rich source of zinc. However, beef, pork, baked beans and pumpkin seeds are everyday foods that will help you meet your zinc requirements.

Vitamin D is a fat soluble vitamin that is naturally present in few foods. Researchers have suggested that taking Vitamin D supplements may enhance resistance to respiratory infections such as Covid-19, or limit the severity of the illness for those that do become infected.

The opinion of Dr Daniel McCartney and Dr Declan Byrne was published in the Irish Medical Journal last week. They recommended that adults living in this part of the world take 20-50 micrograms of vitamin D per day for the next three to six months as a short-term measure to specifically address the risk of COVID-19. Vitamin D deficiency is common here. Those at greater risk include older people, nursing home residents and hospital in-patients.

This level of supplementation should only be considered as a short-term measure which may potentially help those who are deficient in vitamin D or potentially deficient in vitamin D.

It is advisable that a person should only take this amount of vitamin D under the guidance and supervision of their doctor. The current recommended dose is 200-400 IU. Oily fish and eggs are sources.

Echinacea is widely used to reduce the risk and symptoms of a common cold, despite having limited evidence for it.

A big problem with the studies that have been conducted is that one echinacea product differs quite greatly to the next one. Like other supplements, sometimes other herbs are also added.

A review of the literature was done. However, due to all the differences in supplements being tested, it was difficult to draw any strong conclusions.

Nonetheless, it does seem that some echinacea products may be effective at treating colds. Although, the overall evidence is weak.

There have been a number of studies investigating the impact of particular strains of probiotics on upper respiratory tract infections and the immune system in athletic cohorts ranging from healthy active people to elite athletes.

Unfortunately, it's hard to draw guidelines from these studies, even though the evidence for use is building, as different types of probiotics are used within these studies. Additionally, differences in the effectiveness relate to the type of sport, the training history of the person, and the training load they're undertaking. For the general population, it's important to note that the gut appears to play an important role in immune function.

One of the most important areas to focus on when trying to improve your gut microbiome is to aim for a varied diet with lots of different types of plants. However, it's likely with time that more specific guidance will emerge.

Eating a healthy varied diet is the best way to support your immune system with food. Supplements can sometimes be referred to as the 'sprinkles on the icing of the cake'.

Covid-19 is new so no research has been completed on diet, nutrients and the likeliness of infection, severity of infection or duration of infection. Therefore, nutritional practices for the common cold are simply nutritional practices for the common cold. The most important steps with regards to Covid-19 remain hand hygiene and social distancing.

However, it will certainly do no harm to focus on eating lots of the foods mentioned above, taking your vitamin D supplement to prevent deficiency and eating a variety of plants to support a healthy gut microbiome.

BROWN BREAD

Ingredients: 350g wholemeal flour;125g plain flour; 20g wheatgerm; 20g wheat bran; 1 tsp bread soda; 1 tsp salt; 500ml buttermilk.

METHOD:

1. Preheat your oven to 200C.

2. Mix all the dry ingredients in a bowl.

3. Add the buttermilk & mix well. 4. Grease a bread tin. Spoon in the contents. Bake for 10 min then turn down the heat to 180C and bake for another 40 mins.

BAKED EGGS IN RED PEPPER CUPS

Ingredients: 1 red pepper, oil spray, 2 eggs

METHOD:

1. Preheat the oven to 175C.

2. Slice the pepper in half.

3. Place on a baking tray.

4. Crack an egg into each half.

5. Bake until your eggs are cooked to your preference.

Approx 20-30min.

POSH BAKED BEANS

Ingredients: 2 tbsp pumpkin seeds;

2 slices of brown bread; 200g baked beans; olive oil; half a garlic clove.

METHOD:

1. Heat a frying pan. Dry fry pumpkin seeds.

2. Toast brown bread.

3. Meanwhile heat your beans in a pan.

4. Brush your toast with olive oil and rub with garlic.

5. Place beans on top.

6. Sprinkle with pumpkin seeds.

Belfast Telegraph

Originally posted here:

Foods to boost your health while living through the pandemic - Belfast Telegraph

Posted in Food Supplements | Comments Off on Foods to boost your health while living through the pandemic – Belfast Telegraph

Probiotic Based Dietary Supplements Market Size Analysis, Top Manufacturers, Shares, Growth Opportunities and Forecast to 2026 – Science In Me

Posted: at 6:29 pm

New Jersey, United States: Market Research Intellect has added a new research report titled, Probiotic Based Dietary Supplements Market Professional Survey Report 2020 to its vast collection of research reports. The Probiotic Based Dietary Supplements market is expected to grow positively for the next five years 2020-2026.

The Probiotic Based Dietary Supplements market report studies past factors that helped the market to grow as well as, the ones hampering the market potential. This report also presents facts on historical data from 2011 to 2019 and forecasts until 2026, which makes it a valuable source of information for all the individuals and industries around the world. This report gives relevant market information in readily accessible documents with clearly presented graphs and statistics. This report also includes views of various industry executives, analysts, consultants, and marketing, sales, and product managers.

Key Players Mentioned in the Probiotic Based Dietary Supplements Market Research Report:

Market Segment as follows:

The global Probiotic Based Dietary Supplements Market report highly focuses on key industry players to identify the potential growth opportunities, along with the increased marketing activities is projected to accelerate market growth throughout the forecast period. Additionally, the market is expected to grow immensely throughout the forecast period owing to some primary factors fuelling the growth of this global market. Finally, the report provides detailed profile and data information analysis of leading Probiotic Based Dietary Supplements company.

Probiotic Based Dietary Supplements Market by Regional Segments:

The chapter on regional segmentation describes the regional aspects of the Probiotic Based Dietary Supplements market. This chapter explains the regulatory framework that is expected to affect the entire market. It illuminates the political scenario of the market and anticipates its impact on the market for Probiotic Based Dietary Supplements .

The Probiotic Based Dietary Supplements Market research presents a study by combining primary as well as secondary research. The report gives insights on the key factors concerned with generating and limiting Probiotic Based Dietary Supplements market growth. Additionally, the report also studies competitive developments, such as mergers and acquisitions, new partnerships, new contracts, and new product developments in the global Probiotic Based Dietary Supplements market. The past trends and future prospects included in this report makes it highly comprehensible for the analysis of the market. Moreover, The latest trends, product portfolio, demographics, geographical segmentation, and regulatory framework of the Probiotic Based Dietary Supplements market have also been included in the study.

Ask For Discount (Special Offer: Get 25% discount on this report) @ https://www.marketresearchintellect.com/ask-for-discount/?rid=201109&utm_source=SI&utm_medium=888

Table of Content

1 Introduction of Probiotic Based Dietary Supplements Market1.1 Overview of the Market1.2 Scope of Report1.3 Assumptions

2 Executive Summary

3 Research Methodology3.1 Data Mining3.2 Validation3.3 Primary Interviews3.4 List of Data Sources

4 Probiotic Based Dietary Supplements Market Outlook4.1 Overview4.2 Market Dynamics4.2.1 Drivers4.2.2 Restraints4.2.3 Opportunities4.3 Porters Five Force Model4.4 Value Chain Analysis

5 Probiotic Based Dietary Supplements Market, By Deployment Model5.1 Overview

6 Probiotic Based Dietary Supplements Market, By Solution6.1 Overview

7 Probiotic Based Dietary Supplements Market, By Vertical7.1 Overview

8 Probiotic Based Dietary Supplements Market, By Geography8.1 Overview8.2 North America8.2.1 U.S.8.2.2 Canada8.2.3 Mexico8.3 Europe8.3.1 Germany8.3.2 U.K.8.3.3 France8.3.4 Rest of Europe8.4 Asia Pacific8.4.1 China8.4.2 Japan8.4.3 India8.4.4 Rest of Asia Pacific8.5 Rest of the World8.5.1 Latin America8.5.2 Middle East

9 Probiotic Based Dietary Supplements Market Competitive Landscape9.1 Overview9.2 Company Market Ranking9.3 Key Development Strategies

10 Company Profiles10.1.1 Overview10.1.2 Financial Performance10.1.3 Product Outlook10.1.4 Key Developments

11 Appendix11.1 Related Research

Complete Report is Available @ https://www.marketresearchintellect.com/product/probiotic-based-dietary-supplements-market-size-and-forecast/?utm_source=SI&utm_medium=888

We also offer customization on reports based on specific client requirement:

1-Freecountry level analysis forany 5 countriesof your choice.

2-FreeCompetitive analysis of any market players.

3-Free 40 analyst hoursto cover any other data points

About Us:

Market Research Intellect provides syndicated and customized research reports to clients from various industries and organizations with the aim of delivering functional expertise. We provide reports for all industries including Energy, Technology, Manufacturing and Construction, Chemicals and Materials, Food and Beverage and more. These reports deliver an in-depth study of the market with industry analysis, market value for regions and countries and trends that are pertinent to the industry.

Contact Us:

Mr. Steven FernandesMarket Research IntellectNew Jersey ( USA )Tel: +1-650-781-4080

Email: [emailprotected]

Get Our Trending Report

https://www.marketresearchblogs.com/

https://www.marktforschungsblogs.com/

Tags: Probiotic Based Dietary Supplements Market Size, Probiotic Based Dietary Supplements Market Growth, Probiotic Based Dietary Supplements Market Forecast, Probiotic Based Dietary Supplements Market Analysis, Probiotic Based Dietary Supplements Market Trends, Probiotic Based Dietary Supplements Market

More:

Probiotic Based Dietary Supplements Market Size Analysis, Top Manufacturers, Shares, Growth Opportunities and Forecast to 2026 - Science In Me

Posted in Food Supplements | Comments Off on Probiotic Based Dietary Supplements Market Size Analysis, Top Manufacturers, Shares, Growth Opportunities and Forecast to 2026 – Science In Me

Will product designers survive the AI revolution? – The Next Web

Posted: at 6:28 pm

Did you know TNW Conference has a track fully dedicated to exploring new design trends this year? Check out the full Sprint program here.

Our intelligence is what makes us human, and AI is an extension of that quality. Yann LeCun

Thehuman species has performed incredible feats of ingenuity. We have created beautiful sculptures from a single block of marble, written enchanting sonnets that have stood for centuries and landed a craft on the face of a distant rock orbiting our planet. It is sobering then to think, that what separates us from our close, albeit far less superior cousins the chimpanzee, is a45% difference in our genomes.

I propose to you, however, that natures insatiable thirst for balance has ultimately led us to create a potential rival to our dominance as a species on this planetArtificial Intelligence. The pertinent question then becomes, what aspects of our infamous ingenuity will AI augment, and perhaps ultimately surpass?

What is AI & Machine Learning?

Essentially what some really smart people out there are trying to achieve, is a computer system that emulates human intelligence. This is the ability to make decisions that maximize the chance of the system achieving its goals. Even more important is the ability of the system to learn and evolve.

To achieve this, every system needs a starting point massive amounts of data. For example, in order to train a computer system to tell the difference between a cat and a dog, you would have to feed it with thousands of images of cats and dogs.

Read: [AI will never replace good old human creativity]

What is creativity?

Creativity is seeing what everyone else saw, and thinking what no one else thought Albert Einstein

Ive heard many people say a computer system could never be creative, and that to create art, music,or an ad campaign, one needs to feel, have a soul, and a lifetime of experiences to draw from.

Having spent over a decade in the advertising industry, I can confidently say that the best creatives I have seen, were usually the ones with the most exposure. The more you have seen, traveled or experienced, the more creative you tend to be.

Creativity is about challenging the norm,thinking differently, being the square pegs in the round holes, and evoking specific emotions in your audience. So how difficult can that be for AI to achieve? It certainly seems that in todays world, creativity is actually very arbitrary. Why? Because both this

and this

are considered valuable works of art.

The current state of AI vs Creatives

Link:

Will product designers survive the AI revolution? - The Next Web

Posted in Ai | Comments Off on Will product designers survive the AI revolution? – The Next Web

How Microsoft Teams will use AI to filter out typing, barking, and other noise from video calls – VentureBeat

Posted: at 6:28 pm

Last month, Microsoft announced that Teams, its competitor to Slack, Facebooks Workplace, and Googles Hangouts Chat, had passed 44 million daily active users. The milestone overshadowed its unveiling of a few new features coming later this year. Most were straightforward: a hand-raising feature to indicate you have something to say, offline and low-bandwidth support to read chat messages and write responses even if you have poor or no internet connection, and an option to pop chats out into a separate window. But one feature, real-time noise suppression, stood out Microsoft demoed how the AI minimized distracting background noise during a call.

Weve all been there. How many times have you asked someone to mute themselves or to relocate from a noisy area? Real-time noise suppression will filter out someone typing on their keyboard while in a meeting, the rustling of a bag of chips (as you can see in the video above), and a vacuum cleaner running in the background. AI will remove the background noise in real time so you can hear only speech on the call. But how exactly does it work? We talked to Robert Aichner, Microsoft Teams group program manager, to find out.

The use of collaboration and video conferencing tools is exploding as the coronavirus crisis forces millions to learn and work from home. Microsoft is pushing Teams as the solution for businesses and consumers as part of its Microsoft 365 subscription suite. The company is leaning on its machine learning expertise to ensure AI features are one of its big differentiators. When it finally arrives, real-time background noise suppression will be a boon for businesses and households full of distracting noises. Additionally, how Microsoft built the feature is also instructive to other companies tapping machine learning.

Of course, noise suppression has existed in the Microsoft Teams, Skype, and Skype for Business apps for years. Other communication tools and video conferencing apps have some form of noise suppression as well. But that noise suppression covers stationary noise, such as a computer fan or air conditioner running in the background. The traditional noise suppression method is to look for speech pauses, estimate the baseline of noise, assume that the continuous background noise doesnt change over time, and filter it out.

Going forward, Microsoft Teams will suppress non-stationary noises like a dog barking or somebody shutting a door. That is not stationary, Aichner explained. You cannot estimate that in speech pauses. What machine learning now allows you to do is to create this big training set, with a lot of representative noises.

In fact, Microsoft open-sourced its training set earlier this year on GitHub to advance the research community in that field. While the first version is publicly available, Microsoft is actively working on extending the data sets. A company spokesperson confirmed that as part of the real-time noise suppression feature, certain categories of noises in the data sets will not be filtered out on calls, including musical instruments, laughter, and singing.

Microsoft cant simply isolate the sound of human voices because other noises also happen at the same frequencies. On a spectrogram of speech signal, unwanted noise appears in the gaps between speech and overlapping with the speech. Its thus next to impossible to filter out the noise if your speech and noise overlap, you cant distinguish the two. Instead, you need to train a neural network beforehand on what noise looks like and speech looks like.

To get his points across, Aichner compared machine learning models for noise suppression to machine learning models for speech recognition. For speech recognition, you need to record a large corpus of users talking into the microphone and then have humans label that speech data by writing down what was said. Instead of mapping microphone input to written words, in noise suppression youre trying to get from noisy speech to clean speech.

We train a model to understand the difference between noise and speech, and then the model is trying to just keep the speech, Aichner said. We have training data sets. We took thousands of diverse speakers and more than 100 noise types. And then what we do is we mix the clean speech without noise with the noise. So we simulate a microphone signal. And then you also give the model the clean speech as the ground truth. So youre asking the model, From this noisy data, please extract this clean signal, and this is how it should look like. Thats how you train neural networks [in] supervised learning, where you basically have some ground truth.

For speech recognition, the ground truth is what was said into the microphone. For real-time noise suppression, the ground truth is the speech without noise. By feeding a large enough data set in this case hundreds of hours of data Microsoft can effectively train its model. Its able to generalize and reduce the noise with my voice even though my voice wasnt part of the training data, Aichner said. In real time, when I speak, there is noise that the model would be able to extract the clean speech [from] and just send that to the remote person.

Comparing the functionality to speech recognition makes noise suppression sound much more achievable, even though its happening in real time. So why has it not been done before? Can Microsofts competitors quickly recreate it? Aichner listed challenges for building real-time noise suppression, including finding representative data sets, building and shrinking the model, and leveraging machine learning expertise.

We already touched on the first challenge: representative data sets. The team spent a lot of time figuring out how to produce sound files that exemplify what happens on a typical call.

They used audio books for representing male and female voices, since speech characteristics do differ between male and female voices. They used YouTube data sets with labeled data that specify that a recording includes, say, typing and music. Aichners team then combined the speech data and noises data using a synthesizer script at different signal to noise ratios. By amplifying the noise, they could imitate different realistic situations that can happen on a call.

But audiobooks are drastically different than conference calls. Would that not affect the model, and thus the noise suppression?

That is a good point, Aichner conceded. Our team did make some recordings as well to make sure that we are not just training on synthetic data we generate ourselves, but that it also works on actual data. But its definitely harder to get those real recordings.

Aichners team is not allowed to look at any customer data. Additionally, Microsoft has strict privacy guidelines internally. I cant just simply say, Now I record every meeting.'

So the team couldnt use Microsoft Teams calls. Even if they could say, if some Microsoft employees opted-in to have their meetings recorded someone would still have to mark down when exactly distracting noises occurred.

And so thats why we right now have some smaller-scale effort of making sure that we collect some of these real recordings with a variety of devices and speakers and so on, said Aichner. What we then do is we make that part of the test set. So we have a test set which we believe is even more representative of real meetings. And then, we see if we use a certain training set, how well does that do on the test set? So ideally yes, I would love to have a training set, which is all Teams recordings and have all types of noises people are listening to. Its just that I cant easily get the same number of the same volume of data that I can by grabbing some other open source data set.

I pushed the point once more: How would an opt-in program to record Microsoft employees using Teams impact the feature?

You could argue that it gets better, Aichner said. If you have more representative data, it could get even better. So I think thats a good idea to potentially in the future see if we can improve even further. But I think what we are seeing so far is even with just taking public data, it works really well.

The next challenge is to figure out how to build the neural network, what the model architecture should be, and iterate. The machine learning model went through a lot of tuning. That required a lot of compute. Aichners team was of course relying on Azure, using many GPUs. Even with all that compute, however, training a large model with a large data set could take multiple days.

A lot of the machine learning happens in the cloud, Aichner said. So, for speech recognition for example, you speak into the microphone, thats sent to the cloud. The cloud has huge compute, and then you run these large models to recognize your speech. For us, since its real-time communication, I need to process every frame. Lets say its 10 or 20 millisecond frames. I need to now process that within that time, so that I can send that immediately to you. I cant send it to the cloud, wait for some noise suppression, and send it back.

For speech recognition, leveraging the cloud may make sense. For real-time noise suppression, its a nonstarter. Once you have the machine learning model, you then have to shrink it to fit on the client. You need to be able to run it on a typical phone or computer. A machine learning model only for people with high-end machines is useless.

Theres another reason why the machine learning model should live on the edge rather than the cloud. Microsoft wants to limit server use. Sometimes, there isnt even a server in the equation to begin with. For one-to-one calls in Microsoft Teams, the call setup goes through a server, but the actual audio and video signal packets are sent directly between the two participants. For group calls or scheduled meetings, there is a server in the picture, but Microsoft minimizes the load on that server. Doing a lot of server processing for each call increases costs, and every additional network hop adds latency. Its more efficient from a cost and latency perspective to do the processing on the edge.

You want to make sure that you push as much of the compute to the endpoint of the user because there isnt really any cost involved in that. You already have your laptop or your PC or your mobile phone, so now lets do some additional processing. As long as youre not overloading the CPU, that should be fine, Aichner said.

I pointed out there is a cost, especially on devices that arent plugged in: battery life. Yeah, battery life, we are obviously paying attention to that too, he said. We dont want you now to have much lower battery life just because we added some noise suppression. Thats definitely another requirement we have when we are shipping. We need to make sure that we are not regressing there.

Its not just regression that the team has to consider, but progression in the future as well. Because were talking about a machine learning model, the work never ends.

We are trying to build something which is flexible in the future because we are not going to stop investing in noise suppression after we release the first feature, Aichner said. We want to make it better and better. Maybe for some noise tests we are not doing as good as we should. We definitely want to have the ability to improve that. The Teams client will be able to download new models and improve the quality over time whenever we think we have something better.

The model itself will clock in at a few megabytes, but it wont affect the size of the client itself. He said, Thats also another requirement we have. When users download the app on the phone or on the desktop or laptop, you want to minimize the download size. You want to help the people get going as fast as possible.

Adding megabytes to that download just for some model isnt going to fly, Aichner said. After you install Microsoft Teams, later in the background it will download that model. Thats what also allows us to be flexible in the future that we could do even more, have different models.

All the above requires one final component: talent.

You also need to have the machine learning expertise to know what you want to do with that data, Aichner said. Thats why we created this machine learning team in this intelligent communications group. You need experts to know what they should do with that data. What are the right models? Deep learning has a very broad meaning. There are many different types of models you can create. We have several centers around the world in Microsoft Research, and we have a lot of audio experts there too. We are working very closely with them because they have a lot of expertise in this deep learning space.

The data is open source and can be improved upon. A lot of compute is required, but any company can simply leverage a public cloud, including the leaders Amazon Web Services, Microsoft Azure, and Google Cloud. So if another company with a video chat tool had the right machine learners, could they pull this off?

The answer is probably yes, similar to how several companies are getting speech recognition, Aichner said. They have a speech recognizer where theres also lots of data involved. Theres also lots of expertise needed to build a model. So the large companies are doing that.

Aichner believes Microsoft still has a heavy advantage because of its scale. I think that the value is the data, he said. What we want to do in the future is like what you said, have a program where Microsoft employees can give us more than enough real Teams Calls so that we have an even better analysis of what our customers are really doing, what problems they are facing, and customize it more towards that.

See the article here:

How Microsoft Teams will use AI to filter out typing, barking, and other noise from video calls - VentureBeat

Posted in Ai | Comments Off on How Microsoft Teams will use AI to filter out typing, barking, and other noise from video calls – VentureBeat

AI In The Enterprise: Reality Or Myth? – Forbes

Posted: at 6:28 pm

Artificial intelligence (AI) is one of the most talked-about new technologies in the business world today.

It's estimated that enterprise AI usage has increased 270% since 2015. This has coincided with a massive spike in investment, with the enterprise AI industry expected to grow to $6.1 billion by 2022.

Along with the technology's very real ability to transform the job market, exaggerated myths have also become common. The hype surrounding this branch of technology has led to a number of myths:

Myth No. 1: More Data Is The Key To AI's Success

While it's true that AI needs data in order to learn and operate efficiently, the idea that more data equals better outcomes is misleading. Not all data is created equal.

If the information fed to an AI program is labeled incorrectly or isn't relevant, it poisons the data pool. The more information AI has access to, the more precise its models and predictions will be. If the data itself is of poor quality, the outcome will be precise but not necessarily based on business reality. This can result in poor decision-making.

The truth is that the data fed to an AI solution needs to be curated and analyzed beforehand. Prioritize quality over quantity.

Myth No. 2: Companies See Immediate Value From AI investments

The integration of AI into standard operating procedures doesn't happen overnight. As seen in Myth No. 1, the data the AI uses needs to be curated and checked for relevance beforehand. This may significantly reduce the amount of information the AI has access to.

To obtain truly valuable returns, it's essential to continuously provide relevant data. Like humans, AI solutions need to be given time to learn. There may be a significant lag between when an AI-based initiative begins and when you see a return on investment.

Myth No. 3: AI Will Render Humans Obsolete

The purpose of AI is not to replace all human workers. AI is a tool businesses can use to achieve their goals. It can automate mundane processes and pull interesting insights from large data sets. When used correctly, it augments and aids human decision-making. AI provides recommendations based on trends gleaned from mountains of information. It may even pose new questions that have never been considered. A human still needs to weigh the information provided and make a final decision based on risk analysis.

Pointing out these myths in no way indicates that AI won't deliver on its transformational promise. It's easy to forget that enterprise AI adoption is still in its infancy. Even still, a 2018 Deloitte survey reported that 82% of executives said their AI projects had already led to a positive ROI. Those now implementing AI projects will be the case studies of the near future.

While there are sure to be growing pains, being on the cutting edge of this exciting technology should be beneficial. There's little doubt about how important it will be for the businesses of tomorrow. Getting a head start now, ironing out the wrinkles and locking down efficient processes will pay dividends.

Go here to see the original:

AI In The Enterprise: Reality Or Myth? - Forbes

Posted in Ai | Comments Off on AI In The Enterprise: Reality Or Myth? – Forbes

AI streamlines acoustic ID of beluga whales – GCN.com

Posted: at 6:27 pm

AI streamlines acoustic ID of beluga whales

Scientists at the National Oceanic and Atmospheric Administration who study endangered beluga whales in Alaskas Cook Inlet used artificial intelligence to reduce the time they spend on analysis by 93%.

Researchers have acoustically monitored beluga whales in the waterway since 2008, but acoustic data analysis is labor-intensive because automated detection tools are relativelyarchaic in our field, Manuel Castellote, a NOAA affiliate scientist, told GCN. By improving the analysis process, we would provide resultssooner, and our research wouldbecome more efficient.

The analysis typically gets hung up in the process of validating the data because detectors pick up any acoustic signal that is similar to that of a beluga whales call or whistle. As a result, researchers get many false detections, including noise from vessel propellers, ice friction and even birds at the surface in shallow areas, Castellote said.

A machine learning model that could distinguish between actual whale calls and other sounds would provide highly accurate validation output and replace the effort of a human analyst going through thousands of detections to validatethe ones corresponding to beluga, he said.

The researchers used Microsoft AI products to develop a model with a deep neural network, a convolutional neural network, a deep residual network, and a densely connected convolutional neural network. The resulting detector that is an ensemble of these four AI models is more accurate than each of the independent models, Castellote said.

Heres how it works: Twice a year, researchers recover acoustic recorders from the seafloor. A semi-automated detector has been extracting the data and processing it, looking for tones in the recordings. It yields thousands sometimes hundreds of thousands of detections per dataset.

The team used the collection of recordings with annotated detections -- both actual beluga calls and false positives -- that it has amassed in the past 12 years to train the AI and ML tools.

Now, instead of having a data analyst sit in front of a computer for seven to 14 days to validate all these detections one by one, the unvalidated detection log is used by the ensemble model to check the recordings and validate all the detections in the log in four to five hours, Castellote said. The validated log is then used to generate plots of beluga seasonalpresence in each monitored location. These results are useful to inform management decisions.

With the significant time theyre saving, researchers can increase the number of recorders they send to the seafloor each season and focus on other aspects of data analysis, such as understanding where belugas feed based on the sounds they make when hunting prey, Castellote said. They can also study human-made noise to identify activity in the area that might harm the whales.

The team is now moving into the second phase of its collaboration with Microsoft, which involves cutting the semi-automated detector out of the process and instead applying ML directly to the sound recordings. The streamlined process will search for signals from raw data, rather than using a detection log to validate pre-detected signals.

This allows widening the detection process from beluga only to all cetaceans inhabiting Cook Inlet, Castellote said. Furthermore, it allows incorporating other target signals to be detected and classified [such as] human-made noise. Once the detection and classification processes are implemented, this approach will allow covering multiple objectives at once in our data analysis.

Castellotes colleague, Erin Moreland, will use AI this spring to monitor other mammals, too, including ice seals and polar bears. A NOAA turboprop airplane outfitted with AI-enabled cameras will fly over the Beaufort Sea scanning and classifying the imagery to produce a population count that will be ready in hours instead of months, according to a Microsoft blog post.

The work is in line with a larger NOAA push for more AI in research. On Feb. 18, the agency finalized the NOAA Artificial Intelligence Strategy. It lists five goals for using AI, including establishing organizational structures and processes to advance AI agencywide, using AI research in support of NOAAs mission and accelerating the transition of AI research to applications.

Castellote said the ensemble deep learning model hes using could easily be applied to other acoustic signal research.

A code module was built to allow retraining the ensemble, he said. Thus, any other project focused on different species (and soon human-made noise) can adapt the machine learningmodel to detect and classify signals of interest in their data.

Specifics about the model are available on GitHub.

About the Author

Stephanie Kanowitz is a freelance writer based in northern Virginia.

See more here:

AI streamlines acoustic ID of beluga whales - GCN.com

Posted in Ai | Comments Off on AI streamlines acoustic ID of beluga whales – GCN.com

How AI will earn your trust – JAXenter

Posted: at 6:27 pm

In the world of applying AI to IT Operations one of the major enterprise concerns is a lack of trust in the technology. This tends to be an emotional rather than intellectual response. When I evaluate the sources of distrust in relation to IT Ops, I can narrow it down to four specific causes.

The algorithms used in AIOps are fairly complex, even if you are addressing an audience which has a background in computer science. The way in which these algorithms are constructed and deployed is not covered in academia. Modern AI is mathematically intensive and many IT practitioners havent even seen this kind of mathematics before. The algorithms are outside the knowledge base of todays professional developers and IT operators.

SEE ALSO: 3 global manufacturing brands at the forefront of AI and ML

When you analyse the specific types of mathematics used in popular AI-based algorithms, deployed in an IT operations context, the maths is basically intractable. What is going on inside the algorithms cannot be teased out or reverse engineered. The mathematics generates patterns whose sources cannot be determined due to the very nature of the algorithm itself.

For example, an algorithm might tell you a number of CPUs have passed a usage threshold of 90% which will result in end user response time degrading. Consequently, the implicit instruction is to offload the usage of some servers. When you have this situation, executive decision makers will want to know why the algorithm indicates there is an issue. If you were using an expert system it could go back and show you all the invoked rules until you reverted back to the original premise. Its almost like doing a logical inference in reverse. The fact that you can trace it backwards lends credibility and validates the conclusion.

What happens in the case of AI is that things get mixed up and switched around, which means links are broken from the conclusion back to the original premise. Even if you have enormous computer power it doesnt help as the algorithm loses track of its previous steps. Youre left with a general description of the algorithm, the start and end data, but no way to link all these things together. You cant run it in reverse. Its intractable. This generates further distrust, which lives on a deeper level. Its not just about not being familiar with the mathematical logic.

Lets look at the way AI has been marketed since its inception in the late 1950s. The general marketing theme has been that AI is trying to create a human mind, when this is translated into a professional context people view it as a threat to their jobs. This notion has been resented for a long time. Scepticism is rife but it is often a tactic used to preserve livelihoods.

How AI has been marketed as an intellectual goal and a meaningful business endeavour, lends credibility to that concern. This is when scepticism starts to shade into genuine distrust. Not only is this technology that may not work, it is also my personal enemy.

IT Operations, in terms of all the various enterprise disciplines, is always being threatened with cost cutting and role reduction. Therefore, this isnt just paranoia, theres a lot of justification behind the fear.

IT Operations has had a number of bouts with commercialized AI which first emerged in the final days of the cold war when a lot of code was repackaged and sold to the IT Ops as it was a plausible use case. Many of the people who are now in senior enterprise positions, were among the first wave of people who were excited about AI and what it could achieve. Unfortunately, AI didnt initially deliver on expectations. So for these people, AI is not something new, its a false promise. Therefore, in many IT Operations circles there is a bad memory of previous hype. A historical reason for scepticism which is unique to the IT Ops world.

These are my four reasons why enterprises dont trust AIOps and AI in general. Despite these four concerns, the use of AI-based algorithms in an IT Operations context is inevitable, despite the distrust.

Take your mind back to a very influential Gartner definition of big data in 2001. Gartner came up with the idea of the 3Vs. The 3Vs (volume, variety and velocity) are three defining properties or dimensions of big data. Volume refers to the amount of data, variety refers to the number of types of data and velocity refers to the speed of data processing. At the time the definition was very valuable and made a lot of sense.

The one thing Gartner missed is the issue of dimensionality i.e. how many attributes a dataset has. Traditional data has maybe four or five attributes. If you have millions of these datasets, with a few attributes, you can store them in a database and it is fairly straightforward to search on key values and conduct analytics to obtain answers from the data.

However, when youre dealing with high dimensions and a data item that has a thousand or a million attributes, suddenly your traditional statistical techniques dont work. Your traditional search methods become ungainly. It becomes impossible to formulate a query.

As our systems become more volatile and dynamic, we are unintentionally multiplying data items and attributes which leads me onto AI. Almost all of the AI techniques developed to date are attempts to handle high dimensional data structures and collapse them into a smaller number of manageable attributes.

When you go to the leading Universities, youre seeing fewer courses on Machine Learning, but more geared towards embedding Machine Learning topics into courses on high dimensional probability and statistics. Whats happening is that Machine Learning per se is starting to resemble practical oriented bootcamps, while the study of AI is now more focussed on understanding probability, geometry and statistics in relation to high dimensions.

How did we end up here? The brain uses algorithms to process high dimensional data and reduces it to low dimensional attributes, it then processes and ends up with a conclusion. This is the path AI has taken. Lets codify what the brain is doing and you end up realizing that what youre actually doing is high dimensional probability and statistics.

SEE ALSO: Facebook AIs Demucs teaches AI to hear in a more human-like way

I can see discussions about AI being repositioned around high dimensional data which will provide a much clearer vision of what is trying to be achieved. In terms of IT operations, there will soon be an acknowledgement that modern IT systems contain high volume, high velocity and high variety data, but now also high dimensional datasets. In order to cope with this were going to need high dimensional probability and statistics and model it in high dimensional geometry. This is why AIOps is inevitable.

More here:

How AI will earn your trust - JAXenter

Posted in Ai | Comments Off on How AI will earn your trust – JAXenter

Surge in Remote Working Leads iManage to Launch Virtual AI University for Companies that Want to Harness the Power of the RAVN AI Engine -…

Posted: at 6:27 pm

CHICAGO, April 09, 2020 (GLOBE NEWSWIRE) -- iManage, the company dedicated to transforming how professionals work, today announced that it has rolled out a virtual Artificial Intelligence University (AIU), as an adjunct to its customer on-site model. With the virtual offering, legal and financial services professionals can actively participate in project-driven, best-practice remote AI workshops that use their own, real-world data to address specific business issues even amidst the disruption caused by the COVID-19 outbreak.

AIU helps clients to quickly and efficiently learn to apply machine learning and rules-based modeling to classify, find, extract and analyze data within contracts and other legal documents for further action, often automating time-consuming manual processes. In addition to delivering increases in speed and accuracy of data search results, AI frees practitioners to focus on other high-value work. Driven both by the need of organizations to reduce operational costs and to adapt to fundamental shifts toward remote work practices, virtual AIU is playing an important role in helping iManage clients continue to work and collaborate productively. The curriculum empowers end users with all the skills they need to quickly ramp up the efficiency and breadth of their AI projects using the iManage RAVN AI engine.

Participating in AIU was a huge win for us. We immediately saw the impact AI would have in surfacing information we need and allowing us to action it to save time, money and frustration, said Nikki Shaver, Managing Director, Innovation and Knowledge, Paul Hastings. The workshop gave us deep insight into how to train the algorithm effectively for the best possible effect. And, very quickly, more opportunities came to light as to how AI could augment our business in the longer term, continued Shaver.

AI is a transformation technology thats continuing to gain momentum in the legal, financial and professional services sectors. But many firms dont yet have the internal knowledge or training to deliver on its promise. iManage is committed to helping firms establish AI Centers of Excellence not just sell them a kit and walk away, said Nick Thomson, General Manager, iManage RAVN. Weve found the best way to ensure client success is to educate and build up experience inside the firm about how AI works and how to apply it to a broad spectrum of business problems.

Deep Training Delivers Powerful Results

iManage AIUs targeted, hands-on training starts with the fundamentals but delves much deeper enabling organizations to put the flexibility and speed of the technology to work across myriad scenarios. RAVN easily helps facilitate actions like due diligence, compliance reviews or contract repapering, as well as more sophisticated modeling that taps customized rule development to address more unique use cases.

The advanced combination of machine learning and rules-based extraction capabilities in RAVN make it the most trainable platform on the market. Users can teach the software what to look for, where to find it and then how to analyze it using the RAVN AI engine.

Armed with the tools and training to put AI to work across their data stores and documents, AIU graduates can help their organizations unlock critical knowledge and insights in a repeatable way across the enterprise.

Interactive Curriculum Builds Strong Skillsets

The personalized, interactive course is delivered over three half-day sessions, via video conferencing, to a small team of customer stakeholders. Such teams may include data scientists, knowledge managers, lawyers, partners, contract specialists, and trained legal staff. AIU is also available to firms that are considering integrating the RAVN engine and would like to see AI in action as they assess the potential impact of the solution on their businesses.

Expert iManage AI instructors, with deep technology and legal expertise, work with clients in advance to help identify use cases for the virtual AIU. The iManage team fully explores client use cases prior to the training to facilitate the most effective approach to extraction techniques for client projects.

The daily curriculum includes demonstrations with user data and individual and group exercises to evaluate and deepen user skills. Virtual breakout rooms for project drill down and feedback mechanisms, such as polls and surveys, help solidify learning and make the sessions more interactive. Recordings and transcripts allow customers to revisit AIU sessions at any time.

For more information on iManage virtual AIU or on-site training read our AI blog post or contact us at AIU@imanage.com.

Follow iManage via: Twitter: https://twitter.com/imanageinc LinkedIn: https://www.linkedin.com/company/imanage

About iManageiManage transforms how professionals in legal, accounting and financial services get work done by combining artificial intelligence, security and risk mitigation with market leading document and email management. iManage automates routine cognitive tasks, provides powerful insights and streamlines how professionals work, while maintaining the highest level of security and governance over critical client and corporate data. Over one million professionals at over 3,500 organizations in over 65 countries including more than 2,500 law firms and 1,200 corporate legal departments and professional services firms rely on iManage to deliver great client work securely.

Press Contact:Anastasia BullingeriManage +1.312.868.8411press@imanage.com

Read the rest here:

Surge in Remote Working Leads iManage to Launch Virtual AI University for Companies that Want to Harness the Power of the RAVN AI Engine -...

Posted in Ai | Comments Off on Surge in Remote Working Leads iManage to Launch Virtual AI University for Companies that Want to Harness the Power of the RAVN AI Engine -…

How AI is helping scientists in the fight against COVID-19, from robots to predicting the future – GeekWire

Posted: at 6:27 pm

Artificial intelligence is helping researchers through different stages of the COVID-19 pandemic. (NIST Illustration / N. Hanacek)

Artificial intelligence is playing a part in each stage of the COVID-19 pandemic, from predicting the spread of the novel coronavirus to powering robots that can replace humans in hospital wards.

Thats according to Oren Etzioni, CEO of Seattles Allen Institute for Artificial Intelligence (AI2) and a University of Washington computer science professor. Etzioni and AI2 senior assistant Nicole DeCario have boiled down AIs role in the current crisis to three immediate applications: Processing large amounts of data to find treatments, reducing spread, and treating ill patients.

AI is playing numerous roles, all of which are important based on where we are in the pandemic cycle, the two told GeekWire in an email. But what if the virus could have been contained?

Canadian health surveillance startup BlueDot was among the first in the world to accurately identify the spread of COVID-19 and its risk, according to CNBC. In late December, the startups AI software discovered a cluster of unusual pneumonia cases in Wuhan, China, and predicted where the virus might go next.

Imagine the number of lives that would have been saved if the virus spread was mitigated and the global response was triggered sooner, Etzioni and DeCario said.

Can AI bring researchers closer to a cure?

One of the best things artificial intelligence can do now is help researchers scour through the data to find potential treatments, the two added.

The COVID-19 Open Research Dataset (CORD-19), an initiative building on Seattles Allen Institute for Artificial Intelligence (AI2) Semantic Scholar project, uses natural language processing to analyze tens of thousands of scientific research papers at an unprecedented pace.

Semantic Scholar, the team behind the CORD-19 dataset at AI2, was created on the hypothesis that cures for many ills live buried in scientific literature, Oren and DeCario said.Literature-based discovery has tremendous potential to inform vaccine and treatment development, which is a critical next step in the COVID-19 pandemic.

The White House announced the initiative along with a coalition that includes the Chan Zuckerberg Initiative, Georgetown Universitys Center for Security and Emerging Technology, Microsoft Research, the National Library of Medicine, and Kaggle, the machine learning and data science community owned by Google.

Within four days of the datasets release on March 16, itreceived more than 594,000 views and 183 analyses.

Computer models map out infected cells

Coronaviruses invade cells through spike proteins, but they take on different shapes in different coronaviruses. Understanding the shape of the spike protein in SARS-Cov-2 that causes coronavirus is crucial to figuring out how to target the virus and develop therapies.

Dozens of research papers related to spike proteins are in the CORD-19 Explorer to better help people understand existing research efforts.

The University of Washingtons Institute for Protein Design mapped out 3D atomic-scale models of the SARS-CoV-2 spike protein that mirror those first discovered in a University of Texas Austin lab.

The team is now working to create new proteins to neutralize the coronavirus, according to David Baker, director of the Institute for Protein Design. These proteins would have to bind to the spike protein to prevent healthy cells from being infected.

Baker suggests that its a pretty small chance that artificial intelligence approaches will be used for vaccines.

However, he said, Asfar as drugs, I think theres more of a chance there.

It has been a few months since COVID-19 first appeared in a seafood-and-live-animal market in Wuhan, China. Now the virus has crossed borders, infecting more than one million people worldwide, and scientists are scrambling to find a vaccine.

This is one of those times where I wish I had a crystal ball to see the future, Etzioni said of the likelihood of AI bringing researchers closer to a vaccine. I imagine the vaccine developers are using all tools available to move as quickly as possible. This is, indeed, a race to save lives.

More than 40 organizations are developing a COVID-19 vaccine, including three that have made it to human testing.

Apart from vaccines, several scientists and pharmaceutical companies are partnering to develop therapies to combat the virus. Some treatments include using antiviral remdesivir, developed by Gilead Sciences, and the anti-malaria drug hydroxychloroquine.

AIs quest to limit human interaction

Limiting human interaction in tandem with Washington Gov. Jay Inslees mandatory stay-at-home order is one way AI can help fight the pandemic, according to Etzioni and DeCario.

People can order groceries through Alexa without stepping foot inside a store. Robots are replacing clinicians in hospitals, helping disinfect rooms, provide telehealth services, and process and analyze COVID-19 test samples.

Doctors even used a robot to treat the first person diagnosed with COVID-19 in Everett, Wash., according to the Guardian. Dr. George Diaz, the section chief of infectious diseases at Providence Regional Medical Center, told the Guardian he operated the robot while sitting outside the patients room.

The robot was equipped with a stethoscope to take the patients vitals and a camera for doctors to communicate with the patient through a large video screen.

Robots are one of many ways hospitals around the world continue to reduce risk of the virus spreading. AI systems are helping doctors identify COVID-19 cases through CT scans or x-rays at a rapid rate with high accuracy.

Bright.md is one of many startups in the Pacific Northwest using AI-powered virtual healthcare software to help physicians treat patients more quickly and efficiently without having them actually step foot inside an office.

Two Seattle startups, MDmetrix and TransformativeMed, are using their technologies to help hospitals across the nation, including University of Washington Medicine and Harborview Medical Center in Seattle. The companies software helps clinicians better understand how patients ages 20 to 45 respond to certain treatments versus older adults. It also gauges the average time period between person-to-person vs. community spread of the disease.

The Centers for Disease Control and Prevention uses Microsofts HealthCare Bot Service as a self-screening tool for people wondering whether they need treatment for COVID-19.

AI raises privacy and ethics concerns amid pandemic

Despite AIs positive role in fighting the pandemic, the privacy and ethical questions raised by it cannot be overlooked, according to Etzioni and DeCario.

Bellevue, Wash., residents are asked to report those in violation of Inslees stay home order to help clear up 911 lines for emergencies, Geekwire reported last month. Believe police then track suspected violations on the MyBellevue app, which shows hot spots of activity.

Bellevue is not the first. The U.S. government is using location data from smartphones to help track the spread of COVID-19. However, privacy advocates, like Jennifer Lee of Washingtons ACLU, are concerned about the long-term implications of Bellevues new tool.

Etzioni and DeCario also want people to consider the implications AI has on hospitals. Even though deploying robots to take over hospital wards helps reduce spread, it also displaces staff. Job loss because of automation is already at the forefront of many discussions.

Hear more from Oren Etzioni on this recent episode of the GeekWire Health Tech podcast.

Visit link:

How AI is helping scientists in the fight against COVID-19, from robots to predicting the future - GeekWire

Posted in Ai | Comments Off on How AI is helping scientists in the fight against COVID-19, from robots to predicting the future – GeekWire

Google expands AI calling service Duplex to Australia, Canada, and the UK – The Verge

Posted: at 6:27 pm

Googles automated, artificial intelligence-powered calling service Duplex is now available in more countries, according to a support page updated today. In addition to the US and New Zealand, Duplex is now available in Australia, Canada, and the UK, reports VentureBeat, which discovered newly added phone numbers on the support page that Google says it will use when calling via Duplex from a distinct country.

It isnt a full rollout of the service, however, as Google clarified to The Verge its using Duplex mainly to reach businesses in those new countries to update business hours for Google Maps and Search.

And indeed, CEO Sundar Pichai did in fact outline this use of Duplex last month, writing in a blog post, In the coming days, well make it possible for businesses to easily mark themselves as temporarily closed using Google My Business. Were also using our artificial intelligence (AI) technology Duplex where possible to contact businesses to confirm their updated business hours, so we can reflect them accurately when people are looking on Search and Maps. Its not clear if a consumer version of the service will be made available at a later date in those countries.

Duplex launched as an early beta in the US via the Google Assistant back in late 2018 after a splashy yet controversial debut at that years Google I/O developer conference. There were concerns about the use of Duplex without a restaurant or other small business express consent and without proper disclosure that the automated call was being handled by a digital voice assistant and not a human being.

Google has since tried to address those concerns, with limited success, by adding disclosures at the beginning of calls and giving businesses the option to opt out of being recording and speak with a human. Duplex now has human listeners who annotate the phone calls to improve Duplexs underlying machine learning algorithms and to take over in the event the call either goes awry or the person on the other end chooses not to talk with the AI.

Google has also expanded the service in waves, from starting on just Pixel phones to iOS devices and then more Android devices. The services first international expansion was New Zealand in October 2019.

Update April 9th, 2:15PM ET: Clarified that the Duplex rollout is to help Google update business hours for Google Maps and Search.

See the article here:

Google expands AI calling service Duplex to Australia, Canada, and the UK - The Verge

Posted in Ai | Comments Off on Google expands AI calling service Duplex to Australia, Canada, and the UK – The Verge