Monthly Archives: March 2017

Cinema Expanding: MSPIFF 2017 To Hold Virtual Reality Showcase – CBS Minnesota / WCCO

Posted: March 7, 2017 at 10:20 pm

March 7, 2017 5:03 PM

MINNEAPOLIS (WCCO) Virtual reality is expanding the creative horizons of filmmaking, and thats being reflected in this years Minneapolis-St. Paul International Film Festival.

The dates for the largest film event in the Midwest were announced Tuesday. Officials with the Film Society of Minneapolis-St. Paul, which puts on the festival, say it will run from April 13-29, bringing 250 films from more than 70 countries to Minnesota audiences.

Part of the program will be a showcase of 11 virtual reality projects, which festival officials describe as best, most engaging and mind-expanding work in the medium in the last few years.

The Virtual Reality and the Future of Storytelling Exhibition will run from April 21-25 at the St. Anthony Main Theatre, where most of the festival screenings take place. Admission will be free and open to the public.

A sampling of the experiences in the VR showcase include: the terror of walking a plank atop a skyscraper (Richies Plank Experience): being accosted by relatives after coming out (Out of Exile: Daniels Story); and exploring the changing landscape of the Amazon jungle (Under the Canopy). You can read the full list of VR projects here.

Also part of the VR showcase is a panel discussion on the future of the medium.

The panel will feature virtual reality filmmakers discussing how the technology is proving to be an immensely disruptive force in cinema, forcing filmmakers to rethink the possibilities of visual storytelling. The panel discussion is slated for April 22 at the Mill Artists Lofts Performance Hall in Minneapolis.

As for the hundreds of old-fashioned films playing at the festival, the full lineup will be revealed on March 23.

Read the original here:

Cinema Expanding: MSPIFF 2017 To Hold Virtual Reality Showcase - CBS Minnesota / WCCO

Posted in Virtual Reality | Comments Off on Cinema Expanding: MSPIFF 2017 To Hold Virtual Reality Showcase – CBS Minnesota / WCCO

Virtual reality has a motion sickness problem – Science News

Posted: at 10:20 pm

Tech evangelists predicted that 2016 would be the year of virtual reality. And in some ways they were right. Several virtual reality headsets finally hit the commercial market, and millions of people bought one. But as people begin immersing themselves in new realities, a growing number of worrisome reports have surfaced: VR systems can make some users sick.

Scientists are just beginning to confirm that these new headsets do indeed cause a form of motion sickness dubbed VR sickness. Headset makers and software developers have worked hard to combat it, but people are still getting sick. Many in the industry fear this will be a major obstacle to mass adoption of virtual reality.

A lot of VR, people today cannot tolerate, says Kay Stanney, a human factors engineer with a focus on VR at Design Interactive in Orlando, Fla. Search for VR sickness on Twitter, she says, and youll see that people are getting sick every day.

Around 25 to 40 percent of people suffer from motion sickness depending on the mode of transport, scientists have estimated, and more women are susceptible than men.

Count me among those women. Im highly prone to motion sickness. Cars, planes and boats can all make me feel woozy. It can take me a day or more to fully shake the nausea, headache and drowsiness. Certain that virtual reality would also make me sick, Ive purposefully avoided strapping on a headset. (Until this assignment came along.)

Women who got sick playing a VR horror game

Men who got sick playing the game

So far, avoiding VR hasnt been much of a loss for me. A lot of the VR industry is focused on video games, vying for a chunk of an estimated $100 billion market. And most of the early adopters who are willing to pay for one of the new premium headsets $400 for Sonys PlayStation VR or $800 for an Oculus Rift or HTC Vive are probably serious gamers or technophiles. I dont fit either category.

But, avoidance promises to become harder as VR moves beyond games. The technology has already begun creeping into other fields. Car companies, including Audi, General Motors Co.and used-car seller Vroom, are building VR showrooms where you can check out cars as if you were actually on the lot. Architects are using VR to walk clients through buildings that dont yet exist. Schools and learning labs are taking students on virtual field trips to both contemporary and historical sites.

Facebook CEO Mark Zuckerberg sees virtual reality as the next big social platform. In 2014, Facebook bought Oculus VR, maker of the Rift headset, for around $2 billion. This is really a new communication platform, Zuckerberg wrote in the Oculus announcement. Imagine sharing not just moments with your friends online, but entire experiences and adventures. New VR sites where people can socialize or play games together in virtual spaces, like AltspaceVR and Rec Room, are springing up. And some tech luminaries see a future in which VR is integrated into many more aspects of our daily lives, from movies and entertainment to work and health care.

Nobody knows if the broader public will embrace virtual reality. Sales of the expensive high-end headsets have been underwhelming the three premium systems combined sold an estimated 1.5 million headsets in 2016. But sales of cheaper mobile headsets were more impressive. For less than $100, Samsung Gear VR, Google Daydream View, Google Cardboard and others are powered by your mobile phone. But with smaller screens and less computer power, they are far less capable than the Rift or the Vive. Still, they are selling. In January, Samsung reported that it had sold 5 million of the $99 Gear VR headset since its release in November 2015.

But VR may never really catch on if it makes people sick. And while VR companies and developers are confident that theyll find solutions, many motion sickness experts are pessimistic. My hunch is that [the solutions] are extremely limited, says Steven Rauch, director of the Vestibular Division at Massachusetts Eye and Ear in Boston.

In some ways, the very premise of virtual reality makes it an ideal vehicle for motion sickness.

Motion sickness has probably been with us as long as weve had boats. References to seasickness date back to Greek mythology; the word nausea is derived from the Greek naus, meaning ship. J.A. Irwin introduced the term motion sickness in the scientific literature in 1881. Since then, an extensive body of research has accumulated.

The most widely accepted theory to emerge is that motion sickness is brought on by a mismatch between two or more of the senses that help you keep your balance. For example, when youre below deck on a ship at sea, your eyes see a stationary room. But your vestibular system the fluid-filled canals and specialized membranes in your inner ear senses the motion of the ship as it rolls over waves. Youre getting conflicting information on different sensory channels into the balance system, Rauch says. That is believed to be the primary cause of motion sickness.

In virtual reality, the mismatch is there as well, says visual neuroscientist Bas Rokers of the University of WisconsinMadison. But the sensory cues are reversed: Your eyes see that you are moving through the virtual world in a virtual car or a virtual spaceship, or strolling down a virtual path but your vestibular system knows youre not actually moving. That gives you a cue conflict, he says.

While most motion sickness experts think sensory mismatch is to blame, some disagree. Kinesiologist Thomas Stoffregen of the University of Minnesota in Minneapolis, whos been studying motion sickness for 25 years, thinks instability is the culprit. On a ship, the rolling motion puts you off balance, and that makes you sick, he says. Motion sickness situations are ones in which the control of your body is challenged somehow. If you dont rise to that challenge, then the contents of your stomach may rise.

This idea, known as the postural instability theory, can be applied to VR as well, Stoffregen says. If your eyes convince your brain that youre in the virtual world, your body will respond to it instead of the real world you are physically in, which can throw your balance off. Imagine sitting in a chair in the real world while riding in a car in the virtual world. As the car approaches a turn, youll want to lean into it, which could land you on the floor. The more convincing the virtual world is, the more likely you are to link the control of your body to what youre seeing, Stoffregen says. And in a virtual car, that is a mistake.

While the postural instability theory is outside the scientific mainstream, it offers an explanation for another mystery of motion sickness: why more women suffer than men.

Stoffregen and colleagues have shown repeatedly that its possible to predict who is likely to get motion sick in various circumstances by measuring postural sway the small, subconscious movements people make to stay balanced while standing still. By analyzing several aspects of sway, including the distance, direction and timing of the movements, the researchers have found that people who are susceptible to motion sickness sway differently than those who arent. And postural sway differs measurably between men and women. The difference, Stoffregen says, can be attributed to physical differences between the sexes, such as height and center of balance.

Stoffregens research suggests women are also more prone to VR sickness than men. In a study published in December in Experimental Brain Research, Stoffregen and colleagues measured the postural sway of 72 college students before they were asked to play one of two VR games for 15 minutes using an Oculus Rift DK2. The first game made two of 18 men and six of 18 women feel motion sick, not enough for a statistically significant difference.

But more than half of the students who played the horror game Affected, using a handheld controller to explore a dark, spooky building, reported feeling sick. Of the 18 women playing that game, 14 felt sick. Thats nearly 78 percent, compared with just over 33 percent of the men. When the scientists compared those results against the postural sway data, just as in their previous motion sickness studies, they found a measurable difference in sway between those who got sick and those who didnt (SN: 1/21/17, p. 7).

Rokers has another explanation for the gender difference that fits with the sensory mismatch theory. In a study published in January 2016 in Entertainment Computing, Rokers and colleagues looked at how visual acuity might affect susceptibility to VR sickness. Seventy-three people with either natural or corrected 20/20 vision completed a battery of visual tests and then spent up to 20 minutes in an Oculus Rift DK1 headset watching videos. The videos showed motion from different points of view, such as a drone flying around a bridge or a passenger in a car driving through mild traffic. Of the female participants, 75 percent felt sick enough to stop watching before the 20 minutes had passed, compared with 41 percent of the men.

People who were better at perceiving 3-D motion in the visual tests were more likely to feel sick. And on average, the women in the study performed better on the 3-D motion perception tests than the men.

Its not clear why women would have better visual acuity for 3-D motion, but the results suggest that the more sensitive you are to sensory cues, the more likely you are to detect a mismatch, Rokers says. If you can tell that your senses are providing you different information, then you are more likely to get motion sick.

Just being a woman doesnt necessarily mean youll be highly susceptible to motion sickness like I am. Lots of other factors are likely at play. Some research suggests Asians are more likely to suffer. People who get migraines are also unusually prone to motion sickness. Scientists at genetic-testing company 23andMe reported in Human Molecular Genetics in 2015 that they had found 35 genetic variants associated with car sickness. Age is also a factor: Infants are generally immune, susceptibility increases from age 2 to 15, and although it hasnt been my experience, the problem subsides for many people in adulthood.

Everybodys brain has a different capacity for processing motion, Rauch says. Just like some people are good with languages and some people are good with math, some people are good with motion processing, of doing this complex sensory-integration task. The people who are good at it become figure skaters and divers and gymnasts, he says. But there are other people who throw up if they ride backwards on the metro. That would be me.

Under the right circumstances, though, anyone with a functioning vestibular system can experience motion sickness nearly everyone stranded on a lifeboat in choppy seas will get sick.

Very little motion sickness research has been done on the latest VR headsets available to consumers. But Rauch says the very nature of VR, which is to trick your eyes into telling your brain youre in another world, is inviting a sensory conflict. Theres always going to be some sensory conflict, and so the VR is going to be more successful in people who can tolerate that, Rauch says. For me, he was clear: Its always going to be torture.

Story continues below slideshow

Some games, like theBlu: Encounter(screenshot shown on first slide)and Job Simulator (middle slide), are unlikely to cause sickness because they require little movement around the virtual world. The dinosaur-hunting game Island 359 (last slide)has a teleport option for more susceptible players.

The U.S. military was the first to report, in 1957, that virtual environments could be problematic: Flight simulators were making some pilots motion sick. Since then, many studies have confirmed that simulator sickness is a real problem.

One of the biggest tech hurdles for VR has been the inherent delay between when you move your head and when the display updates to reflect that movement. If the lag is too great, you can end up with a potentially vomit-inducing sensory mismatch. Todays high-end systems have capitalized on advances in displays, video rendering, motion tracking and computing to cut down the lag to the neighborhood of 20 milliseconds low enough to avoid triggering motion sickness. Theyve beaten most of the pure hardware problems, says Steven LaValle, a computer scientist at the University of Illinois at Urbana-Champaign and a former head scientist at Oculus.

But even with the best virtual reality system, what you do in the virtual world matters. If youre sitting or standing in one place in both the real world and the virtual world, youre very unlikely to feel sick. And as long as a step in the real world results in an equivalent step in the virtual world, moving around is fine too. All three of the premium headsets use external lasers to track the motion of the headset within a limited space up to 3.5 meters by 3.5 meters with the HTC Vive. But to explore further, youll need to use handheld controllers with buttons, triggers and directional touch pads to move your virtual self around, just as in a regular 2-D video game. Thats where things can go wrong.

I like to joke that the controller is like a sickness generator, says LaValle, who worked on reducing motion sickness while at Oculus. Every time you grab onto a controller, youre creating motions that are not corresponding perfectly to the physical world. And when thats being fed into your eyes and ears, then you have trouble.

The people creating the content for VR systems are taking the problem seriously, says Steve Bowler, cofounder of VR game company CloudGate Studio, based outside of Chicago. Developers are really, really focused on zero tolerance for user motion sickness.

On its face, it makes no sense that exposure to motion should bring on disabling nausea and vomiting. But we share this seemingly odd connection between our sense of balance and the gastrointestinal tract with many nonhuman animals, including dogs, monkeys, sheep, birds and even fish. The most often cited explanation is an evolutionary theory put forward by cognitive psychologist Michel Treisman in Science in 1977. Ingesting a poison can also mess with your balance system. So the body interprets the motion reaction as a symptom of poisoning and responds as it would with poison, by vomiting to try to get rid of the harmful substance, he suggested. Although its just an idea and has never been tested, it has some intuitive appeal.

One of the most successful strategies developers have hit on is using teleportation to take short skips around the virtual world. Basically you aim the controller where you want to go and the screen fades to black for a split second, sort of like the blink of an eye. When it fades back in, youre at the new location. This, Bowler says, eliminates motion sickness even for the most susceptible people he knows. But that comfort comes at a cost: The whole point of VR is to convince you that youre physically in this other world; if youre magically teleporting here and there, its not going to feel as real, he says.

Bowler favors a technique known as sprint or dash that aims to reduce the effects of acceleration. Instead of gradually ramping your speed up and back down, a sprint bumps you up to speed almost instantaneously, maintains that speed until you reach your target and then drops you quickly back down to a standstill.

While sprinting doesnt approximate natural movement very well, it does let you see the motion, unlike teleportation. And Bowler says hes had about a thousand people at various events try sprinting in a dinosaur-hunting game his group built called Island 359 with almost no reports of motion sickness. Anyone who feels uncomfortable can switch to chasing dinosaurs using a teleportation option instead.

Oculus seems to have accepted that VR sickness cant be eliminated from all VR experiences at the moment, so most Oculus-approved games come with comfort ratings to let users know if a game or experience is more or less likely to make them sick. Those assessments might help people like me avoid the most nauseating games.

Bowler considers himself an ambassador for virtual reality. After almost an hour of very patiently and enthusiastically explaining how VR works, he somehow convinced me to try it. A few days later I was at UploadVR in San Francisco strapping on the HTC Vive with Bowler looking on via Skype from his office in the Chicago suburbs.

The headset was heavy and awkward, but I otherwise felt fine while creating a virtual 3-D painting or walking around on the deck of a shipwreck as an enormous blue whale swam by ogling me. I even shot at drones while dodging virtual bullets, with no hint of motion sickness. I decided I was ready to hunt dinosaurs.

First I tried teleportation mode in Bowlers game, and as he promised, no nausea. Though the splatters of blood and guts when I slashed some attacking mini dinosaurs was almost enough to make me gag, the strangeness of teleportation made me feel more like I was inside a 2-D video game than on a dinosaur-infested island. I decided to see if I could handle sprint mode. I wanted to know if it would feel more real.

That was a mistake. I could only manage about a half dozen sprints before I felt the first hints of nausea. I had to quit. Once the headset was off I felt better. But soon, a lingering nausea and drowsiness hit, like I sometimes experience after a turbulent flight. I didnt entirely recover until the following evening. Im glad Bowler convinced me to give it a try, and the parts I could handle were pretty fun. But I wont be going back for more anytime soon.

Virtual reality still has lots of room for improvement, but whether it will ever reach the point of being comfortable for everyone is an open question. The VR industry is moving at a pace science cant match, forging ahead with its own grand experiment as millions of users test its products. Much of what we learn about how VR affects people will show up first in living rooms and on Twitter rather than in scientific labs and journals. And though the results of those experiments are still coming in, tech luminaries havent hesitated to declare 2017 as the real year of virtual reality.

A slew of possible solutions for VR sickness most with very little research behind them have been suggested by scientists, developers, companies, entrepreneurs and users.

Here are just a few:

This article appears in the March 18, 2017, issue of Science News with the headline, "Real sick: The immersive experience of the virtual world is not for everyone."

Excerpt from:

Virtual reality has a motion sickness problem - Science News

Posted in Virtual Reality | Comments Off on Virtual reality has a motion sickness problem – Science News

MWC 2017: How virtual reality could be the next big thing for healthcare – ComputerWeekly.com

Posted: at 10:20 pm

This time last year, visitors to Mobile World Congress 2016 in Barcelona remarked on the sudden prevalence of virtual reality (VR) tech on many of the stands.

Look at how to orchestrate the variety of devices in use, and how to achieve efficient workforce mobility.

By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.

You also agree that your personal information may be transferred and processed in the United States, and that you have read and agree to the Terms of Use and the Privacy Policy.

Then, memorably, Samsung brought the technology to global attention when it enlisted Facebooks Mark Zuckerberg for a viral demonstration at the launch of its Galaxy S7 handsets.

Consumer virtual reality is all well and good, but in the 12 months since Samsungs PR stunt, most of the VR headsets that were given away free with new smartphones have gone largely unused, treated as a curiosity for a few weeks before ending up in a cupboard.

More attention is being paid to the idea of augmented reality (AR), which like its more immersive VR cousin had a viral moment in the summer of 2016 when millions took to the streets to hunt and collect cute little animals in the hit AR game Pokmon Go.

It would probably be fair to say that VR is walking a long path to widespread acceptance and use, but even if consumers arent yet doing much with it beyond playing video games, the technology continues to advance at pace, and is finding new use cases in many fields.

Some of the most interesting applications, and perhaps the most relevant to society, are to be found in the field of healthcare.

Once upon a time, Wendy Powell of Portsmouth University worked as a private chiropractor, but she returned to academia to take a degree in computing and IT, which she followed up with a doctorate in creative technologies, for which she studied walking behaviour in VR.

Now senior lecturer in applications of VR at Portsmouth Universitys School of Creative Technologies, Powell conducts extensive research into the use of VR and interactive technologies for health and well-being, and regularly represents the Institute of Electrical and Electronics Engineers (IEEE) on VR topics.

My key interest is physical rehabilitation and how we can leverage VR tech for physical rehabilitation. There are a wide variety of different applications there, she tells Computer Weekly.

As previously explored during the early stages of her research, a great deal of Powells work to date has centred on the use of VR for stroke patients, using certain properties of VR, such as the ability to change where people see their hands moving, to regain control of their movements.

Stroke patients can also benefit from programmes that help them simulate basic tasks that may have to be relearned after an attack. This could include boiling a kettle safely, with no risk of scalding oneself, says Powell, or relearning how to cross a road in an environment where there is no danger of being struck by a vehicle.

VR is proving to be of similar use in fields such as physiotherapy, where it is being used to make mundane exercises a little more interesting for patients.

If you can gamify exercise in VR, where your movements are tracked and get feedback, the patients are more actively engaged

Wendy Powell, IEEE

If you have to get somebody to do a specific exercise 100 times, its incredibly boring, and as soon as the patient starts to feel a bit better, they stop doing it, says Powell.

If you can gamify the exercise in VR, where your movements are being tracked and theres feedback from those movements, the patients are more actively engaged.

Thats only part of it, she continues. If you go away and do 100 repetitions, you might have done them really badly, but if youre doing it inside VR and using full-body tracking at the same time, you can look at your performance over time.

One of the most interesting areas of research for VR practitioners in the healthcare sector is to help amputees manage their conditions. Statistics show that over 90% of amputees continue to feel their absent limb as if it was still there, a condition known as phantom limb.

People experience these sensations in a number of different ways, such as tingling, itching or twitching, or even trying to make a gesture. However for many amputees the experience of having phantom limb is overwhelmingly painful. It is very common for patients to be on very strong doses of medication to manage that.

Using visualisation to reduce the pain is one technique that has gained some traction, but this is quite difficult to do and depends a lot on the ability of the patient to internalise and believe that, for example, a reflected image of a complete limb in a mirror box is their own.

However, researchers are now beginning to understand that there is actually something about VR that reduces pain.

Trials with amputees have shown that by using electromyography (EMG) a diagnostic technique that detects the electrical potential of muscle cells when they are activated muscle movements made in the amputees upper arm, for example, in an attempt to control and move the absent forearm can be rendered in a VR environment.

So if I use the upper arm muscles that clench my fist, the EMG reads the intent to clench the fist, even if there is no fist. We can use that to clench an animated fist, so that when they have the headset on they can see the animation, says Powell.

This is what we call emotive visual feedback. The patient connects the loop back and tells the brain that the hand is okay and they can move it. That seems to be a very powerful tool not just to reduce pain, but to allow the patient to mentally let go of some of the problems of having a missing limb.

Obviously, amputees cannot spend the rest of their lives in a VR environment, but Powell envisages that in the future, once prescription protocols for VR are properly developed, people may use it a couple of times a day to help them manage their pain without needing to fall back on powerful drugs. This would, however, require extensive clinical trials.

Powell is at pains to point out that there are still many other unknowns when it comes to VR. She compares its development to something like a drug trial being conducted in reverse. Weve found a drug that works, which is called VR, she says, but what we dont know yet are the ingredients, or precisely how its working.

The need to find out exactly VR reduces pain is an urgent one. Functional magnetic resonance imaging (FMRI) scans of the human brain do indeed show that the pain sensors of the brain do indeed dull their activity when the patient is in VR, Powell explains, but the jury is still out on why this should be the case, or at what level of the brain it is being driven.

In physiotherapy work, the very fact that VR is being used to change how people are behaving means there can be other negative effects. Powell compares it to the early days of the Nintendo Wii games system, when there was a brief fad for exercise games, such as Wii Tennis.

However, players very quickly discovered that they could trick the system to win more easily by swinging from the wrist instead of from the shoulder as a tennis player would. This caused a lot of cases of repetitive strain injury (RSI).

Yes, you can use VR and experiment with it but you have to be very aware that it changes how people behave and lets them cheat because patients cheat and VR doesnt stop them from doing that, says Powell.

The other, more publicised problem with VR is that it can make users feel slightly nauseous, which is not ideal when they may already be ill. In the early days of VR, this was largely a hardware issue, with graphics taking too long to render if the user moved their head too quickly. This problem has largely been developed out now, but others have taken its place.

One is what we call accommodation convergence conflict, which sounds terribly complex but really its that VR is tricking my eyes into thinking that youre sitting over there but the screen Im looking at is here, so my eyes are focusing on a screen here but trying to interpret you as being further away, says Powell.

That conflict between where my eyes are actually focusing and where theyre virtually focusing can cause eye strain as well, so thats one issue we havent solved yet. Some of that is about good VR design and where you get people to look.

Most of the cyber sickness problems now are either core design or the conflict between what you expect to feel and what you actually feel. So if Im sitting in a motionless chair and in VR Im hurtling down a rollercoaster, Im going to feel sick because Im not moving and yet everything in my vision is.

As an example, one demo tried at Portsmouth saw Powell being floated around a VR Roman villa environment, an experience she found very nauseating because it gave her the sensation of moving without actually physically moving.

One way to solve this could be to build moving elements into the external environment, such as a vibrating chair like you might find in a fairground flight simulator. Meanwhile, inside the VR villa the users avatar could be being carried around in a litter chair, and both the external and internal stimuli would match up. Of course this would be very expensive, so in practice it is more likely that the problem will be solved through closer attention to VR scenario design.

If designers can match the narrative to their input technique, its much better, says Powell. If the user is sitting, have the VR narrative have them sitting, or teleport them instantaneously. Dont make them walk as an avatar if theyre in a chair.

Theres a lot of research being done on this. The IEEE VR conference is doing a great deal of technical underpinning research to look at things like stable horizons and frame of reference; things we can use to reduce nausea.

Nevertheless, Powell has found that, whether it is being used to help elderly people learn exercises to keep active and remain in their own homes; to make sure people with broken bones keep on top of their physio; or to help amputees relieve their pain, VR is being well accepted across the board.

When I was first using VR, there was an inherent bias where I expected elderly people in particular to be very resistant, but they actually often engaged with it very well as long as was not complex or cumbersome, she says.

Thats why mobile VR excites me, because were using that with people in their 80s and above who put a headset on and there are no wires, theres nothing to worry about catching their hands, its not uncomfortable, and they just engage with it.

Amputees, particularly those with injuries sustained during the wars in Afghanistan and Iraq, tend to be even more enthusiastic. Young and otherwise fit people dont want a lifetime of taking morphine or other similar drugs to manage their pain; they prefer to be as drug free as possible.

A lot of them will say I will try anything, literally anything that can solve my pain. One of them told me if I could stick it in a fire to get rid of it I would, if I could have it amputated again I would, says Powell.

So when you find something thats not just helpful but is actually quite fun too, people get pretty excited. The pain can be intense. But where you can repurpose something like VR to help, why wouldnt you try it?

Clinicians, too, are increasingly open to the power of VR in healthcare, says Powell, much more than they were in the past, because the technology has advanced to the point where all you need to use it is a smartphone and a headset.

Its another tool for doctors to use, and they want everything they can get to help their patients manage their pain, says Powell.

If they have another tool in their armoury, particularly one they can send patients home with like mobile VR, that helps with pain management and improves quality of life for their patients, then they will try it.

See more here:

MWC 2017: How virtual reality could be the next big thing for healthcare - ComputerWeekly.com

Posted in Virtual Reality | Comments Off on MWC 2017: How virtual reality could be the next big thing for healthcare – ComputerWeekly.com

Amazon’s Latest AI Push: Cybersecurity – Motley Fool

Posted: at 10:20 pm

Amazon.com, Inc. (NASDAQ:AMZN) has been making quite a push into the field of artificial intelligence (AI). You are no doubt familiar with the most public example of this effort -- Alexa, its voice-activated digital assistant which controls the Echo smart speaker and Echo Dot, which were top sellers on Amazon's website over the holidays. Those familiar with Amazon Web Services (AWS), an industry leader in cloud computing, may also be aware of the AI-based tools the company has recently made available to AWS customers: Rekognition for building image recognition apps; Polly for translating text to speech; and Lex, to build conversational bots.

Image source: Pixabay.

What you may not know is that Amazon is adding cyber-security to its AI resume. TechCrunch is reporting that Amazon has acquired AI-based cyber-security company Harvest.ai. According to its website, Harvest.ai uses AI-based algorithms to identify the most important documents and intellectual property of a business, then combines user behavior analytics with data loss prevention techniques to protect them from cyber attacks. Harvest.ai already had ties to Amazon, as a customer who was featured in an AWS Startup Spotlight article, which focuses on innovative and disruptive young companies. Harvest.ai boasts former members of the National Security Agency (NSA), Federal Bureau of Investigation (FBI), Department of Defense (DoD), as well as former employees of Websense and FireEye, Inc.

Harvest.ai's flagship product, MACIE, monitors a company's network in near real-time to identify when a suspicious user accesses unauthorized documents.Its target market was "Fortune 1000 organizations that were migrating to cloud-based platforms." Amazon has a Who's Who of big name companies as customers, so it seems like a natural fit for the company. If it decides to deploy MACIE to its cloud, it adds to the suite of hosting products available for its customers.Amazon already offers its Amazon Inspector, which it defines as an "automated security assessment service to help improve the security and compliance of applications deployed on AWS."Harvest.ai would take that to the next level.

AI is perfectly suited for the task of cyber-security. Image source: Pixabay.

The use of AI in cybersecurity isn't new. MIT has been experimenting with a novel approach to application. By pairing a system with a human counterpart and applying supervised learning, the system was able to detect 85% of threats. Over time, that success rate is sure to improve. Last year, IBM (NYSE:IBM) announced an initiative to train its AI-based Watson in security protocols, in what was to be a year-long research project. By the end of the year, the company expanded the beta program with the inclusion of 40 clients across a variety of industries. Earlier this month, IBM announced that Watson for Cyber Security would be available to customers.

The task of cyber security seems ideally suited to AI applications. The ability to digest a magnitude of data in a short time and match real-time situations against a set of specified criteria seems tailor made for the platform. Add to this AI's ability to learn over time and it seems inevitable that there would be a merging of these technologies.

These acquisitions combined with Amazon's own research makes it one of several companies on the cutting edge of AI. Amazon has been applying the knowledge it gains across a wide swath of its business from consumer facing products to its business-centric applications. Amazon's investors are sure to benefit from this multi-pronged approach.

Danny Vena owns shares of Amazon. The Motley Fool owns shares of and recommends Amazon and FireEye. The Motley Fool has a disclosure policy.

Here is the original post:

Amazon's Latest AI Push: Cybersecurity - Motley Fool

Posted in Ai | Comments Off on Amazon’s Latest AI Push: Cybersecurity – Motley Fool

AI-Powered Customer Service Needs The Human Touch – Huffington Post

Posted: at 10:20 pm

Artificial intelligence, with the human touch, is building a new customer experience

Artificial Intelligence is the definitive technology of the 21st century. All businesses of every size, in every industry will be impacted by AI. In the age of the connected customer, every 1 in 5 U.S. adults are almost never offline, customer experience is the battleground for true differentiation. Today, every successful consumer application is powered by AI. Tomorrow, every successful business will be powered by AI. The line-of-business that is most likely to embrace AI first will be the customer service typically the most process oriented and technology savvy organization within most companies. But before we dig into AIs tremendous potential in transforming customer service, lets scope AIs market size and growth projections.

The Artificial Intelligence (AI) Market Size and Future Projections

Today, 38% of enterprises are already using artificial intelligence (AI), growing to 62% by 2018. Forrester is predicting a 300% increase in AI investments in 2017 compared to 2016 and IDC believes AI will be a $47 billion market by 2020. Forrester lists the top 10 AI technologies here:

Forrester

Gartner named Intelligence Applied AI and Advance Machine Learning, Intelligent Apps, and Intelligent Things as 3 of its top 10 strategic technology trends for 2017.

The disruptive power of AI will impact every business, in every industry. According to Gartner, by 2020, 20% of companies will dedicate workers to monitor and guide neural networks. Gartner advises CIOs to look at areas of the company that have large data sets but lack analytics. AI can provide augmented intelligence with respect to discovery, predictions, recommendations and automation at scale.

PWC named Artificial Intelligence (AI) as one of the eight essential technologies in business. Today, there are 1,652 artificial intelligence (AI) startups and private companies that have captured over $12.24 billion of funding.

Venture Scanner

The power of artificial intelligence is mass personalization and contextual intelligence at scale. According to Accentures 2017 Technology Vision Report, AI could double annual economic growth rates by 2035. Accenture also notes that AI is the new UI. AI is becoming the new user interface (UI), underpinning the way we transact and interact with systems. Seventy-nine percent of business leaders agree that AI will revolutionize the way they gain information from and interact with customers. As AI takes over more of the user experience, it grows beyond just an intelligent interface. With each customer interaction becoming more personalized, powerful, and natural, AI moves into an even more prominent position: your digital spokesperson, Accenture Technology Vision 2017

AI Implementation Realities in the Enterprise

Although Amazons Alexa, has evolved from 1,000 voice command interpretations (or skills) to now more than 10,000 skills in one year period, there is still a lot of AI progress to be made before machines can truly understand and guide next best actions.

Robots, AI will replace 7% of US jobs by 2025 Forrester. Here are the highlights of the report:

The future of work projections and AIs impact on jobs may appear aggressive and somewhat unrealistic. In order to better understand the realities of AI in business, it is important to define phases and prerequisite of AI deployments in large businesses.

The three key phases of enterprise AI roll out: data, algorithms, and workflows. The power of AI usefulness is a function of the quality and quantity of data. Algorithms will help deliver insights discovery, prediction, recommendation and automation of existing manual processes require strong, self-learning and adaptable algorithms. The final phase and the most challenging is the workflows. The constant iteration of analyzing data, researching and developing algorithms, and creating timely actions based on gleaned insights through robust workflows is the job of data scientists and line-of-business people experts. Workflows that guide customer engagements must not be automated to a point where businesses lose sight on the importance of the human touch, empathy and relationship building aimed at earning the right to be a trusted advisor and strategic business partner.

For most companies, the algorithms and workflow complexities will the use of augmented intelligence. This is especially true for complex customer relationship management workflows in B2B customer service functions. That said, there is exponential growth in AI innovation and advancements and companies cannot afford to tag AI as hype, only to find themselves significantly behind their competitors in 1-2 years. AI knowledge, planning and adoption must happen now.

The Role of AI in Customer Service

Today machines have the ability to interact with humans at a level that used to only seem possible in sci-fi movies. Amazon serves up personalized product recommendations, Facebook automatically tags photos and Google maps proactively reroutes you around traffic. AI is powering nearly every experience we have-- making it smarter, seamless and personalized-- and as a result our expectations as consumers are at an all-time high. The most indispensable consumer apps are powered by AI technologies, delivering real personalized value, in real-time. This seamless personalized, immediate and intelligent user experience will make its way to every business, across all industries. AI in business will create motion and flow-based solutions and services. In order for customer service leaders to stay relevant, they must think differently and educate their stakeholder about AI. AI allows companies to deliver these smarter, more personalized and predictive experiences that customers have come to expect, but the human touch is still table stakes for customer success. The most suitable line-of-business to start with AI? Customer Service.

According to Salesforce research, 92% percent of senior executives believe that customer experience is a key competitive differentiator and they view customer service as the primary vehicle for improving the customer experience. In order for customer service organizations to lead customer experience transformation, they must fully embrace, deploy and utilize AI technologies.

What does excellent customer service look like? According to research, excellent customer service is personalized, always on and real-time, consistent and omni-channel. To achieve customer service excellence, service organization must leverage AI to bolster their discovery, prediction, recommendation and automation engines.

Salesforce Research

In the age of the customer, contact channels are expanding rapidly and the amount of data created - both structured and unstructured means that service organizations are drowning in data, but starving for actionable insights.

Salesforce Research

Forrester identifies extended and enhanced self-service, powered by AI technologies as one of its top trends for customer service in 2017. Customer service will continue to invest in structured knowledge management and leverage communities to extend the reach of curated content. Service will become more ubiquitous, via speech interfaces, devices with embedded knowledge, and wearables for service technicians, said Kate Leggett. The second top trend is sustained customer conversations using natural language processing technology. Companies will continue to explore the power of intelligent agents to add conversational interfaces to static self-service content. They will anticipate needs by context, preferences, and prior queries and will deliver proactive alerts, relevant offers, or content, said Leggett.

Top service teams are 3.9X more likely than underperforming service organizations to say predictive intelligence will have transformational impact on their customer service by 2020. The common theme that I hear as I collaborate with business leaders is that AI biggest potential is to augment our ability to connect with customers and giving way to a smarter customer experience.

Salesforce

With projections of 6 billion smartphone users and over 50 billion connected devices by 2020, the next generation customer experience will be powered by artificial intelligence. A CRM platform powered by AI will analyze customer engagements and automatically predict sentiment and adjust customer journeys to ensure optimal user experience. The same logic applied to prediction marketing lead scores and sales opportunity conversions will be applied to customer services cases, optimizing time-to-resolution cycles and improved customers satisfaction and net promoter scores (NPS).

Service organizations can significantly improve the customer experience and bolster service delivery capacities using AI technologies. Service managers using AI can gain real-time insights across all customer contact channels with AI-powered analytics to increase team productivity and CSAT. By using smart data discovery, service managers can optimize agent availability, wait times and opportunities for proactive service delivery. Using machine learning, cases are automatically escalated and classified using sensitivity and domain expertise predictive analytics. AI powered chat bots can deliver knowledge using automated workflows. Field service professionals can use mobile apps powered by AI, delivering precision service based on access to CRM data that can deliver personalized services anywhere. AI powered field apps use algorithms to optimize scheduling and routing using complete CRM data.

Salesforce Research

The top trends for CRM in 2017 includes intelligence powering prescriptive advice, according to Forrester.

The advantage of a CRM platform powered by artificial intelligence goes far beyond just the services organization. At Salesforce, the artificial intelligence technology, called Einstein, is infused across all the Salesforce clouds, giving over 150,000 companies that use Salesforce to seamlessly access AI capabilities across their sales, services, marketing, IT and community organizations. Marketers using AI have seen an average 25% lift in click through and opens. Sales professionals using AI predictive lead scoring have a 300% increase in lead to opportunity conversions. Commerce teams using AI have 7-15% increase in revenue per site visitor.

Artificial Intelligence is the definitive technology for the 21st century, and companies that use AI as augmented intelligence to make more informed and faster decisions will win the age of the customer where personalization, immediacy and intelligence are the new currencies of growing businesses. But to sustain growth and earn customers trust, businesses have to use common sense, care more and be cautious of over-automating. Businesses must practice empathy, inside and outside of the company, and deliver on their promises.

More:

AI-Powered Customer Service Needs The Human Touch - Huffington Post

Posted in Ai | Comments Off on AI-Powered Customer Service Needs The Human Touch – Huffington Post

Airbnb pledges not to replace human community with AI – TechCrunch

Posted: at 10:20 pm


TechCrunch
Airbnb pledges not to replace human community with AI
TechCrunch
Airbnb wants to mold its hosts into a powerful organizing force, akin to a union, to advocate on its behalf with local governments around the world and to serve as an ideological rebuke to the advances of AI at other tech firms. As part of that effort ...

and more »

Read more:

Airbnb pledges not to replace human community with AI - TechCrunch

Posted in Ai | Comments Off on Airbnb pledges not to replace human community with AI – TechCrunch

IBM, Salesforce Strike Global Partnership on Cloud, AI – Fortune

Posted: at 10:20 pm

Are two clouds better than one?

How about two nerdily-named artificial intelligence platforms?

According to IBM and Salesforce , the answer to both of those questions is yes.

The two Fortune 500 companies on Monday afternoon revealed a sweeping global strategic partnership that aligns one iconic company's multiyear turnaround effort with another's staggering growth ambitions . According to the terms of the deal, IBM and Salesforce will integrate artificial intelligence platforms (Watson and Einstein, respectively) and some of their software and services (e.g. a Salesforce component to ingest The Weather Companys meteorological data). IBM will also deploy Salesforce Service Cloud internally in a sign of goodwill.

Why not go it alone? Fortune spoke on the phone with IBM CEO Ginni Rometty and Salesforce CEO Marc Benioff to get a better understanding of the motives behind the deal. What follows is a transcript of that conversation, edited and condensed.

Fortune : Hi, guys. So what's this all about?

Benioff : It's great to connect with you again. Artificial intelligence is really accelerating our customers' success and they're finding tremendous value in this new technology. The spring release of Salesforce Einstein has opened our eyes to what's possible. We now have thousands of customers who have deployed this next-generation artificial intelligence capability. I'll tell you, specifically with our Sales Cloud customers, it creates this incredible level of productivity. Sales executives are way more productive than ever beforethe ability to do everything from lead scoring to opportunity insights really opened my eyes that this is possible. So the more value in artificial intelligence we can provide our customers, the more successful they'll be, which is why we're doing this relationship with IBM.

We're able to give our customers the incredible capabilities of not only Einstein but Watson. When you look at the industries we cater toretail, financial services, healthcarethe data and insights that Watson can provide our customers is really incredible. And we're also thrilled that IBM has agreed to use Salesforce products internally as well. This is really taking our relationship to a whole new level.

Rometty : Andrew, thank you for taking the time. This announcement is both strategic and significant. I do think it's really going to take AI further into the enterprise. I think about 2017 as the year when we're going to see AI hit the world at scale. It's the beginning of an era that's going to run a decade or decades in front of us. Marc's got thousands of clients; by the end of this year we'll have a billion people touched by Watson. We both share that vision. An important part of it is the idea that every professional can be aided by this kind of technology. It takes all the noise and turns it into something on which they can take action. It isn't just a sales process; we're going to link other processes across a company. We're talking about being able to augment what everyone doesaugment human intelligence. Together, this will give us the most complete understanding of a customer anywhere.

For our joint customers, to me, this is a really big deal. Take an insurance companyMarc's got plenty of them as clients. You link to insights around weather, hook that into a particular region, tell people to move their cars inside because of hail. You might even change a policy. These two things together do really allow clients to be differentiated.

This is the beginning of a journey together.

I thought this was the brainiest deal I've ever heard of, with Watson and Einstein together.

Rometty : It's good comedy.

Like any two large tech companies, you compete in areas and collaborate in othersfrenemies. Why did you engage in this partnership? Any executive asks themselves: build, buy, or partner. Why partner this time?

Benioff : I'll give you my honest answer here, which is that I've always been a huge fan of IBMGinni knows that. When I look at pioneering values in businesscompanies that have done it right and really stuck to their principles over generationsI really look to IBM as a company that has deeply inspired me personally as I built Salesforce over the last 18 years. We're going to be 18 years old on March 8th. When I look at what we've gone through in the last two decades, I really think that it's our values that have guided us and how those values have been inspired by many of the things at IBM.

Number two is, Ginni made a strategic decision to acquire Bluewolf, which is a company that we had worked very hard to nurture and incubate over a very long period of time. It really demonstrated to me that the opportunity to form a strategic relationship with IBM was possible. We both have this incredible vision for artificial intelligence but we're coming at it from very different areas. [Salesforce is] coming at it from a declarative standpoint, expressed through our platform, for our customer relationship management system. IBM's approach, which is pioneering, especially when it comes to key verticals like retail or finance or healthcarethese are complements. These are the best of both worlds for artificial intelligence. These are the two best players coming together. We have almost no overlap in our businesses. We have really a desire to make our customers more successful.

Rometty : Beautifully said. And I'll only add a couple of points. Not only sharing values as companies but in terms of how we look at our customers. We share over 5,000 joint clients. But more importantly, think about this era of AI. There are different approaches you can take. What Marc's done with Einsteinthink of it as CRM as a process. What we've done with Watsonthink of it as an industry in depth. We do have very little overlap. Why we talk about Watson as a platform is to be integrated with things like what Marc's doing.

Let me ask you about AI. It's been in development for decades, but the current wave is nascent. How do you each see AI as part of the success of your companies? It's a capabilityno one goes to the store to buy AI. Hopefully it solves their problems. But AI can be anything.

Rometty : I view AI as fundamental to IBM. Watson on the IBM cloudthat's a fundamental and enduring platform. We've built platforms for ourselves before: mainframe, middleware, managed services. This is now the era of AI. It will be a silver thread through all of what IBM does.

Is it fair to say that you guys aren't trying to compete on AI? I don't mean between youI mean within the greater industry.

Rometty : We're absolutely complementary. Clients will make some architectural decisions here. Everyone's gonna pick some platforms to use. They will pick them around AI. By the way, there are stages: the most basic is machine learning, then AI, then cognitive [computing]. What we're doing with Marc goes all the way into cognitive. Just to be clear.

Benioff : I could not agree more. We brought our customers into the cloud, then into the social world, then into the mobile world. Now we're bringing them into the AI world.

This is really beyond my wildest dreams in terms of what's possible today. And by the waythat we're able to replace Microsoft's products [at IBM] is a bonus for us. (laughs)

Read more here:

IBM, Salesforce Strike Global Partnership on Cloud, AI - Fortune

Posted in Ai | Comments Off on IBM, Salesforce Strike Global Partnership on Cloud, AI – Fortune

RISE OF THE MACHINES: AI computers learn to code THEMSELVES in major development – Express.co.uk

Posted: at 10:20 pm

GETTY

Microsoft and Cambridge University have teamed up to create AI software which has the ability to write code for itself.

The sophisticated machine known as DeepCoder has the ability to solve its programming problems by stealing codes from other programs.

The research paper from the two establishments says that the development is a huge step towards powerful AI and will also allow people to develop programs much easier.

The paper reads: We have found several problems in real online programming challenges that can be solved with a program in our language.

GETTY

Jason Dorfman, MIT CSAIL

1 of 9

"A dream of artificial intelligence is to build systems that can write computer programs.

Coding has been described as one of the most important skills of the future, and a recent survey from job markets firm Burning Glass found that as many as seven million job openings in 2015 required some form of coding skills.

GETTY

But with AI now having the ability to code itself, it could put many budding coders out of work.

A recent report from the United Nations (UN) revealed AI is set to displace millions of workers across the globe as scientists storm towards making machines with human-level intelligence.

While many firms will welcome the news of free labour that will be more efficient than humans, it will leave many people worried about their economic future.

The report from the UN warn that people in the developing world are at the most risk of losing their jobs to disruptive technologies and the study states that the process is already in full swing.

See original here:

RISE OF THE MACHINES: AI computers learn to code THEMSELVES in major development - Express.co.uk

Posted in Ai | Comments Off on RISE OF THE MACHINES: AI computers learn to code THEMSELVES in major development – Express.co.uk

What Would an AI Doomsday Actually Look Like? – Futurism

Posted: at 10:20 pm

Imagining AIs Doomsday Artificial intelligence (AI) is going to transform the world, but whether it will be a force of good or evil is still subject to debate. To that end, a team of experts gathered for Arizona State Universitys (ASU) Envisioning and Addressing Adverse AI Outcomes to talk about the worst-case scenarios that we could face if AI veers towards becoming a serious threat to humanity.

There is huge potential for AI to transform so many aspects of our society in so many ways. At the same time, there are rough edges and potential downsides, like any technology, says AI scientist Eric Horvitz.

As an optimistic supporter of everything AI has to offer, Horvitz has a very positive outlook about the future of AI. But hes also pragmatic enough to recognize that for the technology to consistently advance and move forward, it has to earn the trust of the public. For that to happen, all possible concerns surrounding the technology have to be discussed.

That conversation specificallywas what the workshop hoped to tackle.40 scientists, cyber-security experts, and policy-makers were divided into two teams to hash out the numerous ways AI can cause trouble for the world. The red team were tasked with imagining all the cataclysmic scenarios AI could incite, and the blue team was asked to devisesolutions to defend against such attacks.

These situations had to be realistic rather than purely hypothetical, anchored in whats possible given our current technology, and what we expect to come from AI over the next few decades.

Among the scenarios described were automated cyber attacks (wherein a cyber weapon is intelligent enough to hide itself after an attack and prevent all efforts to destroy it), stock markets being manipulated by machines, self-driving technology failing to recognize critical road signs, and AI being used to rig or sway elections.

Not all scenarios were given sufficient solutions either, illustrating just how unprepared we are at present to face the worse possible situations that AI could bring. For example, in the case of intelligent, automated cyber attacks, it would apparently be quiteeasy for attackers to use unsuspecting internet gamers to cover their tracks, using something like an online game toobscure the attacks themselves.

As entertaining as it may be to think up all of these wild doomsday scenarios, its actually a deliberate first step towards real conversations and awareness about the threat that AI could pose. John Launchbury, from the US Defenses Advanced Research Projects Agency hopes it will lead to concrete agreements on rules of engagement for cyber war, automated weapons, and robot troops.

The purpose of the workshop after all, isnt to incite fear, but to realistically anticipate the various possibilities of how technology can be misused and hopefully, get a head start on defending ourselves againstit.

Here is the original post:

What Would an AI Doomsday Actually Look Like? - Futurism

Posted in Ai | Comments Off on What Would an AI Doomsday Actually Look Like? – Futurism

Why AI will determine the future of fintech – TNW

Posted: at 10:20 pm

More investors are setting their sights on the financial technology (Fintech) arena. According to consulting firm Accenture, investment in Fintech firms rose by 10 percent worldwide to the tune of $23.2 billion in 2016.

China is leading the charge after securing $10 billion in investments in 55 deals which account for 90 percent of investments in Asia-Pacific. The US came second taking in $6.2 billion in funding. Europe, also saw an 11 percent increase in deals despite Britain seeing a decrease in funding due to the uncertainty from the Brexit vote.

Run an early-stage company? We're inviting 250 to exhibit at TNW Conference and pitch on stage!

The excitement stems from the disruption of traditional financial institutions (FIs) such as banks, insurance, and credit companies by technology. The next unicorn might be among the hundreds of tech startups that are giving Fintech a go.

What exactly is going to be the next big thing has yet to be determined, but other developments in computing like artificial intelligence (AI) may play a huge part.

The growing reality is that, while opportunities are abound, competition is also heating up.

Take, for example, the number of Fintech startups that aim to digitize routine financial tasks like payments. In the US, the digital wallet and payments segment is fiercely competitive. Pioneers like PayPal see themselves being taken on by other tech giants like Google and Apple, by niche-oriented ventures like Venmo, and even by traditional FIs.

Some ventures are seeing bluer oceans by focusing on local and regional markets where conditions are somewhat favorable.

The growth of Chinas Fintech was largely made possible by the relative age of its current banking system. It was easier for people to use mobile and web-based financial services such as Alibabas Ant Financial and Tencent since phones were more pervasive and more convenient to access than traditional financial instruments.

In Europe, the new Payment Services Directive (PSD2) set to take effect in 2018 has busted the game wide open. Banks are obligated to open up their application program interfaces (APIs) enabling Fintech apps and services to tap into users bank accounts. The line between banks and fintech companies are set to blur so just about everyone in finance is set to compete with old and new players alike.

Convenience has become a fundamental selling point to many users that a number of Fintech ventures have zeroed in on delivering better user experiences for an assortment of financial tasks such as payments, budgeting, banking, and even loan applications.

There is a mad scramble among companies to leverage cutting-edge technologies for competitive advantage. Even established tech companies like e-commerce giant Amazon had to give due attention to mobile as users shift their computing habits towards phones and tablets. Enterprises are also working on transitioning to cloud computing for infrastructure.

But where do more advanced technologies such as AI come in?

The drive to eliminate human fallibility has also made artificial intelligence (AI) driven to the forefront of research and development. Its applications range from sorting what gets shown on your social media newsfeed to self-driving cars. Its also expected to have a major impact in Fintech due to potential of game changing insights that can be derived from the sheer volume of data that humanity is generating. Enterprising ventures are banking on it to expose the gap in the market that has become increasingly small due to competition.

AI and finance are no strangers to each other. Traditional banking and finance have relied heavily on algorithms for automation and analysis. However, these were exclusive only to large and established institutions. Fintech is being aimed at empowering smaller organizations and consumers, and AI is expected to make its benefits accessible to a wider audience.

AI has a wide variety of consumer-level applications for smarter and more error-free user experiences. Personal finance applications are now using AI to balance peoples budgets based specifically to a users behavior. AI now also serves as robo-advisors to casual traders to guide them in managing their stock portfolios.

For enterprises, AI is expected to continue serving functions such as business intelligence and predictive analytics. Merchant services such as payments and fraud detection are also relying on AI to seek out patterns in customer behavior in order to weed out bad transactions.

People may soon have very little excuse of not having a handle of their money because of these services

Despite the exciting potential AI brings, there are still caveats. A big challenge for Fintech is to develop AI to be as smart as it can. There will be no shortage of people who will try to game and outwit such systems.

While AI seeks to eliminate human error, the flipside losing the human touch is a common criticism of AI. Smart money decisions are best made through numbers and logic. However, people do have an emotional connection with their money so it will be a challenge for Fintech apps to create experiences that do not alienate its users. Take the sad stories of insurance claims being denied due to strict algorithms that disregard the nuances of the human condition. AI still has a way to go factoring in what is just and moral in its decision making.

As for finance as a field and industry, there is also the issue of financial analysts, advisors, bankers, and traders being threatened to obsolescence by AI. A running joke with AI alludes to the Terminator movie franchise where AI seeks to eliminate humanity from existence. Unemployment, however, is rarely a laughing matter.

With the stiff competition in Fintech, ventures have to deliver a truly valuable products and services in order to stand out. The venture that provides the best user experience often wins but finding this X factor has become increasingly challenging.

The developments in AI may provide that something extra especially if it could promise to eliminate the guess work and human error out of finance. Its for these reasons that AI might just hold the key to what further Fintech innovations can be made.

This post is part of our contributor series. It is written and published independently of TNW.

Read next: TNW's 5 rules for writing the perfect cold email

The rest is here:

Why AI will determine the future of fintech - TNW

Posted in Ai | Comments Off on Why AI will determine the future of fintech – TNW