{"id":231580,"date":"2017-08-01T06:54:33","date_gmt":"2017-08-01T10:54:33","guid":{"rendered":"http:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/uncategorized\/siggraph-neurable-lets-you-control-a-virtual-world-with-your-uploadvr.php"},"modified":"2017-08-01T06:54:33","modified_gmt":"2017-08-01T10:54:33","slug":"siggraph-neurable-lets-you-control-a-virtual-world-with-your-uploadvr","status":"publish","type":"post","link":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/mind-upload\/siggraph-neurable-lets-you-control-a-virtual-world-with-your-uploadvr.php","title":{"rendered":"SIGGRAPH: Neurable Lets You Control A Virtual World With Your &#8230; &#8211; UploadVR"},"content":{"rendered":"<p><p>    Ive used my eyes to    interact with a virtual world before, but startup Neurable    just enhanced that experience by reading my thoughts too.  <\/p>\n<p>    At SIGGRAPH this week the Boston-based startup is showing its    modified HTC Vive which include EEG (Electroencephalography)sensors    along the interior of the headstrap. This is combined with    eye-tracking technology from German firm SMI, which     may have just been acquired by Apple. The EEG sensors    comb-like structure dug through my hair to subtly make contact    against my scalp where they detected brain activity. It is    definitely alarming to hear someone outside VR say my brain is    looking good.  <\/p>\n<p>    What followed was a brief training session where a group of    objects floated in front of me  a train, ball and block among    them. Each time one of them rotated I was told to focus on that    object and think grab in my mind. I did so a number of times    for several of the objects, all successful.  <\/p>\n<p>    Afterward there was a test. I was told to just think of the    object I wanted. I tried not to stare directly at the object I    wanted but five out of five times the correct object was picked    as I thought about it. A sixth time the wrong object was    selected but it occurred as someone was talking to me and I was    distracted. As I refocused, almost immediately the correct    object moved toward me.  <\/p>\n<p>    In the video above you can see each of the objects flash.    Neurable CEO and President Ramses Alcaide says they are able to    detect these flashes in my brain even though they register    subconsciously. He said the eye tracking inside the headset    wasnt active during the training and test portion of the    demonstration. It became active during the next portion of the    demo meant to show the potential of the system in a game    environment. Heres how Neurable     describes it:  <\/p>\n<p>      Neurable is debutingAwakening,a VR game      preview made in partnership witheStudiofuture, at SIGGRAPH      2017 in Los Angeles.Awakeningis a futuristic      story reminiscent ofStranger Things: you are a      child held prisoner in a government science laboratory. You      discover that experiments have endowed you with telekinetic      powers. You must use those powers to escape your cell, defeat      the robotic prison guards, and free yourself from the lab.      The game allows you to manipulate objects and battle foes      with your mind, and is played entirely without handheld      controllers.    <\/p>\n<p>    According to Neurable, this works using machine learning to    interpret your brain activity in real time to afford virtual    powers of telekinesis. The company offers an SDK so Unity    developers can integrate the system into a game.  <\/p>\n<p>    I was able to select a group of objects on the ground of my    holding cell just by thinking about them and then use them to    try and escape. I was offered some hints from outside VR to    escape the room but the selection with my mind worked to grab    the objects I wanted. As I moved into a lab, I looked around at    the counter tops and thought about the objects to toss at a    robot approaching me. One of them was a keyboard. As I thought    the word grab it floated toward me. Object after object I    tossed at the incoming robots until I progressed through to the    end of the level.  <\/p>\n<p>    We have two modes. Pure EEG mode, which just determines the    object you want and brings it to you directly, and we have a    mode that is a hybrid BCI [brain-computer interface] mode, and    in that mode we can use the eyes as a type of mouse where you    can move your eyes nearthe object you want to select, said    Alcaide. From there your brain tells us which one you clicked    on.  <\/p>\n<p>    In the video above you can see me sort of covering my face as a    kind of surprised reaction each time the system correctly    identified which object I wanted. I was frankly in shock  I    really didnt expect it to work as well as it did.Both    this brain-computer interface and the earlier eye tracking demo    I tried felt like true super powers.  <\/p>\n<p>    According to Alcaide, Neurable raised around $2 million and has    13 employees.  <\/p>\n<p>    I think the future of mixed reality interactions is an    ecosystem of solutions that incorporates voice, gesture    control, eye tracking and the missing link to that entire    puzzle which is brain-computer interfaceswe need some sort of    system that prevents the action from happening until the user    wants it to happen, and thats where brain-computer interfaces    come in, said Alcaide. In my opinion mixed reality cannot    become a ubiquitous computing platform like the iPhone, or like    the computer, until we have brain-computer interfaces as part    of the solution.  <\/p>\n<p>    Tagged with: Neurable  <\/p>\n<p><!-- Auto Generated --><\/p>\n<p>See the rest here: <\/p>\n<p><a target=\"_blank\" href=\"https:\/\/uploadvr.com\/siggraph-neurable-lets-control-virtual-world-thought\/\" title=\"SIGGRAPH: Neurable Lets You Control A Virtual World With Your ... - UploadVR\">SIGGRAPH: Neurable Lets You Control A Virtual World With Your ... - UploadVR<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p> Ive used my eyes to interact with a virtual world before, but startup Neurable just enhanced that experience by reading my thoughts too.  <a href=\"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/mind-upload\/siggraph-neurable-lets-you-control-a-virtual-world-with-your-uploadvr.php\">Continue reading <span class=\"meta-nav\">&rarr;<\/span><\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"limit_modified_date":"","last_modified_date":"","_lmt_disableupdate":"","_lmt_disable":"","footnotes":""},"categories":[16],"tags":[],"class_list":["post-231580","post","type-post","status-publish","format-standard","hentry","category-mind-upload"],"modified_by":null,"_links":{"self":[{"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/posts\/231580"}],"collection":[{"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/comments?post=231580"}],"version-history":[{"count":0,"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/posts\/231580\/revisions"}],"wp:attachment":[{"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/media?parent=231580"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/categories?post=231580"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/tags?post=231580"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}