Facebook Reality Labs researcher designs software that allows novice designers to build expressive robots – Dailyuw

Posted: February 28, 2020 at 9:46 am

Many people know Geppetto as the fictional carpenter that brought a wooden puppet to life. The puppet who we know as Pinocchio can sing, dance, and feel sadness and love.

At Carnegie Mellon University (CMU), Geppetto is a data-driven system for designers to create robots that can interact with emotions. With only changes to the speed, body angles, and gait, designers can create walking spiders that show happiness, anger, or excitement.

Other animals, such as puppies, dinosaurs, and centaurs, have also been, ironically, robotized to life.

The Geppetto project was founded by Ruta Desai, a research scientist at Facebook Reality Labs, (a research component at Facebook), and her team at CMU and Autodesk. Desai lectured at the UW Design Use Build on how this technology can democratize the robot design process for non-technical designers.

[These tools] can empower novices, these are the people without any robotics or engineering backgrounds, to build and design robots for their own needs, Desai said. We can empower everyone to play a part in shaping our future.

By normalizing robotic technology for the common people," robots will no longer be the exclusive property of large companies and research institutions. Desai envisions a future where people can obtain robot-building tools, which are software and gadget used to assemble robots, online or at a local store.

Robots will be integrated into daily human life just as smartphones are today making our life better, Desai said. Maybe have a Robot Now service just as we have [Amazon] Prime Now today, and thats really the vision of this project.

However, designing a functional robot is skill-intensive and requires high-level knowledge in engineering. It also doesnt have a live demonstration while the robot is being built, which can be frustrating whenever the system fails.

Imagine you want to build a robot now, Desai said. To do this, you have to first design the mechanical structure based on the task, [you] also [have to] think about the necessary electrical subsystems held together, and finally program this behavior to do the task that you want the robot to do.

As Desai explained, robot construction is built on three main axes: electrical subsystems, mechanical structure, and behavior. Since electrical subsystems are becoming more accessible, Desais research focuses on innovating the structure and behavior design of the robots.

Desais first research was a drag-and-drop robot design tool that allows users to easily ensemble wire, legs, and wheels within minutes. Showing a graphical interface allows the designer to see if there is enough space for all the components to fit together.

In her usability testing, the process not only significantly saves time but also increases the success rate of every iteration.

While her early work was predominantly used for articulated and non-articulated robots, Desais latest research, Geppetto, investigates semantic design to create expressive robotic behaviors.

Geppetto, in a way, brings life to a robot and that is why we named it after the famous Disney wood crafter, Desai said.

Current robot-building tools fall into two categories: animation and robotic tools. The former has great editing controls and the latter has great, but hard-to-use, simulation capabilities. To bridge the gap, Geppetto allows users to create and edit behaviors using physical simulation.

Desai displayed the interface to the audience to visualize the software. In the center is an area for users to design the robot; on the left is a slider panel where they can change the expression parameters; and on the top is a gallery of sample motions to inspire users of possible outcomes.

Desai explained that the gallery feature addresses the black slate issue. This is when novice users dont know what constitutes an expressive robot. For example, when users see an angry spider spiking its legs and knuckles, they have a better visualization of their own design.

Geppetto has a data-driven framework to help designers create different kinds of expressive robots. However, this also depends on how similar the robots are. An existing dataset for a four-legged walking robot might not easily translate to a six-legged walking robot.

Desai talked about her future work on redefining human-computer interaction to human-computer understanding. In robotic engineering, she pointed out how users still don't know the functions of a system while the systems can't predict what the users want.

Desai addresses this disconnect by creating multimodal features, such as images, sketching, and natural language processing, that can capture the users intent and preferences.

Achieving this technology can lead to opportunities in generative design. This is when the user specifies conditions to the system, which creates design options that satisfy these constraints. Its the technology that Tony Stark uses to design the time machine in Avengers: Endgame.

Wouldnt it be great if the user could just say I want to create a desk that can support a monitor model X? Desai said.

At Facebook Reality Labs, Desai is working on building computational methods that enable contextual and adaptive interaction. In simpler words, this is when a device can detect and react to users intent.

Robotic technology has been the showstopper of big tech companies and science fiction, but with innovative tools that allow non-technical designers to build their own expressive robots, people will be seeing more robotic animals gleefully dancing to the sound of music

Reach reporter Anh Nguyen at science@dailyuw.com. Twitter: @thedailyanh

Like what youre reading? Support high-quality student journalism by donating here.

Continued here:

Facebook Reality Labs researcher designs software that allows novice designers to build expressive robots - Dailyuw