Engaging the public in robotics: 11 tips from 5,000 robotics events across Europe – Robohub

Europe is focussed on making robots that work for the benefit of society. This requires empowering future roboticists and users of all ages and backgrounds. In its 9th edition, the European Robotics Week (#ERW2019) is expected to host more than 1000 events across Europe. Over the years, and over 5,000 events, the organisers have learned a thing or two about reaching the public, and ultimately making the robots people want.

Demystify robotics

For many, robots are only seen in the media or science fiction. The robotics community promises ubiquitous robots, yet most people dont encounter robots in their work or daily lives. This matters. The Eurobarometer 2017 survey of attitudes towards the impact of digitisation found that the more familiar people are with robots, the more positive they are about the technology. A recent workshop for ERW organisers highlighted the importance of being able to touch, feel, see and enjoy the presence of robots in order to remove the fear factor and improve the image of robots. People need to interact with real robots to understand their potential, and limitations.

Bring robots to public places

Most robotics events happen where roboticists and their robots already are, in universities and industry. This works well for those who show interest in the field, and have the means to attend. To reach a broader audience, robots need to be brought to public places, such as city centres, or shopping malls. ERW organisers said dont expect ordinary people to come to universities. In Ghent Belgium for example, space was found in the city library to give visitors an opportunity to interact with robots. More recently, the Smart Cities Robotics (SciRoc) challenge held an international robot competition in a shopping mall in the UK.

Tackle global challenges

Robots have a role to play in tackling todays most pressing challenges, whether its the environment, healthcare, assistive living, or education. Robots can also improve efficiencies in industry and avoid 4D (dangerous, dirty, difficult, drudgerous) jobs. This is not often explicitly highlighted, with robots presented for the sake of it as fun gadgets, instead of useful tools. By positioning robots as the helpers of tomorrow, we empower users to imagine their applications, and roboticists receive meaningful feedback on their use. Such applications may also be more exciting for a broader diversity of people.

The Blue-Eyed Dragon Robot by Biljana Vickovi (with the University of Belgrade, Mihajlo Institute, Robotics Laboratory Belgrade, Serbia) for example introduced an innovative and socially useful robotic artwork into a public space with a tin recycling function. It integrates robotics into an artwork with a demonstrable ecological, social and cultural impact. The essence of this innovative work of art is that it enables the public to interact with it. As such people are direct participants and not merely an audience. In this way contemplation is replaced by action. says its creator.

Tell stories about people who work with robots

Useful robots will ultimately be embedded in society, our work, our lives. Their role is often presented from the developerss or industrys perspective. This leaves the public with the sense that robots are being done to them, rather than with them. By bringing the users in the discussion, we hear stories of how they use the technology, what their hopes and concerns are, and ultimately design better robots and inspire future users to make use of robots themselves.

Bring a diversity of people together

Making robots requires a large range of backgrounds, from social sciences, law, and business, to hardware and software engineering. Domain expertise, will also be key. Assistive robots will require input from nurse carers for example. Engaging with a diverse population of makers and users will help ensure the technology is developed for everyone. The ERW2019 central event in Poznan features a panel dedicated to women in digital and robotics. Carmela Snchez from Hisparob in Spain says this year, our motto for ERW is Robotic Thinking and Inclusion. We focus on how robotics and computational thinking can help inclusion: inclusion of different abilities, social, and economic backgrounds, and genders.

Avoid hype and exaggerations

Inflated expectations about robotics may lead to disappointment when robots are deployed, or may lead to unfounded fears about their use. A recent ERW organiser commented Robots are not prevalent or visible in society at large and so prevailing perceptions about robots are largely shaped by media presentation, which too often resort to negative stereotypes. Its worth noting robots are typically made for a single task, and many do not look like a humanoid robot. With this lens, robots no longer seem too difficult to engineer, and are far from science fiction depictions. This could be empowering for those who would like to become roboticist, and could help users imagine robots that would be helpful to them. The Smart CIties Robotics challenge for example showed the crowds how robots could help them take a lift, or deliver emergency medicine in a mall.

Teach teachers

By teaching teachers to teach robotics, we can reach many more students than what is possible through all the European Robotics Week combined. Lia Garcia, founder of Logix5 and a national coordinator of ERW in Spain underscored the need to engage the education sector: We have to work with teachers. We need to get robotics onto the school curriculum, onto the teaching college curriculum and to get to teachers who teach teachers. Workshops that teach educators, and help spread the word among local teachers are essential. As an added encouragement, they could receive CPD (continuing professional development) credits for taking part in robotics workshops. The ERW2019 central event in Poznan features a workshop dedicated to robotics education in Europe on 15 November.

Run competitions

Competitions are an important way of bringing students into robotics. Its fun, and exciting, and shows they can build something that works in the real world. Europe now hosts several large robotics competitions including the European Robotics League (Emergency, Consumer, Professional, and Smart Cities). While these competitions are tailored to university students, others are run for kids. The ERW event page already has over 100 robot competitions and challenges listed for this year. Fiorella Operto from Scuola di Robotica has coordinated more than 100 teams from all over Italy committed to using a humanoid robot to promote the Italian Cultural Heritage. The 2020 edition of the NAO Challenge is devoted to Arts&Cultures, asking robotics to improve the knowledge of and to promote beautiful Italian art.

Keep it fun

More than ever, we have a broad range of tools to engage with the public. It could be as simple as drawing pictures of robots, to developing robot-themes escape rooms, or engaging on social media including youtube, twitter, instagram and tiktok. Robots are fun, which is why they are such good tools in education. Be creative with demos and activities. Make robots dance, allow people to decorate them, play games. University of Bristol for example will be running a swarm-themed escape room called Swarm Escape!.

Engage with stakeholders

Events with the public are a good opportunity to engage with stakeholders, including government, industry, and users. This is important as stakeholders will ultimately be the ones making robots a reality. Having them participate in such events helps them understand the potential, invest in technology and skills, and shape policy. It could also provide funding for some of the more ambitious events. For the first time since 2012, Robotics Place, the cluster of Occitanie, organizes a one day meeting with its members on November 20th in Toulouse. Robotics Place members will meet with press, politics, students, partners and professional customers. says Philippe Roussel, a local coordinator for France.

Act regionally, connect across Europe

Events are present across Europe, organised regionally for the local community. Connecting these events at a European scale increases impact, raise awareness, builds momentum, and allows for lessons to be shared across the content. euRobotics and Digital Innovation Hubs provide valuable resources for these purposes.

Yet there is a divide in access, with cities being better catered to than rural communities, or areas that are poorer. The challenge is to provide everyone with access and exposure to robotics and its opportunities. Extra effort should be made to reach out to underserved communities, for example using a robot roadshow. Organisers of ERW said a further benefit of this cross-border approach would be to enhance the European dimension. As an example, from May 2020, a 105m long floating Science Center called the MS Experimenta will be touring southern Germany, bringing science from port to port.

Get involved

Feeling inspired, ready to make a difference? Organise your own European Robotics Week event, big or small, and register it here along with the over 900 events already announced.

guest author

Continue reading here:

Engaging the public in robotics: 11 tips from 5,000 robotics events across Europe - Robohub

Robotic Takeaways and Trends from FABTECH 2019 – Robotics Business Review

A KUKA welding robot on display at FABTECH 2019. Image courtesy of Jonathan Alonso, CNC Machines.

CHICAGO More than 40,000 attendees gathered here last week to look at the latest robotics and automation solutions in the metal forming, fabrication, welding and finishing space. The annual FABTECH show is held every two years in Chicago, rotating with other series in the other years.

Here are some of the newer robotic and automation trends we discovered while at the show:

3D printing continues to advance in the types of products it can produce and materials it can use. The technology is preferable to traditional machining for shorter-run applications, lower costs and a more varied production capability. However, traditional machining can produce high-volume products and components more quickly.

Formlabs showed off industrial 3D printed examples at FABTECH 2019. Image: Jonathan Alonso, CNC Machines.

As 3D printing starts to use more materials, it becomes a more compelling option, attendees said. At the show, BigRep announced four new materials:

Xometry displayed parts produced with its newest 3D printing process, Carbon DLS, a 3D printing technology that uses digital light projection, oxygen-permeable optics and programmable liquid resins to produce products with end-use durability, resolution, and surface finish.

The company also ran live demos of the Xometry Instant Quoting Engine, which enables the user to select the type of process (3D printing or machining), material and other parameters to obtain pricing for custom parts.

Several companies were showing off new arc welding capabilities. While robots have been used for this application for some time, some providers are making changes in some of their designs.

Kawaski Robotics, for example, separated the centers of the upper arms length and rotation axis of the BA006N and BA006L so that wire feeders can be placed in the space behind the upper arm. The design change is designed to provide more operation flexibility, as well as higher speed to increase productivity.

By eliminating cable positioning variables, the change also makes offline programming more efficient and maintenance faster, Kawasaki said.

The AR3120 robot provides long reach for welding. Image: Yaskawa Motoman

Yaskawa Motoman displayed its AR700 and AR900 robots, which the company says are ideal for welding small parts with complicated angles in tight places with an overarm torch. A slim profile design enables close proximity placement of robots for high-density workcells, and a smooth, easy-to-clean surface accommodates use in harsh environments. As an option, the manipulator cable can be connected on the bottom of the robot (as opposed to the side) to avoid wall interference.

Another Yaskawa Motoman robot, the AR1730, has a contoured arm design to allow easy access to parts in tight spots and avoids potential interference with fixtures, enabling close proximity placement of robots for high-density work cells.

Using robots for such tasks isnt new, but the precision and dexterity of the robots and grippers continues to evolve so that robots can handle ever more of these tasks.

Arc Specialties displayed the Kuka LBR iiwa, seven-axis force-controlled collaborative robot that incorporates force sensors into each axis with a resolution of less than one pound of force. Combining superior 3M abrasives with Burr King belt sanders and polishing spindles, along with an experienced robot integrator that puts it all together and creates the code, the result is a cell that demonstrates force-controlled polishing on the belt sander slack side and wheel, with final finishing on a disk.

Italy-based Lesta, which announced its entrance into the U.S. in August with its LestaUSA subsidiary, displayed its finishing robots for liquid and powder coat applications.

LestaUSA was showing its finishing robots for liquid and power coat applications. Image: Jonathan Alonso, CNC Machines.

Our robotic technology itself isnt new, said Derek DeGeest, president of LestaUSA. Whats unique here is that LestaUSAs proven technology is so simple that, upon completion of the installation, a manufacturer of any size can literally be making their own programs and painting robotically on Day One.

According to the company, other robotic painting technology requires engineers and robotic programmers, but LestaUSAs robots are self-learning and only needs a painter to learn. Lesta robots go into a weightless learning mode while in the hands of a companys best painter, who then performs the painting cycle on a desired part as the software creates its own robotic code of every movement and paint spray. The painters exact technique, including the application of the paint, is saved and then mirrored by the robot on future jobs.

Robot manufacturers continue to seek to make their products easier to use. Yaskawas teach pendant, for example, orients itself automatically with the way the operator is oriented. So if the operator turns 90 degrees, for example, forward, right, left, etc., will now be 90 degrees different than it was before, said Michael Castor, product manager, material handling. Someone can learn how to program basic movements in just 30 seconds; it used to be someone had to spend a whole day learning this.

Additionally, Yaskawas Universal Weldcom Interface (UWI) offers full utilization of the advanced capabilities on select Miller and Lincoln Electric digital welding power supplies, providing simple control of any weld process or parameter, including voltage, amperage and wire feed speed through a common user interface for either brand.

The UWI can filter weld modes based on process type, wire size, wire type and gas type and offers up to 16 unique processes from the power source library for easy access within the interface, and up to 1,000 custom arc files with specific processes and parameters are available for use in motion programming.

Trade shows such as FABTECH tend to be held the same month every year (or every other year for shows on a biennial schedule), but FABTECH will move to September in 2021 (Sept. 13-16) and for future Chicago events.

FABTECH is one of the largest events held at McCormick Place, bringing $73 million delegation spending to Chicago, said John Catalano, SME senior director, FABTECH. As a result of FABTECHs growth over the past years and our favorable relationship with the city and the convention center, we were proud to make this shift to support the needs of the industry.

(Editors note: Images and video provided courtesy of CNC Machines).

Original post:

Robotic Takeaways and Trends from FABTECH 2019 - Robotics Business Review

QEII Health Centre touts effectiveness of robotic surgery – TheChronicleHerald.ca

At first glance, Alisa Morris could have been playing a high-tech video game.Her face was pressed into a binocular-viewing pad, her hands were manipulating an elaborate claw-like device and a monitor screen beside her showed what looked like mechanical arms moving around a yellow circle.Morris actually was trying out a robotic surgery demonstration machine called the da Vinci Xi with the help of specialist Johnny Farah of Minogue Medical.Yeah, there you go. Now you have better reach, thats going to help you reach those really really tight areas that surgeons experience in a complex case, Farah encouraged Morris, who works in the perioperative care department at the Halifax Infirmary.That was so fun, Morris said after yielding the controls to a friend amid the hubbub of a packed infirmary lobby. The movement is very seamless, its just like moving your own hand.

Adding robotic extensions to a surgeons hands sounds futuristic but its quickly becoming the norm in operating rooms across the world.The impact of surgical robotics on the lives of patients and their families cannot be understated, said Dr. Katharina Kieser, chief of gynecology at the QEII Health Sciences Centre, told the gathering organized by the QEII Health Foundation on Friday morning.

Access to this cutting edge technology means surgeries can be performed with a few small incisions and the utmost precision, leading to faster recovery times, less dependence on painkillers and other benefits, she said.The robotic surgery program at the QEII Health Sciences Centre was launched on a trial basis in February. Two gynecological surgeons and two urological surgeons have performed about 88 procedures.We felt it was very important to make sure we have a small team initially doing these procedures to make sure we were cohesive, all well trained and that we were doing enough numbers and volumes for everybody to feel very comfortable, said Joanne Dunnington, director of perioperative services for the Nova Scotia Health Authority, in an interview at the event.

This is the future of health care. ...While its tremendously important for our patients today its only the tip of the wedge for more development in advanced technology for the next 10 to 30 years. Its a huge deal.

- Bill Bean, CEO of QEII Health Foundation

Many of the procedures have been for prostate and uterine cancers, as well as partial nephrectomies, which involve the removal of part of the kidney. The four surgeons are supported by a nursing team of about six people. Many physicians get robotic surgery training in their residencies and Minogue Medical, the Canadian vendor for the da Vinci robot, also provides education and training on that machine for NSHA staff, Dunnington said.This is the future of health care, said Bill Bean, CEO of the QEII Health Foundation, in an interview. And so while its tremendously important for our patients today its only the tip of the wedge for more development in advanced technology for the next 10 to 30 years. Its a huge deal.The foundation is so far footing the entire bill for the robotic surgery program, which will cost $8.1 million over the next five years.It was announced at the event that the Sobey family has given $3 million toward the effort, making for a total of $5.3 million in donations. The foundation will be reaching out to the community as the campaign goes on to make up that $2.8-million shortfall, Bean said.The robotic surgery program at the QEII, the first in Atlantic Canada, has also been a boon for the recruitment of top surgeons, urology chief Greg Baillytold the gathering.In the past couple of years, weve been able to recruit at least four surgeons in four different specialties who have robotic experience in their fellowship training including urology, thoracic surgery, gynecology and ear nose and throat, said Bailly, who also heads the QEII robotics council.

Without a doubt, surgical robotics have played a key role in recruiting these individuals.For more information on the QEII Health Foundations robotic surgery campaign, go to https://qe2foundation.ca/current-priorities/surgical-robotics.

Follow this link:

QEII Health Centre touts effectiveness of robotic surgery - TheChronicleHerald.ca

USC Viterbi Researchers Honored with Best Paper Nomination at the 2019 International Conference on Intelligent Robots and Systems – USC Viterbi School…

Bee+, a 95 mg four-winged robotic insect prototype designed by the Autonomous Microrobotic Systems Laboratory. PHOTO/Nestor Prez-Arancibia.

Next time you pass a flying insect, take a moment to appreciate its structure. With a body thats fairly heavy compared to its thin, delicate wings, this insect will fly with grace, fluidity and control. While this is an intrinsic aspect of an insects natural anatomical design, its very difficult to achieve this perfect balance of elements when replicating it in a robotic version.

In work that resulted in a best paper nomination by the 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), USC Viterbi researchers achieved this balance. The research team, members of the Autonomous Microrobotic Systems Laboratory (AMSL), designed the first four-winged robotic insect weighing less than 100mg. At 95mg, this prototype, Bee+, features a new type of actuator that weighs half as much as the prior state-of-the-art piezoelectric actuatorone that transforms electrical energy into a mechanical displacement or stress.

The research team is led by Nstor O. Prez-Arancibia, Assistant Professor in the USC Viterbi Department of Aerospace and Mechanical Engineering (AME) and includes AME PhD candidates Xiufeng Yang and Ariel A. Caldern, as well as recent AMSL PhD graduates Ying Chen and Longlong Chang.

This new actuator and a new robotic configuration enabled us to integrate four wings inside the same envelope as that of the prior two-winged prototype, said Prez-Arancibia. The individual areas of expertise of the participant PhD students reflect the multidisciplinary nature and complexity of the research effort that resulted in the reported innovation: Xiufeng is an expert on robotic design; Ying is an expert on control and dynamics; Longlong is an expert on aerodynamics; and Ariel is on expert on fabrication. Together, we were able to make an incredible breakthrough.

It also enabled the researchers to break the world record of lift-to-weight ratio for flying robots at this scale. Bee+ demonstrates improved controllability and potentially longer life span over predecessors, like the 75-mg RoboBee created seven years ago by a team of Harvard researchers that included Prez-Arancibia.

Members of the Autonomous Microrobotics Systems Laboratory, led by Nstor O. Prez-Arancibia. PHOTO/Nstor O. Prez-Arancibia.

Next up, the lab is working on creating the first fully autonomous (in terms of control and power) sub-gram flying robot.

Held in Macau, China from November 4-8, 2019, the IROS conference is one of the leading conferences on robotics in the world. Over 2,500 papers from 53 countries were submitted. The paper on Bee+ was one of four finalists for the IROS 2019 Best Paper Award and was simultaneously accepted for publication in the journal IEEE Robotics and Automation Letters. To read more about the Bee+, please see here.

See the original post here:

USC Viterbi Researchers Honored with Best Paper Nomination at the 2019 International Conference on Intelligent Robots and Systems - USC Viterbi School...

Most plastic is not getting recycled, and AI robots could be a solution – Business Insider

Humans have enlisted nearly 100 AI-powered robots in North American to come to the rescue for something humans are terrible at: recycling.

Even when we try to do it right, we're often making things worse; About one out of every four of the things people throw into the recycling bin aren't recyclable at all.

All those misplaced greasy pizza boxes (not recyclable) and clamshell containers tossed in with the plastics, have imperiled an industry that was never really that effective in the first place.

Only a small fraction of the over 2.1 billion tons of the garbage the world produces each year gets recycled about 16%.

And even that small sliver has gotten smaller over the past year.

For decades, the US sold more than half of its recyclables to China mostly plastics to be melted into pellets, the raw material for making more plastic.

But in March of 2018, China said, "No More."

"They started shipping more and more stuff to China, often contaminated dirty plastics or mixed too many mixed goods," said Kate O'Neill, a UC Berkeley professor and author of "Waste."

Around a quarter of the shipments China received had to be hand-processed, buried in landfills, or incinerated.

So the Chinese government declared that bales could contain only up to half a percent of things that contaminated them, like food wrappers or a dirty jar of peanut butter. US consumers and recycling centers couldn't keep up.

"I think people in the wealthy countries had gotten complacent, never bothering to build more recycling facilities domestically," O'Neill added.

Today, a handful of start-ups are testing out new technology to make recycling sustainable.

AMP Robotics is an artificial intelligence and robotics company that aims to change the way we recycle.

Founder of AMP Robotics, Matanya Horowitz said "the situation with the Chinese export markets have actually been good for [the company]."

Robots use artificial intelligence to sort through recyclables. BHS

AMP Robotics is rolling out its latest model: a "Cortex Robot" that uses optical sensors to take in what rolls by, and a "brain" to figure out what his "hands" should do with something even if it looks different to anything he's seen before.

"A lot of these recycling facilities are structured with the primary task of basically dealing with contamination that's not supposed to be there," said Horotwiz. ""What we see is a lot of recycling facilities are investing in automation to help improve their operations."

At least four companies are rolling out similar models, in the hopes of turning a profit from the US' growing piles of hard-to-sort recyclables.

And investors are taking notice. In November 2019, AMP Robotics announced a $16 million Series A investment from Sequoia Capital.

But what about helping humans get better at choosing what to put in their recycling bins in the first place?

New policies in Shanghai are one of the first steps in China's push to solve its waste problems.

This past summer, citizens will face fines and what are called "social penalties" if they don't sort things properly.

One trash sorting volunteer said, Shanghai started the test run on June 24. "It was very hard for us at the beginning. Everyone was busy, people didn't know how to sort," the volunteer who requested to be unidentified said.

"At first we had some hard times," said Shanghai citizen Zhaoju Zhang. "The most difficult part was how to differentiate between dry and wet trash. It was so complicated that we all got confused."

Almost immediately, hundreds of AI-enabled apps sprouted up in order to assist everyday sorting.

"If it's something that is confusing whether it's dry or wet trash, we can just scan the item and get the answer," Zhang said.

Shanghai citizens are now required to sort recyclables properly from their trash. Yuan Ye

But not everyone has access to AI to help parse the new rules, and many complain that complying is tough, and punishments are too harsh.

Kate O'Neill said the new laws are having a "massive cultural impact" and there are "some concerns about how draconian it is, but it's too early to really tell the results. But it certainly has seems to be a massive culture shift."

This kind of cultural shift in how we throw things away would be challenging in the US, where the average person produces twice as much trash as a Chinese citizen.

But experts warn that rethinking the way we deal with garbage is essential, and AI technology offers a promising way forward.

It's even possible for it to identify who created a piece of trash in the first place.

Horowitz explained that robots are able to learn the features of materials. They are able to sparse whether a material is cloudy or opaque. AI robots may even be able to identify symbols of specific brands. All of these abilities help the robots like Max narrow down the source of contamination and what to do with it.

Last year, over 250 companies signed a MacArthur Foundation agreement pledging that 100% of plastic packaging will be easily and safely reused, recycled, or composted by 2025.

CEO of SC Johnson, Fisk Johnson, said in an interview, "We're a family company, and we have a very long-term view, and business has to be part of the solution."

Whether or not they make good on this pledge, AI will be quietly watching, and gathering data on the packaging these brands continue to use.

Originally posted here:

Most plastic is not getting recycled, and AI robots could be a solution - Business Insider

Ten outrageous things robots can do right now, from cooking to building IKEA furniture – National Post

Robots are still a far cry from the ones that turned on humanity in I, Robot and the Terminator movies, but they are rapidly becoming more advanced. For example, Boston Dynamics has a dog-like robot that can open doors, and a humanoid one that does parkour and gymnastics. Here are 10 more outrageous skills robots have picked up in recent years.

1. Play soccer and Simon Says simultaneously

Researchers at MIT designed their quadrupedal mini cheetah to be virtually indestructible: it has the dexterity of a yoga teacher, can nail a 360-degree backflip and, when kicked to the ground, recovers in one kung-fu-like swoop. So far, the army of tiny bots have mastered soccer and Simon Says.

2. Cook a gourmet dinner

In a few years, its entirely possible that robot chefs will be as common a household item as toasters or coffee makers. U.K.-based tech company Moley has created an entirely robotic kitchen, which features a set of dexterous arms that can hold utensils, crack eggs, measure ingredients and even do the dishes. The master robo-chef is able to prepare hundreds of recipes from around the world, all of which can be downloaded from an electronic library. The consumer version is expected to launch by the end of the year.

3. Make rock music

A team of German engineers decided to put a literal spin on heavy metal with Compressorhead, a fully animatronic rock band. Each of its four members the aptly named Fingers (a guitarist with 78 fingers), Stickboy (a four-armed drummer), Bones (the bass player), and the latest addition, Junior (the lead singer, which the bands creators crowdfunded more than $400,000 to build) plays a real instrument. Since the band formed in 2013, it has released a full album and performed at music festivals in the U.K., Russia, France, Australia and even Canada.

4. Give hugs

Researchers in Stuttgart, Germany have created a real-life version of Disneys Big Hero 6. (A robot deemed so lovable that it won 2015s Most Huggable Character award.) HuggieBot standing over five feet tall and weighing 450 pounds asks humans for hugs (and even says please) before embracing them in its massive metal arms. Its creators hope that one day, HuggieBot can be used for emotional support in college dorms and senior facilities.

5. Win at Rock Paper Scissors

Even past victors of the World Rock Paper Scissors Championship (yes, thats a real thing) stand no chance against the University of Tokyos Janken robot, which has never lost a game ever. Instead of relying on prediction, the bot uses high-speed recognition to determine what shape the human hand is going to make, then reacts with the winning move. The entire process only takes a thousandth of a second.

6. Ski like an Olympian

During last years winter Olympics in Pyeongchang, humans werent the only ones competing in Alpine skiing. At the nearby Welli Hilli ski resort in Hoenseong, eight robot athletes zipped down the slopes to compete for a prize of $10,000. Each bot had to be over 50 cm tall, maneuver around the flag poles and needed to have joints that allowed them to bend their knees and elbows. The snow pants were optional.

7. Run a hotel

For just under $200, you could spend a night in a hotel run by robots. At Japans Henn na Hotel, almost every single thing from the receptionists (a female android and an English-speaking dinosaur) to the luggage porter to the room service is automated. In an unfortunate turn of events, half of the 243 robots staff were laid off earlier this year, but the hotel is still fully operational.

8. Build IKEA furniture

If theres a single thing everyone can agree on, its that trying to assemble IKEA furniture is a hellish process. Researchers in Singapore want to spare amateur furniture builders from ever having to crack another instruction manual, so they created a set of robotic arms that can assemble an IKEA chair in just 20 minutes. While most manufacturing bots function assembly line-style, this robot uses 3D cameras to correctly identify which parts it needs, then pieces them all together.

9. Perform brain surgery

Earlier this month, doctors in Toronto performed the worlds first-ever brain surgery using robotics. The patient had suffered a major aneurysm, and surgeons used a remote controlled robotic arm to guide a catheter from an incision made near her groin all the way up to her brain.

10. Lead a funeral

In Japan, hiring a Buddhist monk for a funeral can cost upwards of $3,000. Pepper a robe-wearing robot priest is able to chant sutras while simultaneously tapping a drum for a fifth of the cost. Bonus: the automated monk can live-stream the ceremony to those who are unable to attend.

Read the original:

Ten outrageous things robots can do right now, from cooking to building IKEA furniture - National Post

Robots Need to Know They Can Die at Any Minute, Just Like the Rest of Us – Popular Mechanics

How do you get machines to perform better? Tell them they could croak at any minute. In a new paper from the University of Southern California, scientists say that in a dynamic and unpredictable world, an intelligent agent should hold its own meta-goal of self-preservation.

Lead researcher Antonio Damasio is a luminary in the field of intelligence and the brain. In his profile at the Edge Foundation, they say Damasio has made seminal contributions to the understanding of brain processes underlying emotions, feelings, decision-making and consciousness. At USC, hes co-director of the Brain and Creativity Institute (BCI) with his equally luminous wife, Hanna Damasio.

Damasios paper, coauthored with BCI researcher Kingson Man, is a model based on philosophy and science of mind paired with accumulating research into robotics technology. They published the paper in Nature Machine Intelligence, a title usually meant as Natures [Journal of] Machine Intelligence but in this case, strangely prescient.

Damasio and Man suggest the way to make resilient robots isnt to make them impenetrably strong, but rather, to make them vulnerable in order to introduce ideas like restraint and self-preserving strategy. If an AI can use inputs like touch and pressure, then it can also identify danger and risk-to-self, ScienceAlert summarized.

This idea invokes the design concept of a survival game, where a finite number of resources is given to a set number of players and they must find an equilibrium or eliminate their competitors. The gorgeous 2018 card game Shipwreck Arcana is a great example of a cooperative survival game: To win, at least one person must survive being shipwrecked. You can share resources to preserve more people, or you can sacrifice resources from some players to increase the likelihood that one person will survive.

A robot with a sense of its own health isnt the most novel thingwhen a car tells you the oil is low or the engine is overheating, thats a direct self-preservation behavior. Theres just no in-between layer of circuitry to model thinking or prioritizing. Instead, the car has sensors only, and those sensors flag errors in order for the vehicles operator to address them. Imagine a car that considered your planned commute and the health of its engine and pulled itself over every 10 minutes to cool off.

Under certain conditions, machines capable of implementing a process resembling homeostasis might also acquire a source of motivation and a new means to evaluate behaviour, akin to that of feelings in living organisms, Damasio and Man say in their abstract. The car example fits this rubric. Instead of a sensor alerting an outsider every time, the hypothetical car has a brain to run analyses of the different factors that can go wrong and how likely each scenario is.

Human brains do this without, well, a second thought. Our many sensesnot just fivefeed constant input into our brains, and our body systems adjust in response. People dislike the idea that their brains and bodies are basically machines, although our complexity will probably never be fully understood by human scientists. What the scientists paper represents is the way more and more complex materials and computing are bringing machines steps toward our level, not the other way around.

Visit link:

Robots Need to Know They Can Die at Any Minute, Just Like the Rest of Us - Popular Mechanics

Construction robotics is changing the industry in these 5 ways – Robot Report

The SAM100 bricklaying robot at the Brighton Health Center South site of the University of Michigan Hospitals and Health Centers. Source: Construction Robotics

Until recently, construction was one of the least digitized and automated industries in the world. Many projects could be completed more efficiently with the help of the right construction robotics, mainly because the related tasks are incredibly repetitive.

While manual labor will likely always be a huge component of modern construction, technology has been steadily improving since the first pulleys and power tools. Robots, drones, autonomous vehicles, 3D printing, and exoskeletons are beginning to help get the work done. With low U.S. unemployment and shortages of skilled labor, automation is key to meeting demand and continued economic growth.

Construction robots may be involved in specific tasks, such as bricklaying, painting, loading, and bulldozing. We expect hundreds of AMRs in the next two years, mainly doing haulage, said Rian Whitton, an analyst at ABI Research. These robots help to protect workers from a hazardous working environment, reduce workplace injuries, and address labor shortages.

Many potential solutions rely on artificial intelligence and machine learning to deliver unprecedented levels of data-driven support. For instance, a driverless crane could transport materials around a worksite, or an aerial drone could gather information on a worksite to be compared against the plan.

Here are just a few examples of how robotics is transforming construction.

An example of how construction robotics are revolutionizing the industry can be seen in the HadrianX bricklaying machine from Australia-based FBR Ltd. (also known as Fastbrick Robotics). It employs an intelligent control system aided by CAD to calculate the necessary materials and movements for bricklaying.

Hadrian also measures environmental changes, such as movement caused by wind or vibrations, in real time. This data is then used to improve precision during the building process.

While Hadrian does require the use of proprietary blocks and adhesive, FBR noted that the related materials are 12 times bigger than standard house bricks and are lighter, stronger, and more environmentally sustainable.

Robots like Hadrian and SAM100 from Victor, N.Y.-based Construction Robotics promise to reduce operating costs and waste, as well as provide safer work environments and improve productivity. Hadrian can build the walls of a house in a single day, which is much faster than conventional methods.

While the major automakers and technology companies are working on self-driving cars, autonomous vehicles are already part of construction robotics.

Such equipment can transport supplies and materials. For instance, Volvo has been working on its HX2, an autonomous and electric load carrier that can move heavy loads without additional input. It has no driver cab and instead uses a digital logistics-driven control technology backed by what Volvo calls a vision system to detect humans and obstacles while on the move.

Another company, Built Robotics, which last month raised $33 million, offers autonomous bulldozers and excavators. AI guidance systems direct the equipment to their destinations and ensure that the necessary work is completed safely and accurately.

Autonomous vehicles and construction robotics is not intended to replace manual labor entirely, but to augment and enhance efficiency. Safety is vastly improved as well, as we eliminate the potential for human error.

Construction robotics and drones using sensors such as lidar with Global Positioning System technologies can provide vital information about a worksite. Along with AI, it can help predict what tasks are required.

Doxel Inc. makes a small tread-based robot that does exactly that. It scans and assesses the progress of a construction project by traversing the site. The information it collects is used to detect potential errors and problems early.

Doxels data is stored in the cloud, where its filtered through a deep-learning algorithm to recognize and assess more minute details. For example, the system might point out that a ventilation duct is installed incorrectly, and the early detection can allow for the proper correction well before costly revisions are needed.

Humans are still in the loop for much of construction robotics, combining the strengths of human supervision with multiple technologies. The Internet of Things, additive manufacturing, and digitization are contributing to the industrys growth, noted Caterpillar.

Painting drones are an excellent example, since they can be controlled via tablet or smartphone via an app, and they can report on the data they gather thats analyzed in the cloud.

Remote-control technology can also be applied to semi-autonomous vehicles. Project managers can use it to deliver instructions and orders to their workforce instantly.

Barcelona-based Scaled Robotics offers construction robotics that can be remotely controlled by mobile devices. The companys Husky unmanned ground vehicle can roam a construction site and capture critical information via multiple sensors. The data is transferred to the cloud, where its used for building information modeling (BIM) of the project.

Before, during, and after a construction project, many assessments require the review of a worksite and surrounding area. Limited surveillance is also necessary for supervising workers and securing the site. In addition, project managers and supervisors must walk the site to conduct final inspections. Construction robotics and drones can help all of these processes.

Aerial drones and ground-based robots can survey a worksite and gather multiple types of data, depending on the sensors used. Augmented reality and virtual reality can enable operators to get a realistic and real-time feel for what the drones are seeing.

While donning a VR headset, for instance, viewers can see a live feed of captured video from the drone. More importantly, that immersive experience is provided remotely, so project managers dont even have to be on the job site to get an accurate assessment. The video feed is also recorded for playback at a later time, providing yet another resource.

Companies are already using drone technology to this end. In 2018, Chinese drone maker DJI announced a global partnership with Skycatch for a fleet of 1,000 high-precision custom drones to create 3D site maps and models of project sites.

The global market for construction robotics also represents a huge opportunity for developers and suppliers. It could grow from $22.7 million in 2018 to $226 million by 2025, predicts Tractica. Research and Markets estimates that the market will grow to $126.4 million by 2025.

According to the International Federation of Robotics and the Robotic Industries Association, the construction robotics market will experience a compound annual growth rate (CAGR) of 8.7% between 2018 and 2022. Research firm IDC is more bullish, predicting a CAGR of 20.2%.

Automation and digitization are driving a revolution in the construction industry, which has historically been slow to adopt new technologies. From design through final inspection and maintenance, the full benefits of construction robotics have yet to be realized.

Follow this link:

Construction robotics is changing the industry in these 5 ways - Robot Report

Waxahachie FIRST Robotics Team takes 2nd place at tournament – Waxahachie Daily Light

Daily Light report

SaturdayOct19,2019at11:33AM

ROCKWALL Students from the Career & Technical Student Organization, Waxahachie FIRST Robotics Team, represented the Waxahachie Independent School District at the NTX Tournament of Robots on Oct. 12-13 in Rockwall, Tx.

FIRST Robotics (For Inspiration and Recognition of Science and Technology) serves students enrolled in CTE courses aligned with the engineering and manufacturing STEM careers cluster. Participants enjoy the experience of applying classroom and laboratory lessons in hands-on activities and competitive events.

Waxahachie Robotics is a WISD district team, open to all high school students within the Waxahachie school district.

Students participating this season so far are all from Waxahachie Global High School: Camile Condron, Jacob Mendoza, Steven Cloud, Eddie Almaguer, Cole Shelby, Evan Ford, Brendon Blankenship, Talon Wilderman, Ashauntee Fairley, Conner Teague, Carl Bicada, Katherine Keys and Miles Charpentier.

The NTX Tournament of Robots consisted of 29 teams from three states, and Waxahachie Robotics took second place overall in this years contest.

Students will be traveling again in February and March to Dallas and Greenville to test their skills against competitors from across the nation and around the world. WISD proudly supports these students, teachers and organizations.

For more information about Waxahachie Robotics, contact Waxahachie Global High School at 972.923.4761 or email swarren@wisd.org or dmathiesen@wisd.org.

Read the rest here:

Waxahachie FIRST Robotics Team takes 2nd place at tournament - Waxahachie Daily Light

[Hearing from an AI Expert 5] At the Intersection of Robotics and Innovation – Samsung Newsroom

There is much anticipation these days around the field of robotics with its immense potential and promising future applications. However, a large gap exists between public expectations and what is actually deemed technically feasible by scientists and engineers today. Fortunately, Samsungs New York AI Center is buoyed by the presence of a team of highly skilled researchers, led by robotics and AI expert Dr. Daniel D. Lee, who are working to close this gap. Samsung Newsroom spoke with Dr. Lee about the work being done at the center, as well as the facilitys ability to foster collaboration in a range of areas and attract top talent.

Asked about his centers mandate, Lee explains that the New York AI Center focuses on fundamental research at the intersection of AI, robotics and neuroscience. The centers objective is to solve challenging problems at this intersection, and one good example is the problem of robotic manipulation1.

Put simply, robots need to become far more skillful before they are ready to help humans with physical tasks in their daily lives. The first step involves endowing robots with the intelligence to perceive and understand their surroundings. Next, they must be able to make swift decisions in unpredictable situations. Finally, robots should be dexterous and nimble enough to perform the appropriate actions. However, it is impossible for robot designers to anticipate every contingency robots will encounter in real world environments. Thus, robots need to be able to learn from experience just as humans do.

At this time, most common machine learning methods are not suitable for teaching robots since enormous amounts of training data are required. Lee explained that there are several challenges that need to be addressed regarding machine learning for robotics.

Dealing with the physical world is much more difficult for AI than playing video games or Go, he explains, We are currently developing AI learning methods that can deal with the uncertainty and diversity of the physical world so that robots become more prevalent in homes and workplaces. I would compare the state of robots today to computers in the 1980s during the transformation from mainframes to personal computers.

The New York AI Center is addressing such challenges to provide a richer AI and robotics experience. For instance, the center has recently developed novel AI methods that are able to efficiently teach robots using limited data. One recently-developed method trains a neural network to generate motion trajectories for a robot arm directly from camera images.

In order to allow robots to handle things for people, robots need to learn how to touch, grasp, and move a variety of everyday objects. Lee explains how the problem of dexterous robotic manipulation is an area of focus for the New York AI Center.

Lee comments that the ability of humans and some animals to manipulate household objects is currently unmatched by machines. Thats why we are investigating how AI-based solutions can be applied to make breakthroughs in this area. Extrapolating further, Lee explains that dexterous robotic manipulation requires the ability to precisely and robustly handle objects exhibiting uncertain material properties.

Manipulation is relatively easy if the objects and environments are carefully controlled, such as on a factory floor, Lee reports, But it becomes much more difficult in unknown, cluttered environments when faced with a diverse array of objects.

By way of an example, Lee lays out the capabilities that would be required for a robot to serve a chilled glass of wine in a restaurant. How heavy is the glass, and how slippery is it due to condensation? He adds, Its impossible to completely model all the possible physical characteristics of the glass of wine, so machine learning is critical in training robots to handle the difficult situations.

As the AI sector has grown more sophisticated, it has become increasingly clear that collaborative solutions are critical for researchers to overcome the challenges they face. In an area as complex and multi-faceted as robotic manipulation, contributions from and collaborations with the worlds best and brightest will be instrumental, comments Lee. He highlights the value of working with both other Samsung AI Centers and academic institutions, saying that, solving fundamental problems in AI to positively impact society requires drawing upon the ability and skills of numerous experts globally.

He added, The Samsung AI Centers invite collaborations with researchers who can help address these difficult challenges. We currently have a number of faculty from leading academic institutions who are collaborating with us in New York.

Lee highlights just how beneficial being located in New York has been for his team, saying that certainly, New York City is one of the greatest and most diverse cities in the world. It is a magnet for world-class research and engineering talent.

Attracting the very best in talent is extremely important to remain on the bleeding edge of future AI advancements, and Lee reports that the center has been fortunate in this area, saying, We have benefited from being able to attract and recruit some outstanding researchers since we started the Center.

Our team is composed of expert scientists and engineers who are creating innovative theories and algorithms and state-of-the-art technological developments, Lee adds, Its been great working with them to publish in leading academic conferences and journals as well.

Speaking about how he envisions robots will fit into society in the future, Lee points out that, in their infancy, some robots drew attention because they were cute and fun, but that people tended to use them less as the novelty wore off. In order for people to see robots as valuable and relevant, new systems need to have enough intelligence that they become indispensable in our daily lives.

Intelligent robotic systems have the potential to completely revolutionize how people go about their activities in the future, Lee extrapolates, In the near term, we will see modest improvements on simple tasks in constrained environments. But more complete systems that can handle a variety of chores and complex tasks will require further research breakthroughs. The Samsung AI Centers are helping to generate those new advances.

Asked to outline what he sees as the ultimate vision for AI and robotic intelligence, Lee says, I grew up reading and watching science fiction stories that envisaged amazing robots helping humans. It would be incredible to see some of those positive visions actually come to life.

1 The ability for robots to interact with and move physical objects in a range of environments.

Original post:

[Hearing from an AI Expert 5] At the Intersection of Robotics and Innovation - Samsung Newsroom

News Desk Wrapup: Quick Hits on Robot News for the Week – Robotics Business Review

Kamuthi Solar Project

Its amazing how quickly the week goes by when youre monitoring the world of robotics news. Its almost like we need a robot or AI around here to start producing more news copy (no, dont think about that just yet).

With most people thinking about their mid-October weekend plans, take one quick moment to see what else has been going on that I found interesting this week:

SenseHawk, which develops AI-powered software for the solar industry, said this week it benchmarked the condition of 2.5 million solar modules in record time at one of the worlds largest solar sites, the Kamuthi Solar Power Project in India. Using drones with thermography imaging technology, as well as its cloud-based SenseHawk Therm software, the company was able to assess the solar site in less than three weeks, which would have taken several months if tackled monthly. The site spans an area of 2,500 acres, the equivalent of 950 football fields, or about four square miles.

The software is able to detect hot spots, evaluate energy loss, schedule maintenance and track defects over time. The company said it could detect more than 99.9% of all hotspots on solar modules.

If youre in the New England area and want to hear some smart people talk about robots next week, head to the Robotics Engineering Research Symposium at Worcester Polytechnic Institute (WPI) on Tuesday, Oct. 22. Titled Launching to a Robotic Future, the event will include robotics researchers from around the world, along with a reception highlighting the history of WPIs robotics program.

Speakers include Dr. Robert Howe (Harvard), Dr. Wolfram Burgard (Toyota Research Institute), Dr. Shiekegi Sugano (Waseda University) and Dr. Al Rizzi (Boston Dynamics). Head here for more details.

Amazon announced this week it would return to Las Vegas in 2020 for the second edition of its re:MARS conference, which covers machine learning, automation, robotics and space topics. The event will be held June 16-19, 2020, with more details on the agenda and speakers announced in early 2020. Click here to read about my experiences at this years re:MARS event, and why the robotics industry needs Amazon.

Yaskawa Motoman Americas employees at the Dayton, Ohio, headquarters.

The Motoman Robotics Division of Yaskawa America recently celebrated its 30th anniversary. Previously known as Motoman, the company began as a 50/50 joint venture between Hobart Brothers Company and Yaskawa Electric America, and then officially began operations on Aug. 1, 1989. In 1994, Motoman Inc. became a wholly-owned subsidiary of Yaskawa Electric Corp.

The company started with just 59 employees, and now has nearly 700 employees in 11 facilities throughout the U.S. (Dayton, Ohio; Detroit; Los Angeles; Austin, Texas; Birmingham, Ala.), Canada (Mississauga, Ontario; and Pointe-Claire, Quebec), Mexio (Aguascalientes, Apodaca N.L C.P. and Queretaro); and Brazil (Diadema, Sao Paulo).

Vectis Automation has teamed up with Universal Robots to create the Vectis Cobot Welding Tool, aimed to help manufacturers boost productivity by reducing the learning curve, deployment time, risk, and cost of robotic welding. The tool is powered by a UR10e cobot, and is also available as a low-risk, rent-to-own option. The two companies will show off the tool at the upcoming FABTECH show in Chicago, Nov. 11-14.

SkyOp, which develops drone training courseware for educational institutions, announced recently it was awarded a cooperative purchasing contract to make its SkyOp Drone Training Curriculum available to local school districts in New York through the Boards of Cooperative Educational Services (BOCES) program. Under the agreement, SkyOp will deliver its workforce-development STEM curriculum directly to local districts while the BOCES will provide support and training for teachers and district staff.

The curriculum, which includes more than 300 hours of instruction and coursework, covers topics such as the introduction to drones, Part 107 test preparation, hands-on trone flight training, drone photo and video production, and intro to Pix4D, among others.

See more here:

News Desk Wrapup: Quick Hits on Robot News for the Week - Robotics Business Review

An Army of Tiny Robots Could Assemble Huge Structures in Space – Universe Today

We live in a world where multiple technological revolutions are taking place at the same time. While the leaps that are taking place in the fields of computing, robotics, and biotechnology are gaining a great deal of attention, less attention is being given to a field that is just as promising. This would be the field of manufacturing, where technologies like 3D printing and autonomous robots are proving to be a huge game-changer.

For example, there is the work being pursued by MITs Center for Bits and Atoms (CBA). It is here that graduate student Benjamin Jenett and Professor Neil Gershenfeld (as part of Jenetts doctoral thesis work) are working on tiny robots that are capable of assembling entire structures. This work could have implications for everything from aircraft and buildings to settlements in space.

Their work is described in a study that recently appeared in the October issue of the IEEE Robotics and Automation Letters. The study was authored by Jenett and Gershenfeld, who were joined by fellow graduate student Amira Abdel-Rahman and Kenneth Cheung a graduate of MIT and the CBA, who now works at NASAs Ames Research Center.

As Gerensheld explained in a recent MIT News release, there have historically been two broad categories of robotics. On the one hand, youve got expensive robotics made out custom components that are optimized for particular applications. On the other hand, there are those that are made from inexpensive mass-produced modules with lower performance.

The robots that the CBA team is working on which Jenett has dubbed the Bipedal Isotropic Lattice Locomoting Explorer (BILL-E, like WALL-E) represent an entirely new branch of robotics. On the one hand, they are much simpler than the expensive, custom and optimized variety of robots. On the other, they are far more capable than mass-produced robots and can build a wider variety of structures.

At the heart of the concept is the idea that larger structures can be assembled by integrating smaller 3D pieces which the CBA team calls voxels. These components are made up of simple struts and nodes and can be easily fastened together using simple latching systems. Since they are mostly empty space, they are lightweight but can still be arranged to distribute loads efficiently.

The robots, meanwhile, resemble a small arm with two long segments that are hinged in the middle with a clamping device at each end that they use to grip onto the voxel structures. These appendages allow the robots to move around like inchworms, opening and closing their bodies in order to move from one spot to the next.

However, the main difference between these assemblers and traditional robots is the relationship between the robotic worker and the materials it is working with. According to Gershefeld, it is impossible to distinguish this new type of robot from the structures they build since they work together as a system. This is especially apparent when it comes to the robots navigation system.

Today, most mobile robots require a highly precise navigational system to keep track of their position, such as GPS. The new assembler robots, however, need only know where they are in relation to the voxels (small subunits they are currently working on). When an assembler moves onto the next one, it readjusts its sense of position, using whatever it is working on to orient itself.

Each of the BILL-E robots is capable of counting its steps, which in addition to navigation allows it to correct any errors it makes along the way. Along with control software developed by Abdel-Rahman, this simplified process will enable swarms of BILL-Es to coordinate their efforts and work together, which will speed up the assembly process. As Jenett said:

Were not putting the precision in the robot; the precision comes from the structure [as it gradually takes shape]. Thats different from all other robots. It just needs to know where its next step is.

Jenett and his associates have built several proof-of-concept versions of the assemblers, along with corresponding voxel designs. Their work has now progressed to the point where prototype versions are able to demonstrate the assembly of the voxel blocks into linear, two-dimensional, and three-dimensional structures.

This kind of assembly process has already attracted the interest of NASA (which is collaborating with MIT on this research), and Netherlands-based aerospace company Airbus SE which also sponsored the study. In NASAs case, this technology would be a boon for their Automated Reconfigurable Mission Adaptive Digital Assembly Systems (ARMADAS), which co-author Cheung leads.

The aim of this project is to develop the necessary automation and robotic assembly technologies to develop deep-space infrastructure which includes a lunar base and space habitats. In these environments, robotic assemblers offer the advantage of being able to assemble structures quickly and more cost-effectively. Similarly, they will be able to conduct repairs, maintenance, and modification with ease.

For a space station or a lunar habitat, these robots would live on the structure, continuously maintaining and repairing it, says Jenett. Having these robots around will eliminate the need to launch large preassembled structures from Earth. When paired with additive manufacturing (3D printing), they would also be able to use local resources as building materials (a process known as In-Situ Resource Utilization or ISRU).

Sandor Fekete is the director of the Institute of Operating Systems and Computer Networks at the Technical University of Braunschweig, Germany. In the future, he hopes to join the team in order to further develop the control systems. While developing these robots to the point that they will be able to build structures in space is a significant challenge, the applications they could have are enormous. As Fekete said:

Robots dont get tired or bored, and using many miniature robots seems like the only way to get this critical job done. This extremely original and clever work by Ben Jenett and collaborators makes a giant leap towards the construction of dynamically adjustable airplane wings, enormous solar sails or even reconfigurable space habitats.

There is little doubt that if humanity wants to live sustainably on Earth or venture out into space, it is going to need to rely on some pretty advanced technology. Right now, the most promising of these are the ones that offer cost-effective ways of seeing to our needs and extending our presence across the Solar System.

In this respect, robot assemblers like BILL-E would not only be useful in orbit, on the Moon, or beyond, but also here on Earth. When similarly paired with 3D printing technology, large groups of robotic assemblers programmed to work together could provide cheap, modular housing that could help bring an end to the housing crisis.

As always, technological innovations that help advance space exploration can be tapped to make life on Earth easier as well!

Further Reading: MIT, IEEE

Like Loading...

View post:

An Army of Tiny Robots Could Assemble Huge Structures in Space - Universe Today

If a Robotic Hand Solves a Rubiks Cube, Does It Prove Something? – The New York Times

This article is part of our continuing Fast Forward series, which examines technological, economic, social and cultural shifts that happen as businesses evolve.

SAN FRANCISCO Last week, on the third floor of a small building in San Franciscos Mission District, a woman scrambled the tiles of a Rubiks Cube and placed it in the palm of a robotic hand.

The hand began to move, gingerly spinning the tiles with its thumb and four long fingers. Each movement was small, slow and unsteady. But soon, the colors started to align. Four minutes later, with one more twist, it unscrambled the last few tiles, and a cheer went up from a long line of researchers watching nearby.

The researchers worked for a prominent artificial intelligence lab, OpenAI, and they had spent several months training their robotic hand for this task.

Though it could be dismissed as an attention-grabbing stunt, the feat was another step forward for robotics research. Many researchers believe it was an indication that they could train machines to perform far more complex tasks. That could lead to robots that can reliably sort through packages in a warehouse or to cars that can make decisions on their own.

Solving a Rubiks Cube is not very useful, but it shows how far we can push these techniques, said Peter Welinder, one of the researchers who worked on the project. We see this as a path to robots that can handle a wide variety of tasks.

The project was also a way for OpenAI to promote itself as it seeks to attract the money and the talent needed to push this sort of research forward. The techniques under development at labs like OpenAI are enormously expensive both in equipment and personnel and for that reason, eye-catching demonstrations have become a staple of serious A.I. research.

The trick is separating the flash of the demo from the technological progress and understanding the limitations of that technology. Though OpenAIs hand can solve the puzzle in as little as four minutes, it drops the cube eight times out of 10, the researchers said.

This is an interesting and positive step forward, but it is really important not to exaggerate it, said Ken Goldberg, a professor at the University of California, Berkeley, who explores similar techniques.

A robot that can solve a Rubiks Cube is not new. Researchers previously designed machines specifically for the task devices that look nothing like a hand and they can solve the puzzle in less than a second. But building devices that work like a human hand is a painstaking process in which engineers spend months laying down rules that define each tiny movement.

The OpenAI project was an achievement of sorts because its researchers did not program each movement into their robotic hand. That might take decades, if not centuries, considering the complexity of a mechanical device with a thumb and four fingers. The labs researchers built a computer system that learned to solve the Rubiks Cube largely on its own.

What is exciting about this work is that the system learns, said Jeff Clune, a robotics professor at the University of Wyoming. It doesnt memorize one way to solve the problem. It learns.

Development began with a simulation of both the hand and the cube a digital recreation of the hardware on the third floor of OpenAIs San Francisco headquarters. Inside the simulation, the hand learned to solve the puzzle through extreme trial and error. It spent the equivalent of 10,000 years spinning the tiles up, down, left and right, completing the task over and over again.

The researchers randomly changed the simulation in small but distinct ways. They changed the size of the hand and the color of the tiles and the amount of friction between the tiles. After the training, the hand learned to deal with the unexpected.

When the researchers transferred this computer learning to the physical hand, it could solve the puzzle on its own. Thanks to the randomness introduced in simulation, it could even solve the puzzle when wearing a rubber glove or with two fingers tied together.

At OpenAI and similar labs at Google, the University of Washington and Berkeley, many researchers believe this kind of machine learning will help robots master tasks they cannot master today and deal with the randomness of the physical world. Right now, robots cannot reliably sort through a bin of random items moving through a warehouse.

The hope is that will soon be possible. But getting there is expensive.

That is why OpenAI, led by the Silicon Valley start-up guru Sam Altman, recently signed a billion-dollar deal with Microsoft. And its why the lab wanted the world to see a demo of its robotic hand solving a Rubiks Cube. On Tuesday, the lab released a 50-page research paper describing the science of the project. It also distributed a news release to news outlets across the globe.

In order to keep their operation going, this is what they have to do, said Zachary Lipton, a professor in the machine learning group at Carnegie Mellon University in Pittsburgh. It is their life blood.

When The New York Times was shown an early version of the news release, we asked to see the hand in action. On the first attempt, the hand dropped the cube after a few minutes of twisting and turning. A researcher placed the cube back into its palm. On the next attempt, it completed the puzzle without a hitch.

Many academics, including Dr. Lipton, bemoaned the way that artificial intelligence is hyped through news releases and showy demonstrations. But that is not something that will change anytime soon.

These are serious technologies that people need to think about, Dr. Lipton said. But it is difficult for the public to understand what is happening and what they should be concerned about and what will actually affect them.

See the original post here:

If a Robotic Hand Solves a Rubiks Cube, Does It Prove Something? - The New York Times

CMR Surgical installs its first surgical robotics system – DOTmed HealthCare Business News

The first of CMR Surgicals Versius platforms has found its home at a center specializing in laparoscopy.

The surgical robotics system will now be used at Galaxy Care Hospital in Pune, India, in a wide range of surgical procedures that include transthoracic operations, hysterectomies and myomectomies.

"One of the key features that makes Versius a good fit for Galaxy Care Hospital is its modularity," Martin Frost, chief executive officer and co-founder of CMR Surgical, told HCB News. "Because the system is modular, the Versius system can be moved between operating theatres quickly and easily, increasing the opportunity for use and the cost-effectiveness of Versius. With our partnership with Galaxy Care, we're pleased to be able to make minimal access surgery available to more people globally."

Ad StatisticsTimes Displayed: 844114Times Visited: 7416

Special-Pricing Available on Medical Displays, Patient Monitors, Recorders, Printers, Media, Ultrasound Machines, and Cameras.This includes Top Brands such as SONY, BARCO, NDS, NEC, LG, EDAN, EIZO, ELO, FSN, PANASONIC, MITSUBISHI, OLYMPUS, & WIDE.

The installation also marks the release of the first clinical registry for a surgical robotic system, according to CMR Surgical, which manages it in partnership with providers to record and monitor patient outcomes of all procedures involving Versius to ensure patient safety. Outcomes measured include operative time, length of stay, 30-day readmissions, and returns to the operating room within 24 hours. The creation of the registry stems from CMR Surgicals aim to provide post-market surveillance as part of the IDEAL (Idea, Development, Exploration, Assessment, Long-term study) framework, which calls for manufacturers to describe the stages of innovation in surgery and other interventional procedures.

"Our mission is to address the higher unmet need for flexible, surgical care," said Frost. "Versius is indicated for upper GI, general, gynaecological and colorectal procedures, and since its introduction into Galaxy Care Hospital, Versius has been used to complete transthoracic, hysterectomies and myomectomies, under the leadership of Dr. Shailesh Puntambekar."

CMR Surgical expects the system to gain traction in European and Asia-Pacific markets, and for it to become a competitor of the leading robotics system, the da Vinci system by Intuitive Surgical, in the near future.

It plans to have additional Versius Surgical Robotic Systems in use at hospitals across India and Europe by the end of 2019, including in ones that are a part of the National Health Service in the U.K.

Back to HCB News

See the original post:

CMR Surgical installs its first surgical robotics system - DOTmed HealthCare Business News

Open-Source Arm Puts Robotics Within Reach – Hackaday

In November 2017, we showed you [Chris Annin]s open-source 6-DOF robot arm. Since then hes been improving the arm and making it more accessible for anyone who doesnt get to play with industrial robots all day at work. The biggest improvement is that AR2 had a closed-loop control system, and AR3 is open-loop. If something bumps the arm or it crashes, the bot will recover its previous position automatically. It also auto-calibrates itself using limit switches.

AR3 is designed to be milled from aluminium or entirely 3D printed. The motors and encoders are controlled with a Teensy 3.5, while an Arduino Mega handles I/O, the grippers, and the servos. In the demo video after the break, [Chris] shows off AR3s impressive control after a brief robotic ballet in which two AR3s move in hypnotizing unison.

[Chris] set up a site with the code, his control software, and all the STL files. He also has tutorial videos for programming and calibrating, and wrote an extremely detailed assembly manual. Between the site and the community already in place from AR2, anyone with enough time, money and determination could probably build one. Check out [Chris] playlist of AR2 builds people are using them for photography, welding, and serving ice cream. Did you build an AR2? The good news is that AR3 is completely backward-compatible.

The AR3s grippers work well, as youll see in the video. If you need a softer touch, try emulating an octopus tentacle.

Thanks for the tip, [Andrew]!

See the original post here:

Open-Source Arm Puts Robotics Within Reach - Hackaday

Rethink Robotics launches Sawyer Black Edition – Robotics and Automation News

Rethink Robotics, which is now part of the Hahn Group, has launched a new version of its collaborative robot, Sawyer.

The Sawyer Black Edition is now available for pre-order and will be presented for the first time at K 2019 in Dsseldorf.

The new hardware update comes one year after the takeover of Rethink Robotics assets by Hahn Group.

The Sawyer Black Edition is the result of the combination of German engineering and longstanding application experience.

Rethink says Sawyer is now quieter and has more reliable components with higher quality.

The Sawyer Black Edition can be pre-ordered now; first deliveries will take place in 2019.

Rethink says the Sawyer Black Edition contributes to a quieter working environment and is therefore even more popular among employees.

The company adds that the improved component quality of the Sawyer Black Edition significantly raises the collaborative robots reliability.

At K 2019, the Sawyer Black Edition will be demonstrated at the Hahn Group at booth E61 in hall 10 as part of a palletizing application for the packaging of boxes and plastics parts.

With the Black Edition Rethink Robotics continues to stand for easy application, flexibility in use and high acceptance among employees.

Tasks that are dangerous for humans are, among others, possible applications. Some of these include CNC machine assembly, circuit board assembly, metal processing, injection molding, packaging, loading and unloading, as well as tests and inspections.

Rethink says the Sawyer collaborative robot solution is ready for use immediately after delivery and equipped with Intera software and two camera systems.

You might also like

View post:

Rethink Robotics launches Sawyer Black Edition - Robotics and Automation News

Robotics company offers $190,000 for the rights to your face – NEWS.com.au

Heres your chance to be the literal face of a robotics company.

A tech firm is looking for the right person to lend their likeness to a new line of robot assistants for the elderly. And while it might sound like the plot to a bad sci-fi flick, the company will pay the chosen candidate 100,000 (about $A190,000) for the privilege.

The privately funded firm has opted to remain anonymous due to the projects secretive nature, but it has hired robotics recruiter Geomiq to find the right face for the job, reports the Mirror.

RELATED: Sex robots are here, are they therapeutic or gross?

Ideal applicants will possess a kind and friendly face for the prototype, per the head, er, face hunters recruitment ad. Its a once-in-a-lifetime opportunity for the right person; lets hope we can find them, said a Geomiq spokesperson.

The lucky winner of the face-off will have their likeness reproduced on thousands of virtual friends a la Will Smiths disturbing 2004 movie I, Robot as well as rake in the aforementioned big bucks. The project has been five years in the making.

RELATED: Robots already taking jobs

Designers havent disclosed much beyond that, only that the robotic doppelgangers will hit the assembly line next year and will be readily available to the public upon completion.

On the application page, Geomiq acknowledges that licensing ones visage to an unnamed robotics company for eternity is potentially an extremely big decision.

The face-cloning campaign has drawn flack from social media sceptics, with many of them analogising it to bad dystopian movie tropes. Janelle Mone warned us about this, cautioned one.

Others wondered why a supposedly tech-savvy robotics company needed a human face at all and couldnt just save money by using an online random-face generator. Have these people ever heard of GANs? asked one Twitter techie. There are datasets with 100k realistic (but not real) faces available already.

This article originally appeared on the New York Post and was reproduced with permission

Is $190,000 enough for you to sell the rights to your appearance forever? Let us know what you think in the comments below.

See original here:

Robotics company offers $190,000 for the rights to your face - NEWS.com.au

Geek+ launches smart factory with ‘robots making robots’ – Robotics and Automation News

Geek+, a supplier of robotics and AI technologies for warehouses, has launched what it claims is the worlds first smart factory to use robot arms to make mobile robots.

Based in Nanjing, the factory uses Geek+ robots, AI algorithms and other automated solutions to manufacture new Geek+ robots. All of the companys robots are produced at this factory.

With an increasing demand for customization and limited release products, product cycles are getting shorter and shorter, making flexible production an essential aspect of the manufacturing industry.

Autonomous mobile robots in factories are the best way to achieve flexible production and can also help companies realize a smart and agile supply chain.

With this objective, the company has launched its Geek+ Smart Factory Solution, using its Nanjing facility as a blueprint for flexible production and intelligent manufacturing.

Geek+ says it can adapt and implement the smart factory template to manufacturing facilities worldwide, and customize the solution to meet various production and industrial scenarios.

With over 200 projects across the world, Geek+ has gained considerable experience and data knowledge developing smart logistics solutions for warehousing and manufacturing environments.

It has developed new AI algorithms for scenarios spanning numerous industries, from retail and apparel, to manufacturing and pharmaceutical companies.

Through this, the company has built an ecosystem with international technology partners to develop a total solution for smart warehouses and smart factories including AI vision, robot arms, and internet of things, production management system, logistics management system, big data analysis and advanced robotics.

With its Smart Factory Solution, Geek+ says it is helping customers upgrade their operations to an intelligent and agile supply chain.

Robots making robots

The Nanjing factory output has almost doubled traditional manual production capacity, and single-shift annual production is designed to exceed 10,000 robots. Under a production logistics management system, the robots operate together.

They include:

The robots are powered by AI technologies including:

Once assembled, the new robots direct themselves to the calibration area to receive basic parameter settings.

They automatically complete final testing and finished product inspection after which they directly proceed to the finished product area to be packaged and ready to ship.

Geek+ smart factory management system, powered by AI

To operate smart factories, Geek+ has developed a new integrated system, the Geek+ Production Logistics Management System. It powers all aspects of the facility, from inventory to the production line, integrating logistics and production into a flexible and efficient system. It connects the stock area with the production area and unifies the management of all the different robotic solutions.

This system replaces the traditional conveyor belt system with a new island production mode of autonomous mobile robots.

These production islands can be easily duplicated and the solution is a completely flexible and scalable:

This new intelligent and flexible production model offers a real alternative to costly and rigid conveyor belts.

A game changer for intelligent manufacturing

Production capacity has almost doubled, compared to traditional manual production, with annual output expected to exceed 10,000 robots.

The new solution also guarantees more precise process control and higher accuracy with a straight-through rate for the final assembly are exceeding 98 percent, and higher traceability of the whole process, which reduces overall management cost.

Yong Zheng, founder and CEO of Geek+, says: Smart factories will be a turning point for the entire industry as they provide a truly proven alternative to traditional, fixed production and achieve flexible production.

What better way to show to the world the value of our solutions than to apply it to our own production? Our Nanjing factory is a window into the future of intelligent logistics and manufacturing.

The Geek+ smart factory solution is applicable to a wide range of industries, including automobile manufacturers, auto part factories and 3C electronics factories.

It is particularly well suited for industries that require more flexible manufacturing processes, to keep up with the demand for new product lines and allow for capacity expansion.

With smart factories, trial production of new products and product line transformation can be easily implemented, says Geek+.

Zheng says: In the past four years, we have already developed and implemented game changing technologies for warehousing operations. With smart factories, we continue to pave the way for a truly intelligent supply chain.

You might also like

Originally posted here:

Geek+ launches smart factory with 'robots making robots' - Robotics and Automation News

Robotics startup wants to pay 100k to use a real human face on its robots – DIGIT.FYI

A robotics startup is offering 100,000 to use a real persons face on its robots. The unnamed company has contacted manufacturer Geomiq for help finding the ideal kind and friendly human face for its robots, described as virtual friends for elderly people.

The robotics startup said the need for anonymity is due to the secretive nature of the project. But production of the robots is expected to begin in 2020 and will be readily available to the public, it added.

Geomiq said that the company is privately-funded and that the project has been in development for five years. It has since, apparently, taken on investment from a number of independent VCs, as well as a top fund based in Shanghai.

A spokesperson for Geomiq said: At this point, were not allowed to share any more details about the project, but were hoping that someone with the right face will get in touch as a result of this public appeal.

We know that this is an extremely unique request, and signing over the licenses to your face is potentially an extremely big decision. But its a once-in-a-lifetime opportunity for the right person; lets hope we can find them.

Dr Kate Devlin, an author on the topics of AI and robotics, said: Im cool with the whole friendly robot thing. But I cant work out why a) it needs a realistically human face and, b) why that face needs to be of a real individual.

If you are interested in selling your face, you can apply here. Candidates who make it through the next phase will be given full details on the project, while unsuccessful candidates will not be contacted.

Like Loading...

Related

View original post here:

Robotics startup wants to pay 100k to use a real human face on its robots - DIGIT.FYI

Conference on Collaborative Robots, Advanced Vision and Artificial Intelligence Comes to San Jose November 12-13 – Business Wire

ANN ARBOR, Mich.--(BUSINESS WIRE)--Automation expertsand those who want to explore how to grow their business with the latest trends and innovationswill descend on San Jose November 12-13 for the Collaborative Robots, Advanced Vision & AI (CRAV.ai) Conference. Sponsored by the Association for Advancing Automation (A3), this conference is ideal for engineers and manufacturers seeking effective ways to reduce cost, improve quality and advance productivity, while increasing flexibility. CRAV.ai also holds appeal for experienced users seeking new applications or prospective users trying to determine if robotics, vision and artificial intelligence make sense for their companies. Registration for the conference is open at https://crav.ai.

The automation industry continues to change and disrupt, with new innovations and new examples of automation solutions helping businesses around the world, said Jeff Burnstein, president, A3. This conference in particular brings in some of the most influential minds in the space to share the technologies, trends and actionable insights that will help companies become more competitive. Come learn how to not get left behind in this increasingly automated world.

In addition to three in-depth tracks featuring dozens of sessions highlighting practical solutions and emerging technologies, the conference will feature the following keynotes:

Last year, CRAV.ai drew more than 500 attendees, including engineers and decision makers from companies like Google, Apple, Intel, Lockheed Martin, Toyota and many more.

The full agenda can be found here: https://crav.ai/agenda. Register at https://crav.ai.

About Association for Advancing Automation (A3)The Association for Advancing Automation (A3) is the global advocate for the benefits of automating. A3 promotes automation technologies and ideas that transform the way business is done. A3 is the umbrella group for Robotic Industries Association (RIA), AIA - Advancing Vision + Imaging, Motion Control & Motor Association (MCMA) and A3 Mexico. RIA, AIA, MCMA and A3 Mexico combined represent over 1,250 automation manufacturers, component suppliers, system integrators, end users, research groups and consulting firms from throughout the world that drive automation forward. For more information, visit: A3, RIA, AIA, MCMA, A3 Mexico.

Upcoming A3 Events:Collaborative Robots, Advanced Vision & AI Conference (CRAV.ai) Nov. 12-13, 2019, San Jose, California.A3 Business Forum Jan. 13-15, 2020, Orlando, Florida.Robotic Grinding & Finishing Conference April 27-28, 2020, St. Paul, Minnesota.The Vision Show June 9-11, 2020, Boston, Massachusetts.Automate May 17-20, 2021, Detroit, Michigan.

Visit link:

Conference on Collaborative Robots, Advanced Vision and Artificial Intelligence Comes to San Jose November 12-13 - Business Wire