Morgan Stanley: Macau Q1 EBITDA likely to come in below GGR growth as operators deleverage – Inside Asian Gaming

Macaus 1Q24 EBITDA could come in below the 6% quarter-on-quarter GGR growth of 6%, suggesting operators are still heavily focused on deleveraging, according to investment bank Morgan Stanley.

In a note previewing the upcoming earnings season, to be kicked off by Sands China parent Las Vegas Sands later this month, Morgan Stanley analysts Praveen Choudhary, Gareth Leung and Stephen Grambling said they believe industry corporate EBITDA to be around 5% higher quarter-on-quarter at US$1.9 billion putting it at 81% of 1Q19 levels.

However, while some operators have confirmed that their short-term focus post-pandemic was deleveraging, the analysts said they expect to see smaller operating leverage benefits in Q1 due to wage increase and some companies increasing promotions to attract customers.

As such, key trends to keep an eye on during results season will be gambler spending power; updates on opex, reinvestment costs and committed investment guidance; and any changed views on dividend resumptions given that Sands, Melco and SJM are yet to declare any dividends since the pandemic.

We think relative EBITDA performance and mass share gains will be key, and continue to prefer MGM China and Wynn Macau, the analysts said. We also think SJM share will benefit from better EBITDA growth, driven by Grand Lisboa Palace.

We dont prefer Sands or Galaxy just for 1Q24 earnings trade we expect both to have lost market shares in 1Q24 [as] Sands does not have turnover rent benefit in 1Q24, while Galaxy may be hurt by more cost increases.

Morgan Stanley added that Melco Resorts may have also improved its market share in February and March but that higher costs could have impacted margins.

Melco stated in its 4Q23 earnings call that reducing its debt remained its key focus in 2024.

Read the original:

Morgan Stanley: Macau Q1 EBITDA likely to come in below GGR growth as operators deleverage - Inside Asian Gaming

Air Macau welcomes first of two wide-body aircrafts under plan to expand international network – Inside Asian Gaming

Air Macau has announced that the first wide-body aircraft to bne introduced to its fleet this year has arrived in Macau.

The A330-300 landed in Macau from Hangzhou on Sunday morning and flew to Beijing that afternoon, having been leased from Air China.

Air Macau said it plans to introduce a second wide-body aircraft in the second half of this year to operate medium- and long-haul routes, and to add Middle East destinations as part of efforts to expand its European network.

The company first revealed in January that it would introduce two wide-body aircraft this year to complement the development of Macaus aviation industry.

Air Macau introduced its first wide-body aircraft, the A300-600R wide-body aircraft, in 2006 after which time it launched the Taipei-Macau-Shanghai route. However, in 2007 the aircraft was converted into a freighter.

There are currently 26 airlines operating out of Macau International Airport, with routes covering 23 destinations in mainland China, three destinations in Taiwan and 17 international destinations. The passenger market distribution is 46% from mainland China, 16% from Taiwan and 38% from international markets.

Macau International Airport Corporation Limited (CAM) said recently that it would invest more resources into creating incentives for airlines to operate international and medium- and long-haul direct routes.

Originally posted here:

Air Macau welcomes first of two wide-body aircrafts under plan to expand international network - Inside Asian Gaming

Macau Casino Win Tops Expectations, Revenue Climbs to $2.42B – Casino.Org News

Posted on: April 1, 2024, 09:43h.

Last updated on: April 1, 2024, 09:50h.

The Macau Gaming Inspection and Coordination Bureau reported Monday that the enclaves casinos won MOP19.5 billion (US$2.42 billion) from gamblers in March.

March 2024 marked a more than 53% year-over-year boost and was 5.5% better than February when the city hosted many Chinese New Year travelers. March outpaced the consensus forecast among analysts focused on the region the only place under Chinas control where slot machines and table games are allowed which predicted a 49% year-over-year improvement.

March was Macaus second-richest monthly casino win since the city reopened its borders in January 2023.

Through three months of 2024, GGR in what was the worlds richest gaming market before the COVID-19 pandemic a title since reclaimed by Nevada has rebounded 65.5% from the same period in 2023. The regions six casino licensees Sands, Galaxy, MGM, Wynn, SJM, and Melco won $7.11 billion in the first quarter.

Macau has a vastly different operating climate for the six casino concession holders than it did pre-pandemic. China used the global health crisis to improve its national security, and a pillar of the undertaking was preventing large amounts of money stop fleeing the Communist Partys control.

During the health crisis, Beijing instructed Macau to more closely scrutinize casino junket groups that for years had brought the mainlands wealthiest VIP gamblers to the Special Administrative Region (SAR) to gamble in private high-roller rooms. China President Xi Jinping levied accusations that junkets facilitated the transfer of large amounts of cash through the tax haven. The Chinese leader says that poised national security risks.

Macau, a tax haven that operates under Chinas One Country, Two Systems policy that gives the region a high degree of governance autonomy, agreed to crack down on the VIP travel industry to limit the illicit flow of money from the mainland to Macau for the specific purpose to gamble. As a result, junkets are largely gone from Macau.

That has forced the six casino companies, which have invested many billions of dollars each into their resorts around town, to switch their focus to the general and premium mass public.

Some Macau analysts believe the casinos have already successfully pivoted to the general and coveted premium mass-public demographics.

Those market observers are optimistic about 2024 and the years ahead. Analysts at JPMorgan said last week there are no signs of an impending slowdown. Other brokerages arent so convinced.

March GGR was a significant improvement from March 2023, but the $2.42 billion represents 75% of the pre-pandemic March 2019 revenue. Since China and Macau began reopening their borders to international traffic by ending zero-COVID in late 2022, the best month related to 2019 was December 2023 when GGR returned to 81% of the pre-pandemic level.

Macaus post-COVID recovery path is slowing as Chinas economic growth loses momentum, Shirley Zhao and Katia Dmitrieva, economics correspondents for Bloomberg, said Monday. Slowing growth came despite rising numbers of tourists, suggesting per person spending weakened amid deteriorating consumer sentiment.

Paired with the public possibly scaling back their spending are ongoing rising costs for the casinos. Along with inflation, the casinos continue to meet their nongaming investment obligations as dictated through their 2022 relicensing terms.

Read more:

Macau Casino Win Tops Expectations, Revenue Climbs to $2.42B - Casino.Org News

Rookie Robotics Team from Small UWS High School Joining the Giants in Robotics Competition – westsiderag.com

Sonia Benowitz is second from left. Credit: Annabelle Malschlin.

By Lisa Kava

Students from the newly formed robotics team at West End Secondary School (WESS), on West 61st Street, are competing in the New York City regionals of the FIRST Robotics Competition (FRC) from April 5-7. The event will take place at the Armory Track and Field Center in Washington Heights.

Founded in 2015, WESS has 500 students in its public high school. How did its novice robotics team secure a spot at FRC, alongside larger, well-established schools known for their STEM (Science, Technology, Engineering, and Math) programs, such as The Bronx High School of Science and Stuyvesant HIgh School?

The story starts in September 2023 when Upper West Sider Sonia Benowitz, 14, entered 9th grade at WESS. She had loved building LEGO robots in WESSs middle school robotics club, the community of the club and working with friends towards a common goal, she told West Side Rag in a phone interview. But a club did not exist for high school students. So she created one.

First, she approached her school principal who was supportive, she said. Benowitz then asked her middle school robotics coach, Noah Tom-Wong, to help run the club. Together with math teacher Evan Wheeler, who signed on as faculty leader, they began to spread the word. Soon the club had 25 members from 9th through 12th grade.

With Tom-Wongs guidance, the club members gathered wood, metal, and other supplies, ordering from vendors and robotics companies. They began to build a fully functional robot that could perform various tasks through remote wireless control. For example, one task is that the robot will use its arms that we built to pick up disks shaped like frisbees, Benowitz said, then throw the disks into a goal area.

Tom-Wong suggested the club enter the FIRST Robotics Competition, in which he had competed as a student at Stuyvesant High School. He volunteers frequently at FRC competitions. Robotics provides students [with] an incredibly unique environment where they can exert energy safely and with great impact, he told the Rag. The nature of the competition not only makes students good at STEM, but also [at] STEM communication.

But the $6,000 registration fee for the competition was not in the school budget. Thats when Samantha Alvarez Benowitz, Sonias mom, got involved. Researching, she learned about a rookie grant from NASA through its Robotics Alliance Project. The WESS team applied and got it. According to Alvarez Benowitz, they were the only school in New York City selected to receive the NASA grant, and one of five schools in New York state,

On the application we had to describe who was on our team, so I did a demographic survey and found that close to 70% of our team members are from historically underrepresented groups in STEM, including women, people of color, LGBTQ+, and students with disabilities, Sonia Benowitz said. They also wanted to know how we would get and pay for the supplies we needed to build the robot. The team has been fundraising through bake sales and other school functions. They also applied for grants, receiving $2,500 from the Gene Hass Foundation, an automotive company that sponsors STEM education.

At the competition the WESS team will be paired with two other teams to form a three-team alliance. Each team has its own robot which will be programmed to perform different tasks. The robots are judged and awarded points. We have to prepare our robot to complete as many tasks as possible, but also to complete tasks as well as possible, Benowitz explained. The WESS robot has been programmed to drive up a ramp onto a platform, like a car on a road, Alvarez Benowitz added. The ramp and platform are part of an existing set that all the teams use.

Working collaboratively is crucial, according to Tom-Wong. The work that comes out of these robotics teams can be very complex, he said. Its not unusual at competitions to see students from multiple teams working together to fix one teams problem. The top five teams will compete in the championships in Houston at the end of April.

Benowitz is excited about the competition. Our team has been working towards this moment for months, and we have all put in a lot of time and effort to get here. She is also a little nervous. I hope that our robot wont have any problems or break in the middle of a match.

Tom-Wong credits the rookie team for its perseverance. The group had to work with less stock and fewer tools [than most teams]. We also do not have the experience that the veteran teams have, he told the Rag. He is hopeful that WESS students will remain active in robotics in future years. Ultimately this group is unique in that they are pioneering the robotics program at WESS. They are laying the groundwork for a place where students can push themselves to learn and develop.

Subscribe to West Side Rags FREE email newsletterhere.

See more here:

Rookie Robotics Team from Small UWS High School Joining the Giants in Robotics Competition - westsiderag.com

Notus robotics team is headed to 2024 FIRST Championship – KTVB.com

Notus Jr/Sr High School robotics team of five students is headed to the 2024 FIRST Championship in Houston, Texas.

BOISE, Idaho A small robotics team from Notus Jr/Sr High School is living the classic underdog story after they qualified to compete at a world championship.

The team of five students will be heading to Houston, Texas to participate in the 2024 FIRST Championship. On Friday, KTVB spoke to the team advisor, Nick Forbes, who said this is the first year the program was introduced to the Notus. But that hasn't stopped them.

In March of 2024,team 9726 received the Rookie of the Year All-Star Award after competing in Boise. A few days later, they were invited to compete on the world stage.

According to the FIRST website, with every new season the game changes, and students will need to build a robot to achieve the goal. This year's game is called 'CRESCENDO.'

While FIRST's rules recommend a team should consist of 10 students, team 9726 won with half that. But, a student told KTVB it hasn't been without some challenges.

"It was entirely made from duct tape, zip ties, and just things that we had to find around," Ezekiel said. "There were sometimes things that we had to improvise through 3-D printings and other things. We're very proud of the work we've done."

He said their robot mainly plays defense, utilizing a wall, which helped them secure a spot at worlds.

The world championships in Houston kicks off on April 16.

See the latest news from around the Treasure Valley and the Gem State in our YouTube playlist:

HERE ARE MORE WAYS TO GET NEWS FROM KTVB:

Download the KTVB News Mobile App

Apple iOS: Click here to download

Google Play:Click here to download

Watch news reports for FREE on YouTube: KTVB YouTube channel

Stream Live for FREE on ROKU:Add the channel from the ROKU store or by searching 'KTVB'.

Stream Live for FREE on FIRE TV: Search KTVB and click Get to download.

View original post here:

Notus robotics team is headed to 2024 FIRST Championship - KTVB.com

The evolution of robotics: research and application progress of dental implant robotic systems | International Journal of … – Nature.com

Implantology is widely considered the preferred treatment for patients with partial or complete edentulous arches.34,35 The success of the surgery in achieving good esthetic and functional outcomes is directly related to correct and prosthetically-driven implant placement.36 Accurate implant placement is crucial to avoid potential complications such as excessive lateral forces, prosthetic misalignment, food impaction, secondary bone resorption, and peri-implantitis.37 Any deviation during the implant placement can result in damage to the surrounding blood vessels, nerves, and adjacent tooth roots and even cause sinus perforation.38 Therefore, preoperative planning must be implemented intraoperatively with utmost precision to ensure quality and minimize intraoperative and postoperative side effects.39

Currently, implant treatment approaches are as follows: Free-handed implant placement, Static computer-aided implant placement, and dynamic computer-aided implant placement. The widely used free-handed implant placement provides less predictable accuracy and depends on the surgeons experience and expertise.40 Deviation in implant placement is relatively large among surgeons with different levels of experience. When novice surgeons face complex cases, achieving satisfactory results can be challenging. A systematic review41 based on six clinical studies indicated that the ranges of deviation of the platform, apex, and angle from the planned position with free-handed implant placement were (1.250.62)mm(2.771.54)mm, (2.101.00)mm(2.911.52)mm, and 6.904.409.926.01, respectively. Static guides could only provide accurate guidance for the initial implantation position. However, it is difficult to precisely control the depth and angle of osteotomies.42 The lack of real-time feedback on drill positioning during surgery can limit the clinicians ability to obtain necessary information.42,43,44 Besides, surgical guides may also inhibit the cooling of the drills used for implant bed preparation, which may result in necrosis of the overheated bone. Moreover, the use of static guides is limited in patients with limited accessibility, especially for those with implants placed in the posterior area. Additionally, the use of guides cannot flexibly adjust the implant plan intraoperatively. With dynamic computer-aided implant placement, the positions of the patient and drills could be tracked in real-time and displayed on a computer screen along with the surgical plan, thus allowing the surgeon to adjust the drilling path if necessary. However, the surgeons may deviate from the plan or prepare beyond it without physical constraints. During surgery, the surgeon may focus more on the screen for visual information rather than the surgical site, which can lead to reduced tactile feedback.45 The results of a meta-analysis showed that the platform deviation, apex deviation, and angular deviation were 0.91mm (95% CI 0.791.03mm), 1.26mm (95% CI 1.141.38mm), and 3.25 (95% CI 2.843.66) respectively with the static computer-aided implant placement, and 1.28mm (95% CI 0.871.69mm), 1.68mm (95% CI 1.451.90mm), and 3.79 (95% CI 1.875.70), respectively, with dynamic computer-aided implant placement. The analysis results showed that both methods improved the accuracy compared to free-handed implant placement, but they still did not achieve ideal accuracy.46 Gwangho et al.47 believe that the key point of a surgical operation is still manually completed by surgeons, regardless of static guide or dynamic navigation, and the human factors (such as hand tremble, fatigue, and unskilled operation techniques) also affect the accuracy of implant placement.

Robotic-assisted implant surgery could provide accurate implant placement and help the surgeon control handpieces to avoid dangerous tool excursions during surgery.48 Furthermore, compared to manual calibration, registration, and surgery execution, automatic calibration, registration, and drilling using the dental implant robotic system reduces human error factors. This, in turn, helps avoid deviations caused by surgeons factors, thereby enhancing surgical accuracy, safety, success rates, and efficiency while also reducing patient trauma.7 With the continuous improvement of technology and reduction of costs, implant robotics are gradually becoming available for commercial use. Yomi (Neocis Inc., USA) has been approved by the Food and Drug Administration, while Yakebot (Yakebot Technology Co., Ltd., Beijing, China), Remebot (Baihui Weikang Technology Co., Ltd, Beijing, China), Cobot (Langyue dental surgery robot, Shecheng Co. Ltd., Shanghai, China), Theta (Hangzhou Jianjia robot Co., Ltd., Hangzhou, China), and Dcarer (Dcarer Medical Technology Co., Ltd, Suzhou, China) have been approved by the NMPA. Dencore (Lancet Robotics Co., Ltd., Hangzhou, China) is in the clinical trial stage in China.

Compared to other surgeries performed with general anesthesia, dental implant surgery can be completed under local anesthesia, with patients awake but unable to remain completely still throughout the entire procedure. Therefore, research related to dental implant robotic system, as one of the cutting-edge technologies, mainly focuses on acquiring intraoperative feedback information (including tactile and visual information), different surgical methods (automatic drilling and manual drilling), patient position following, and the simulation of surgeons tactile sensation.

The architecture of dental implant robotics primarily comprises the hardware utilized for surgical data acquisition and surgical execution (Fig. 4). Data acquisition involves perceiving, identifying, and understanding the surroundings and the information required for task execution through the encoders, tactile sensors, force sensors, and vision systems. Real-time information obtained also includes the robots surrounding environment, object positions, shapes, sizes, surface features, and other relevant information. The perception system assists the robot in comprehending its working environment and facilitates corresponding decision-making as well as actions.

The architecture of dental implant robotics

During the initial stage of research on implant robotics, owing to the lack of sensory systems, fiducial markers and corresponding algorithms were used to calculate the transformation relationship between the robots and the models coordinate system. The robot was able to determine the actual position through coordinate conversions. Dutreuil et al.49 proposed a new method for creating static guides on casts using robots based on the determined implant position. Subsequently, Boesecke et al.50 developed a surgical planning method using linear interpolation between start and end points, as well as intermediate points. The surgeon performed the osteotomies by holding the handpieces, with the robot guidance based on preoperatively determined implant position. Sun et al.51 and McKenzie et al.52 registered cone-beam computed tomography (CBCT) images, the robots coordinate system, and the patients position using a coordinate measuring machine, which facilitated the transformation of preoperative implant planning into intraoperative actions.

Neocis has developed a dental implant robot system called Yomi (Neocis Inc.)53 based on haptic perception and connects a mechanical joint measurement arm to the patients teeth to track their position. The joint encoder provides information on the drill position, while the haptic feedback of handpieces maneuvered by the surgeon constrains the direction and depth of implant placement.

Optical positioning is a commonly used localization method that offers high precision, a wide -field -of -view, and resistance to interference.54 This makes it capable of providing accurate surgical guidance for robotics. Yu et al.55 combined image-guided technology with robotic systems. They used a binocular camera to capture two images of the same target, extract pixel positions, and employ triangulation to obtain three-dimensional coordinates. This enabled perception of the relative positional relationship between the end-effector and the surrounding environment. Yeotikar et al.56 suggested mounting a camera on the end-effector of the robotic arm, positioned as close to the drill as possible. By aligning the cameras center with the drills line of sight at a specific height on the lower jaw surface, the cameras center accurately aligns with the drills position in a two-dimensional space at a fixed height from the lower jaw. This alignment guides the robotic arm in drilling through specific anatomical landmarks in the oral cavity. Yan et al.57 proposed that the use of eye-in-hand optical navigation systems during surgery may introduce errors when changing the handpiece at the end of the robotic arm. Additionally, owing to the narrow oral environment, customized markers may fall outside the cameras field of view when the robotic arm moves to certain positions.42 To tackle this problem, a dental implant robot system based on optical marker spatial registration and probe positioning strategies is designed. Zhao et al constructed a modular implant robotic system based on binocular visual navigation devices operating on the principles of visible light with eye-to-hand mode, allowing complete observation of markers and handpieces within the cameras field of view, thereby ensuring greater flexibility and stability.38,58

The dental implant robotics execution system comprises hardware such as motors, force sensors, actuators, controllers, and software components to perform tasks and actions during implant surgery. The system receives commands, controls the robots movements and behaviors, and executes the necessary tasks and actions. Presently, research on dental implant robotic systems primarily focuses on the mechanical arm structure and drilling methods.

The majority of dental implant robotic systems directly adopt serial-linked industrial robotic arms based on the successful application of industrial robots with the same robotic arm connection.59,60,61,62 These studies not only establish implant robot platforms to validate implant accuracy and assess the influence of implant angles, depths, and diameters on initial stability but also simulate chewing processes and prepare natural root-shaped osteotomies based on volume decomposition. Presently, most dental implant robots in research employ a single robotic arm for surgery. Lai et al.62 indicated that the stability of the handpieces during surgery and real-time feedback of patient movement are crucial factors affecting the accuracy of robot-assisted implant surgery. The former requires physical feedback, while the latter necessitates visual feedback. Hence, they employed a dual-arm robotic system where the main robotic arm was equipped with multi-axis force and torque sensors for performing osteotomies and implant placement. The auxiliary arm consisted of an infrared monocular probe used for visual system positioning to address visual occlusion issues arising from changes in arm angles during surgery.

The robots mentioned above use handpieces to execute osteotomies and implant placement. However, owing to limitations in patient mouth opening, performing osteotomies and placing implants in the posterior region can be challenging. To overcome the spatial constraints during osteotomies in implant surgery, Yuan et al.63 proposed a robot system based on earlier research which is laser-assisted tooth preparation. This system involves a non-contact ultra-short pulse laser for preparing osteotomies. The preliminary findings confirmed the feasibility of robotically controlling ultra-short pulse lasers for osteotomies, introducing a novel method for a non-contact dental implant robotic system.

It can be challenging for patients under local anesthesia to remain completely still during robot-assisted dental implant surgery.52,64,65,66,67 Any significant micromovement in the patients position can severely affect clinical surgical outcomes, such as surgical efficiency, implant placement accuracy compared to the planned position, and patient safety. Intraoperative movement may necessitate re-registration for certain dental implant robotic systems. In order to guarantee safety and accuracy during surgery, the robot must detect any movement in the patients position and promptly adjust the position of the robotic arm in real time. Yakebot uses binocular vision to monitor visual markers placed outside the patients mouth and at the end of the robotic arm. This captures motion information and calculates relative position errors. The robot control system utilizes preoperatively planned positions, visual and force feedback, and robot kinematic models to calculate optimal control commands for guiding the robotic arms micromovements and tracking the patients micromovements during drilling. As the osteotomies are performed to the planned depth, the robotic arm compensates for the patients displacement through the position following the function. The Yakebots visual system continuously monitors the patients head movement in real time and issues control commands every 0.008s. The robotic arm is capable of following the patients movements with a motion servo in just 0.2s, ensuring precise and timely positioning.

Robot-assisted dental implant surgery requires the expertise and tactile sense of a surgeon to ensure accurate implantation. Experienced surgeons can perceive bone density through the resistance they feel in their hands and adjust the force magnitude or direction accordingly. This ensures proper drilling along the planned path. However, robotic systems lack perception and control, which may result in a preference for the bone side with lower density. This can lead to inaccurate positioning compared to the planned implant position.61,62 Addressing this challenge, Li et al.68 established force-deformation compensation curves in the X, Y, and Z directions for the robots end-effector based on the visual and force servo systems of the autonomous dental robotic system, Yakebot. Subsequently, a corresponding force-deformation compensation strategy was formulated for this robot, thus proving the effectiveness and accuracy of force and visual servo control through in vitro experiments. The implementation of this mixed control mode, which integrates visual and force servo systems, has improved the robots accuracy in implantation and ability to handle complex bone structures. Based on force and visual servo control systems, Chen et al.69 have also explored the relationship between force sensing and the primary stability of implants placed using the Yakebot autonomous dental robotic system through an in vitro study. A significant correlation was found between Yakebots force sensing and the insertion torque of the implants. This correlation conforms to an interpretable mathematical model, which facilitates the predictable initial stability of the implants after placement.

During osteotomies with heat production (which is considered one of the leading causes of bone tissue injury), experienced surgeons could sense possible thermal exposure via their hand feeling. However, with free-handed implant placement surgery, it is challenging to perceive temperature changes during the surgical process and establish an effective temperature prediction model that relies solely on a surgeons tactile sense. Zhao et al.70, using the Yakebot robotic system, investigated the correlation between drilling-related mechanical data and heat production and established a clinically relevant surrogate for intraosseous temperature measurement using force/torque sensor-captured signals. They also established a real-time temperature prediction model based on real-time force sensor monitoring values. This model aims to effectively prevent the adverse effects of high temperatures on osseointegration, laying the foundation for the dental implant robotic system to autonomously control heat production and prevent bone damage during autonomous robotic implant surgery.

The innovative technologies mentioned above allow dental implant robotic systems to simulate the tactile sensation of a surgeon and even surpass the limitations of human experience. This advancement promises to address issues that free-handed implant placement techniques struggle to resolve. Moreover, this development indicates substantial progress and great potential for implantation.

The robotic assistant dental implant surgery consists of three steps: preoperative planning, intraoperative phase, and postoperative phase (Fig. 5). For preoperative planning, it is necessary to obtain digital intraoral casts and CBCT data from the patient, which are then imported into preoperative planning software for 3D reconstruction and planning implant placement. For single or multiple tooth gaps using implant robotic systems (except Yakebot),61,62,71,72 a universal registration device (such as the U-shaped tube) must be worn on the patients missing tooth site using a silicone impression material preoperatively to acquire CBCT data for registration. The software performs virtual placement of implant positions based on prosthetic and biological principles of implant surgery, taking into account the bone quality of the edentulous implant site to determine the drilling sequence, insertion depth of each drill, speed, and feed rate. For single or multiple tooth implants performed using Yakebot, there is no need for preoperative CBCT imaging with markers. However, it is necessary to design surgical accessories with registration holes, brackets for attaching visual markers, and devices for assisting mouth opening and suction within the software (Yakebot Technology Co., Ltd., Beijing, China). These accessories are manufactured using 3D printing technology.

Clinical workflow of robotic-assisted dental implant placement

For the intraoperative phase, the first step is preoperative registration and calibration. For Yakebot, the end-effector marker is mounted to the robotic arm, and the spatial positions are recorded under the optical tracker. The calibration plate with the positioning points is then assembled into the implant handpiece for drill tip calibration. Then, the registration probe is inserted in the registration holes of the jaw positioning plate in turn for spatial registration of the jaw marker and the jaw. Robot-assisted dental implant surgery usually does not require flapped surgery,73,74, yet bone grafting due to insufficient bone volume in a single edentulous space or cases of complete edentulism requiring alveolar ridge preparation may require elevation of flaps. For full-arch robot-assisted implant surgery, a personalized template with a positioning marker is required and should be fixed with metallic pins for undergoing an intraoperative CBCT examination, thus facilitating the robot and the jaws registration in the visual space and allowing the surgical robot to track the patients motion. The safe deployment of a robot from the surgical site is an essential principle for robot-assisted implant surgery. In the case of most robots, such as Yomi, the surgeon needs to hold the handpieces to control and supervise the robots movement in real time and stop the robotic arms movement in case of any accidents. With Yakebot, the entire surgery is performed under the surgeons supervision, and immediate instructions are sent in response to possible emergencies via a foot pedal. Additionally, the recording of the entrance and exit of the patients mouth ensures that the instruments would not damage the patients surrounding tissues. The postoperative phase aims at postoperative CBCT acquisition and accuracy measurement.

In clinical surgical practice, robots with varying levels of autonomy perform implant surgeries differently. According to the autonomy levels classified by Yang et al.6,8,33 for medical robots, commercial dental implant robotic systems (Table 2) currently operate at the level of robot assistance or task autonomy.

The robot-assistance dental implant robotic systems provide haptic,75 visual or combined visual and tactile guidance during dental implant surgery.46,76,77 Throughout the procedure, surgeons must maneuver handpieces attached to the robotic guidance arm and apply light force to prepare osteotomies.62 The robotic arm constrains the 3D space of the drill as defined by the virtual plan, enabling surgeons to move the end of the mechanical arm horizontally or adjust its movement speed. However, during immediate implant placement or full-arch implant surgery, both surgeons and robots may struggle to accurately perceive poor bone quality, which should prompt adjustments at the time of implant placement. This can lead to incorrect final implant positions compared to the planned locations.

The task-autonomous dental implant robotic systems can autonomously perform partial surgical procedures, such as adjusting the position of the handpiece to the planned position and preparing the implant bed at a predetermined speed according to the pre-operative implant plan, and surgeons should send instructions, monitor the robots operation, and perform partial interventions as needed. For example, the Remebot77,78 requires surgeons to drag the robotic arm into and out of the mouth during surgery, and the robot automatically performs osteotomies or places implants according to planned positions under the surgeons surveillance. The autonomous dental implant robot system, Yakebot,73,79,80 can accurately reach the implant site and complete operations such as implant bed preparation and placement during surgery. It can be controlled by the surgeon using foot pedals and automatically stops drilling after reaching the termination position before returning to the initial position. Throughout the entire process, surgeons only need to send commands to the robot using foot pedals.

Figure 6 shows the results of accuracy in vitro, in vivo, and clinical studies on robot-assisted implant surgery.20,46,48,55,62,64,67,68,69,70,71,72,75,76,77,78,79,80,81,82,83,84,85,86,87,88,89 The results suggest that platform and apex deviation values are consistent across different studies. However, there are significant variations in angular deviations among different studies, which may be attributed to differences in the perception and responsiveness to bone quality variances among different robotic systems. Therefore, future development should focus on enhancing the autonomy of implant robots and improving their ability to recognize and respond to complex bone structures.

Accuracy reported in studies on robotic-assisted implant placement

Xu et al.77 conducted a phantom experimental study comparing the implant placement accuracy in three levels of dental implant robotics, namely passive robot (Dcarer, level 1), semi-active robot (Remebot, level 2), and active robot (Yakebot, level 2) (Fig. 7). The study found that active robot had the lowest deviations at the platform and apex of the planned and actual implant positions, While the semi-active robot also had the lowest angular deviations. Chen et al.46 and Jia et al.79 conducted clinical trials of robotic implant surgery in partially edentulous patients using a semi-active dental implant robotic system (level 1) and an autonomous dental implant robot (level 2). The deviations of the implant platform, apex, and angle were (0.530.23)mm/(0.430.18)mm, (0.530.24)mm/(0.560.18)mm and 2.811.13/1.480.59, respectively. These results consistently confirmed that robotic systems can achieve higher implant accuracy than static guidance and that there is no significant correlation between accuracy and implant site (such as anterior or posterior site). The platform and angle deviation of autonomous dental implant robots were smaller than those of semi-active dental implant robotic systems. Li et al.73 reported the use of the autonomous dental implant robot (level 2) to complete the placement of two adjacent implants with immediate postoperative restoration. The interim prosthesis fabricated prior to implant placement was seated without any adjustment, and no adverse reactions occurred during the operation.

Comparison of accuracy of dental implant robotics with different levels of autonomy (phantom experiments) (*P<0.05, **P<0.01, ***P<0.001)

Bolding et al.,53 Li et al.,20 Jia et al.,79 and Xie et al.90 used dental implant robots to conduct clinical trials in full-arch implant surgery with five or six implants placed in each jaw. The deviations of implant platform, apex, and angle are shown in Fig. 8. The haptic dental implant robot (level 1) used by Bolding et al.,53 achieved more deviations compared to other studies that used semi-active (level 1) or active robots (level 2). As its handpiece must be maneuvered by the surgeon, human errors such as surgeon fatigue may not be avoided. Owing to the parallel common implant placement paths between various implant abutments, prefabricated temporary dentures could be seated smoothly, and some patients wore temporary complete dentures immediately after surgery. These results indicate that robotic systems can accurately locate and perform implant placement during surgery.

Comparison of accuracy in robotic-assisted full-arch implant placement

As there are relatively few studies of implant robots in clinical applications, Tak acs et al.91 conducted a meta-analysis under in vitro conditions with free-handed, static-guided, dynamic navigated, and robotic-assisted implant placements, as shown in Fig. 9. It was found that, compared to free-handed, static guided and dynamic navigated implant placements, robotic-assisted implant placements have more advantages in terms of accuracy. However, in vitro studies cannot fully simulate the patients oral condition and bone quality. Recent clinical studies89,92,93 have shown a lower deviation in robotic-assisted implant placements compared to static-guided and dynamic-navigated implant placements. Common reasons for deviations in static-guided and dynamic-navigated implant placements include the following: deflection caused by hand tremors due to dense bone during surgery, surgeons experience, and other human factors. Larger clinical studies will be needed in the future to evaluate the differences between robotic and conventional surgical approaches and to provide guidance for the further development and refinement of robotic techniques.

Comparison of accuracy of free-handed, static, dynamic, and robotic-assisted implant placement. (FHIP free-hand implant placement, SCAIP static computer-aided implant placement, DCAIP dynamic computer-aided implant placement, RAIP robot-assisted implant placement)

For the long-term follow-up performance of robotic systems used in dental implant procedures, none of the comparative studies was longer than a year. One 1-year prospective clinical study by Xie et al.90 showed that the peri-implant tissues after robot-assisted full arch surgery at 1-year visit remained stable. There is little evidence indicating clinical outcomes especially for patient-reported outcomes. A more detailed clinical assessment should be included for further research.

Although robotic-assisted dental implant surgery can improve accuracy and treatment quality,94 it involves complex registration, calibration, and verification procedures that prolong the duration of surgery. These tedious processes may introduce new errors,61 and lower work efficiency, especially in single tooth implant placement62 that could extend visit times and affect patient satisfaction.62 Besides, surgeons are required to undergo additional training to familiarize themselves with the robotic system.87

During implantation, the drill tips at the end of the robotic arms cannot be tilted, and this can increase the difficulty of using robots in posterior sections with limited occlusal space.61,62 In addition, currently available marker systems require patients to wear additional devices to hold the marker in place. If these markers are contaminated or obstructed by blood, the visual system may not be able to detect them, limiting surgical maneuverability to some extent. During immediate implant placement or in cases of poor bone quality in the implant site, the drill tips may deviate towards the tooth sockets or areas of lower bone density, seriously affecting surgical precision.

Currently, only one study has developed a corresponding force-deformation compensation strategy for robots,68 but clinical validation is still lacking. Additionally, the dental implant robotic system, along with other dental implant robots developed for prosthetics, endodontics, and orthodontics, is currently single-functional. Multi-functional robots are required for performing various dental treatments.

Despite the enormous potential of robotic systems in the medical field, similar to the development of computer-aided design/computer-aided manufacturing technology, introducing and applying this technology faces multiple challenges in the initial stages. The high cost of robotic equipment may limit its promotion and application in certain regions or medical institutions. Surgeons require specialized technical training before operating robotic systems, which translates to additional training costs and time investment.95

Go here to see the original:

The evolution of robotics: research and application progress of dental implant robotic systems | International Journal of ... - Nature.com

Nvidia Announces Robotics-Oriented AI Foundational Model – InfoQ.com

At its recent GTC 2024 event, Nvidia announced a new foundational model to build intelligent humanoid robots. Dubbed GR00T, short for Generalist Robot 00 Technology, the model will understand natural language and be able to observe human actions and emulate human movements.

According to Nvidia CEO Jensen Huang, creating intelligent humanoid robots is the most exciting AI problem today. GR00T robots will learn coordination and other skills by observing humans to be able to navigate, adapt and interact with the real world. At the conference keynote, Huang showed several demos of what GR00T is capable of at the moment, including some robots performing a number of tasks.

The GR00T model takes multimodal instructions and past interactions as input and produces the actions for the robot to execute.

To power GR00T, Nvidia has created a new family of systems-on-modules, called Jetson Thor, using the latest Blackwell graphics architecture from the company and able to provide 800 teraflops (TFLOPS) of eight-bit floating-point compute.

At the foundation of GR00T lies Nvidia Isaac Sim, an extensible, Omniverse-based platform for robotics simulation aimed to improve the way AI-based robots are designed and tested, according to the company.

To train GR00T at scale, Nvidia has also built a new compute orchestration platform, Nvidia Osmo, aimed at coordinating training and inference across several Nvidia systems, including DGX systems for training, OVX systems for simulation, and IGX and AGX systems for hardware-in-the-loop validation.

Embodied AI models require massive amounts of real and synthetic data. The new Isaac Lab is a GPU-accelerated, lightweight, performance-optimized application built on Isaac Sim specifically for running thousands of parallel simulations for robot learning.

While GR00T is still very much a work in progress, Nvidia has announced two of the building blocks that will compose it, as part of the Isaac platform: a foundational model for robotic-arm manipulators, called Isaac Manipulator, and a collection of hardware-accelerated packages for visual AI and perception, the Isaac Perceptor.

According to Nvidia, Isaac Manipulator

provides up to an 80x speedup in path planning and zero-shot perception increases efficiency and throughput, enabling developers to automate a greater number of new robotic tasks.

On the other hand, Isaac Perceptor aims to improve efficiency and safety in environments where autonomous mobile robots are used, such as in manufacturing and fulfillment operations.

Both the Manipulator and the Perceptor should become available in the next quarter, says Huang.

On a related note, Nvidia has joined the Open Source Robotics Alliance, which aims to provide financial and industry support to the Robot Operating System (ROS). The company has not detailed if they plan to use ROS for GR00T robots, though.

Link:

Nvidia Announces Robotics-Oriented AI Foundational Model - InfoQ.com

Century High School Robotics team becomes champion of 2024 regional – KIMT 3

ROCHESTER, MN.--Century High School Robotics team is the number one seed and champion of the 2024 10,000 Lakes Regional.

Rochester has been representing the tournament for nearly two decades. The team, Inconceivable 2530 will now advance to FIRST Robotics Championship in Houston, TX from April 17- April 20.

The rest is here:

Century High School Robotics team becomes champion of 2024 regional - KIMT 3

Rainbow Robotics unveils RB-Y1 wheeled, two armed robot – Robot Report

Listen to this article

RB-Y1 mounts a humanoid-type double-arm robot on a wheeled, high-speed mobile base. | Credit: Rainbow Robotics

Rainbow Robotics announced the release of detailed specifications for the new RB-Y1 mobile robot. The company recently signed a memorandum of understanding with Schaeffler Group and the Korea Electronics Technology Institute, or KETI, to co-develop the RB-Y1 and other mobile manipulators in Korea.

The past year has seen an explosion in the growth of humanoids, where most of the robots are bipedal and walk on two legs. Likewise, there have been many recent releases of mobile manipulators, or autonomous mobile robots (AMRs) with a single arm manipulator on board the vehicle.

The RB-Y1 is a form of wheeled robot base with a humanoid double-arm robot on top. Rainbow Robotics robot uses that base to maneuver through its environment and position the arms for manipulation tasks. The company called this configuration a bimanual manipulator.

To perform various and complex tasks, both arms on the RB-Y1 are equipped with seven degrees of freedom and consist of a single torso with six axes that can move the body. With this kinematic configuration, it is possible to move more than 50 cm (19.7 in.) vertically, making it possible to perform tasks at various heights.

Learn from Agility Robotics, Amazon, Disney, Teradyne and many more.

The maximum driving speed for the RB-Y1 is 2,500 mm/s (5.6 mph), and the company is claiming that the robot can accelerate quickly and turn at higher speeds by leaning the body into the turn. To avoid toppling while in motion, the center of gravity can be safely controlled by dynamically changing the height of the body.

The dimensions of the robots are 600 x 690 x 1,400 mm (23.6 x 27.2 x 55.1 in.), and the unit weighs 131 kg (288.8 lb.). The manipulators can each lift 3 kg (6.61 lb.).

At press time, there are not a lot of details about the robots ability to function using artificial intelligence, and one early video showed it working via teleoperation. Its likely that the demonstrations in the video below are with remote operators.

However, Rainbow Robotics clearly has the goal of making its robot fully autonomous in the future, as more research, development, training, and simulation are completed.

These days, when Generative AI such as ChatGPT and Figure is a hot topic in the robot industry, we have developed a bimanual mobile manipulator in line with the AI era, stated a company spokesperson. We hope that the platform will overcome the limitations of existing industrial robots and be used in many industrial sites.

Original post:

Rainbow Robotics unveils RB-Y1 wheeled, two armed robot - Robot Report

Comau and Leonardo Want to Elevate Aeronautical Structure Inspection with Cognitive Robotics – DirectIndustry e-Magazine

Robotic company Comau and aerospace company Leonardo are currently testing a self-adaptive robotic solution to enable autonomous inspection of helicopter blades. This could enhance quality inspections and offer greater flexibility without sacrificing precision or repeatability. At a time when the aerospace industry demands faster processes, better control, and higher quality, it requires a new generation of advanced automation. We contacted Simone Panicucci, Head of Cognitive Robotics at Comauto know more about this solution and how it could benefit the aerospace industry.

The increasing demand for faster processes in the aerospace industry requires to automate complex processes that, until recently, could only be manual. When it comes to testing essential structures such as helicopter blades, the potential benefits of automation increase exponentially. Robotic inspection ensures precision and efficiency. It also ensures standardization and full compliance with the testing process by objectively executing each assigned task.

To meet the industrys needs, Comau and Leonardo have been testing an intelligent inspection solution based on Comaus cognitive robotics, on-site in Anagni, Italy to inspect helicopter blades measuring up to 7 meters.

The solution relies on a combination of self-adaptive robotics, advanced vision systems, and artificial intelligence. Comaus intelligent robot can autonomously perform hammer tests and multispectral surface inspections on the entire nonlinear blade to measure and verify structural integrity, with a granularity exceeding thousands of points.

The robot perceives and comprehends its environment, makes calculated decisions, and intuitively optimizes the entire inspection process.

They will then test the system on another site to enhance MRO (maintenance, repair, and overhaul) service capabilities.

We contacted Simone Panicucci, Head of Cognitive Robotics at Comau who gave us more details about this collaboration.

Simone Panicucci: The collaboration grew out of Leonardos need to ensure advanced autonomous inspection of highly critical aviation infrastructure using cognitive robotics. The two companies are collaborating to develop and test a powerful, self-adaptive robotic solution to autonomously inspect helicopter blades up to 7 meters in length. Aerospace is not a sector that is used to automation yet. The high variability and the low volumes act as constraints toward a deep automation adoption. Cognitive robotics solutions are thus a key enabler to provide the automation benefits (such as process engineering, repeatability, and traceability) even with heterogeneous products and unstructured environments and Comau is leading the creation of AI-based, custom robotic solutions.

Simone Panicucci: The solution developed is a self-adaptive and efficient machine to inspect really large helicopter blades. It includes a visual inspection as well as a tapping test. It consists in physically stimulating the blade surface with an ad-hoc little hammer to recognize from the consequent sound if there is any issue in the blades internal structure. Jointly, both inspections require testing tens of thousands of points on the overall blade.

The robot can sense the environment, and locate the blade in the space with an accuracy below 10 mm. It can also understand potential objects in the scene the robot may collide with. And it can calculate at run time the optimal and collision-free path planning to complete the task.

Simone Panicucci: The solution is provided with a 3D camera whose input is elaborated by a vision system to merge multiple acquisitions, post-process the scene acquired, and then localize both the helicopter blade as well as potential obstacles.

Simone Panicucci: All the movements performed by the robot are calculated once the scene has been sensed, which means that no robot movement has been offline calculated. Additional sensors have been added to the robot flange as an external and independent system to avoid damaging the blade.

Simone Panicucci: Today, helicopter blade inspection is done manually. The provided solution offers greater accuracy and efficiency, ensuring standardization and full compliance with the testing process by objectively completing each assigned task. Operators now program the machine, codifying their experience through a simplified user interface. The machine can work for hours without intervention, providing an accurate report summarizing critical points at the end.

Simone Panicucci: The flexibility is given by the fact that the solution is able to deal with different helicopter blade models and potentially even different helicopter components. In addition, accuracy and repeatability are typical automation takeaways, now even improved thanks to vision system adoption. Increased quality is due to the fact that the operator can now focus on the activity where he/she brings most of the value, the defect detection and confirmation, instead of mechanically performing the inspection.

Simone Panicucci: Operator knowledge is always at the center. Leonardo personnel keep the final word regarding the helicopter blade status certification as well as any point inspected. The automation solution aims to alleviate operators from the repetitive task of manually inspecting tens of thousands of points on the helicopter surface. After hours of signal recording, the solution generates a comprehensive report summarizing the results of AI-based anomaly detection. The industrialized solution ensures repeatability, reliability, and traceability, covering and accurately performing the task.

Simone Panicucci: The solution is CE-certified and incorporates both physical and virtual safety measures. Physical barriers and safety lasers create a secure perimeter, halting operations instantly in the event of unexpected human intrusion. Furthermore, the solution ensures safe loading and unloading of helicopter blades and verifies proper positioning by requiring operators to activate safety keys from a distance of approximately 10 meters.

Simone Panicucci: This solution demonstrates that product heterogeneity and low volumes, typical of the aerospace sector, no longer constrain automation adoption. Comaus cognitive robotics approach enables the delivery of effectiveness, quality, and repeatability even in unstructured environments and with low volumes. It easily adapts to different helicopter models and blades. Executing a process like the tapping test necessitated defining requirements and process engineering. This involved defining the material of the tapping tool, as well as the angle and force to apply. Additionally, all labeled data, whether automatic or manual, are now tracked and recorded, facilitating the creation of an extensive knowledge base to train deep learning models.

Simone Panicucci: Leonardo has been conducting tests on this solution as part of a technology demonstration. This technology holds potential benefits for both Leonardo and its customers. It could standardize inspection processes globally and may be offered or deployed to customers with numerous helicopters requiring inspection.

Simone Panicucci: The specific solution could obviously be extended to other inspections in the helicopter sectors as well as the avionics. But it is worth mentioning that from the technology point of view, the software pipeline, as well as the localization and optimal path planning may be easily applicable in other inspection activities as well as manufacturing or even continuous processes, like welding.

Simone Panicucci: The next steps involve thorough testing of the automation solution at another Leonardo Helicopters plant. This process will contribute to ongoing improvements in the knowledge base and, consequently, the deep learning algorithm for anomaly recognition.

Continued here:

Comau and Leonardo Want to Elevate Aeronautical Structure Inspection with Cognitive Robotics - DirectIndustry e-Magazine

Google giving $500K to expand robotics and AI education programs in Washington state – GeekWire

U.S. Congresswoman Suzan DelBene joins Googles Paco Galanes, Kirkland site lead and engineering director, right, with students working on robotics projects at Finn Hill Middle School in Kirkland, Wash., on Friday. (Google Photo)

Googles philanthropic arm is giving a $500,000 grant to expand access to robotics and artificial intelligence education programs across Washington state middle schools, the company announced Friday.

In partnership with the non-profits Robotics Education & Competition Foundation (RECF) and For InSpiration and Recognition of Science and Technology (FIRST), Google.org said the grant would support 1,234 new or existing robotics clubs in Washington and reach more than 8,900 students over the course of three years.

The announcement came during an event Friday morning at Finn Hill Middle School in Kirkland, Wash., where students put together robots and were introduced to hands-on STEM tools by Google employee volunteers. The Alphabet-owned tech giant has a sizable workforce in Kirkland and the greater Seattle area.

U.S. Congresswoman Suzan DelBene (D-WA) attended the event and said the investment was key to educating future leaders in robotics and AI.

Programs like these give young people the opportunity to innovate, build new skills, and open bright new pathways for their future, DelBene said.

The funding is part of a $10 million initiative launched by Google.org to fund FIRST and RECF in communities where the company has a presence.

Continue reading here:

Google giving $500K to expand robotics and AI education programs in Washington state - GeekWire

NEURA and Omron Robotics partner to offer cognitive factory automation – Robot Report

Listen to this article

NEURA has developed cognitive robots in a variety of form factors. Source: NEURA Robotics

Talk about combining robotics and artificial intelligence is all the rage, but some convergence is already maturing. NEURA Robotics GmbH and Omron Robotics and Safety Technologies Inc. today announced a strategic partnership to introduce cognitive robotics into manufacturing.

By pooling our sensor and AI technologies and expertise into an ultimate platform approach, we will significantly shape the future of the manufacturing industry and set new standards, stated David Reger, founder and CEO of NEURA Robotics.

Reger founded the company in 2019 with the intention of combining sensors and AI with robotics components for a platform for app development similar to that of smartphones. The NEURAverse offers flexibility and cost efficiency in automation, according to the company.

Unlike traditional industrial robots, cognitive robots have the ability to learn from their environment, make decisions autonomously, and adapt to dynamic production scenarios, said Metzingen, Germany-based NEURA. This opens new application possibilities including intricate assembly tasks, detailed quality inspections, and adaptive material handling processes.

We see NEURAs cognitive technologies as a compelling growth opportunity for industrial robotics, added Olivier Welker, president and CEO of Omron Robotics and Safety Technologies. By combining NEURAs innovative solutions with Omrons global reach and automation portfolio, we will provide customers new ways to increase safety, productivity, and flexibility in their operations.

Pleasanton, Calif.-based Omron Robotics is a subsidiary of OMRON Corp. focusing on automation and safety sensing. It designs and manufactures industrial, collaborative, and mobile robots for various industries.

Weve known Omron for quite some time, and even before I started NEURA, we had talked about collaborating, Reger told The Robot Report. Theyve tested our products, and weve worked together on how to benefit both sides.

We have the cognitive platform, and theyre one of the biggest sensor, controllers, and safety systems providers, he added. This collaboration will integrate our cognitive abilities and NEURAverse with their sensors for a plug-and-play solution, which everyone is working toward.

Omron Robotics Olivier Welker and NEURAs David Reger celebrate their partnership. Source: NEURA

When asked whether NEURA and Omron Robotics partnership is mainly focused on market access, Reger replied, Its not just the sales channel there are no really big limits. From both sides, there will be add-ons.

Rather than see each other as competitors, NEURA and Omron Robotics are working to make robots easier to use, he explained.

As a billion-dollar company, it could have told our startup what it wanted, but Omron is different, said Reger. I felt we got a lot of respect from Olivier and everyone in that organization. It wont be a one-sided thing; it will be just Lets help each other do something great. Thats what were feeling every day since weve been working together. Now we can start talking about it.

NEURA has also been looking at mobile manipulation and humanoid robots, but adding capabilities to industrial automation is the low-hanging fruit, where small changes can have a huge effect, said Reger. A lot of things for humanoids have not yet been solved.

I would love to just work on household robots, but the best way to get there is to use the synergy between industrial robotics and the household market, he noted. Our MAiRA, for example, is a cognitive robot able to scan an environment and from an idle state pick any known or unknown objects.

MAiRA cognitive robot on MAV mobile base. Source: NEURA Robotics

NEURA and Omron Robotics promise to make robots easier to use, helping overall adoption, Reger said.

A big warehouse company out of the U.S. is claiming that its already using more than 1 million robots, but at the same time, Im sure theyd love to use many more robots, he said. Its also in the transformation from a niche market into a mass market. We see thats currently only possible if you somehow control the environment.

Its not just putting all the sensors inside the robot, which we were first to do, and saying, OK, now were able to interact with a human and also pick objects,' said Reger. Imagine there are external sensors, but how do you calibrate them? To make everything plug and play, you need new interfaces, which means collaboration with big players like Omron that provide a lot of sensors for the automation market.

NEURA has developed its own sensors and explored the balance of putting processing in the cloud versus the edge. To make its platform as popular with developers as that of Apple, however, the company needs the support of partners like Omron, he said.

Reger also mentioned NEURAs partnership with Kawasaki, announced last year, in which Kawasaki offers the LARA CL series cobot with its portfolio. Both collaborations are incredibly important for NEURA and will soon make sense to everyone, he said.

Reger will be presenting a session on Developing Cognitive Robotics Systems at 2:45 p.m. EDT on Wednesday, May 1, Day 1 of the Robotics Summit & Expo. The event will be at the Boston Convention and Exhibition Center, and registration is now open.

Ill be talking about making robots cognitive to enable AI to be useful to humanity instead of competing with us, he said. AI is making great steps, but if you look at what its doing, like drawing pictures or writing stories these are things that Id love to do but dont have the time for. But if I ask, lets say, AI to take out the garbage or show it a picture of garbage, it can tell me how to do it, but its simply not able to do something about it yet.

NEURA is watching humanoid development but is focusing on integrating cognitive robotics with sensing and wearables as it expands in the U.S., said Reger. The company is planning for facilities in Detroit, Boston, and elsewhere, and it is looking for leadership team members as well as application developers and engineers.

We dont just want a sales office, but also production in the U.S., he said. We have 220 people in Germany I just welcomed 15 new people who joined NEURA and are starting to build our U.S. team. In the past several months, weve gone with only European and American investors, and were looking at the Japanese market. The U.S. is now open to innovation, and its an exciting time for us to come.

Learn from Agility Robotics, Amazon, Disney, Teradyne and many more.

Read this article:

NEURA and Omron Robotics partner to offer cognitive factory automation - Robot Report

Mercer University hosts GeorgiaFirst Robotics competition – 13WMAZ.com

GeorgiaFirst Robotics is a STEM-based organization for youth across the state.

MACON, Ga. On Saturday, high school students put their wits to the test during the GeorgiaFirst Robitics District Qualifier.

GeorgiaFirst Robotics is a STEM-based organization for youth across the state. They partnered with FIRST in South Carolina and Mercer University to host the competition at Hawkins Arena.

The Peachtree District Championship is the state championship of high school-level robotics and challenges teams of students to build robots to compete in head-to-head challenges. The top 50 teams from the Peachtree District will receive an invitation to compete against the best in the district.

At the competition on Saturday, there were 50 teams present and around 200 students. Another thousand people attended as spectators.

Event organizers describe the competition as combining the excitement of sport with the rigors of science and technology." They say its the ultimate sport for the mind, and that high-school student participants call it the hardest fun youll ever have.

"Students that are here are not only learning the technical skills in engineering, but they also have skills that they're learning through collaboration through teamwork through communication. Many of them are running social media, they learn how to run a business," Assistant Vice President for Enrollment Dr. Kelly Holloway said.

Programs like GeorgiaFirst Robotics aim to teach technical skills in engineering, teamwork and communication.

During the competition, three teams went head-to-head with another three teams.

The robots needed to complete obstacle courses like running over big donut-like tubes and throwing them into goals. During the first 20 seconds of the match, the robots are on their own doing the competition through code, and then afterward they are remote-controlled by the students.

Organizers say people might be surprised by what it takes to get these robots up and running and the career paths it opens up for students.

WHAT OTHER PEOPLE ARE READING:

View original post here:

Mercer University hosts GeorgiaFirst Robotics competition - 13WMAZ.com

Pioneering Emotional Intelligence in Robotics: The Rise of Emo – yTech

In a breakthrough for robotics and artificial intelligence (AI), a robot named Emo stands as a testament to technological ingenuity, possessing the capability to learn and replicate human emotional expressions. This development marks a significant stride in narrowing the emotional divide between humans and machines, potentially reshaping the way we interact with robots in a multitude of sectors.

Core Innovation Behind Emos Emotional Acuity Emos core innovation lies in its dual neural network architecture, which empowers the robot with unprecedented emotional intelligence. By utilizing advanced cameras and motor systems, Emo can observe and assimilate human expressions. Over time, its capacity to respond in contextually relevant ways improves, making human-robot interactions increasingly natural and seamless.

Professor Hod Lipson and his team are the visionaries behind Emos conceptualization and realization. Their work paves the way for a future where robots can forge emotional bonds with humans, setting a new benchmark in social robotics.

Potential for Transformative Impact Across Industries The ripple effect of Emos introduction is vast, with implications for customer service, therapy, elder care, and education. It foretells significant growth within the social robotics market, with affordable manufacturing techniques on the horizon and analysts predicting robust market development bolstered by the integration of empathetic robots in everyday life.

Navigating the Ethical Considerations of Advanced Robotics Notwithstanding the advancements and promises of Emos technology, ethical questions loom. Issues surrounding emotional authenticity, privacy, and employment disruptions accentuate the need for conscientious deployment of such robots. This underscores the importance of engaging with ethics-focused organizations like IEEE and ACM, which strive to establish standards that balance technological progress with societal well-being.

In summary, Emo represents a fusion of AI and emotional perception, potentially revolutionizing human-robot interaction and industry practices. Its advent warrants thoughtful consideration of the ethical landscape as we embrace the age of emotionally intelligent machines. The robotic companions evolution and the industrys path forward will be characterized by ethical vigilance, research brilliance, and insightful analysis, jointly shaping the role of robotics in our future.

Expanding the Market Forecast for Emotionally Intelligent Robots The global market for social and emotional robotics is expected to experience substantial growth over the coming years. According to a report by MarketsandMarkets, the social robot market, in particular, is expected to rise from USD 918 million in the current scenarios to over USD 3,900 million by the next decade, expanding at a CAGR of 14.5% during the forecast period. This growth is fueled by increasing adoption in sectors such as personal assistance, education, and healthcare, where they can perform tasks ranging from companionship to assisting with cognitive therapy and rehabilitation.

The emergence of robots like Emo will spur further research and development, reducing costs and enhancing functionalities. This will likely attract investment and increase the accessibility of these robots, thus making them more commonplace in both consumer and commercial environments.

Challenges and Controversies Within the Robotics Industry Despite these promising market forecasts, the robotics industry faces challenges and controversies that could impact the emotional intelligence sector. One of the primary concerns is job displacement, as robots become capable of performing tasks typically reserved for human workers. This could lead to significant shifts in the labor market and necessitate retraining for those whose jobs are affected.

Another key consideration is data privacy and security, especially with robots that can collect and analyze personal emotional data. Ensuring that this information is used responsibly and securely is paramount to maintaining public trust.

For research, development, and the establishment of standards in robotics, resources can be found through organizations such as IEEE and ACM.

Summary and Industry Outlook In conclusion, Emo exemplifies the potential for emotion recognition in robotics to drive innovation across various sectors. The social and emotional robot industry is anticipated to flourish, bringing about advancements in how these machines are integrated into our daily lives. As the industry progresses, it will be essential to monitor market dynamics, foster ethical practices, and encourage responsible innovation, thereby ensuring that the evolution of robots like Emo contributes positively to society.

The success of products like Emo and the industrys trajectory will heavily rely on striking a balance between innovation and the humane and ethical application of technology. Thought leaders, developers, and policymakers will need to collaborate to navigate these challenges successfully. The trends in the robotics industry point towards a future where emotionally intelligent machines become an integral part of the fabric of society, enhancing human life while addressing the ethical implications of such profound technological integration.

Leokadia Gogulska is an emerging figure in the field of environmental technology, known for her groundbreaking work in developing sustainable urban infrastructure solutions. Her research focuses on integrating green technologies in urban planning, aiming to reduce environmental impact while enhancing livability in cities. Gogulskas innovative approaches to renewable energy usage, waste management, and eco-friendly transportation systems have garnered attention for their practicality and effectiveness. Her contributions are increasingly influential in shaping policies and practices towards more sustainable and resilient urban environments.

Continued here:

Pioneering Emotional Intelligence in Robotics: The Rise of Emo - yTech

2 Clovis Unified robotics teams headed to the world championships, now they need your support – KFSN-TV

The page you requested was not found. You may have followed an old link or typed the address incorrectly.

We've also been doing some house cleaning so the page may have been moved or removed.

Please try searching for what you are looking for or you could go to the home page and start from there. Or you may be interested in today's top stories.

Read the rest here:

2 Clovis Unified robotics teams headed to the world championships, now they need your support - KFSN-TV

Middle school robotics team going to world championship – The Sparta Independent

Three teams in the Sparta Middle School robotics program made it to the finals of the state competition and one is going on to the 2024 VEX Robotics World Championship.

That team, with three returning members from last year, placed second in states and won the Design Award, which qualified them for the world competition.

This is the programs second year competing.

Last year, five student groups competed for the first time in the VEX IQ Robotics season. Two teams made it to states and one team advanced to the 2023 VEX Robotics World Championship in Dallas.

This years team is asking for donations to help fund their trip to the world championships.

For information and to donate, go online to http://www.gofundme.com/f/sparta-middle-school-robotics-world-championship?utm_campaign=p_cp%20fundraiser-sidebar&utm_medium=copy_link_all&utm_source=customer&fbclid=IwAR0rfjdfaY3ODlFH3AAFRvivnav_RLe8lNk32Qgz87y1s7Lmgl2uoYk3buM_aem_AUQe3KULgv3YlwbYc9ypbkYVIpE5ShxJepsydptvUulCgaOpXF2_dOgBuFI3zNr2KXJ08_-coghBL0CQuV9cgFcm

See the original post:

Middle school robotics team going to world championship - The Sparta Independent

Ranching of tomorrow: Smooth Ag bringing robotics to ranchers with autonomous Ranch Rover – Graham Leader

By automating the cattle feeding process the Graham-based company Smooth Ag is looking to bring the innovation of robotics to ranchers through its autonomous Ranch Rover vehicle.

The Ranch Rover was the creation of fourth-generation rancher River McTasney who had the agricultural lifestyle ingrained in his bones at a young age growing up on a 3,000-acre ranch while tending 120 head of cattle.

I went to school at Paint Creek High School, an agriculture community. Most of us kids there grew up working on our own stuff. We have a mechanical skill set from that lifestyle that really equips us with the problem solving skills that I think a lot of people from outside of the rural community may not quite get, McTasney said. ...So that problem solving skill set really helped with this later on down the road.

Following high school, McTasney attended Texas A&M University and graduated in 2018 with a degree in construction management. He worked for a year in College Station in sales for an HVAC company before deciding he wanted a break and moved back to his family ranch.

I was feeding cows and I was like, Theres got to be a better way to do this. ...Being one of the only able-bodied people on the ranch to do other stuff, there was other stuff I needed to get done instead of spending three hours a day in the feed pickup, he said. I started tinkering with different ideas and finally decided that a mobile platform, just like a feed pickup without the driver, was the best way to do it.

McTasney learned to code with the intention of making the dream of the Ranch Rover a reality. Over the next two years he built a conceptual machine on an old pickup truck frame and eventually moved up to the current prototype.

It has a 4,000-pound payload. Its GPS waypoint navigation fused with machine vision, so its completely autonomous. They have the ability to set routes and then with those routes set individual feed missions... and those are on a timer, he said. You can schedule them however you like, you can pick your feed locations (and) pick how much youre going to feed at each of those feed locations.

The rover has data-driven decision making which McTasney said can provide owners information for planning.

Theres a lot of data collection involved as well thats going to be extremely valuable. With computer vision its one of those things that is hard to see, but the way technological advancements are working out right now computer vision is getting amazing, he said. The type of data that were going to be able to directly feed back to the customer based off of that is actually going to be really insane. Its going to be very valuable. So thats just one of the perks of solving a problem directly is we get to put up those various sensors and cameras on this thing and kind of knock out two birds with one stone.

Around a year-and-half ago McTasney connected with representatives from Graham to see if they wanted to be involved with making the city a home base for the project. The site was also something McTasney wanted due to having land close.

We have land in Caddo as well... just East of Breckenridge. I wanted to stay around home because we do have obligations to the ranch. ...Graham is just a great community, too, he said. ...Whenever youre doing something like this, youre really grabbing everything you can to stay motivated and keep doing it and so you really want to be surrounded and supported by a community that believes in success, beliefs in new things. I think Graham did a really good job of displaying that and really got me roped in.

The company has a 4,000 square foot shop located on Rocky Mound Road in Graham and has expanded to a three-man team internally.

The company has $400,000 in the sales pipeline for orders and will be delivering its first vehicle to Oklahoma State University next week. The team has been busy showcasing the rover, most recently at the Texas and Southwestern Association Convention at the end of March.

The response has been incredible. We picked up three more customers there in one day. Thats without having any inventory, which is a really neat thing, he said. These guys know... its going to be a while there. They got about a six month lead time. So that in itself, getting people to sign a letter of intent saying that theyre going to buy one as we produce, thats... a very validated customer and a very convicted customer. So they believe in us, they really like what were doing. This is something they feel can be very useful and beneficial in their operation.

McTasney said the rover is tailoring to the actual needs of cattle ranches which is assisting with the labor shortage. While the company is focused on the Ranch Rover for pasture land for open range cow/calf operations, they plan to address another need with a feedlot machine within the next 18 months.

(Theres) a huge demand in feedlots. Thats a much bigger machine mechanically... so well focus on Ranch Rover, this pasture land model, to grow those sales numbers to continue to prove validation for investors, he said. Well move sometime in the next one-to-two years to building out a much larger machine built specifically for feedlots, which is going to be a real enterprise as this is new technology for them as well. And thats a huge labor burden, compared to the pasture land.

Read the rest here:

Ranching of tomorrow: Smooth Ag bringing robotics to ranchers with autonomous Ranch Rover - Graham Leader

Libyan students to travel to US to compete in World Robotics Championship – Libya Update

A group of students from the Libyan National Robotics Team will travel to the United States to compete in the World Robotics Championship to be held in Houston, Texas.

Charg dAffaires at the U.S. Embassy in Libya, Jeremy Berndt said: It was a real pleasure to meet this impressive group of students from the Libyan National Robotics Team who will be traveling to Houston, Texas to compete in the World Robotics Championship with support from the Libyan Academy for Telecom and Informatics (LPTIC).

We are rooting for them. I am certain these gifted young Libyans will play a vital role shaping Libyas future, Berndt commented.

The United States is proud of our partnership on science, technology, engineering, and math (STEM) education with youth in Libya and around the world, he added.

Original post:

Libyan students to travel to US to compete in World Robotics Championship - Libya Update

Southeastern New England teams do well in high school robotics competition – Turn to 10

Please ensure Javascript is enabled for purposes ofwebsite accessibility

Mon, 08 Apr 2024 03:59:09 GMT (1712548749678)

4a9dab849aec5f1c093eaced15411feb763238f3

c8f6d8c85516425a6df352ac3ccd87cd1cbf8d8a

Fallback Presentation. Using deprecated PresentationRouter.

See the rest here:

Southeastern New England teams do well in high school robotics competition - Turn to 10