Operation Wolf Returns: First Mission VR announced for PS VR2, Quest 2, and PICO 4 – Gematsu

Publisher Microids and developer Virtuallyz Gaming have announced rail shooter comeback Operation Wolf Returns: First Mission VR for PlayStation VR2, Quest 2, and PICO 4. It will launch on June 22.

As previously announced, the revival of the 1987 arcade game from TAITO is also coming to non-virtual reality platforms as Operation Wolf Returns: First Mission this fall. It will be available both physically and digitally for PlayStation 5, PlayStation 4, and Switch, and digitally for Xbox Series, Xbox One, and PC via Steam.

Here is an overview of the virtual reality version, via Microids:

About

Developed and published by TAITO in 1987, the arcade game Operation Wolf left a mark for a whole generation of gamers, being one of the first games to offer side scrolling rail shooting action. Operation Wolf Returns: First Mission VR will remain faithful to the spirit of the original game, while premiering a new artistic direction.

In its solo virtual reality campaign, play as a special agent fighting a new criminal organization. In addition to arms and drug trafficking, the organization, led by the mysterious General Viper, has developed a powerful new weapon. After discovering several of its bases, you will be sent to dismantle the organization, destroy the weapon and free the hostages held captive in surrounding camps.

Key Features

View a new set of screenshots at the gallery.

Link:

Operation Wolf Returns: First Mission VR announced for PS VR2, Quest 2, and PICO 4 - Gematsu

Winston-Salem State University adds virtual reality training to its … – WUNC

Winston-Salem State University is partnering with the University of North Carolina School of the Arts to include virtual reality training in its curriculum for nursing students.

The nearly $800,000 project is funded by the North Carolina Collaboratory, which was created by the General Assembly to conduct research across the UNC System to inform policymaking on the state and local levels.

Under the program, nursing students will start using virtual reality during their junior year to train in clinical environments like hospitals, simulating real-world experiences. Through virtual reality, Leslee Battle the dean of WSSUs Health and Sciences School said the programs faculty can control what happens to make sure that the students get the most experience and more practice.

We all know the more that you are repeating certain activities, the better it is retained, Battle said.

Battle added that the training begins with knowing the fundamentals, like hand washing, and progresses as the student goes through the nursing program.

We're able to both use it as a learning tool for the students to go through as well as an assessment, Battle said. For example, faculty can add a reminder in a session for the student to wash their hands when they enter a patients room; then add an assessment to see if the student remembers to wash their hand during another session.

According to Battle, Winston-Salem State is the first historically Black college or university in North Carolina to use virtual reality training in its nursing program.

Read more:

Winston-Salem State University adds virtual reality training to its ... - WUNC

Upward Bound Students Explore CMU’s New Virtual Reality … – CMUnow

On a recent tour of the cutting-edge St. Marys Medical Education Center, a group of high school students from Grand Junction High School and Central High School were able to get hands-on experience in the virtual reality (VR) anatomical learning lab. These students are part of the inaugural TRIO Upward Bound (UB) program that works to prepare local first-generation, income-eligible students for success in college.

Anatomy Lab Coordinator Michael Williams, PhD, led the tour and explained, we house three graduate programs at the St. Marys Medical Education Center. We offer an occupational therapy program, physician assistant studies and in the fall, we will be starting a physical therapy program. After touring the new classrooms and study spaces, Williams brought the students to the main attraction of their visit, the VR anatomical learning lab.

The lab consists of eight VR headsets and Williams explained to students that VR gives us a very different perspective that you cant get on a model. Students were given instructions on how to load an anatomy model into the VR environment, toggle the muscle, bone and viscera layers off and on and how to click on individual components to study them in more detail or to remove them from the model. One thing thats neat about the VR is that you can walk around the model, and you can move inside the model to get a different viewpoint. Its hard sometimes to make sense of what something looks like just from a piece of paper in a textbook. This gives students a hands-on, three-dimensional experience, explained Williams.

One thing thats neat about the VR learning lab is that you can walk around the model, and you can move inside the model to get a different viewpoint. Its hard sometimes to make sense of what something looks like just from a piece of paper in a textbook. This gives students a hands-on, three-dimensional experience. - Anatomy Lab Coordinator Michael Williams, PhD

When it was time for the UB students to put on the VR headsets, they expressed wonder and amazement as they explored the virtual environment. The UB program is funded through a federal grant along with contributions from Colorado Mesa University and is targeted at helping ninth and tenth grade students with academic tutoring, program and course selection advising, college exam and applications assistance and financial literacy training. UB participants also get to attend college exploration trips to find a school that fits their unique needs. One of the biggest attractions of the program is the Summer Academy when students spend six-weeks on the CMU campus and experience on-campus living, go on a week-long college campus bus tour, engage in various mental health, wellness and social activities and receive rigorous academic instruction to make sure they are ready to succeed at the college level.

When students are not attending UB activities and events they have access to UB Advisors that are assigned to their high school for additional support. Advisors also engage families to connect them with additional resources to support their students and navigate the financial aid resources available to them.

Upward Bound Advisor and CMU alumnus Aaron Reed, mass communication, 22, currently works with students at Central High School and explained, I wanted to be part of the Upward Bound program because it is so hands-on, and you can really connect with students in their schools. You get to talk to them day-to-day, help them get to the next level and get them excited about college. I didnt have a program like this when I was in high school, and I wanted to make a change and be the difference for first-generation students in our community.

I didnt have a program like this when I was in high school, and I wanted to make a change and be the difference for first-generation students in our community. Upward Bound Advisor Aaron Reed

Reed continued, so many first-generation students dont see students that look like them, come from where they come from or have their background go on to college and succeed. I think the misconception is that if I dont look like that, or if my parents didnt go to college, I cant do it. The biggest thing I want to instill in our D51 students is that just because you may be a first-generation student you can still go to college and do great things.

To be considered for the 2023-2024 program students must be going into either ninth or tenth grade next year, study at Central High School or Grand Junction High School, identify as a first-generation student (meaning neither parent/guardian has a four-year degree) and demonstrate financial need.

If you know a student that could benefit from this program, they can learn more and apply for the 2023-2024 program on the Upward Bound website.

Anatomy Lab Coordinator Michael Williams got things set up for everyone.

The students were all smiles when it was their turn.

And the students' curiosity led to some great discussions.

Read more from the original source:

Upward Bound Students Explore CMU's New Virtual Reality ... - CMUnow

Virtual reality used to address victim court trauma – BBC

30 April 2023

To play this content, please enable JavaScript, or try a different browser

A Belfast tech firm is helping victims prepare for court with the help of virtual reality software

Virtual reality is being used to help victims of crime prepare for giving evidence in court.

Belfast-based tech company Immersonal designed the software that is being rolled out across 52 Scottish courts in the next year.

It is also being piloted in The Hague as part of the International Criminal Court.

The technology allows victims to interact with key members of the judicial process in a virtual world.

Immersonal CEO Tom Houston said the primary aim was to make the court experience less traumatic by giving victims more information about what to expect.

He said: "Going to court can be intimidating but this technology allows you to walk through a three-dimensional world which recreates the actual court building where the case is to be held.

"You can interact in a virtual environment that includes the people and objects you will encounter during your case."

The technology allows victims to interact with key members of the judicial process in a virtual world

Through a headset, users are able to navigate their way through the virtual court and click on individuals such as the court clerk who will explain their role.

The team of virtual reality experts set up the company Immersonal in 2021.

Their main aim was developing affordable ways for organisations and individuals who are not tech-savvy to create their own virtual-reality experiences and training simulations.

The company has since secured a 500,000 contract with the Scottish Government to deliver the virtual-reality service in courts over the course of the next 12 months.

'Victims need support'

The virtual-reality court scheme is not currently operating in Northern Ireland.

A 360-degree view of courtrooms across Northern Ireland is available online through the Victims Support NI website.

Former Victim Support CEO Geraldine Hanna was appointed as Northern Ireland's first victims of crime commissioner last year.

Ms Hanna said that while 360 views of courts had offered a valuable insight into court buildings, she believes there is huge potential for virtual reality to help enhance and improve the experience of victims.

Geraldine Hanna was appointed as Northern Ireland's first victims of crime commissioner last year

She added: "It's really important that victims get as much information and support as possible in the run up to their trial.

"The developments in using virtual reality in the criminal justice system feels like the next step in the journey to help improve a victims experience."

A Northern Ireland Courts and Tribunals Service (NICTS) spokesperson said: "NICTS are committed to working closely with our partners to provide appropriate support to victims and witnesses when they are required to attend court at what is a challenging time in their lives."

In some circumstances the Department of Justice also offers initiatives which allow vulnerable victims and witnesses to provide their evidence away from the court environment, for example at remote evidence centres.

'A traumatic experience'

Charles Little was the first person on the scene when his parents-in-law were murdered in their home by a mental-health patient.

Michael and Marjorie Cawdery, both 83, died in a frenzied knife attack in Portadown, County Armagh, in May 2017.

Mr Little said: "When I arrived the perpetrator nearly ran me down as he escaped in the family car, so it was clearly a very traumatic experience.

"But while the trauma starts with the incident, dealing with the criminal justice system - the trauma can continue."

Charles Little was the first person on the scene when his parents-in-law were murdered

In June 2018 Thomas Scott McEntee, who suffers from paranoid schizophrenia, was sentenced to a minimum of 10 years in prison.

Mr Little said: "During that court case it was the first time I'd ever been in a courtroom and it was very overwhelming.

"It felt like everyone knew what they were doing except for my family and I."

Mr Little believes the potential of virtual reality to improve the experience of victims in the justice system should be explored further.

He said: "Certainly from what I've heard about this technology, where you can interact and have people explain their roles, I think this would be hugely helpful.

"It's about ensuring the victim's ability to act as a witness at the trial and we should embrace anything that can help with that."

View post:

Virtual reality used to address victim court trauma - BBC

2 Stocks to Invest in Virtual Reality – The Motley Fool

Virtual reality (VR) has been one of those technologies that always seems close to going mainstream, but has never quite reached the tipping point to make it so. But with Apple (AAPL 4.69%) likely on the verge of releasing its VR headset in the next few months, this once-niche segment may finally have its day in the Sun.

Investors looking for a couple of stocks to bet on in this estimated $250 billion-sized VR market of the future should take a closer look at what Apple as well as key chipmaker Nvidia (NVDA 4.06%) are doing right now.

Image source: Getty Images.

For years, rumors have swirled about Apple working on a virtual reality headset, but to date the device hasn't materialized. This time, it really looks like Apple is on the verge of releasing an AR/VR mixed-reality headset.

Bloomberg has published multiple reports about the device over the past several months and has said that Apple has even shown it off to its board of directors. More recently, the publication said that the device will debut at Apple's Worldwide Developers Conference (WWDC) in June and that it will be able to run hundreds of thousands of iPad apps.

The device, which is dubbed the Reality Pro or Reality One, will reportedly cost $3,000. Apple may focus its capabilities around gaming and fitness, as well as being able to be used as an e-reader or to watch live sports.

Apple entering into an entirely new device category would be a huge step for the company and, of course, there's no guarantee of success. Noted Apple analyst Ming-Chi Kuo thinks the initial rollout of the devices could be slow, with a maximum estimated 300,000 sold in the first year.

But the tech titan also has a long track record of slowly moving into new markets, then dominating them. And if it repeats that success in VR, then Apple could soon become the top VR company that rivals are trying to emulate.

Nvidia's VR opportunity comes from its high-powered graphics processing units (GPUs) that are often the go-to choice for many gamers. The company's gaming segment is one of its largest, accounting for 30% of sales in the most recent quarter.

Nvidia's gaming sales have admittedly slowed lately, but it understands that VR could be a catalyst for new growth from its gaming business, and has made moves to tap into it. That's why the company has developed tools for developers to create and launch virtual worlds through Nvidia's Omniverse Cloud. The company has also worked directly with large companies, including Volvo, to show how VR tools can be used to develop products.

Just as companies look to Nvidia to help them power their servers for cloud computing and artificial intelligence services, the growing need for high-end graphics for VR could boost demand for Nvidia's chips. Some estimates put the market size for AR/VR chips at $9.5 billion by 2027, and with Nvidia's early moves, the company is likely to be a key beneficiary in this space.

It's still unclear how quickly consumers will adopt new VR technologies. I think Apple launching a mixed reality device could help spread more widespread adoption, but it'll take some time before we see that theory pan out or not. For investors looking to snatch up shares of these two companies based on their virtual reality prospects, you'll probably have to remain patient as this market starts to grow.

View post:

2 Stocks to Invest in Virtual Reality - The Motley Fool

Virtual reality improving seniors’ reality in Swampscott – Itemlive – Daily Item

Katrina Regan, reflections and engagement specialist for LCB Senior Living, administers a REACT Neuro virtual reality cognition test to The Residence at Vinnin Square resident Mickey Mizner. (Spenser Hasak)

SWAMPSCOTT Staff at the Residence at Vinnin Square, an assisted-living facility for seniors on Salem Street, are using virtual reality headsets to detect early signs of memory loss and other declines in cognitive function.

In 2020, the Residences parent company LCB Senior Living became the first assisted-living company in the country to partner with REACT Neuro, a virtual reality technology company. REACT Neuro was created by a team led by Dr. Rudy Tanzi, the team neuroscientist for the New England Patriots.

Tanzi, who is credited with discovering the Alzheimers gene and currently serves as the director of the genetics and aging research unit at Massachusetts General Hospital, developed REACT Neuro to create a reliable way to track and measure cognitive health.

Roughly once every two weeks, residents at the facility use REACT Neuro virtual reality headsets and remote controls to take puzzle-like tests, which assess aspects of their cognitive health like short and long-term memory, reflexes, hand-eye coordination, and motor function.

Katrina Regan works as the reflections and engagements specialist for the Residences memory care department. She said that while doctors can regularly track patients blood pressure to test for early signs of heart disease, conditions such as Alzheimers or dementia can not always be detected as early.

This is really a checkup from the neck up, Regan said. You dont get the opportunity to check on your brain unless theres a problem, so this gives our residents a chance to look at their brain trends over time.

Using results from the REACT Neuro tests, staff can spot dips in their residents cognitive ability and work with them to assess potential causes of brain health fluctuation. Regan said staff at The Residence use Tanzis SHIELD program to ensure residents sleep seven to eight hours a night, manage their stress in a healthy way, interact with others, exercise, and continue to learn new things.

If our resident takes a test today and they score great, and then they take that same test in three or four weeks and they dont score so great, we look at those results and say, Well, what are the factors that might come into play here? Are you not sleeping well lately? Have you been extra stressed lately? Are you feeling depressed? Are you not as socially active? Regan said. Its really lifestyle changes that will help our residents improve their scores.

Although the Residence at Vinnin Square has only used the program for a couple of months, Regan said many of her residents use the tests themselves as mental workouts and actively try to improve their scores.

Resident Engagement Director Alicia Malley organizes regular social and educational programs at the facility to engage seniors. She said since she began administering the REACT Neuro tests, shes noticed more residents attending her Tai Chi and meditation classes.

I do see them coming to more stuff each time, Malley said. My Tai Chi is growing because I do meditation at the end and it helps with the stress. They love it. Its bringing more engagement.

Anne Khatchadurian moved into the facility two months ago after she fell and fractured her kneecap. On Wednesday, she put on the VR glasses to take a long-term memory test.

I think its wonderful, Khatchadurian said. Im having difficulty with my memory and other things, and its helping me.

Mickey Mizner, who has lived at the Residence for more than four years, said he enjoys the tests because he takes comfort in knowing he can improve his cognitive abilities.

They have some great activities. Those activities are a real plus. The more you do, the more you can do, Mizner said. Anything you can do to improve your mind is a plus.

Read the original post:

Virtual reality improving seniors' reality in Swampscott - Itemlive - Daily Item

Researchers at UW analyze cybersickness while playing virtual … – CTV News Kitchener

Published May 5, 2023 6:17 p.m. ET

Updated May 5, 2023 7:20 p.m. ET

Click to Expand

Just like the motion sickness someone might feel while in a plane, car or boat cybersickness can occur while people are playing virtual reality (VR) games.

When someone is plugged into a VR game, it can be hard for certain gamers to process being in two places at once both physically and virtually.

Conflicts between information between the real world and the virtual world can sometimes make people feel sick, said Michael Barnett-Cowan, an associate professor in the department of kinesiology and health sciences at the University of Waterloo (UW).

Researchers at UW completed a new study to explore why some feel cybersickness more than other. Researchers believe its all about something called subjective visual vertical a measure of how individuals perceive the orientation of vertical lines.

And the degree which that line moves closer to your body then to gravity gives a symmetric about how youre using multi-sensory cues for upright perception, said Barnett-Cowan.

The researchers collected data from 31 participants and assessed their perceptions of the vertical before and after playing two VR games one was high-intensity, and the other was low-intensity.

They found that there are often two types of people.

Some people would rely more on their body as a frame of reference, and others would rely on gravity as a frame of reference, Barnett-Cowan said.

Those who were able to offset their perception of the vertical more, were less likely to get sick especially in a high intensity VR setting.

Their brains adopted, and that adaptation carried over into their subjective visual vertical settings. Those that were kind of stubborn and set the line just like they did initially were the same individuals who reported higher levels of sickness, he said.

Staff Ctrl V, a VR arcade in Waterloo, said they are aware of cybersickness and what can often trigger it.

A piece of content that requires you to move forward, and you arent actually moving. Maybe youre just sort of floating through space. Thats an instance that might induce sickness, said CEO and Co-founder of Ctrl V, Robert Bruski.

Staff said they work with customers to customize their experience, if theyre prone to experiencing motion sickness.

If you dont feel well, change the content. If youre too close to something, move away from it. Youre the master of your domain, and youre going to know whats feeling right, Bruski said.

Experts suggest keeping VR users educated and aware of their surroundings in reality or virtually is the most important step towards a positive user experience.

RELATED IMAGES

Excerpt from:

Researchers at UW analyze cybersickness while playing virtual ... - CTV News Kitchener

Virtual Reality (VR) Motion Capture System Market Next Big Thing | Major Giants Google, Sony, Apple – openPR

Virtual Reality (VR) Motion Capture System Market

Access Sample Report + All Related Graphs & Charts @: https://www.advancemarketanalytics.com/sample-report/196902-global-virtual-reality-vr-motion-capture-system-market#utm_source=OpenPR/Pranita

Major & Emerging Players in Virtual Reality (VR) Motion Capture System Market:-Google (United States), Sony (Japan), Meta (United States), Apple (United States), Oculus (United States), NVidia (United States), HP (United States), HTC (Taiwan), Microsoft (United States) and Motion Reality (United States)

The Virtual Reality (VR) Motion Capture System Market Study by AMA Research gives an essential tool and source to Industry stakeholders to figure out the market and other fundamental technicalities, covering growth, opportunities, competitive scenarios, and key trends in the Virtual Reality (VR) Motion Capture System market.

Virtual Reality (VR) Motion Capture System gives a simulated experience that can be similar to or completely different from the real world. Virtual Reality is a term uses for a computer generated 3D Environments which allow peoples to enter and interacts with alternate realities. Virtual Reality (VR) Motion Capture System which allows a user to interact with a computer-simulated environment, whether that environment is a simulation of the real world or an imaginary world. It is the key to experiencing, feeling and touching the past, present and the future. It is the medium of creating our own world, our own customized reality. Virtual Reality (VR) is popular name for an absorbing, interactive, Computer mediated experience in which person perceives a synthetic (simulated) environment by means of special human-computer interface Equipment. It interacts with simulated objects in that environment as If they were real. Several persons can see one another and interact in shared Synthetic environment such as battlefield.

The titled segments and sub-section of the market are illuminated below:by Type (Non-Immersive, Semi-Immersive, Fully Immersive), Application (Video Gaming, Business, Educations, Cinema, Architecture, Other), Distribution Channel (Online, Offline), Tools (Unity 3D, Unreal Engine (UE4), 3DS Max & Maya, Blender, A-Frame, Other), Connectivity (PC, Smartphone) Players and Region - Global Market Outlook to 2027

Market Trends:Rapid Uses of Artificial Intelligence in Virtual Reality, Innovation of Fifth Generation (5G) Technology and Innovation of New IoT Based Devices

Opportunities:Rapid Growth in Information and Technology, Rapid Adoption of Artificial Intelligence Across the World, Continuous Growth Digitalization and Technological Advancement and Virtually Growing Business and Corporate Sector

Market Drivers:Rising Demand in Online Streaming Video Platform, Rising Demand in Video Gaming Platform, Surge in Demand in Space and Military Training and Rising Demand in Healthcare Practices and Training

Challenges:High Maintenance and Installation Cost Associated with Virtual Reality (VR) Motion Capture System, Low Penetration in Emerging Regions, Lack of Knowledge about Virtual Reality (VR) Motion Capture System and Among People

Enquire for customization in Report @: https://www.advancemarketanalytics.com/enquiry-before-buy/196902-global-virtual-reality-vr-motion-capture-system-market#utm_source=OpenPR/Pranita

Some Point of Table of Content:Chapter One: Report OverviewChapter Two: Global Market Growth TrendsChapter Three: Value Chain of Virtual Reality (VR) Motion Capture System MarketChapter Four: Players ProfilesChapter Five: Global Virtual Reality (VR) Motion Capture System Market Analysis by RegionsChapter Six: North America Virtual Reality (VR) Motion Capture System Market Analysis by CountriesChapter Seven: Europe Virtual Reality (VR) Motion Capture System Market Analysis by CountriesChapter Eight: Asia-Pacific Virtual Reality (VR) Motion Capture System Market Analysis by CountriesChapter Nine: Middle East and Africa Virtual Reality (VR) Motion Capture System Market Analysis by CountriesChapter Ten: South America Virtual Reality (VR) Motion Capture System Market Analysis by CountriesChapter Eleven: Global Virtual Reality (VR) Motion Capture System Market Segment by TypesChapter Twelve: Global Virtual Reality (VR) Motion Capture System Market Segment by Applications

What are the market factors that are explained in the Virtual Reality (VR) Motion Capture System Market report?- Key Strategic Developments: Strategic developments of the market, comprising R&D, new product launch, M&A, agreements, collaborations, partnerships, joint ventures, and regional growth of the leading competitors.- Key Market Features: Including revenue, price, capacity, capacity utilization rate, gross, production, production rate, consumption, import/export, supply/demand, cost, market share, CAGR, and gross margin.- Analytical Tools: The analytical tools such as Porter's five forces analysis, SWOT analysis, feasibility study, and investment return analysis have been used to analyze the growth of the key players operating in the market.

Buy This Exclusive Research Here: https://www.advancemarketanalytics.com/buy-now?format=1&report=196902#utm_source=OpenPR/Pranita

Definitively, this report will give you an unmistakable perspective on every single reality of the market without a need to allude to some other research report or an information source. Our report will give all of you the realities about the past, present, and eventual fate of the concerned Market.

Thanks for reading this article; you can also get individual chapter wise section or region wise report version like North America, Europe or Asia.

Contact Us: Craig Francis (PR & Marketing Manager) AMA Research & Media LLPUnit No. 429, Parsonage Road Edison, NJ New Jersey USA - 08837 Phone: +1(201) 7937323, +1(201) 7937193sales@advancemarketanalytics.com

Advance Market Analytics is Global leaders of Market Research Industry provides the quantified B2B research to Fortune 500 companies on high growth emerging opportunities which will impact more than 80% of worldwide companies' revenues.Our Analyst is tracking high growth study with detailed statistical and in-depth analysis of market trends & dynamics that provide a complete overview of the industry. We follow an extensive research methodology coupled with critical insights related industry factors and market forces to generate the best value for our clients. We Provides reliable primary and secondary data sources, our analysts and consultants derive informative and usable data suited for our clients business needs. The research study enable clients to meet varied market objectives a from global footprint expansion to supply chain optimization and from competitor profiling to M&As.

This release was published on openPR.

Originally posted here:

Virtual Reality (VR) Motion Capture System Market Next Big Thing | Major Giants Google, Sony, Apple - openPR

Virtual reality shows how fracking affects Argentinian river basin – Stockholm Environment Institute

A recent research conference at Argentinas National University of Comahue (UNCo) featured a new virtual reality (VR) experience that allows users to explore how local unconventional hydrocarbon production, known as fracking, affect the water supply in the Comahue river basin.

SEI scientists Laura Forni, Romina Daz Gmez and Marina Mautner are working together with UNCo on the research, which was featured in a news article about the virtual reality project in Ro Negro .

Argentinas Vaca Muerta region, located in the Patagonian south, is home to the worlds second-largest shale gas reserves and the fourth-largest shale oil deposits , where gas production is outpacing the growth of infrastructure to accommodate it.

SEI and UNCo are investigating how this gas production might pose risks to the water supply, agricultural production, and the population that depends on them. The researchers use remote sensing , or scanning performed by satellites and high-flying aircraft, to map the fracking wells proximity to rivers, farms, neighborhoods and cities. That data informed the interactive VR exhibit, accessible by VR goggles, at UNCos conference.

For the first time we are going to have all the information centralized and we are going to be able to visualize all the components at the level of the Comahue Basin, Daz Gmez told Ro Negro.

As climate change progresses, the regions water supply is expected to decline, while fracking increases the wastewater generated. The teams research on this topic indicates that shale operations increase pressure on the water supply and pose a risk to water quality.

The researchers hope the VR project will help educate the public about frackings impact on local populations and ecosystems, as well as promote the use of remote sensing to produce such data.

Go here to see the original:

Virtual reality shows how fracking affects Argentinian river basin - Stockholm Environment Institute

Digital Twin and Metaverse: The Future of Virtual Reality – NASSCOM Community

The concept of virtual reality (VR) has come a long way since its inception. With the advancements in technology, it has become possible to create virtual environments that are almost indistinguishable from the real world. Two recent concepts that are gaining traction in the world of VR are digital twin and metaverse. In this article, we will explore what these terms mean, how they are related to each other, and what the future of VR might look like with their implementation.

Table of Contents

Introduction

What is Virtual Reality?

Digital Twin: The Concept and Its Applications

The Rise of Metaverse

The Relation between Digital Twin and Metaverse

How Digital Twin and Metaverse will Change the Future of VR

Advantages of Digital Twin and Metaverse

Challenges and Risks Associated with Digital Twin and Metaverse

The Future of Virtual Reality

Conclusion

Introduction

Virtual reality has come a long way since the first crude head-mounted display was developed in the 1960s. With the advancements in technology, it is now possible to create fully immersive virtual environments that are almost indistinguishable from the real world. Two recent concepts that are gaining popularity in the world of VR are digital twin and metaverse. In this article, we will explore what these concepts are, how they are related to each other, and how they will shape the future of VR.

What is Virtual Reality?

Virtual reality is a computer-generated simulation of a three-dimensional environment that can be interacted with in a seemingly real way by a person using special electronic equipment, such as a head-mounted display, gloves, or a bodysuit. VR can be used for various purposes, such as entertainment, education, and training.

Digital Twin: The Concept and Its Applications

A digital twin is a virtual representation of a physical object, process, or system. It is created by collecting data from sensors, cameras, and other sources in real-time and using it to create a 3D model of the object, process, or system. Digital twins can be used in various industries, such as manufacturing, healthcare, and transportation. For example, in manufacturing, a digital twin can be used to simulate the production process and optimise it for efficiency.

The Rise of Metaverse

Metaverse is a term that was first coined by science fiction author Neal Stephenson in his 1992 novel Snow Crash. It refers to a virtual universe that is created by the convergence of multiple virtual worlds. The concept of metaverse has gained popularity in recent years, especially after the success of virtual worlds such as Second Life and Minecraft. Companies such as Facebook and Epic Games are also working on creating their own versions of metaverse.

The Relation between Digital Twin and Metaverse

Digital twin and metaverse are related concepts in the sense that both involve the creation of virtual environments. However, while digital twin is focused on creating a virtual representation of a physical object or system, metaverse is focused on creating a virtual universe that is inhabited by virtual beings.

How Digital Twin and Metaverse will Change the Future of VR

The implementation of digital twin and metaverse will bring about significant changes in the future of VR. Digital twins will enable us to create virtual replicas of physical objects and systems, which can be used for various purposes, such as training and maintenance. Metaverse, on the other hand, will create a virtual universe that is inhabited by virtual beings, opening up new possibilities for entertainment, education, and social interaction.

Advantages of Digital Twin and Metaverse

The implementation of digital twin and metaverse will bring several advantages to the field of VR. Digital twin will allow us to test and optimise physical systems in a virtual environment before implementing them in the real world, reducing the cost and risk of errors. It can also help in remote monitoring and maintenance of physical systems, leading to increased efficiency and reduced downtime.

Metaverse will create new opportunities for entertainment, social interaction, and education. It will allow people from different parts of the world to come together in a virtual environment and experience things that are not possible in the real world. It can also be used for educational purposes, such as simulating historical events or scientific experiments.

The Future of Virtual Reality

The implementation of digital twin and metaverse is just the beginning of the future of VR. With the advancements in technology, it is possible that we may one day be able to fully immerse ourselves in a virtual environment that is indistinguishable from the real world. This could revolutionise the way we work, play, and interact with each other.

Conclusion

Digital twin and metaverse are two concepts that are gaining popularity in the world of virtual reality. Digital twin involves creating a virtual representation of a physical object or system, while metaverse involves creating a virtual universe that is inhabited by virtual beings. The implementation of these concepts will bring several advantages to the field of VR, such as increased efficiency, new opportunities for entertainment and education, and the ability to test and optimise physical systems in a virtual environment.

Here is the original post:

Digital Twin and Metaverse: The Future of Virtual Reality - NASSCOM Community

Trends and Opportunities in Augmented Reality – Manufacturing.net

Economic uncertainties are pushing manufacturers to embrace emerging technologies as a means of staying competitive in a constantly evolving market. One example is augmented reality (AR). Deloittes 2023 manufacturing industry outlook reports that 12 percent of surveyed manufacturers plan to focus on AR in the coming year to improve operational efficiencies.

AR technologies enhance the physical world by superimposing digital information onto real-world objects and environments. Often AR displays are paired with gesture recognition to enable interaction with the AR environment. This creates an immersive experience that enhances the perception of and engagement with the real world.

AR can equip facility managers with real-time feeds indicating operating status and delivering production statistics. AR can assist industrial engineers by providing real-time notifications and alerts of potential maintenance issues, allowing for prompt and proactive resolution before they escalate into larger problems. Maintenance personnel can use AR to receive real-time guidance and step-by-step workflows during upkeep and repair operations, improving accuracy and minimizing downtime. Likewise, operators can use AR for training and ongoing guidance while operating equipment and machinery, reducing the risk of error and enhancing efficiency.

Over the last decade, patent applications that mention AR in either their title or abstract have grown from a few hundred applications per year to over a thousand. By examining patent filings, we can gain insight into how industrial manufacturers are utilizing AR technologies in their operations and achieving potential competitive advantages.

Obtaining a patent, however, can involve a significant investment and requires full disclosure of the invention. There is also no guarantee of securing meaningful patent protection, as the final outcome may be either no or limited protection. It is crucial to weigh the costs and benefits of revealing the details of an invention against the potential commercial value of any patent that might result, and the feasibility of enforcing patent rights against an infringer.

AR technologies are being used to enhance operations across entire manufacturing facilities. For instance, U.S. Patent No. 11,265,513, titled Virtual Reality and Augmented Reality for Industrial Automation, provides examples of AR systems that generate and deliver AR presentations to a user via a wearable device. These presentations include 3D holographic views of a plant facility or a location within a plant facility, which can be rendered based on the users current location or orientation on a facility floor.

The AR system can provide automation system data, notification, and proactive guidance by modifying the users view of the immediate surroundings. Additionally, the AR system can superimpose indicators near their corresponding machines, which can relate to critical operating parameters such as temperatures, pressures, speeds, and voltage levels, as well as statistics or key performance indicators such as overall equipment effectiveness, performance or production efficiency, percentage of machine availability over time, product quality statistics, cycle times, overall downtime or runtime durations, and more.

AR technologies are also being used to enhance maintenance operations. U.S. Patent No. 11,270,473, titled Mechanical Fastening Unit Management Method Using Augmented Reality, uses AR to aid operators in the proper tightening of fasteners. The AR-based system superimposes a virtual space on a real space to create an an augmented reality space. Within this AR space, virtual counterparts of a real-world fastening device, such as a torque wrench, and the real-world fasteners are presented. The AR system uses various visual cues such as colors, flashes, text, and graphics to indicate a proper tightening sequence, whether each fastener has received an appropriate amount of torque, and which fasteners still require tightening.

AR technologies have emerged as powerful tools for streamlining the handling of end products generated by this equipment. U.S. Patent No. 11,358,180, titled Workpiece Collecting Point Units and Methods for Supporting the Processing of Workpieces, describes using AR to improve the time-consuming and error-prone process of collecting, sorting, and arranging workpieces. With this system, an AR-equipped operator receives information indicating the appropriate sorting bins for the workpieces during a sorting process. During a collection process, the AR system can augment the display of a collection table to indicate the location of workpiece stacks of the same type and alert the operator when a stack of workpieces is complete.

U.S. Patent Application Publication No. 2022/0221845, titled System and Method for Augmented Reality (AR) Assisted Manufacture of Composite Structures and Bonded Assemblies, describes an AR-assisted system specifically designed for assembling the various layers of such structures and assemblies. These structures, which can include aircraft and vehicle panels, often possess complex geometries that necessitate precise placement during assembly. The AR system provides virtual representations of each layer of the structure and provides visual indicators that ensure accurate placement, position, and orientation of new layers relative to previously assembled layers.

Patents are not the only way to protect competitive advantages. Given the unique nature of manufacturing, innovators should carefully consider whether patents or alternative forms of intellectual property, such as trade secrets, are the most appropriate means of protecting their innovations. AR technologies often are employed in private within the confines of a manufacturing floor. The lack of visibility into whether a competitor is violating a patented invention can limit the effectiveness of the patent. The ability to detect infringement, therefore, is a key factor to consider when deciding whether to seek patent protection.

Potential commercial value is another important factor to consider. The examples above demonstrate that AR technologies may be tailored to suit the specific needs of a given facility, process, or end product. Understanding whether other manufacturers have similar needs can provide valuable insight into the potential market for the AR invention and help to assess the likelihood of competitors copying or licensing the invention. The expected commercial value of a patent, however, should be balanced against the potential costs to enforce it.

Defending ones own innovations, however, is not the only motivation for applying for a patent. Other motivations include potential revenue streams via licensing, access to other technologies through cross-licensing opportunities or joint ventures, obtaining leverage in litigation, attracting investment through increased valuation, and providing access to funding by using patent assets as collateral. Ultimately, a combination of factors and motivations typically guide the decision to seek patent protection.

Manufacturers contemplating patenting their AR innovations should consult with an intellectual property attorney who can provide guidance on available options to safeguard those innovations and assist in crafting a strategy that aligns with specific business objectives.

Brian Emfinger is an IP attorney in Banner Witcoffs Chicago office. His email is bemfinger@bannerwitcoff.com.

Read the original here:

Trends and Opportunities in Augmented Reality - Manufacturing.net

How augmented reality is being used to train the next generation of … – TheFabricator.com

In this augmented reality gas metal arc welding setup, a student lays down a virtual bead.

Its a black art. I used to hear that a lot on fab shop tourscode for something that took years to learn and only the talented few truly mastered. Why, exactly? Sometimes it had to do with the nature of the skill and the workers tactile and visual experience welding a workpiece. If something went awry, theyd try again. And again. And again. For years.

Thing is, considering the acute worker shortage, the industry just doesnt have that kind of time. It needs some way to shorten that training cycle while not skimping on process fundamentals, so that students know what works in what situation and why. They dont just show up to the shop, learn a narrow set of skills (push this button, weld this joint), and get to work. They follow procedures, but they also know why those procedures work so well.

Here, augmented reality (AR) might fill a need, especially for one of the most hands-on processes on the fab shop floor: welding.

Weve been in the augmented reality space for the past eight years, and it continues to get better and better. Were trying to get as close to reality as possible, including visual, audio, and tactile elements. Our ability to create an accurate weld puddle in software has come a long way in just the past few years.

That was Steve Hidden, national account manager, welding education and workforce development, at Miller Electric Mfg. LLC in Appleton, Wis. The company offers its AugmentedArc augmented reality welding system, a technology that merges the visual, aural, and tactile welding experiencethe gun, the workpiece, the buzzing, the visual cueswith software that simulates how a bead flows given how the weld is performed.

Students can wield a gas metal arc welding (GMAW) gun, a stinger for shielded metal arc welding (SMAW), or a gas tungsten arc welding (GTAW) torch. The experience isnt a video game. Using a combination of sensors that read the position of strategically placed QR-codes on the weld gun or torch and workpiece, the AR approach to weld training tracks students movements throughout the process.

Imagine the first time a student grasps the welding gun and lays a bead on plate. He strikes the arc, hesitates, then moves too fast. He tries again and burns through. Spatter goes everywhere, and coupon after coupon after coupon goes into the recycling bin. The student continues to practice, using up shielding gas and welding wire, and putting the gun nozzle and other consumable components through all sorts of abuse. Its not a pretty site, andas anyone working at a technical school or fab shop with in-house training knowsit can get quite expensive.

Now imagine that same student donning a welding helmet, only this time hes manipulating an odd-looking GTAW torch and filler rod, each with QR codes. Instead, sensors in the students helmet read those codes to determine how exactly the student manipulates the tungsten tip along a lap joint, creating a virtual fillet on a workpiece that, again, is covered with strategically placed codes, all of which become invisible when the student dons the welding helmet. With the helmet on, he sees a metallic workpiece ready to be joined. He depresses the foot pedal throughout the process, but there are no arcs, no spatter, no metal at all.

A student next to him wields another odd-looking device, this time a stinger for the SMAW process with a code-covered cube near the endagain, all invisible through the weld helmet. She manipulates the stinger carefully down a vertical groove joint. Next to her sits another student, this one manipulating the nozzle of a GMAW gun around a pipe to create a flange weld.

All three are welding in AR. They see the arc, filler metal, and weld being laid down, and they even hear the weld buzz, just as a welder would in a real-world application. When the student practicing GTAW dips too much of the filler rod at once, the weld pool reacts. When the student practicing GMAW travels too quickly, again, the weld pool reacts. After completing the practice joint, all three students keep their welding helmets on to view their completed, virtual weld, defects and all.

Through the welding helmet, the student can see a virtual representation of the welding arc, plus certain markers showing ideal work position, travel, and standoff for a particular lesson.

The software points out exactly where and how those errors occurred. The arc length was too long here. The filler rod angle was off here. Your work and travel angles were off. Not only this, the software shows what those angles should have been and exactly how to correct it.

On the next try, the students teacher turns on visual aids that they see through the welding helmet, showing what elements should be where. I call them training wheels, Hidden said. Are they too close? Are they too far way?

The visual aids are dynamic, changing as needed to direct the student. For instance, a visual cue might show the student where the welding gun should be at a certain point during the weld program; as the student catches up, the cue changes.

We also give the instructor the ability to let students make their own mistakes, Hidden said, adding that the software neednt tell them that, say, their gas setting is wrong, that the voltage is set too high, or anything else. The software might not simulate exactly what happens when something goes awry, like an actual burn-through of the material. But it can graphically show somethings amiss and leave students to figure out what the problem is on their own. Alternatively, teachers can set the system up to notify exactly whats wrong.

Teachers can create their own assignments, Hidden said, adding that they can establish what training wheel symbols to display and when. The key is knowing when to stop giving assistance and let students fly solo. It all depends on students needs and where they are on their training journey.

[AR] serves as an interim phase between theory and hands-on applications, said Patricia Carr, national manager of education and workforce development at Miller, adding that the technology has helped students uncomfortable with the arcs and sparks, including those with disabilities, gain confidence before practicing the real thing, effectively broadening the recruiting net.

The AR system has been designed around the needs of educators. Over the years, experienced welders and welding engineers within and outside Miller have collaborated with software engineers to make the experience ever more realistic. Students now can see all the puddle dynamics, with molten metal wetting against the joint sidewalls. They see weld pool disturbances that could indicate undercut or porosity, incomplete penetration, as well as over-welding practices that could lead to excessive and costly grinding.

All this begs the question, could AR be used to certify welders? Hidden chuckled a bit. Not today, he said. This tool is all about preparation. AR could allow welders to refine their technique and gain muscle memory before practicing in the lab and taking the certification test.

Gaining that muscle memory can be an extraordinary challenge, especially when improper technique can lead to a messy situationlike stick welding overhead, holding an exceedingly long arc, and being caught in a rainstorm of hot sparks.

Using AR, a student can place the workpiece wherever needed to start practicing those challenging weld positions. We have seven coupons for all positions, Hidden said, including flat and overhead. And the beauty with AR, I can take my part [coupon], put double sticky tape on it, and put it underneath the table. Students can crawl underneath the table and weld it.

Looking through the welding helmet, a student can inspect his weld and receive specific feedback.

Repetition builds muscle memory and confidence, preparing students for the real world of overhead welding. Once they strike an arc for real, theyre more likely to maintain the right welding technique and produce a clean bead without a plethora of sparks raining down.

Students and professionals using the AR system can practice any technique they wish, but as Hidden explained, software does need to be built around a specific technique to score itthat is, building the ability to track the position of the weld and consumables, compare it to an ideal, score it based on that comparison, and pinpoint areas for improvement. For instance, students practicing GTAW might want to walk the cup over a certain joint geometry. They can run through the motions to gain the muscle memory, but the system wont be able to give comprehensive feedback, at least not yet.

Though, of course, new software is being written and improved upon all the time. As Carr explained, Miller has been following the voice-of-the-customer methodology, developing software requested by the majority of current and potential users of the technology.

Even in its current state, AR helps demystify a misunderstood, opaque process by identifying exactly what makes a good weld and what doesnt. Skilled people take many paths to achieve a quality weld, so not following exactly what the AR system prescribes doesnt guarantee failure. Still, in the future, those learning and perfecting their skillsand perhaps even experienced welding professionalsmight look more and more to AR as a kind of compass, something to reference to make sure their fundamentals are there and that theyre headed in the right direction. And theres an added bonus: They neednt waste consumables and test coupons in the process.

See the article here:

How augmented reality is being used to train the next generation of ... - TheFabricator.com

CoPilot to Harness AI Speed, Scale, Talespin CEO Says – XR Today

Virtual reality (VR) training continues to lead as one of the top verticals for the extended reality (XR) umbrella of technologies. Along with augmented and mixed reality (AR/MR), VRs fully-immersive training capabilities remain a vital tool in boosting learner engagement. Talespin, a major supplier of XR training solutions, recently debuted a web-based version of its CoPilot Designer platform to increase adoption rates for enterprises.

Numerous studies and end users have documented a significant increase in information and employee retention rates due to XRs appealing, interactive instructional designs. Additional emerging technologies like artificial intelligence (AI) are also expediting the democratisation of XR for enterprises tapping the tools needed to empower workforces.

XR Today interviewed Kyle Jackson, CEO and Co-Founder, Talespin to discuss the latest updates on the companys CoPilot Designer solution. He discussed how his companys latest update provides employers with impressive platforms to upskill workers on-the-fly and with promising results.

Kyle Jackson: We are excited to announce the launch of a web-based version of our no-code, AI-enabled XR content creation tool, CoPilot Designer, which is now available for our customers and partners.

This update creates a version of our design tool that is easier to use and adopt than ever. In the near term, this will help our current customers save time and money, allow us to welcome more customers to our platform, and ultimately lead to more immersive learning content production across our ecosystem empowering companies to scale content across teams, offices, and geographies faster and more efficiently than ever.

However, we think its important to note the broader context. If we take a step back, theres an even bigger picture that CoPilot Designer plays its part in as our industry adapts to generative AI content creation tools and prepares for a new wave of XR headsets. We know this combination will usher in a new paradigm of immersive content creation and distribution.

We see our platform, and specifically CoPilot Designer, as a critical layer in this equation. CoPilot Designer harnesses AIs speed and scale and applies both to creating and publishing the next generation of highly engaging XR learning content.

Kyle Jackson: When we first released CoPilot Designer to the market in 2021, it was a key piece of our vision to help people become better humans. Weve always believed that AI-powered virtual humans could help real humans get better at skills such as critical thinking, empathy, and navigating difficult workplace conversations.

Ironically, we realized that as AI emerged, our human skills soft skills would be more in demand than ever. Since then, weve spent more than four years using our platform to help dozens of enterprises train their workforce in human skills in VR.

VR is a perfect channel for workers to practice difficult conversations in a safe, non-judgmental environment. It allows learners to practice delicate scenarios requiring mindfulness and tact and reinforces key interactions no college or business school trains for.

Going forward, we believe that as AI continues to automate many tasks, a spotlight will be further shined on workers human intelligence, soft skills, and people skills and become what truly differentiates them in the future workplace.

Kyle Jackson: Currently, customers can use CoPilot Designer for content creation workflows with complementary generative AI text and image tools, such as ChatGPT and Midjourney. XR content created with CoPilot Designer also uses text-to-speech and natural language processing to deliver realistic conversational simulations for learners.

We are also thoughtfully exploring more AI integrations on our roadmap. For example, the ability to create immersive learning simulations where the speech from virtual human characters can be driven by a large language model (LLM) that is guided by the constraints provided by the business.

This can also be mixed with more prescriptive immersive branched narratives for enterprise use cases ranging from onboarding to customer experience training simulations. With integrations like this, CoPilot Designer can be used to author open-ended AI-powered immersive learning experiences and learning modules with a very specific scenario or script for use cases that require that.

Kyle Jackson: Absolutely! Our customers have seen impressive results across different industries. For example, a PwC study proved the efficacy of immersive learning with results like fourfold increase in learning speed and learners saying they were 275 times more confident after immersive learning training. Learners ranging from Fortune 500 employees to high school-age students benefit from engaging learning experiences.

The industry applied these results to corporate training use cases ranging from practising customer conversations in the insurance industry to helping managers simulate giving performance feedback.

Were on a mission to help people develop the human skills that set us apart as AI and other technologies continue to permeate further into our work lives. We see great opportunities for these very tools to advance that mission.

See the original post here:

CoPilot to Harness AI Speed, Scale, Talespin CEO Says - XR Today

Teletrix licenses methods for ionizing radiation training using augmented reality – Newswise

Newswise A method using augmented reality to create accurate visual representations of ionizing radiation, developed at the Department of Energys Oak Ridge National Laboratory, has been licensed byTeletrix, a firm that creates advanced simulation tools to train the nations radiation control workforce.

Ionizing radiation which is linked to cancer and other health problems has enough energy to knock electrons off of atoms or molecules, creating ions. Occupational exposure is a common occurrence for many radiological workers, so any method of decreasing exposure helps to limit overall negative effects and increase worker safety.

In the 1940s, ORNL made pioneering contributions across numerous scientific fields, including radiation protection, said Susan Hubbard, ORNL deputy for science and technology. In our 80th year as an institution, we continue to provide leadership in this area. This technology will allow radiological workers to better understand the environments they work in, enabling a safer and more informed workforce.

At ORNL, the licensed methods were originally used to create thevirtual interaction with physics-enhanced reality, or VIPER, application. Using simulated radiation dataimplemented in a gaming platform, the technology divides a physical space into cubes, each representing a volumetric value of ionizing radiation by dose. A 3D interpolation of these values is then used to create an image of gradient contours that are overlaid on a real-world view through an augmented reality, or AR, headset. As a trainee moves through the space, navigating around the contours, the device calculates real-time, yet simulated, exposure based on the users behavior.

We combined physics-based data with a gaming interface that provides a visual platform to make something invisible look and feel real we took science and cinematography and brought them together, said ORNLs Michael Smith.

In addition to Smith, the development team includes ORNLs Noel Nelson and Douglas Peplow, all of the Nuclear Energy and Fuel Cycle Division; and former ORNL researchers M. Scott Greenwood and Nicholas Thompson. Significant support came from the Nuclear and Radiological Protection Division. The technology began as an exploratory, one-year seed project funded under ORNLs Lab Directed Research and Development program.

When it comes to training with ionizing radiation, augmented reality is a superior and safer solution, Smith said. Our team was at the right place at the right time to develop this technology. There was a synergy of hardware and software maturity coupled with an idea thats been around a long time the need to see ionizing radiation.

Teletrixs simulators for radiological and gas detection training are widely used by utilities, emergency response organizations and government agencies. ORNL has been a longtime customer of the Pittsburgh, Pennsylvania-based small business, which also manufactures its own products.

Our company is solely dedicated to improving radiation training our tagline is Prepare Through Simulation and making that training more realistic, said Jason OConnell, sales and business development manager for Teletrix. We're always looking to innovate training, so we make a lot of new products.

One of Teletrixs products is VIZRAD, a virtual reality software system that simulates contamination on individuals and workspaces. VIZRAD trains a user to properly scan someone with a detector and provides objective feedback on technique.

When I put the AR glasses on, it was obvious that ORNLs technology and Teletrixs tools were a great fit, OConnell said. Through the headset and the AR technology, we have the ability to track a persons exact location within a room and inject source information into the room. It really raises the bar on the precision of the training we can deliver.

Having much more realistic readings on your instruments leads to better-prepared employees, better prepared trainees, fewer incidents this technology will help make people in this industry safer.

Additionally, lowering exposure to ionizing radiation also provides cost benefits to companies, he said.

Smith said the development team envisioned three applications for the ORNL technology, including:

Just by having a general impression of the spatial relationship of your body in a given radiation environment, you can decrease your overall dose based on really fundamental behavioral changes, Smith said. We can't see ionizing radiation, so you just walk right through it. But once you have seen what the radiation in your working environment looks like, you can't unsee it. AR provides a means to train people to have a better visceral understanding of how ionizing radiation behaves.

Performance data collected from about 40 participants supports this hypothesis by showing statistically significant behavioral changes after minimal training with AR representations of radiation fields.

Additionally, the method of coupling AR technologies with accurate radiation measurements has been demonstrated and experimentally validated in a study using cesium-137 in ORNLs Nuclear Radiation Protection Division demonstration facility.

ORNL senior commercialization manager Eugene Cochran negotiated the terms of the license. For more information about ORNLs intellectual property in analytical instrumentation,email ORNL Partnerships, call 865.574.1051 orsubscribe to ORNL invention alerts. To connect with the Teletrix team,email [emailprotected]or call 412.798.3636.

UT-Battelle manages ORNL for the Department of Energys Office of Science, the single largest supporter of basic research in the physical sciences in the United States. The Office of Science is working to address some of the most pressing challenges of our time. For more information, please visitenergy.gov/science.

See the rest here:

Teletrix licenses methods for ionizing radiation training using augmented reality - Newswise

PSVR 2 VR Cover accessory kit aims to ease comfort issues – MIXED Reality News

Bild: VR Cover

From pressure points to excessive sweating: A set of accessories from VR-Cover aims to alleviate many of the Playstation VR 2s comfort issues.

The PSVR 2 allows for amazing VR graphics, but suffers from comfort issues depending on the shape of your head. Some users have already hacked their way out of the Halo straps poor fit and pressure points. Sweat under the plastic padding can even compromise the VR headsets technology.

A three-piece accessory set from VR Cover is designed to alleviate all these problems without affecting the warranty. Instead of modifying the hardware itself, buyer:ins simply wrap their PSVR 2 with two fabric covers. The two wraparound covers for the front and back padding are said to reduce and absorb sweat.

The two washable covers with Velcro fasteners are each made of two layers of tightly woven cotton to prevent the formation of foam from perspiration.

In the style of other VR headsets, there is also a length-adjustable headband. It attaches to the sides of the halo strap with Velcro and takes some weight and pressure off the front and back of the head. It also relieves pressure on the neck and shoulders, according to the accessory maker. The principle is similar to many other VR headsets with similar top head straps.

The PSVR 2, on the other hand, practically clamps the skull between the front and back air cushions. With the right head shape, such a halo strap can be very comfortable. After all, the VR headset hangs loosely in front of your eyes with no pressure on your face and enough room for your glasses.

But as is often the case, comfort in virtual reality is highly subjective. With Sonys new headband design in particular, some customers complained about an uncomfortable fit and sweat problems.

The three-piece Head Strap Cover Set for PlayStation VR2 has been available in the European VR Cover Store for 29 Euros since May 4 and sold out within a few hours of going on sale. Replenishment is expected to follow early next week. VR Cover recommends interested buyers to try their luck on Monday, May 8th. A second batch is expected then.

For more tips and support, see our PSVR 2 Getting Started Guide.

Note: Links to online stores in articles can be so-called affiliate links. If you buy through this link, MIXED receives a commission from the provider. For you the price does not change.

Visit link:

PSVR 2 VR Cover accessory kit aims to ease comfort issues - MIXED Reality News

Dundalk Institute student presents at virtual reality conference – Louth Live

Dundalk Institute of Technology (DkIT) said they are delighted to report that Michael Galbraith, an immersive technology specialist with Arup and a current student of the MSc in Computer Gaming and XR in DkIT, recently delivered a successful presentation at the Meta European HQ office in Dublin in conjunction with Eirmersive.

Michael showcased various virtual reality projects he contributed to as part of the company's Immersive Technology team.

These projects exemplified the potential of immersive technology to transform designs and engage the public with proposed solutions, reflecting the practical applications of the skills he is currently honing through the MSc program at DkIT.

DkIT's MSc in Computer Gaming and XR focuses on developing software engineering skills within 3D game engines and the skillset to model 3D characters and environments with a focus on Virtual Reality (VR) and Augmented Reality (AR) technologies.

The course equips students with the knowledge and experience needed to excel in the rapidly evolving world of immersive technology.

Michael's presentation at the Meta European HQ office in Dublin serves as a testament to the quality of education provided by DkIT.

As students like Michael continue to grow and achieve success in the immersive technology field, DkIT remains committed to offering innovative educational programs that prepare students for the dynamic developments ahead within this fast-moving pioneering industry.

ADVERTISEMENT - CONTINUE READING BELOW

Go here to see the original:

Dundalk Institute student presents at virtual reality conference - Louth Live

Memes, virtual reality used to train Home Team officers – The Straits Times

SINGAPORE A photo of American actor Sylvester Stallone as Rambo sticking up both his thumbs is being used to train the next generation of Home Team officers.

The meme, also known as Thumbs Up Rambo, is a reminder to officers that travellers use both their thumbs to clear biometric scans at immigration.

It is one of several memes being used at the Home Team Academy (HTA) to keep training relevant for younger officers, as well as help them better remember and develop the skills they need.

The memes used to train officers in immigration clearance can be scanned using an app to provide officers more information on how clearance should be done, and also how to spot suspicious characters at checkpoints.

These were unveiled on Tuesday at HTAs workplan seminar, where Second Minister for Home Affairs Josephine Teo also launched the second iteration of the Home Team Learning Management System.

The system, which was first used in 2016, has been enhanced to bring together training, assessment and social collaboration onto one platform.

Artificial intelligence-assisted assessment will also be used.

The plan is for the system to eventually become the primary training platform for more than 68,000 officers across the Home Team.

Mrs Teo said the HTA, as the corporate university in homefront safety and security, plays a crucial role in ensuring Home Team officers are future-ready.

She said: Competency-building through training and learning will enable our officers to tackle emerging and future challenges effectively, and achieve our mission of keeping Singapore and Singaporeans safe and secure.

Read more:

Memes, virtual reality used to train Home Team officers - The Straits Times

The Global Augmented Reality In Agriculture Market to register … – Digital Journal

PRESS RELEASE

Published May 5, 2023

Factual Market Research has released a report on the global Augmented Reality In Agriculture market, including historical and current growth prospects and trends from 2022-2030. The Report utilizes unique research techniques that combine primary and secondary research to comprehensively analyze the global Augmented Reality In Agriculture market and draw conclusions about its future growth potential. This method helps analysts determine the quality and reliability of the data. The Report offers valuable insights on key market factors, including market trends, growth prospects, and expansion opportunities for the industry.

Augmented Reality (AR) technology is becoming increasingly prevalent in agriculture. AR in agriculture involves using digital images, video, or sound to enhance the real-world environment and provide farmers with valuable insights and information.

Market Dynamics:

Drivers and Restraints:

The Augmented Reality In Agriculture Market is being driven by several factors, including the increasing demand for precision agriculture, the need to optimize farming processes and reduce waste, and the growing adoption of smart farming technologies. AR technology can help farmers to make more informed decisions about planting, watering, and harvesting crops, as well as detect and treat plant diseases and pests more effectively.

Moreover, the use of AR in agriculture can improve worker safety by providing real-time data and alerts about potential hazards and risks. Additionally, the increasing availability of affordable AR devices such as smartphones and tablets is making this technology more accessible to farmers and agricultural workers.

However, some factors are restraining the growth of the AR in agriculture market. These include the limited adoption of advanced technologies by small-scale farmers, the lack of standardized practices and regulations for using AR in agriculture, and the high costs associated with implementing AR systems.

Any query regarding the Report:

https://www.factualmarketresearch.com/Reports/Augmented-Reality-In-Agriculture-Market

Key players:

Market Segmentation:

Augmented Reality in Agriculture Market, By Type

Augmented Reality In Agriculture Market, By End-User

Augmented Reality In Agriculture Market, By Application

Augmented Reality In Agriculture Market, By Region

Get a Free Sample Report:

https://www.factualmarketresearch.com/Reports/Augmented-Reality-In-Agriculture-Market

Market Trends:

Some of the key trends in the augmented reality in agriculture market include the development of new and innovative AR applications for farming, integrating AR with other smart farming technologies such as drones and sensors, and using AR in training and education programs for farmers and agricultural workers.

Another emerging trend is the use of AR to create virtual simulations of farming environments, which can help farmers test different strategies and scenarios safely and in a controlled manner. In addition, the increasing use of AR to improve the traceability and transparency of the agricultural supply chain is also driving the growth of augmented reality in the agriculture market.

For any customization:

https://www.factualmarketresearch.com/Inquiry/12856

The Report covers the following key elements:

Table of Contents: Augmented Reality In Agriculture Market

Chapter 1: Introduction to Augmented Reality In Agriculture Market

Chapter 2: Analysis of Market Drivers

Chapter 3: Global Market Status and Regional Forecast

Chapter 4: Global Market Status and Forecast by Types

Chapter 5: Competition Status among Major Manufacturers

Chapter 6: Introduction and Market Data of Major Manufacturers

Chapter 7: Upstream and Downstream Analysis

Chapter 8: PESTEL, SWOT, and PORTER 5 Forces Analysis

Chapter 9: Cost Analysis and Gross Margin

Chapter 10: Sales Channels, Distributors, Traders, and Dealers

Chapter 11: Analysis of Marketing Status

Chapter 12: Conclusion of Market Report

Chapter 13: Methodology and References for Augmented Reality In Agriculture Market Research

Chapter 14: Appendix

About Us:

Factual Market Research is a leading provider of comprehensive industry research that provides clients with actionable intelligence to answer their research questions. Our expertise covers over 20 industries, and we provide customized syndicated and consulting research services to cater to our clients specific requirements. Our focus is on delivering high-quality Market Research Reports and Business Intelligence Solutions that enable clients to make informed decisions and achieve long-term success in their respective market niches. Additionally, FMR offers business insights and consulting services to further support our clients.

Visit our website to learn more about our services and how we can assist you.

Contact Us:

If you have any questions regarding our Augmented Reality In Agriculture report or require further information, please dont hesitate to contact us.

E-mail:[emailprotected]

Contact Person: Jaipreet Makked

US Toll-Free: +18007743961

UK (Tollfree): +448081897087

Web: https://www.factualmarketresearch.com/

Follow us on LinkedIn

View post:

The Global Augmented Reality In Agriculture Market to register ... - Digital Journal

Using augmented reality to guide bone conduction device … – Nature.com

Specimen preparation

Whole cadaveric heads were prepared with bilateral curvilinear post-auricular incisions with elevation of a soft tissue flap for exposure of the zygomatic root, posterior external auditory canal, and the mastoid tip. Eight 2mm bone wells were drilled outside of the surgical field to act as fiducial references for eventual image guidance calibration within the experimental arm. Areas of placement included the zygomatic root, bony external auditory canal, and the mastoid tip.

Using a prototype intraoperative cone-beam computed tomography scanner (Powermobil, Siemens, Germany), the cadaver heads were obtained, with an isotropic voxel size of 0.78mm12. Scans were evaluated for abnormal anatomy or evidence of previous surgery. Both the O-OSI and BB-FMT devices were imaged for surgical modelling by creating the virtual rendering of hearing device for projecting the overlay during the procedure. Materialise Mimics Medical 19.0 (Materialise NV, Belgium) was used to identify optimal placement of the devices with creation of virtual heads rendered from CT imaging using pre-set bony segmentation sequencing.

Implants were imported into Materalise Mimics as optimized triangulated surface meshes that moved independently from the bone. The experimental design is outlined in Fig.1. Each surgeons pre-operative planning included placement of four O-OSI devices and four BB-FMT devices in two separate sessions. Bone depth and avoidance of critical structures, such as the sigmoid sinus were major factors. O-OSIs were placed within the mastoid and clearance around the implant was ensured to avoid inadvertent contact with underlying bone. The three possible placements of the BB-FMTs included the mastoid, retrosigmoid, and middle fossa areas. Each surgeon underwent a brief 10-min session with surgical manuals to review optimal surgical technique for both implants. Each planning session lasted five minutes to allow for surgeons to guide exact placement.

Study protocol (CBCT cone beam computed tomography, O-OSI Osia osseointegrated implant steady-state implant, BB-FMT BoneBridge floating mass transducer).

Implantation followed a standardized protocol beginning with the control arm followed by the experimental AR arm (Fig.1). Within the control arm, surgeons utilized Materialise Mimics built-in measurement tool for eventual intraoperative reference during implant placement. Whereas in the experimental arm, device placement was projected onto the surgical field using GTx-Eyes (Guided Therapeutics, TECHNA Institute, Canada) via a PicoPro projector (Cellon Inc., South Korea)7,11. The AR setup is demonstrated in Fig.2 and seen in the supplementary video.

Integrated augmented reality surgical navigation system. (A) the projector and surgical instruments were trackedwith the optical tracker in reference to the registered fiducials on the cadaveric head. Optical tracking markers attached to the projector allows for real-time adjustments to image projection. The surgical navigation platform displaying a pre-operatively placed implant. Experimental AR projection arm setup. (B) Surgeons were encouraged to align the projector to their perspective to reduce parallax.

Following implant placement, CT scans were obtained of the cadaveric heads to capture the location of implantation for eventual 3D coordinates measurement analysis. Each surgeon performed four O-OSI placements followed by four BB-FMTs.

The integrated AR surgical navigation system consists of a PicoPro projector (Cellon Inc., South Korea), a Polaris Spectra stereoscopic infrared optical tracker (NDI, Canada), a USB 2.0-megapixel camera (ICAN, China), and a standard computer. A 3D printed PicoPro projector enclosure enabled the attachment of four tracking markers, which provide real-time three-dimensional tracking information (Fig.2). GTx-Eyes (Guided Therapeutics, TECHNA Institute, Canada) is a surgical navigation platform that utilizes open-source, cross-platform libraries included IGSTK, ITK, and VTK11,13,14,15,16. The developed AR system has demonstrated the projection accuracy at 0.550.33mm and has been widely adapted to the domains of Otolaryngologic and Orthopedic oncologic operations17,18,19,20. Recently, the software has evolved to include AR integration7,9.

The AR system requires two calibrations: (1) camera and instrument tracker, (2) camera and projector, which are both are outlined by Chan et al.9,11. The result allows the tracked tool to be linked with the projectors spatial parameters allowing for both translation and rotational movements.

The camera and tracking tool calibration defines the relationship between the cameras center and the tracking tool coordinates by creating a homogeneous transformation matrix, ({{}^{Tracker}T}_{Cam}), consisting of a 33 rotational matrix (R) and a 31 translational vector (t). The rotational parameter was represented with Euler angles (({R}_{x},{R}_{y},{R}_{z})). This calibration process requires photographing a known checkerboard pattern from various perspectives using the camera that is affixed to the projectors case. The instrument trackers position and orientation are recorded to compute the spatial transformation. The grid dimensions from each photograph are compared with actual dimensions (30mm 30mm in a 97 array) using an open-source Matlab camera calibration tool21. This calibration serves as the extrinsic parameter of the camera.

The intrinsic parameters (A) of the camera include the principal point (({u}_{0,}{v}_{0})), scale factors ((alpha , beta ),mathrm{ and the skew of the two image axes }left(cright))22,23,24. This is denoted as:

$$mathbf{A}=left[begin{array}{ccc}alpha & c& {u}_{0}\ 0& beta & {v}_{0}\ 0& 0& 1end{array}right]$$

When combining the extrinsic (R t) with intrinsic (A) parameters, three-dimensional space (({mathbf{M}=[X,Y,Z,1]}^{T})) can be mapped to a two-dimensional camera image (({mathbf{m}=[u,v,1]}^{T})). s is defined as the scale factors. This is represented by: (smathbf{m}=mathbf{A}left[mathbf{R} mathbf{t}right]mathbf{M}.)

This link defines the spatial relationship between the cameras centre and the projector to create a homogenous transformation matrix (({{}^{Cam}T}_{Proj})). A two-dimensional checkerboard image is projected onto a planar checkerboard surface, which was used in the previous calibration step. The camera captures both images from various perspectives. Using the projector-camera calibration toolbox, the transformation of the camera and projector (({{}^{Cam}T}_{Proj})) is now established25. The calibration requires linking the camera and the projector tracking markers, both of which are mounted on the projector enclosure (Fig.2). By combining both calibration processes, the resulting transformation matrix from the AR projector to the tracking marker is denoted by ({{}^{Tracker}T}_{Proj}={{}^{Tracker}T}_{Cam}*{{}^{Cam}T}_{Proj}) .

AR projection setup required confirmation of projection adequacy using an image guidance probe and a Polaris Spectra NDI (Fig.2). Using the image guidance probe, coordinates from the bony fiducials (drilled bone well) and the projected fiducials (green dots) were captured. The difference between coordinates served as the measurement of projection accuracy (Fig.3).

(A) Fiducials projection onto the surgical field was matched to the drilled wells and (B) subsequent accuracy measurements were obtained with a tracking pointer tool placed within the drilled wells where x-, y-, and z- coordinates were captured.

Post-operative and pre-operative scans were superimposed on Materialise Mimics and centre-to-centre distances as well as angular differences on the axial plane were measured (Figs.4, 5). For O-OSI placements, the centre of the O-OSI was used, whereas the centre of the FMT for BB-FMT.

Accuracy measurements for center-to-center distances and angular accuracy.

Post-operative CT scans (A) BB-FMT and (B) O-OSI following AR projector guided surgery with paired pre-operative planning rendering seen in (C) and (D). In images (A) and (B), there is the pre-operative planning outline superimposed. The blue arrow denotes post- operative placement whereas the red arrow denotes pre-operative planning.

All participants completed a NASA Task Load Index (TLX) questionnaire assessing the use of AR in addition to providing feedback in an open-ended questionnaire26. TLX results were used to generate raw TLX (RTLX) scores for the six domains and subsequently weighted workload scores were generated27.

Continuous data was examined for normality by reviewing histograms, quantilequantile plots, and the ShapiroWilk test for normality. Given the lack of normality and repeated measurements, Wilcoxon signed-rank testing was used for centre-to-centre (C-C) and angular accuracies comparisons between the control and experimental arms. All analyses were performed using SPSS 26 (IBM Corp., Armonk, NY).

All methods were carried out in accordance with relevant guidelines and regulations. This study was approved by the Sunnybrook Health Sciences Centre Research Ethics Board (Project Identification Number: 3541). Informed consent was obtained from all subjects and/or their legal guardian(s) by way of the University of Torontos Division of AnatomyBody Donation Program. All subjects provided consent in the publication of identifying images in an online open-access publication.

Go here to see the original:

Using augmented reality to guide bone conduction device ... - Nature.com

Mixed Reality Music Prototype Turns Spotify Into Vinyl – UploadVR

Freelance Creative Director Bob Bjarke, formerly of Meta, shared an amusing new mixed reality concept on Twitter centered around discovering new music and creating playlists with virtual records.

Bjarke shared footage of Wreckommendation Engine, a prototype experience he created during the Meta Quest Presence Platform Hackathon last week with Unity developers @RJdoesVR and Jeremy Kesten, 3D artist and prototype Joe Kane and immersive sound designer David Urrutia.

Wreckommendation Engine presents users with a virtual record player and crate of records, positioned on a real life surface using mixed reality passthrough on Quest Pro. The user can grab records out of the crate and listen to them by placing them on the turntable. If you like the music, you can throw it against a designated nearby wall to save it. If you hate it, you can throw it against a different wall to smash it into pieces.

If you smash too many tracks, they will eventually come back to life as a killer robot made up of vintage electronics and hi-fi equipment. You can destroy it by throwing more records at it.

The experience integrates with Spotify and uses its API to present you with new tracks, take note of your preferences and compile your saved tracks into a playlist for later.

This is just a proof-of-concept prototype and a bit of fun, so its unlikely to ever see the light of day for Quest users. Nonetheless, its an amusing concept and a cool way to bring more physicality into music discovery in the age of streaming. In a follow-up tweet, Bjarke said that they wanted to use the immersive tools of mixed reality to make a more fun and social music experience, given that formerly social activities like making mixtapes and burning CDs are now algorithmic utilities, done along on a 2D screen.

Originally posted here:

Mixed Reality Music Prototype Turns Spotify Into Vinyl - UploadVR