PSVR 2 VR Cover accessory kit aims to ease comfort issues – MIXED Reality News

Bild: VR Cover

From pressure points to excessive sweating: A set of accessories from VR-Cover aims to alleviate many of the Playstation VR 2s comfort issues.

The PSVR 2 allows for amazing VR graphics, but suffers from comfort issues depending on the shape of your head. Some users have already hacked their way out of the Halo straps poor fit and pressure points. Sweat under the plastic padding can even compromise the VR headsets technology.

A three-piece accessory set from VR Cover is designed to alleviate all these problems without affecting the warranty. Instead of modifying the hardware itself, buyer:ins simply wrap their PSVR 2 with two fabric covers. The two wraparound covers for the front and back padding are said to reduce and absorb sweat.

The two washable covers with Velcro fasteners are each made of two layers of tightly woven cotton to prevent the formation of foam from perspiration.

In the style of other VR headsets, there is also a length-adjustable headband. It attaches to the sides of the halo strap with Velcro and takes some weight and pressure off the front and back of the head. It also relieves pressure on the neck and shoulders, according to the accessory maker. The principle is similar to many other VR headsets with similar top head straps.

The PSVR 2, on the other hand, practically clamps the skull between the front and back air cushions. With the right head shape, such a halo strap can be very comfortable. After all, the VR headset hangs loosely in front of your eyes with no pressure on your face and enough room for your glasses.

But as is often the case, comfort in virtual reality is highly subjective. With Sonys new headband design in particular, some customers complained about an uncomfortable fit and sweat problems.

The three-piece Head Strap Cover Set for PlayStation VR2 has been available in the European VR Cover Store for 29 Euros since May 4 and sold out within a few hours of going on sale. Replenishment is expected to follow early next week. VR Cover recommends interested buyers to try their luck on Monday, May 8th. A second batch is expected then.

For more tips and support, see our PSVR 2 Getting Started Guide.

Note: Links to online stores in articles can be so-called affiliate links. If you buy through this link, MIXED receives a commission from the provider. For you the price does not change.

Visit link:

PSVR 2 VR Cover accessory kit aims to ease comfort issues - MIXED Reality News

Dundalk Institute student presents at virtual reality conference – Louth Live

Dundalk Institute of Technology (DkIT) said they are delighted to report that Michael Galbraith, an immersive technology specialist with Arup and a current student of the MSc in Computer Gaming and XR in DkIT, recently delivered a successful presentation at the Meta European HQ office in Dublin in conjunction with Eirmersive.

Michael showcased various virtual reality projects he contributed to as part of the company's Immersive Technology team.

These projects exemplified the potential of immersive technology to transform designs and engage the public with proposed solutions, reflecting the practical applications of the skills he is currently honing through the MSc program at DkIT.

DkIT's MSc in Computer Gaming and XR focuses on developing software engineering skills within 3D game engines and the skillset to model 3D characters and environments with a focus on Virtual Reality (VR) and Augmented Reality (AR) technologies.

The course equips students with the knowledge and experience needed to excel in the rapidly evolving world of immersive technology.

Michael's presentation at the Meta European HQ office in Dublin serves as a testament to the quality of education provided by DkIT.

As students like Michael continue to grow and achieve success in the immersive technology field, DkIT remains committed to offering innovative educational programs that prepare students for the dynamic developments ahead within this fast-moving pioneering industry.

ADVERTISEMENT - CONTINUE READING BELOW

Go here to see the original:

Dundalk Institute student presents at virtual reality conference - Louth Live

Memes, virtual reality used to train Home Team officers – The Straits Times

SINGAPORE A photo of American actor Sylvester Stallone as Rambo sticking up both his thumbs is being used to train the next generation of Home Team officers.

The meme, also known as Thumbs Up Rambo, is a reminder to officers that travellers use both their thumbs to clear biometric scans at immigration.

It is one of several memes being used at the Home Team Academy (HTA) to keep training relevant for younger officers, as well as help them better remember and develop the skills they need.

The memes used to train officers in immigration clearance can be scanned using an app to provide officers more information on how clearance should be done, and also how to spot suspicious characters at checkpoints.

These were unveiled on Tuesday at HTAs workplan seminar, where Second Minister for Home Affairs Josephine Teo also launched the second iteration of the Home Team Learning Management System.

The system, which was first used in 2016, has been enhanced to bring together training, assessment and social collaboration onto one platform.

Artificial intelligence-assisted assessment will also be used.

The plan is for the system to eventually become the primary training platform for more than 68,000 officers across the Home Team.

Mrs Teo said the HTA, as the corporate university in homefront safety and security, plays a crucial role in ensuring Home Team officers are future-ready.

She said: Competency-building through training and learning will enable our officers to tackle emerging and future challenges effectively, and achieve our mission of keeping Singapore and Singaporeans safe and secure.

Read more:

Memes, virtual reality used to train Home Team officers - The Straits Times

The Global Augmented Reality In Agriculture Market to register … – Digital Journal

PRESS RELEASE

Published May 5, 2023

Factual Market Research has released a report on the global Augmented Reality In Agriculture market, including historical and current growth prospects and trends from 2022-2030. The Report utilizes unique research techniques that combine primary and secondary research to comprehensively analyze the global Augmented Reality In Agriculture market and draw conclusions about its future growth potential. This method helps analysts determine the quality and reliability of the data. The Report offers valuable insights on key market factors, including market trends, growth prospects, and expansion opportunities for the industry.

Augmented Reality (AR) technology is becoming increasingly prevalent in agriculture. AR in agriculture involves using digital images, video, or sound to enhance the real-world environment and provide farmers with valuable insights and information.

Market Dynamics:

Drivers and Restraints:

The Augmented Reality In Agriculture Market is being driven by several factors, including the increasing demand for precision agriculture, the need to optimize farming processes and reduce waste, and the growing adoption of smart farming technologies. AR technology can help farmers to make more informed decisions about planting, watering, and harvesting crops, as well as detect and treat plant diseases and pests more effectively.

Moreover, the use of AR in agriculture can improve worker safety by providing real-time data and alerts about potential hazards and risks. Additionally, the increasing availability of affordable AR devices such as smartphones and tablets is making this technology more accessible to farmers and agricultural workers.

However, some factors are restraining the growth of the AR in agriculture market. These include the limited adoption of advanced technologies by small-scale farmers, the lack of standardized practices and regulations for using AR in agriculture, and the high costs associated with implementing AR systems.

Any query regarding the Report:

https://www.factualmarketresearch.com/Reports/Augmented-Reality-In-Agriculture-Market

Key players:

Market Segmentation:

Augmented Reality in Agriculture Market, By Type

Augmented Reality In Agriculture Market, By End-User

Augmented Reality In Agriculture Market, By Application

Augmented Reality In Agriculture Market, By Region

Get a Free Sample Report:

https://www.factualmarketresearch.com/Reports/Augmented-Reality-In-Agriculture-Market

Market Trends:

Some of the key trends in the augmented reality in agriculture market include the development of new and innovative AR applications for farming, integrating AR with other smart farming technologies such as drones and sensors, and using AR in training and education programs for farmers and agricultural workers.

Another emerging trend is the use of AR to create virtual simulations of farming environments, which can help farmers test different strategies and scenarios safely and in a controlled manner. In addition, the increasing use of AR to improve the traceability and transparency of the agricultural supply chain is also driving the growth of augmented reality in the agriculture market.

For any customization:

https://www.factualmarketresearch.com/Inquiry/12856

The Report covers the following key elements:

Table of Contents: Augmented Reality In Agriculture Market

Chapter 1: Introduction to Augmented Reality In Agriculture Market

Chapter 2: Analysis of Market Drivers

Chapter 3: Global Market Status and Regional Forecast

Chapter 4: Global Market Status and Forecast by Types

Chapter 5: Competition Status among Major Manufacturers

Chapter 6: Introduction and Market Data of Major Manufacturers

Chapter 7: Upstream and Downstream Analysis

Chapter 8: PESTEL, SWOT, and PORTER 5 Forces Analysis

Chapter 9: Cost Analysis and Gross Margin

Chapter 10: Sales Channels, Distributors, Traders, and Dealers

Chapter 11: Analysis of Marketing Status

Chapter 12: Conclusion of Market Report

Chapter 13: Methodology and References for Augmented Reality In Agriculture Market Research

Chapter 14: Appendix

About Us:

Factual Market Research is a leading provider of comprehensive industry research that provides clients with actionable intelligence to answer their research questions. Our expertise covers over 20 industries, and we provide customized syndicated and consulting research services to cater to our clients specific requirements. Our focus is on delivering high-quality Market Research Reports and Business Intelligence Solutions that enable clients to make informed decisions and achieve long-term success in their respective market niches. Additionally, FMR offers business insights and consulting services to further support our clients.

Visit our website to learn more about our services and how we can assist you.

Contact Us:

If you have any questions regarding our Augmented Reality In Agriculture report or require further information, please dont hesitate to contact us.

E-mail:[emailprotected]

Contact Person: Jaipreet Makked

US Toll-Free: +18007743961

UK (Tollfree): +448081897087

Web: https://www.factualmarketresearch.com/

Follow us on LinkedIn

View post:

The Global Augmented Reality In Agriculture Market to register ... - Digital Journal

Using augmented reality to guide bone conduction device … – Nature.com

Specimen preparation

Whole cadaveric heads were prepared with bilateral curvilinear post-auricular incisions with elevation of a soft tissue flap for exposure of the zygomatic root, posterior external auditory canal, and the mastoid tip. Eight 2mm bone wells were drilled outside of the surgical field to act as fiducial references for eventual image guidance calibration within the experimental arm. Areas of placement included the zygomatic root, bony external auditory canal, and the mastoid tip.

Using a prototype intraoperative cone-beam computed tomography scanner (Powermobil, Siemens, Germany), the cadaver heads were obtained, with an isotropic voxel size of 0.78mm12. Scans were evaluated for abnormal anatomy or evidence of previous surgery. Both the O-OSI and BB-FMT devices were imaged for surgical modelling by creating the virtual rendering of hearing device for projecting the overlay during the procedure. Materialise Mimics Medical 19.0 (Materialise NV, Belgium) was used to identify optimal placement of the devices with creation of virtual heads rendered from CT imaging using pre-set bony segmentation sequencing.

Implants were imported into Materalise Mimics as optimized triangulated surface meshes that moved independently from the bone. The experimental design is outlined in Fig.1. Each surgeons pre-operative planning included placement of four O-OSI devices and four BB-FMT devices in two separate sessions. Bone depth and avoidance of critical structures, such as the sigmoid sinus were major factors. O-OSIs were placed within the mastoid and clearance around the implant was ensured to avoid inadvertent contact with underlying bone. The three possible placements of the BB-FMTs included the mastoid, retrosigmoid, and middle fossa areas. Each surgeon underwent a brief 10-min session with surgical manuals to review optimal surgical technique for both implants. Each planning session lasted five minutes to allow for surgeons to guide exact placement.

Study protocol (CBCT cone beam computed tomography, O-OSI Osia osseointegrated implant steady-state implant, BB-FMT BoneBridge floating mass transducer).

Implantation followed a standardized protocol beginning with the control arm followed by the experimental AR arm (Fig.1). Within the control arm, surgeons utilized Materialise Mimics built-in measurement tool for eventual intraoperative reference during implant placement. Whereas in the experimental arm, device placement was projected onto the surgical field using GTx-Eyes (Guided Therapeutics, TECHNA Institute, Canada) via a PicoPro projector (Cellon Inc., South Korea)7,11. The AR setup is demonstrated in Fig.2 and seen in the supplementary video.

Integrated augmented reality surgical navigation system. (A) the projector and surgical instruments were trackedwith the optical tracker in reference to the registered fiducials on the cadaveric head. Optical tracking markers attached to the projector allows for real-time adjustments to image projection. The surgical navigation platform displaying a pre-operatively placed implant. Experimental AR projection arm setup. (B) Surgeons were encouraged to align the projector to their perspective to reduce parallax.

Following implant placement, CT scans were obtained of the cadaveric heads to capture the location of implantation for eventual 3D coordinates measurement analysis. Each surgeon performed four O-OSI placements followed by four BB-FMTs.

The integrated AR surgical navigation system consists of a PicoPro projector (Cellon Inc., South Korea), a Polaris Spectra stereoscopic infrared optical tracker (NDI, Canada), a USB 2.0-megapixel camera (ICAN, China), and a standard computer. A 3D printed PicoPro projector enclosure enabled the attachment of four tracking markers, which provide real-time three-dimensional tracking information (Fig.2). GTx-Eyes (Guided Therapeutics, TECHNA Institute, Canada) is a surgical navigation platform that utilizes open-source, cross-platform libraries included IGSTK, ITK, and VTK11,13,14,15,16. The developed AR system has demonstrated the projection accuracy at 0.550.33mm and has been widely adapted to the domains of Otolaryngologic and Orthopedic oncologic operations17,18,19,20. Recently, the software has evolved to include AR integration7,9.

The AR system requires two calibrations: (1) camera and instrument tracker, (2) camera and projector, which are both are outlined by Chan et al.9,11. The result allows the tracked tool to be linked with the projectors spatial parameters allowing for both translation and rotational movements.

The camera and tracking tool calibration defines the relationship between the cameras center and the tracking tool coordinates by creating a homogeneous transformation matrix, ({{}^{Tracker}T}_{Cam}), consisting of a 33 rotational matrix (R) and a 31 translational vector (t). The rotational parameter was represented with Euler angles (({R}_{x},{R}_{y},{R}_{z})). This calibration process requires photographing a known checkerboard pattern from various perspectives using the camera that is affixed to the projectors case. The instrument trackers position and orientation are recorded to compute the spatial transformation. The grid dimensions from each photograph are compared with actual dimensions (30mm 30mm in a 97 array) using an open-source Matlab camera calibration tool21. This calibration serves as the extrinsic parameter of the camera.

The intrinsic parameters (A) of the camera include the principal point (({u}_{0,}{v}_{0})), scale factors ((alpha , beta ),mathrm{ and the skew of the two image axes }left(cright))22,23,24. This is denoted as:

$$mathbf{A}=left[begin{array}{ccc}alpha & c& {u}_{0}\ 0& beta & {v}_{0}\ 0& 0& 1end{array}right]$$

When combining the extrinsic (R t) with intrinsic (A) parameters, three-dimensional space (({mathbf{M}=[X,Y,Z,1]}^{T})) can be mapped to a two-dimensional camera image (({mathbf{m}=[u,v,1]}^{T})). s is defined as the scale factors. This is represented by: (smathbf{m}=mathbf{A}left[mathbf{R} mathbf{t}right]mathbf{M}.)

This link defines the spatial relationship between the cameras centre and the projector to create a homogenous transformation matrix (({{}^{Cam}T}_{Proj})). A two-dimensional checkerboard image is projected onto a planar checkerboard surface, which was used in the previous calibration step. The camera captures both images from various perspectives. Using the projector-camera calibration toolbox, the transformation of the camera and projector (({{}^{Cam}T}_{Proj})) is now established25. The calibration requires linking the camera and the projector tracking markers, both of which are mounted on the projector enclosure (Fig.2). By combining both calibration processes, the resulting transformation matrix from the AR projector to the tracking marker is denoted by ({{}^{Tracker}T}_{Proj}={{}^{Tracker}T}_{Cam}*{{}^{Cam}T}_{Proj}) .

AR projection setup required confirmation of projection adequacy using an image guidance probe and a Polaris Spectra NDI (Fig.2). Using the image guidance probe, coordinates from the bony fiducials (drilled bone well) and the projected fiducials (green dots) were captured. The difference between coordinates served as the measurement of projection accuracy (Fig.3).

(A) Fiducials projection onto the surgical field was matched to the drilled wells and (B) subsequent accuracy measurements were obtained with a tracking pointer tool placed within the drilled wells where x-, y-, and z- coordinates were captured.

Post-operative and pre-operative scans were superimposed on Materialise Mimics and centre-to-centre distances as well as angular differences on the axial plane were measured (Figs.4, 5). For O-OSI placements, the centre of the O-OSI was used, whereas the centre of the FMT for BB-FMT.

Accuracy measurements for center-to-center distances and angular accuracy.

Post-operative CT scans (A) BB-FMT and (B) O-OSI following AR projector guided surgery with paired pre-operative planning rendering seen in (C) and (D). In images (A) and (B), there is the pre-operative planning outline superimposed. The blue arrow denotes post- operative placement whereas the red arrow denotes pre-operative planning.

All participants completed a NASA Task Load Index (TLX) questionnaire assessing the use of AR in addition to providing feedback in an open-ended questionnaire26. TLX results were used to generate raw TLX (RTLX) scores for the six domains and subsequently weighted workload scores were generated27.

Continuous data was examined for normality by reviewing histograms, quantilequantile plots, and the ShapiroWilk test for normality. Given the lack of normality and repeated measurements, Wilcoxon signed-rank testing was used for centre-to-centre (C-C) and angular accuracies comparisons between the control and experimental arms. All analyses were performed using SPSS 26 (IBM Corp., Armonk, NY).

All methods were carried out in accordance with relevant guidelines and regulations. This study was approved by the Sunnybrook Health Sciences Centre Research Ethics Board (Project Identification Number: 3541). Informed consent was obtained from all subjects and/or their legal guardian(s) by way of the University of Torontos Division of AnatomyBody Donation Program. All subjects provided consent in the publication of identifying images in an online open-access publication.

Go here to see the original:

Using augmented reality to guide bone conduction device ... - Nature.com

Mixed Reality Music Prototype Turns Spotify Into Vinyl – UploadVR

Freelance Creative Director Bob Bjarke, formerly of Meta, shared an amusing new mixed reality concept on Twitter centered around discovering new music and creating playlists with virtual records.

Bjarke shared footage of Wreckommendation Engine, a prototype experience he created during the Meta Quest Presence Platform Hackathon last week with Unity developers @RJdoesVR and Jeremy Kesten, 3D artist and prototype Joe Kane and immersive sound designer David Urrutia.

Wreckommendation Engine presents users with a virtual record player and crate of records, positioned on a real life surface using mixed reality passthrough on Quest Pro. The user can grab records out of the crate and listen to them by placing them on the turntable. If you like the music, you can throw it against a designated nearby wall to save it. If you hate it, you can throw it against a different wall to smash it into pieces.

If you smash too many tracks, they will eventually come back to life as a killer robot made up of vintage electronics and hi-fi equipment. You can destroy it by throwing more records at it.

The experience integrates with Spotify and uses its API to present you with new tracks, take note of your preferences and compile your saved tracks into a playlist for later.

This is just a proof-of-concept prototype and a bit of fun, so its unlikely to ever see the light of day for Quest users. Nonetheless, its an amusing concept and a cool way to bring more physicality into music discovery in the age of streaming. In a follow-up tweet, Bjarke said that they wanted to use the immersive tools of mixed reality to make a more fun and social music experience, given that formerly social activities like making mixtapes and burning CDs are now algorithmic utilities, done along on a 2D screen.

Originally posted here:

Mixed Reality Music Prototype Turns Spotify Into Vinyl - UploadVR

Meta will release a new consumer-grade VR headset next year – TechCrunch

  1. Meta will release a new consumer-grade VR headset next year  TechCrunch
  2. Meta: the future of virtual reality  The Cryptonomist
  3. Meta confirms the next consumer Quest VR headset is coming in 2023  ZDNet
  4. Meta Quest Pro Begins Shipping Today Virtual Reality Times  Virtual Reality Times
  5. Meta Wants To Guide The Future Of Work Through Virtual Reality, But Is It Sustainable?  Allwork.Space
  6. View Full Coverage on Google News

Read more:

Meta will release a new consumer-grade VR headset next year - TechCrunch

The Global Augmented Reality & Virtual Reality In Manufacturing Market size is expected to reach $27.9 billion by 2028, rising at a market growth…

The Global Augmented Reality & Virtual Reality In Manufacturing Market size is expected to reach $27.9 billion by 2028, rising at a market growth of 27.9% CAGR during the forecast period  GlobeNewswire

See more here:

The Global Augmented Reality & Virtual Reality In Manufacturing Market size is expected to reach $27.9 billion by 2028, rising at a market growth...

Facial expressions could be used to interact in virtual reality – Science News for Students

When someone pulls on a virtual reality headset, theyre ready to dive into a simulated world. They might be hanging out in VRChat or slashing beats in Beat Saber. Regardless, interacting with that world usually involves hand controllers. But new virtual reality or VR technology out of Australia is hands-free. Facial expressions allow users to interact with the virtual environment.

This setup could make virtual worlds more accessible to people who cant use their hands, says Arindam Dey. He studies human-computer interaction at the University of Queensland in Brisbane. Other hands-free VR tech has let people move through virtual worlds by using treadmills and eye-trackers. But not all people can walk on a treadmill. And most people find it a challenge to stare at one spot long enough for the VR system to register the action. Simply making faces may be an easier way for those who are disabled to navigate VR.

Facial expressions could enhance the VR experience for people who can use hand controllers, too, Dey adds. They can allow special interactions that we do with our faces, such as smiling, kissing and blowing bubbles.

Deys team shared its findings in the April International Journal of Human Computer Studies.

Weekly updates to help you use Science News for Students in the learning environment

Thank you for signing up!

There was a problem signing you up.

In the researchers new system, VR users wear a cap studded with sensors. Those sensors record brain activity. The sensors can also pick up facial movements that signal certain expressions. Facial data can then be used to control the users movement through a virtual world.

Facial expressions usually signal emotions. So Deys team designed three virtual environments for users to explore. An environment called happy required participants to catch butterflies with a virtual net. Neutral had them picking up items in a workshop. And in the scary one, they had to shoot zombies. These environments allowed the researchers to see whether situations designed to provoke certain emotions affected someones ability to control VR through expressions.

Eighteen young adults tested out the technology. Half of them learned to use three facial expressions to move through the virtual worlds. A smile walked them forward. A frown brought them to a stop. And to perform a task, they clenched their teeth. In the happy world, that task was swooping a net. In the neutral environment, it was picking up an item. In the scary world, it was shooting a zombie.

The other half of participants interacted with the virtual worlds using hand controllers. This was the control group. It allowed the researchers to compare use of facial expressions with the more common form of VR interaction.

After some training, participants spent four minutes in each of the virtual worlds. After visiting each world, participants answered questions about their experience: How easy was their controller to use? How present did they feel in that world? How real did it seem? And so on.

Using facial expressions made participants feel more present inside the virtual worlds. But expressions were more challenging to use than hand controllers. Recordings from the sensor-laden cap showed that the brains of people using facial expressions were working harder than those who used hand controllers. But that could just be because these people were learning a new way to interact in VR. Perhaps the facial expression method would get easier with time. Importantly, virtual settings meant to trigger different emotions did not affect someones ability to control their VR using facial expressions.

This research pushes the boundary of hands-free interaction in VR, says Wenge Xu. He studies human-computer interactions at Birmingham City University in England. He was not involved with the study. Its novel and exciting, he says. But further research is needed to improve the usability of facial expressionbased input.

The researchers are planning more tests and improvements. For instance, everyone in this study was able-bodied. In the future, Dey hopes to test the tech with people who are disabled.

He also plans to explore ways to make interacting through facial expressions easier. That may involve swapping out the sensor-laden caps for some other face-reading technology. One idea: Use cameras that capture face movements or facial gestures, he says. Or sensors could be embedded in the foam cushion of a VR headset. Dey imagines one day people will train VR systems using their own sets of expressions.

Technology has helped me along the way [in a] world that wasnt necessarily made for me, says Tylia Flores. As a person with cerebral palsy, standard methods of interaction arent always available to her. She feels the new technology could make VR more accessible for people whose physical movements are limited.

More:

Facial expressions could be used to interact in virtual reality - Science News for Students

How virtual reality can be used in poultry processing – Poultry World

The Agricultural Technology Research Programme at the Georgia Tech Research Institute is looking at ways to incorporate automation solutions into the challenging poultry processing environment which is beset by high turnover rates.

Food processing environments are often kept cold by design to prevent pathogen growth, but low temperatures and the physical demands of the job, coupled with Covid-19 outbreaks, have led to turnover rates of between 40 and 100% per year.

To address this, the Agricultural Technology Research Programme (ATRP) is exploring ways to combine virtual reality with factory-based robotics in certain poultry processing operations, such as cone loading, which involves putting chicken carcases onto a cone for further processing, having had their legs and thighs removed.

Konrad Ahlin, Georgia Tech Research Institute research engineer, said that while cone loading sounded easy, it isnt: The problem is having a dedicated person doing that for extended periods its physically demanding on the person, and its a menial, trivial task thats unfortunately just necessary.

Virtual reality is creating this bridge where information can intuitively pass between human operators and robotic devices in a way that hasnt been possible before.

ATRPs robotics solution would allow human workers to provide key information to robot systems performing the operation, all from a virtual reality environment. So far, attempts to fully automate common poultry processing operations have not been successful due to the birds irregular and malleable shapes. But Ahlin believes this could change with virtual reality.

Virtual reality is creating this bridge where information can intuitively pass between human operators and robotic devices in a way that hasnt been possible before.

ATRP has filed a provisional patent for its virtual reality research and is also working with the Georgia Research Alliance to develop a commercialisation roadmap for the technology.

Gary McMurry, a Georgia Tech Research Institute principal research engineer, said virtual realitys potential could be transformative: There are lots of reasons that this technology could have a big impact on manufacturing, which is struggling with finding people to do jobs. With this job you could be sitting in West Virginia, put on a VR headset and work from the comfort of your own home. Youre no longer tied to geography, and thats really powerful.

See the rest here:

How virtual reality can be used in poultry processing - Poultry World

WIMI Hologram Academy: The application of virtual reality technology in balance function disorder after cerebral apoplexy – GlobeNewswire

HONG KONG, April 27, 2022 (GLOBE NEWSWIRE) -- WIMI Hologram Academy, working in partnership with the Holographic Science Innovation Center, has written a new technical article describing their exploration of their application of applying VR technology into the treatment for stroke patients' rehabilitation. This article follows below:

VR technology is gradually expanding its fields of application owing to its rapid development. The integration of VR technology and the treatment in balance function disorder, compared with conventional treatment, demonstrates explicit advantages in the aspects of life quality, feedback and outcomes, thus offering a promising future.

Cerebralapoplexy, also known as stroke, or cerebral vascular accident(CVA), refers to an acute cerebrovasculardisease lasting more than 24 hours featured with acute or focal brain dysfunction caused by all kinds of vascular causes(including hemorrhage and ischemia). Typically it includes cerebral hemorrhage, cerebral infarction, subarachnoid hemorrhage and other diseases.Scientists from WIMI Hologram Academy of WIMI Hologram Cloud Inc. (NASDAQ: WIMI) have studied the application of virtual reality technology in balance function disorder after cerebral apoplexy.

Virtual reality technology uses a computer to simulate the real world and allows the user to experience the virtual world. Compared with traditional rehabilitation training, virtual reality technology is fun, safe and can greatly mobilize the patient's initiative, featuring with competitive advantage in the rehabilitation of balance function, thus providing a new treatment for patients' rehabilitation.

1. Characteristics of balance function disorder after cerebral apoplexy

As one of the important physiological functions of the human body, balance is related to posture, transfer and motion control. From the mechanical perspective, balance means that an object is subjected to equal forces from all directions so that it can maintain a stable state. While for the balance of the human body, it is more complex than the balance of objects in nature. Balance is the ability of the body to maintain an upright standing posture in different environments and to adjust and maintain itself automatically when moving or subjected to external forces, including reactions such as balance, protective stretching, striding and jumping. Human body depends on balance in daily life, movement and work, and the maintenance of balance is mainly influenced by vestibular system, proprioception, vision, as well as the brain's balancing reflex regulation and muscle strength. Impaired balance is often characterized by unstable sitting and standing, reduced ability in transferring and walking , and increased risk of falls, which can seriously affect daily life and life quality. Statistics show that in recent years, the disability rate of stroke is high, and about 83% of them suffer from balance function disorder, which greatly increases the burden on families and society. Since the improvement of balance is fundamental to the recovery of movement function and daily activities, balance rehabilitation can be particularly important.

2.The application of VR technology in the rehabilitation of balance function disorder after cerebral apoplexy

2.1Comparison between VR technology and conventional rehabilitation treatment

Conventional rehabilitation methods for balance function disorder mainly involve the use of techniques or rehabilitation equipment for gravity center transfer and movement control exercises in different postures, as well as strengthened sensory input through mirror therapy. While techniques such as suspension, core training, and neuromuscular facilitation focus on enhancing core stability and training trunk control, gravity center transfer, and overall movement function. Neurophysiological treatments include movement relearning training methods, proprioceptive neuromuscular facilitation methods, etc. Although these methods to some extent help improve patients' balance and movement functions, the therapeutic effects depend on the therapist's experience and skill level. While for conventional Chinese rehabilitation therapies such as Taijiquan, Baduanjin Exercise and Wuqinxi Exercise, they lack the necessary feedback during training, are prone to causing abnormal movements and injuries. And for equipment-assisted therapies, such as transcranial magnet and balance apparatus, they featured with single movements, lack of fun and objectivity, which make it difficult to mobilize patients' enthusiasm and initiative.

Virtual reality technology uses computers to synthesize 3D environment models and apply these models to create and experience virtual worlds. It is a system simulation of multi-source information fusion interactive of 3D dynamic visual and physical behavior that allows users to enter virtual space and interact with objects in the virtual world with the help of scene displays, force/tactile sensing devices, position trackers and other devices to create a realistic experience of being there. Virtual reality technology mainly combines sensory interference training and dual task training to enhance the integration of vestibular sensory information and the organization of vestibular center of stroke patients with hemiplegia by adding virtual reality technology balance training, thus promoting postural stability. The study indicated that by combining virtual reality rehabilitation exercises with conventional rehabilitation training, patientslimb function, neurological function, balance function, as well as their life quality can be effectively improved.

2.2The application of VR in the rehabilitation of stroke

VR can provide variousvirtual environmentsfor patients to create immersive feelings and experiences, while also providing technical means for rehabilitation training with key factors such as repetitive exercises, real-time feedback, and motivationmaintenance. Through repetitive active training in variousenvironments and movementrelearning, it helpsaccelerate the establishment of collateral circulationand improves neurological deficits, balancefunction and overall movement function. The studyindicatedthat treatment combinedVR technologywith conventional rehabilitation training for canoe paddling exercises in subacute stroke patients for 30 minutes per dayand three times per week, significant restoration can be seen inthe pressurecenter of standing balance and swing path, with better postural balance after 5 weeks of training, receiving VR training.

For another study in which chronic-phase stroke patients underwent treadmill training in a community-based virtual environment for 30 minutes per day, three times per week, showed significant improvements in limb swing speed, anterior-posterior and overall swing length after 5 weeks of training.The study also revealed that the use of VR technology can significantly improve Berg Balance Scale scores in patients with chronic-onset stroke, improving both static and dynamic balance. Other studies have also demonstrated that appling virtual reality-based training based on conventional balance rehabilitation for 30 minutes per day, 3 times per week for 6 weeks to patients recovering from stroke with balance function disorder, provided visual and auditory feedback, they are capable of shifting the weight of the limb to the left and right side. All these results reveal that patients given VR technology-related treatment can witness a significant boost in balance function and postural control.

In summary, virtual reality technology is a means of integrating visual input and movement output, which can be widely used in the recovery of functional impairment after stroke and is effective in improving the balance function and the ability of daily activities of stroke patients. VR balance rehabilitation training, based on conventional rehabilitation therapy, is more vivid and interesting, and is conducive to better overall clinical outcomes.

Founded in August 2020, WIMI HologramAcademy is dedicated to holographic AI vision exploration, and conducts research on basic science and innovative technologies, driven by human vision. The Holographic Science Innovation Center, in partnership with WIMI HologramAcademy,is committed to exploring the unknown technology of holographic AI vision, attracting, gathering and integrating relevant global resources and superior forces, promoting comprehensive innovation with scientific and technological innovation as the core, and carrying out basic science and innovative technology research.

Read this article:

WIMI Hologram Academy: The application of virtual reality technology in balance function disorder after cerebral apoplexy - GlobeNewswire

Peeka and HarperCollins Children’s Books Team Up on a Virtual Reality Licensing Deal – GlobeNewswire

SEATTLE, April 27, 2022 (GLOBE NEWSWIRE) -- Peeka, the world's first platform for virtual reality (VR) children's books and content, has teamed up with HarperCollins Children's Books to bring beloved storybooks to life in virtual reality.

What's in store?

Peeka's fully immersive experiences use mobile phones and are accessible to families of all backgrounds. With a simple cardboard or plastic VR headset, families can jump into the pages of storybooks and let the stories happen to them.

Peeka's studio in Seattle, WA, has already begun production on HarperCollins Children's Books I Want to Be a Doctor by Laura Driscoll, illustrated by Catalina Echeverri, and will soon start preproduction on Zuri Ray Tries Ballet by Tami Charles, illustrated by Sharon Sordo, and the Christmas classic Peppermint Post by Bruce Hale, illustrated by Stephanie Laberis. These experiences are slated to hit the Peeka app later this year.

Why is this important?

In a recent Project Tomorrow research survey, 75 percent of parents and 71 percent of teachers expressed that effective use of technology is very important for the future success of students in a post-pandemic world.

In a screen-dominated world, Peeka helps bring kids back to books and reading using devices they love, with content that's comfortable and delightful for every family to dive into together.

Further, Peeka opens the doors to new mediums that publishers and authors can explore with their IP. For VR, this licensing deal fosters an understanding of how the VR ecosystem can contribute to building a love of book content.

Michael Wong, Peeka CEO, said: "I'm excited and honored to team up with HarperCollins in the wonderful world of immersive kids' entertainment. This is a milestone for Peeka, and for the VR industry."

Rachel Horowitz, Senior Director, Subsidiary Rights, HarperCollins Children's Books, said: "We are delighted that Peeka will be bringing three of HarperCollins Children's Books titles to life in a new and innovative way, and we are excited to be in this space with them."

ABOUT PEEKA

Peeka, a VR startup based in Seattle, WA, is the first and largest kid's VR company, primarily focusing on picture book-related and other educational, kid-friendly content to help motivate children to find a passion for learning and reading. A majority of Peeka's immersive content deals with important topics such as diversity, empathy, race, mindfulness, gender, and more. Find out more at peekavr.com.

ABOUT HARPERCOLLINS CHILDREN'S BOOKS:

HarperCollins Children's Books is one of the leading publishers of children's and teen books. Respected worldwide for its tradition of publishing quality, award-winning books for young readers, HarperCollins Children's Books is a division of HarperCollins Publishers, which is the second-largest consumer book publisher in the world, has operations in 17 countries, and is a subsidiary of News Corp (NASDAQ: NWS, NWSA; ASX: NWS, NWSLV). You can visit HarperCollins Children's Books at http://www.harpercollinschildrens.com and http://www.epicreads.com and HarperCollins Publishers at corporate.HarperCollins.com.

Related Images

Image 1: Peeka in the Library

Two students in Seattle enjoying immersive storybooks.

This content was issued through the press release distribution service at Newswire.com.

View post:

Peeka and HarperCollins Children's Books Team Up on a Virtual Reality Licensing Deal - GlobeNewswire

Using Virtual Reality To Bridge Gaps In Nursing – Texas A&M University Today

VR simulations provide immersive experiences for students to hone their skills before working with patients.

Keith Mitchell Photography

Before nursing students at Texas A&M University ever enter a clinic or engage in face-to-face clinical scenarios, they already have hours of experience interacting with patients. Theyre accomplishing this through new, innovative virtual reality (VR) simulations developed by their professors.

Since early 2020 (pre-pandemic), a team at the Texas A&M College of Nursinghas been working on integrating VR simulations into their curriculum to help bridge the gap between classroom and clinic. They have launched two simulations so far that provide immersive experiences for students to hone their skills before working with real patients.

VR simulation is in that area that we call a safe container, said Elizabeth Wells-Beede, clinical assistant professor at the College of Nursing. Were all human and mistakes are going to be made. This is a place that we hope to create that psychologically safe environment for mistakes to be made, where we as the experts can help walk the students through the processes, and then they take that experience into practice and not make the mistake with a real-life patient.

Clinical simulation is not new. It has been used in nursing education for many years and allows students to apply the theory theyve learned from books and skills theyve learned in labs (such as checking vital signs, inserting IVs and conducting evaluations) to patient scenarios that they could encounter in a clinical setting. In a traditional simulation, a student is presented with a standardized patient (or trained actor), a mannequin or a computer-based program, to name a few. The student must work through the case presented to them by reading the patients chart, interviewing the patient and conducting an examination to decide what action to take.

Virtual reality is a new, emerging form of clinical simulation that provides more accessible and immersive experiences that dont require learners to travel to clinical settings, helping with the increasing burden on clinical practice partners to place learners.

The technology used at Texas A&M is being developed in close collaboration with Jinsil Hwaryoung Seo, associate professor and director of the Institute for Applied Creativity at the Texas A&M College of Architecture. Nursing faculty write the clinical scenarios and then work with Seo and her students to turn those scenarios into immersive, virtual reality experiences.

To access the simulated world, nursing students put on VR headsets that transport them into a virtual setting that can be a clinic, home or school. There, they meet with a virtual patient and work through their case to make a decision while their instructor observes and provides feedback.

I am convinced VR is the future of simulation, said Cindy Weston, associate dean for clinical and outreach affairs at the College of Nursing. This is an immersive platform thats deeper than what weve been able to do in the past with simulation in the other variety of forms it takes. Student learners feel like theyre in the environment, and its a safe space for them to hone and develop skills with faculty guidance and feedback.

Currently, the College of Nursing has applied VR simulation in two areas: Screening, Brief Intervention and Referral to Treatment (SBIRT), and forensic nursing. SBIRT is an approach that health care providers use to quickly recognize when a patient uses drugs and/or alcohol in risky ways so that they can provide brief intervention and refer them to specialty care if more extensive treatment is needed. Forensic nurses are professionally trained to treat victims of violence through patient-centered, trauma-informed care. Both SBIRT andforensic nursinginvolve patients in vulnerable situations that require highly competent, compassionate and experienced care providers.

We know that when confidence is high, nurses performance, retention, and their ability to perform the skill is high, Weston said. VR simulation builds their confidence and then were able to assess their competence before they head into the clinical setting.

The SBIRT VR simulation has been in use for about a year. It is currently instructor-guided, meaning that when students are interacting with the patient inside the virtual world, an instructor monitoring the simulation from the outside answers on behalf of the patient. The team is working on taking this to the next level and is currently developing an artificial intelligence capability for the platform.

The first forensic nursing VR simulation, which launched last month, is self-contained. In it, nurses complete a number of tasks to learn how to conduct a sexual assault examination. The goal is to help them become comfortable performing the exam before working with a live standardized patient.

We have had an overwhelming excitement with all of it, said Stacey Mitchell, DNP, MBA, MEd, RN, SANE- FAAN, clinical professor and director of theTexas A&M Health Center of Excellence in Forensic Nursing. Most of the students, every time they put on the VR headset, they say, Oh my gosh, this is so amazing! They are thrilled and excited that were bringing this to them.

The VR simulations are not only designed to bridge gaps inside nursing school. As part of three Health Resources and Services Administration (HRSA) grants, they are helping to bridge gaps in rural and medically underserved areas as well. Specific areas of focus include mental health, chronic disease management, medication management, postpartum care, and forensic nursing.

Were trying to meet the need for those areas, and thats really not where the big VR companies are. Theyre looking at the acute care setting, not the ambulatory care setting where we are, Wells-Beede said. Although this whole VR world is building up around us, we are doing something in between thats going to meet the need for rurally underserved areas. This is where I feel our niche is; we are a land-grant institution and were giving back to our community by doing these simulations that can actually be brought into the community setting.

The VR simulation team at the College of Nursing has been selected to receive a team Innovation Award from Texas A&M Technology Commercialization. This award recognizes individuals whose research exemplifies the spirit of innovation within The Texas A&M University System. Wells-Beede, Weston and Mitchell, along with Angela Mulcahy, PhD, RN, CMSRN, CHSE, will be presented with the award at the Patent and Innovation 2022 Awards luncheon on April 22.

Read the original here:

Using Virtual Reality To Bridge Gaps In Nursing - Texas A&M University Today

$5M gift to revolutionize retail, establish virtual reality learning lab at UArizona – University of Arizona News

By Rosemary Brandt, College of Agriculture and Life Sciences

Today

Supply chain woes, environmental sustainability, automation, information security, workforce retention and talent recruitment these are just a handful of the issues facing the retail sector today. To address these and future retail challenges, a $5 million gift from Terry and Tina Lundgren will fuel innovative research and student opportunities in business, retail and consumer sciences at the University of Arizona.

"People often dismiss the importance of retailing, until retail is disrupted," said Laura Scaramella, head of the UArizona John and Doris Norton School of Family and Consumer Sciences. "Think about what happened during the COVID-19 pandemic; what other industry turned so quickly, on a dime, to meet the needs of consumers?"

The gift announcement coincides with the 26th annual UArizona Global Retailing Ideas Summit, which began Wednesday at the university's Student Union Memorial Center. Among conference presenters are senior executives from retailing, consumer brand and technology companies such as Macy's, Best Buy, Nike, PetSmart, and Levi Strauss and Company.

The gift will establish an endowed faculty chair in both the Eller College of Management and the Norton School of Family and Consumer Sciences to help meet the industry's demand for graduates who embrace change and spark innovation in retailing companies of all sizes. It also will provide enhanced scholarship opportunities for community college transfer students.

"I had to work full time and go to school full time. I was the only one out of six kids who went to college. I was desperate to figure out how to graduate because I wanted a better life. I wanted more opportunities," Lundgren said. "There are young people in community college in the same exact situation. If we just give them the opportunity, they will exceed everyone's expectations."

Lundgren, who retired as executive chairman of Macy's, Inc., in 2018, served as the retail company's CEO for 14 years. Recognized as a global leader both in stores and online, Lundgren also twice served as chairman of the National Retail Federation, the industry's leading voice. He is a longtime supporter of the university's retailing program and helped establish the Terry J. Lundgren Center for Retailing, housed in the College of Agriculture and Life Sciences Norton School of Family and Consumer Sciences.

"Terry's success and ongoing support have put the University of Arizona on the map as a destination for students who hope to work in retailing, and as a talent pipeline for the top brands who join us every year for the Global Retailing Ideas Summit," said UArizona PresidentRobert C. Robbins. "This new gift from Terry and Tina is pivotal to the future of a changing industry, and I am so grateful for their longstanding partnership."

The gift comes at a good time for the retailing program, said Lance Erickson, a consumer psychologist and associate professor of practice in the Norton School of Family and Consumer Sciences.

"In the last couple years, we've really refreshed the curriculum, rethinking where retailing is and where it's moving to in the future," Erickson said. "The pandemic revealed so many changes, in terms of how retailers incorporate new technologies and pivot to deliver necessary services. This gift allows us to be at the forefront of cutting-edge technology."

"Terry has always been at the forefront of retailing," said John-Paul Roczniak, president and CEO of the University of Arizona Foundation. "As an alumnus, he has been incredibly generous with his time and philanthropy, funding advances in research and practice that have helped build an incredible pipeline of Wildcat talent. This new gift will create campus partnerships and power innovation."

Part of the Lundgrens' contribution will be used to create a retail learning laboratory, equipped with the latest in virtual reality technology, eye-tracking and heat-sensing software, cameras, and display hardware to allow students to get hands-on practice in a variety of retailing scenarios.

"Biology students have labs where they can practice their craft and perform experiments to learn about biological processes. Retailing students need to be able to do the exact same thing," Scaramella said. "If students are working on product display, they can actually set up a shop, create virtual product, use different display strategies and then use eye-tracking software to track where people are visually attending."

The retailing laboratory will also allow retail and marketing researchers to study consumer behaviors, perceptions and technological adoption. An urgent demand in the retail sector is contactless point of sale, where a store is equipped with optical scanners, and products carry enhanced barcodes to track the items a consumer walks out with and automatically bill their preferred method of payment or credit card on file, Erickson explained.

"To me, the question is: Are consumers going to adopt something like this? How do we educate consumers about how contactless point of sale works and help consumers overcome their hesitancy to walk out of a store without having physically paid for something or checked out?" Erickson said. "This will genuinely be a lab space where we can do research with real consumers."

In addition, the Lundgrens' contribution will fund collaborative research to identify and address other emerging and future retail challenges. As part of the summit announcement, the newly established Lundgren Retail Collaborative issued an open call for proposals from researchers across scientific disciplines to answer the question: What are the big hurdles retailers and companies should be addressing now to better prepare for the future?

"The goal of the Lundgren Retail Collaborative is to build a world-class hub, right here on the University of Arizona campus, that drives retail education, research and practice," said Yong Liu, marketing department head and Robert A. Eckert Endowed Chair in the Eller College of Management.

Business and consumer science students will play a role in the collaborative's research efforts as well. Each year, students will be encouraged to apply for fellowship opportunities and form teams encompassing any variety of fields of study such as fashion, fine art, engineering, psychology or information technology to address the issues facing the retail sector.

Selected student teams will be provided with mentorships, professional development training, and the opportunity to present their solutions to top-level industry professionals as part of the annual Global Retailing Ideas Summits.

"If we're teaching students to really lead innovative change, they need to be able to come up with fresh ideas and learn to work outside of their own discipline," Scaramella said. "Giving students an opportunity to work in interdisciplinary teams now is only going to help them in the future."

Read the original here:

$5M gift to revolutionize retail, establish virtual reality learning lab at UArizona - University of Arizona News

Chinese start-up Nreal is launching its augmented reality glasses in the UK this spring – CNBC

Attendees look at NReal's augmented reality glasses, on the last day of CES 2019 in Las Vegas, Nevada.

Robyn Beck | AFP | Getty Images

Nreal, a Chinese augmented reality start-up, is planning to bring its smart glasses to the U.K.

The Beijing-based company said Tuesday it will launch its Nreal Air AR glasses in Britain later this spring through an exclusive deal with local carrier EE, which is owned by telecoms group BT.

Nreal's glasses allow users to watch movies or play games on large virtual displays. Users can do so by connecting the glasses to their smartphone through a cable. They're designed to look like sunglasses, similar to Snap's Spectacles line of smart glasses.

The Nreal Air has two main modes: "Air Casting" and "MR Space." Air Casting lets users view their phone screen on a 130-inch virtual display when standing four meters away, while MR Space combines digital objects with a user's surrounding space.

Nreal did not give an exact release date or price for the device. A spokesperson said more details will be revealed "at a later date."

Nreal is one of countless companies hoping to bring augmented reality which blends three-dimensional digital objects with the real world to a more mainstream audience. The tech has been around for years but, like virtual reality, it has struggled to find commercial success.

Now, with the tech world abuzz with talk about the so-called "metaverse," it's given technologies like AR and VR a new lease on life. Companies like Microsoft and Facebook, or Meta as it's now known, want to build vast digital worlds in which millions of users can interact and transact with one another.

Peng Jin, co-founder of Nreal, said he believes AR "will start a revolutionary transformation just as the internet once did."

"AR will transcend the current mobile experience, especially when it comes to watching videos, exercising, and playing PC and cloud video games," he added.

Founded in 2017, the company has created two AR headsets to date: the Nreal Light and Nreal Air, the latter of which it debuted last year. The company has raised over $230 million to date from investors including Alibaba, Nio and Sequoia Capital China. It was most recently valued at $700 million.

See the rest here:

Chinese start-up Nreal is launching its augmented reality glasses in the UK this spring - CNBC

What Is the Metaverse? – Government Technology

The 1992 cyberpunk novel Snow Crash introduced the word metaverse a term now bandied about in technology circles and laden with high hopes of redefining the Internet.

This space is still forming, and gaps remain between the visions being woven about what a metaverse-infused future could look like and what exists today.

While definitions vary over what, exactly, constitutes a metaverse, they are commonly described as immersive, persistent, interactive digital environments, and often one that mimics the real world to some extent.

People today commonly access the Internet as text and visuals on a flat screen, but metaverses aim to more heavily use augmented reality (AR) and virtual reality (VR) to create a 3D experience. And just like activities on social media platforms or in massive multiplayer online (MMO) games continue regardless of whether a particular user is logged in, metaverses, too would be continuous and always-on.

Metaverses are also expected to use blockchain technologies like cryptocurrencies and non-fungible tokens (NFTs) to support digital transactions.

Ensuring the various metaverses are interoperable will also be key to achieving the full vision, he said.

Tech Companies

Firms trying to create metaverses present their visions of the work.

Meta CEO Mark Zuckerberg said in a 2021 public letter that the metaverse is the next chapter for the Internet, characterized by its immersive nature. His firms gaming and per the company social VR experience Horizon Worlds is seen as a metaverse project.

The metaverse will be an embodied Internet where youre in the experience, not just looking at it. The defining quality of the metaverse will be a feeling of presence like you are right there with another person or in another place, Zuckerberg wrote. He said this hypothetical new kind of Internet experience would enable activities like virtual gatherings, playing, e-commerce, design, work and more.

The e-commerce element will rely on technologies like cryptocurrencies and NFTs, while tools like AR, VR, phones and computers would enable different levels of digital immersion and functionalities, he said.

In its blog, computing platform company NVIDIA defines the metaverse as a shared virtual 3D world, or worlds, that are interactive, immersive, and collaborative. Another feature is that digital goods and avatars should persist and be transportable among different virtual platforms and thanks to AR between platforms and the physical world. The company offers an omniverse 3D simulation and design platform intended to be used by metaverse developers.

Public Sector

Some state and local governments have been watching the space as well and offer their own views.

Utah Chief Technology Officer Dave Fletcher told GovTech that the metaverse is basically a virtual three-dimensional environment where people can socialize and interact and perform transactions that replicates a real or unreal universe.

The National League of Cities (NCL) said in a recent report that, while it doesnt have an agreed upon definition generally speaking, the metaverse is the next evolution of the Internet that will further integrate physical and digital experiences. For some, its an online space that digitally recreates the real world. For others, it is a shift in how people interact with their world, using technologies like 3D computing, augmented reality, virtual reality and blockchain to form new immersive virtual world experiences where digital information can be overlaid on our physical world.

The Verge reports that computing power will need to advance, for one. As of December 2021, Horizon Worlds could support 20 simultaneous users in a virtual space, and other persistent, virtual, interactive platforms that avoid VR and thus avoid those greater computing requirements still faced strict limits. The games Fortnite and Battlefield 2042, for example, reportedly could support only 100-128 simultaneous players.

Interoperability remains to be achieved as well, and plenty of questions remain over how or whether policymakers will try to regulate aspects of the space, such as commerce and taxation, privacy, hate speech and disinformation.

Find out what metaverses mean for the public sector in part 2 of this series.

Read more here:

What Is the Metaverse? - Government Technology

With virtual reality, my grandmother tells her story of surviving the Holocaust – Chicago Sun-Times

My grandmother, Fritzie Fritzshall, survived Auschwitz.

Czechoslovakian-born, my grandmother was just 13 years old when she was pushed out of a boxcar and into hell.She was torn from her mother and her two brothers, all of whom were murdered. She survived because of the selfless acts of others.

The man who pulled her from the train, a Jewish prisoner, whispered in her ear: Youre 15. Remember, youre 15. When the guards asked her age, she told them she was 15. They sorted her into the line of people who would live another day. The other line led straight to the gas chambers.

Toward the end of the war, my grandmother was transferred to a sub-camp of Auschwitz where she was one of 600 women working as a slave in a factory. Their sole meal for the day was a small chunk of bread. Every night, each of the other 599 women would give her, the youngest, a crumb of their bread in the hope she would survive. Together, the crumbs were the size of a large marble. She promised that if she made it out, she would tell the world what happened.

My grandmother spent the rest of her life fulfilling that promise.

She was dedicated to educating future generations about the Holocaust and helped found the Illinois Holocaust Museum and Education Center in response to a planned neo-Nazi march in Skokie in the late 1970s.

As a Holocaust survivor, she wanted people to hear her story, and even to be able to picture it and understand it as though she were taking them on a personal tour of the notorious concentration camp. She worried that Auschwitz might be left to decay or redeveloped, and that the stories of so many who didnt survive would be lost forever.

Thus, she became a driving force behind The Journey Back, a virtual reality experience with Holocaust survivors that just opened to the public at Illinois Holocaust Museum.

This project was her final contribution to the world, though sadly she didnt live to see her vision come to fruition. She died just months before the opening.

Shortly after her funeral, where I said goodbye to a woman who deeply impacted who I am today, I put on a VR headset. My grandmothers voice immediately echoed in my ears. I walked with her outside her childhood home. I joined her as she was forced into a claustrophobic cattle car. I stood by her as she recalled how as a teenager, she saw the concentration camp and all its horror before her.

Fritzie Fritzshall, one of the founders of the Illinois Holocaust Museum.

I had heard my grandmothers story many times. This was the first time that I was immersed in it. Now, for years to come, people from all over the world will be able to put on a pair of virtual reality goggles and find themselves in present-day Auschwitz.

These stories, and the visceral way in which they are told, will have a major impact on the way Holocaust education is taught. While Holocaust education is required in multiple states nationwide, more needs to be done.

Stories like my grandmothers are essential in humanizing the Holocaust. The Journey Back allows everyone, especially students, to understand the dark realities of history and understand hatred and violence are never the answer.

Without learning and contextualizing history, the past can indeed be repeated. It was my grandmothers hope this new application of technology will inspire the next generation not just to remember, but to develop empathy and speak out against injustice and hate.

My grandmother endured some of the most horrific conditions in history and never lost hope. We must decide not to let hate conquer. We must hear stories like hers and remember we can make a difference, even with small actions. Each one of us has agency and can use it to make a difference in the world. My grandmother has fulfilled her promise. She has told her story, and through it, the stories of many others who couldnt tell theirs. Now its up to us to listen.

Scott Fritzshall is a Los Angeles-based tech entrepreneur and the grandson of the late Holocaust Survivor Fritzie Fritzshall, one of the founders of the Illinois Holocaust Museum and a driving force behind the museums new virtual reality experience.

Send letters to letters@suntimes.com

See the original post:

With virtual reality, my grandmother tells her story of surviving the Holocaust - Chicago Sun-Times

Virtual Reality Therapy Promising for Agoraphobia – Medscape

A novel virtual reality (VR) intervention significantly reduces agoraphobia in patients with psychosis, new research suggests.

The cognitive-behavioral therapybased treatment was particularly effective for patients with the highest level of avoidance of everyday situations.

"Virtual reality is an inherently therapeutic medium which could be extremely useful in mental health services," study investigator Daniel Freeman, PhD, DClinPsy, professor of clinical psychology, University of Oxford, United Kingdom, told Medscape Medical News. "This intervention is coming; the question really is when."

The study was published online April 5 in The Lancet Psychiatry.

Immersive VR involves interactive three-dimensional computer-generated environments that produce the sensation of being in the real world.

For patients with psychosis, dealing with the real world can be an anxious experience, particularly if they experience verbal or auditory hallucinations.

Some may develop agoraphobia and start to avoid places or situations. A virtual environment allows patients to practice dealing with situations that make them anxious or uncomfortable and to learn to reengage in everyday situations.

The study included 346 patients diagnosed with schizophrenia or a related disorder. The mean age of the patients was 37.2 years (67% men, 85% White). Most were single and unemployed. All were receiving treatment for psychosis and had difficulty going out because of anxiety.

The researchers randomly assigned 174 participants to an automated VR cognitive therapy intervention (gameChange) plus usual care and 172 to usual care alone.Trial assessors were blinded to group allocation.

The gameChange intervention was delivered in six sessions that were conducted over a 6-week period. Each session involved 30 minutes of VR.

A session begins when participants enter the virtual therapist's office. They are met by a coach who guides them through the therapy. They can choose from among six VR social situations. These include a cafe, a general practice waiting room, a pub, a bus, opening the front door of their home onto the street, or entering a small local shop.

Each scenario has five levels of difficulty that are based on the number and proximity of people in the social situation and the degree of social interaction. Users can work their way through these various levels.

The virtual sessions took place in patients' homes in about 50% of cases; the remainder were conducted in the clinic. A mental health worker was in the room during the therapy.

Between virtual sessions, participants were encouraged to apply what they learned in the real world, for example, by spending time in a pub.

Usual care typically included regular visits from a community mental health worker and occasional outpatient appointments with a psychiatrist.

The primary outcome was the eight-item Oxford Agoraphobic Avoidance Scale (O-AS) questionnaire. This scale assesses distress and avoidance related to performing increasingly difficult everyday tasks.

The researchers assessed patients at baseline, at the conclusion of the 6-week treatment, and at 26 weeks.

Compared with the group that received usual care alone, the VR therapy group demonstrated a significant reduction in both agoraphobic avoidance (O-AS adjusted mean difference, -0.47; 95% CI, -0.88 to -0.06; Cohen's d, -0.18; P = .026) and distress (-4.33; 95% CI, -7.78 to -0.87; Cohen's d, -0.26; P = .014) at 6 weeks.

This translates to being able to do about 1.5 more activities on the O-AS, such as going to a shopping center alone, said Freeman.

Further analyses showed that VR therapy was especially effective for patients with severe agoraphobia. On average, these patients could complete two more O-AS activities at 26 weeks, said Freeman.

The authors believe the intervention worked by reducing defence behaviors, such as avoiding eye contact and fearful thoughts.

There was no significant difference in occurrence of adverse events between the study groups. These events, which were mild, transient, and did not affect the outcome, includedside effects such as claustrophobia when using headsets.

The intervention would likely work for patients with agoraphobia who do not have psychosis, said Freeman. "Agoraphobia is often the final common pathway in lots of mental health conditions."

Automated VR not only addresses the problem of patients being too afraid to leave home for in-person treatment but may also help address the shortage of trained mental health care providers.

The intervention is currently available at pilot implementation sites in the UK and a few sites in the US, he said.

Commenting on the research for Medscape Medical News, Arash Javanbakht, MD, associate professor (clinical scholar), Wayne State University, Detroit, described the study as "cool and interesting."

However, he said, the findings were not surprising, because exposure therapy has been proven effective in treating phobias. Because of the significant lack of access to exposure therapy providers, "the more mechanized, the more automated therapies that can be easily used, the better," he said.

He noted the VR therapy did not require a high level of training; the study just used peer support staff who sat next to those using the technology.

He also liked the fact that the intervention "focused on things that in reality impair a person's life," for example, not being able to go to the grocery store.

However, he wondered why the investigators studied VR for patients with psychosis and agoraphobia and not for those with just agoraphobia.

In addition, he noted that in part, the treatment's efficacy was partly due to having someone next to the participants offering support, which the control group didn't have.

Javanbakht has researched augmented therapy (AR) for delivering exposure therapy. This technology, which mixes virtually created objects with reality and allows users to move around their real environment, is newer and more advanced than VR but is more complicated, he said.

He explained that AR is more appropriate for delivering exposure therapy in certain situations.

"The basis of exposure therapy is 'extinction learning' exposing a person to a fear cue over and over again until the fear response is extinguished," and extinction learning is "context-dependent," said Javanbakht.

"VR is good when you need to create the whole context and environment, and AR is good when you need to focus on specific objects or cues in the environment," for example, spiders or snakes, he said.

The study was funded by the National Institute of Health Research. Freeman is a founder and a non-executive director of Oxford VR, which will commercialize the therapy. He holds equity in and receives personal payments from Oxford VR; holds a contract for his university team to advise Oxford VR on treatment development; and reports grants from the National Institute for Health Research, the Medical Research Council, and the International Foundation. Javanbakht has patent for an AR exposure therapy.

Lancet Psychiatry. Published online April 5, 2022. Full text

For more Medscape Psychiatry news, join us on Facebook and Twitter.

Here is the original post:

Virtual Reality Therapy Promising for Agoraphobia - Medscape

Global Learning Management System Market Forecast Report 2022-2028: Opportunities in the Adoption of Immersive Learning with Virtual Reality,…

DUBLIN--(BUSINESS WIRE)--The "Learning Management System Market Forecast to 2028 - COVID-19 Impact and Global Analysis By Deployment Mode, Delivery Mode, and End-User" report has been added to ResearchAndMarkets.com's offering.

The global learning management system market is expected to grow from US$ 14,895.17 million in 2021 to US$ 50,995.16 million by 2028; it is estimated to grow at a CAGR of 19.2% during 2021-2028.

The current learning analytics landscape has dramatically expanded, especially for higher education. When students engage in gamified events, they can learn and practice better. Gaming features help create a fun and productive learning experience for learners. The implementation of gamification is most widespread in e-learning platforms meant for K-12 level students.

According to an article published by EducationWorld in December 2019, STEPapp launched India's first-of-its-kind Gamified Learning EdTech app, intending to revolutionize K-12 education in the country.

Further, since the introduction of virtual reality (VR) and augmented reality (AR) into education, the class learning experience has undergone a tremendous transformation. While VR provides a built reality, AR provides a real image with an improved view.

Thus, a surge in the integration of virtual reality, augmented reality, and gamification technologies across educational institutions offers better academic results, creating the demand for LMS platforms to support their implementation, generating opportunities for the future growth of the learning management system market.

Impact of COVID-19 Pandemic on Global Learning Management System Market

The COVID-19 pandemic has shaken several industries. The tremendous growth in the virus spread has urged governments worldwide to impose strict restrictions on vehicles and human road movement.

Due to travel restrictions, mass lockdowns, and business shutdowns, the pandemic has affected economies and countless industries, such as manufacturing & construction, retail, transportation & logistics, and automotive.

In 2020, the COVID-19 pandemic positively impacted the learning management system (LMS) market due to an increase in e-learning across several countries. For instance, according to the World Economic Forum report of April 2020, more than 1.2 billion students in 186 countries were affected by school closures due to the pandemic. In Denmark, students up to the age of 11 are returning to nurseries and schools after initially closing on March 12, 2020.

However, in South Korea, students respond to roll calls from their teachers through online platforms. Thus, schools and universities have increasingly adopted the e-learning system since the start of the COVID-19 outbreak. Therefore, the overall impact of the pandemic on the learning management system (LMS) market in 2020 was positive.

Similarly, from 2021 to 2022, the demand for the LMS platforms increased due to the rising trend of bring your own device (BYOD) among enterprises. The COVID-19 pandemic has created a massive demand for BYOD trend as workers suddenly migrated to remote locations, resulting in new hybrid work environments. Therefore, the growth of the global learning management system (LMS) market in 2021 and 2022 is expected to be positive.

The key companies operating in the learning management system market include Blackboard Inc., Cornerstone, D2L Corporation, Docebo, International Business Machines Corporation, its learning AS, LTGplc, Hurix, SAP SE, and Zoho Corporation Pvt. Ltd.

Scope Highlights

Key Market Dynamics

Market Drivers

Market Restraints

Market Opportunities

Future Trends

Company Profiles

For more information about this report visit https://www.researchandmarkets.com/r/vr8do6

Read more:

Global Learning Management System Market Forecast Report 2022-2028: Opportunities in the Adoption of Immersive Learning with Virtual Reality,...