[ad_1]
Researchers from the Nanyang University of Technology in Singapore have introduced a method for tracking human movements in the metaverse, signalling a potential shift in how we interact with digital environments. Utilizing WiFi sensors and advanced artificial intelligence, this new approach could pave the way for more intuitive experiences in virtual reality.
Accurately representing real-world movements within the metaverse is crucial for creating immersive virtual experiences. Traditionally, this has been achieved through device-based sensors and camera systems, each with limitations, according to the research. For example, handheld controllers with motion sensors provide limited data, capturing movement from a single point on the body. On the other hand, Camera-based systems struggle in low-light conditions and can be obstructed by physical barriers.
Enter the innovative use of WiFi sensors for human activity recognition (HAR). Leveraging the properties of WiFi signals, similar to radar, researchers have found that these can detect and track objects and movements in space.
Researchers have utilized this technology for various purposes, including monitoring heart rates, breathing, and detecting people through walls. Then, by combining WiFi sensors with traditional tracking methods, the Nanyang University team aims to overcome the limitations of previous systems.
Applying WiFi sensors for movement tracking in the metaverse requires sophisticated artificial intelligence (AI) models. The challenge lies in training these models, a process that demands extensive data libraries. Traditionally, creating and labelling these datasets has been a labour-intensive task, limiting the efficiency and scalability of the research.
Introducing MaskFi
To address these challenges, the research team developed MaskFi, a system based on unsupervised learning—a type of AI training that requires significantly less data. MaskFi has demonstrated remarkable efficiency, achieving approximately 97% accuracy in tracking human movements across two benchmarks. This system has the potential to dramatically reduce the time and resources needed to train AI models for HAR in the metaverse.
The implications of MaskFi and similar technologies are vast. By enabling accurate, real-time tracking of human movements without the need for cumbersome equipment or extensive data labelling. This brings us closer to a metaverse that closely mirrors the real world. Overall, this breakthrough could see a future where digital and physical realms converge more smoothly, offering users experiences that are more natural, intuitive, and immersive. As research and development continue, the dream of an advanced real-world representation in the metaverse inches closer to reality.
[ad_2]
Source link