HyperAI초신경
Back to Headlines

AI-based tool creates simple interfaces for virtual and augmented reality

2달 전

**Abstract: AI-Based Tool Creates Simple Interfaces for Virtual and Augmented Reality** A research team from Carnegie Mellon University's Human-Computer Interaction Institute (HCII) has introduced EgoTouch, an innovative tool that leverages artificial intelligence (AI) to control augmented reality (AR) and virtual reality (VR) interfaces by touching the skin with a finger. This development aims to provide tactile feedback using only the sensors integrated into standard AR/VR headsets, thereby eliminating the need for additional, cumbersome hardware. **Background and Motivation:** Previous methods, such as OmniTouch, developed by Chris Harrison, an associate professor at HCII and director of the Future Interfaces Group, came close to achieving this goal but required a specialized, depth-sensing camera. This limitation hindered the practicality and accessibility of such interfaces. Vimal Mollyn, a Ph.D. student under Harrison's guidance, conceived the idea of using a machine learning algorithm to train normal cameras to recognize touching. The fundamental observation was that shadows and local skin deformations uniquely occur when a finger touches the skin, providing visual cues that can be detected and interpreted by AI. **Development and Data Collection:** To develop EgoTouch, Mollyn utilized a custom touch sensor that was positioned along the underside of the index finger and the palm. This sensor collected data on various touch types and forces while remaining invisible to the camera. The data was then used to train a machine learning model to correlate these visual features with touch and force, without the need for human annotation. To ensure the system's robustness and universality, the team expanded their data collection to include 15 users with diverse skin tones and hair densities. They gathered extensive data across multiple scenarios, activities, and lighting conditions, which helped in fine-tuning the model to perform consistently under varied conditions. **Performance and Capabilities:** EgoTouch has demonstrated high accuracy in detecting touch, achieving more than 96% precision and a false positive rate of around 5%. It can recognize pressing down, lifting up, and dragging movements, as well as classify touches as light or hard with 98% accuracy. This capability is particularly useful for implementing right-click functionality on the skin, mimicking touchscreen gestures. The system's performance was consistent across different areas of the hand and forearm, including the front of the arm, back of the arm, palm, and back of the hand. However, it performed less effectively on bony areas like the knuckles, where skin deformation is minimal. As a result, user interface designers are advised to avoid placing interactive elements on these regions. **Future Enhancements:** Mollyn is currently exploring the use of night vision cameras and nighttime illumination to enable EgoTouch to function in low-light conditions. Additionally, he is collaborating with other researchers to extend the touch-detection method to surfaces beyond the skin, broadening the potential applications of the technology. **Significance and Impact:** The introduction of EgoTouch marks a significant advancement in the field of AR/VR interfaces. By utilizing the cameras already present in AR/VR headsets, the system eliminates the need for additional hardware, making it more user-friendly and accessible. The calibration-free nature of EgoTouch further enhances its ease of use, allowing it to work seamlessly out of the box. This breakthrough could lead to more intuitive and natural user interactions in AR/VR environments, potentially revolutionizing how users engage with these technologies. **Conclusion:** EgoTouch represents a promising step towards creating more user-friendly and accessible AR/VR interfaces. Its ability to detect touch and force accurately using only the sensors found in standard headsets opens up new possibilities for on-skin user interfaces, making AR/VR interactions more intuitive and seamless. The ongoing research and development, including night vision capabilities and the extension to other surfaces, suggest that EgoTouch has the potential to significantly enhance the user experience in AR/VR applications. **References:** - Vimal Mollyn, Chris Harrison, et al. "EgoTouch: On-Body Touch Input Using AR/VR Headset Cameras." *Proceedings of the 37th Annual ACM Symposium on User Interface Software and Technology* (2024). DOI: 10.1145/3654777.3676455 - Carnegie Mellon University's Human-Computer Interaction Institute (HCII) - TechXplore. "AI-based tool creates simple interfaces for virtual and augmented reality." (2024, November 13). Retrieved from https://techxplore.com/news/2024-11-ai-based-tool-simple-interfaces.html

Related Links