Exploring Accessibility and Extended Reality at CES2024

This post does not represent the views or opinions of Humanities Commons or Michigan State University. Additionally, mention of products, services, individuals, or companies within this post do not indicate support or preference. 

Written by Stephanie E. Vasko, User Experience Researcher for Humanities Commons

As a Senior User Experience (UX) Researcher in Mesh Research and the College of Arts and Letters, my goals are to create the best experiences for those who use the products and services we provide. When I’m not running UX research activities, I think it’s important to continue to keep up with emerging technologies as a way to plan future improvements and develop methods of providing support using these tools and for those using new technologies. 

In this vein, I recently attended CES 2024, a trade show highlighting emerging technologies and products, using my professional development funding. With 135,000+ attendees and over 4300 exhibitors, in addition to a variety of talks and panels, experiencing the entire breadth and depth of CES would be impossible. Instead of getting overwhelmed, I made a plan to visit and see specific talks and areas focused around key aspects of my current and former daily work: accessibility and extended reality.

Accessibility

Accessibility is one of the most important considerations for me given any product or service. Professionally, I am interested in making sure all of our users have seamless experiences with our services and that we at HC are committed to considering accessibility throughout our workflow. Personally, my own limited mobility has caused a shift in my thinking, use of technology, and day-to-day life. My primary goals at CES were to learn about accessibility best practices, explore how companies are engagIng with accessibility, and delve into technologies that may be coming to the market (especially those with applications in academic and research settings). 

I attended a variety of panels around accessibility and inclusivity, including  “The Future of Inclusive Design,” “Driving Innovation Through Inclusive Design,” and “Building an Accessible World with Intel.” The biggest takeaways, for me, from these panels were 1) the importance of building and maintaining relationships with partners for accessibility and 2) including different voices and perspectives in the process from the beginning. These are takeaways I have brought to the Humanities Commons team meetings and hope to build on as a team as we create new features and improvements. 

From a products perspective, one area that was particularly interesting were devices and software that allow for input or control in expansive ways. Examples I saw of this included Cephable (I tried their demo which allowed me to change slides by moving my head), Augmental (a mouthguard-style device that allows for tongue-based interfacing), and the Lotus ring (not pictured, allows users to control devices with point/click on a ring).

An open Macintosh laptop open displaying a slide with blue background and white text. In the upper right corner, there is a smaller window with controls for Cephable and an image of the author in those controls. 
The author changing slides using the Cephable software and the position of her head
(Image used with permission from author)
Two hands display a clear mouthguard with electronic components
A close up of the Augmental device
(Image used with permission from author)

Extended Reality (XR) and Related Tools

Within extended reality (XR), I am specifically interested in augmented reality (AR) from both research and user perspectives. As someone who works on creating phone-based AR experiences, I saw relatively few of these at CES. One trend that stood out to me was that companies at this year’s CES clearly took style into consideration. I tested several soon-to-market AR glasses  including the XREAL2 (currently pre-order only) and RayNeo X2 (Indiegogo Launch in Feb 2024)

In my previous work on AR, I used photogrammetry to create models for AR-based experiences. At CES, I was able to try several 3D capture and photogrammetry solutions including RebuilderAI and VRIN3D. I look forward to seeing what our users do in this space as more tools become available and 3D capture becomes readily available and integrated into experiences and research.

Two hands hold a phone that is capturing images of orange drawstring bag on a dark grey table on light grey carpet
VRIN3D in action, the purple squares on the phone indicate spaces that VRIN3D has already captured (Image used with permission from author)

Tying back to accessibility, within XR, I focused more on exploring AR devices because I am prone to VR motion sickness. For this reason, I mostly limited my experiences with these products to those that also had a haptic feedback component, and in some cases got just a taste of the experience rather than a full-length demo. We’ve come a long way from rumble controllers; but my CES experience with haptic devices left me thinking that there is still a lot of work to be done in this area. Many of the haptic devices I had the opportunity to try still lacked nuance to the feedback, with little disambiguation between haptic feedback for different events*. One of the most interesting ones I tried was from HaptX, where I strapped on a backpack, gloves, and a VR headset and was able to walk around in a demo environment and try tasks like petting a cat, moving a cup, or writing with a pen. My experience suffered a bit due to only large gloves being available, I would definitely be interested in how it changes given smaller gloves.  

A brunette woman dressed in all black is holding black VR goggles to her face, has a haptic glove on her right hand, and is wearing on her back on a grey box with black cords
The author trying out the current generation HaptX device
(Image used with permission from author) 

Final Notes

This post barely skims the surface of the information and experiences at CES 2024. CES provides ample opportunities to hear from experts, engage with product and company representatives, and talk to individuals with different perspectives. From a research perspective, this could be a space that could spark new research ideas or collaborations. From my own perspective, I came away with ways to expand my network and considerations for future iterations on product and process within my work at MSU. 

* I did not get to try Razer’s Project Esther haptic cushion.