Wolvic 1.7 Released
Welcome to Wolvic 1.7. This release brings updated web engines, improved window handling, OpenXR-enabled hand tracking, and… OpenXR eye tracking!
Let’s take that last point first, because it’s a huge leap forward in user interaction even in early implementation. You know how some XR devices use where you’re looking as a cue for what to do or activate? Wolvic now supports this on any device that has eye tracking capabilities that are exposed via OpenXR. Look at a link, see its “hover” styles. Pinch your fingers or click a controller button, and follow that link you’re looking at.
You’ve probably seen this kind of thing in ads and reviews of Apple’s Vision Pro, and we’re beyond thrilled to have brought similar features to Wolvic. This is still in its early implementation, so there may be oddities and glitches. If you encounter them, please let us know using the feedback feature built into Wolvic! We plan to make eye tracking better and better, and knowing what goes awry is a big part of that.
Note: If you’re going to try out eye tracking, as we hope you do, please note that it requires good eye-tracking calibration for best results. If you find that the tracking target seems off, please use your device’s calibration routines to make sure you have good tracking. If you do and the tracking target is still off in Wolvic, we definitely want to know.
Beyond eye tracking, we’ve extended our OpenXR hand tracking, permitting the use of profiles that allow us to track input from hands the same way we track input from physical controllers. This is currently being used on the Magic Leap 2; here’s a quick capture of using Wolvic’s OpenXR hand tracking on a Magic Leap 2.
The web rendering engine in the Gecko version of Wolvic is now up to Gecko 128 ESR, and our AndroidComponents engine version is also at v128. This will bring Wolvic’s rendering up to par with the latest desktop and mobile browsers.
After some great feedback on the new window behaviors introduced in Wolvic 1.6.1, we’ve refined window sizing and placement, both “fullscreen” and otherwise. Along with that, we brought back the address bar for all windows, which includes media controls if the window is showing playable media.
In individual-device news, we’ve added support for various platform-specific features, such as voice search and input on VisionGlass. And, as is becoming tradition, we’ve added some new VR environments, from a fountain park in Japan to Notre-Dame on a rainy night to the sunny Dutch countryside to a windswept California shore. All four are now available through Wolvic’s Settings > Environment dialog, where any new additions to the Wolvic 3D Environments GitHub repository will appear and be available for download. Our thanks to Bob Dass, Alexandre Duret-Lutz, Aldo Hoeben, and Masato OHTA for releasing their works under the CC BY 2.0 or CC BY-NC 2.0 licenses.
Those are the highlights for 1.7, but as always, there are still more changes and updates listed in the notes below. If you’ve found a new problem or have another issue, please send us your feedback or, if you prefer, file an issue on GitHub. Thanks, and we hope you enjoy Wolvic 1.7!
(26 Aug 2024: Updated to more accurately describe the state of OpenXR hand tracking.)
Highlights
- Bring back the title bar widget.
- Add the hand interaction input profile.
- Support eye-tracking navigation.
- Update the Gecko version to v128.
- Many bugfixes and stability improvements.
Notes
UI
- Bring back the title bar widget, a small element that provides information and quick media actions for windows that don’t have the focus.
- Don’t reduce the size of the window when we go to fullscreen, which used to happen in some cases.
Input
- Add the hand interaction input profile, via OpenXR’s
XR_EXT_hand_interaction
extension. This allows us to use the OpenXR input model for hand tracking and gestures, instead of relying on vendor-specific extensions. - Support eye-tracking navigation via OpenXR’s
XR_EXT_eye_gaze_interaction
extension. Users are able to navigate with their gaze while using their hands or controllers for other input actions. Wolvic will ask for permission before enabling this feature. - Clean up and polish a lot of our input-related code.
Content
- Update the Gecko version to v128. This required updating some GeckoView APIs, as well as Mozilla’s Android Components to the new version.
- Support OpenXR’s
XR_EXT_view_configuration_depth_range
extension, which provides additional graphical information which will help us improve Wolvic’s performance. - Fix layout errors when YouTube and other pages are displayed on a very large screen.
Huawei Vision Glass
- Voice input can be triggered from the phone UI. If the user is editing a text field, the spoken words will be added to it. Otherwise, the words will be used to launch a search.
- The phone UI can be used for seeking backward and forward in the currently playing media, even if it is not in fullscreen.
- Back button works as expected if used before we have entered immersive mode.
- Multiple improvements in stability and functionality.
Others
- Many bugfixes and stability improvements.