OpenXR 设置
OpenXR 有一套自己的设置,这些设置会在 OpenXR 启动时应用。尽管能通过插件实现的 OpenXR 扩展以添加额外设置,但在这里我们只讨论 Godot 核心中的设置。
Enabled
此设置决定了在 Godot 启动时是否启用 OpenXR。当使用 Vulkan 后端时,该选项必须打开。对于其他后端,可随时通过调用 OpenXRInterface 类的 initialize
来启用 OpenXR。
如果要访问动作映射编辑器,该选项也需启用。
你可以使用 --xr-mode on
命令行指令以强制启用该模式。
默认动作映射
该选项指定了 OpenXR 将加载并与 XR 运行时通信的操作映射文件的路径。
构成因素
该选项指定了你的游戏是为以下哪种设备设计的:
Head Mounted
头戴式设备,如 Meta Quest、Valve Index 或 Magic Leap,Handheld
手持设备,例如手机。
如果运行游戏的设备与此处不匹配,OpenXR 将无法初始化。
视图配置
该选项指定了游戏是为哪种视图配置而设计的:
Mono
,游戏只提供单一图像输出。例:手机 AR;Stereo
,你的游戏提供立体图像输出。例:头戴式设备。
如果运行游戏的设备与此处不匹配,OpenXR 将无法初始化。
备注
Godot 暂不支持 OpenXR 中针对特定设备的额外视图配置。例如,Varjo 头显有一个四视图配置,可以输出两组立体图像。这些特例可能会在不久的将来得到支持。
参照空间
在 XR 中,玩家的头部和手部等所有元素的追踪都在一个追踪体积内进行。这个追踪体积的基点是将虚拟空间映射到现实空间的原点。根据使用的 XR 系统的不同,该点会被放置在不同的位置。在 OpenXR 中,这些场景被明确定义,并通过设置参考空间来选择。
Local(本地)
本地参照空间默认将原点放置在玩家的头部。一些 XR 运行时会在游戏启动时重设该位置,而其他则会让该位置在多个会话中保持不变。
然而,这种参照空间并不会阻止用户走开,因此如果你希望阻止用户离开他们正在控制的部件(这可能会破坏游戏),则需要检测用户是否离开。
本地参照空间是飞行模拟器或赛车模拟器的最佳选择,这类游戏通常希望将 XROrigin3D 节点放置在玩家头部应在的位置。
When the user enacts the recenter option on their headset, the method of which is different per XR runtime, the XR runtime will move the XRCamera3D to the XROrigin3D node. The OpenXRInterface will also emit the pose_recentered
signal so your game can react accordingly.
备注
其他 XR 追踪元素,如控制器、锚点,也将相应调整,以保持和重置后的玩家位置一致。
警告
在使用此参照空间时,不应调用 center_on_hmd
。
Stage(暂存区)
The stage reference space is our default reference space and places our origin point at the center of our play space. For XR runtimes that allow you to draw out a guardian boundary this location and its orientation is often set by the user. Other XR runtimes may decide on the placement of this point by other means. It is however a stationary point in the real world.
This reference space is the best option for room scale games where the user is expected to walk around a larger space, or for games where there is a need to switch between game modes. See Room Scale for more information.
When the user enacts the recenter option on their headset, the method of which is different per XR runtime, the XR runtime will not change the origin point. The OpenXRInterface will emit the pose_recentered
signal and it is up to the game to react appropriately. Not doing so will prevent your game from being accepted on various stores.
In Godot you can do this by calling the center_on_hmd
function on the XRServer:
Calling
XRServer.center_on_hmd(XRServer.RESET_BUT_KEEP_TILT, true)
will move the XRCamera3D node to the XROrigin3D node similar to theLocal
reference space.Calling
XRServer.center_on_hmd(XRServer.RESET_BUT_KEEP_TILT, true)
will move the XRCamera3D node above the XROrigin3D node keeping the player’s height, similar to theLocal Floor
reference space.
备注
其他 XR 追踪元素,如控制器、锚点,也将相应调整,以保持和重置后的玩家位置一致。
Local Floor
The local floor reference space is similar to the local reference space as it positions the origin point where the player is. In this mode however the height of the player is kept. Same as with the local reference space, some XR runtimes will persist this location over sessions.
It is thus not guaranteed the player will be standing on the origin point, the only guarantee is that they were standing there when the user last recentered. The player is thus also free to walk away.
This reference space is the best option of games where the user is expected to stand in the same location or for AR type games where the user’s interface elements are bound to the origin node and are quickly placed at the player’s location on recenter.
When the user enacts the recenter option on their headset, the method of which is different per XR runtime, the XR runtime will move the XRCamera3D above the XROrigin3D node but keeping the player’s height. The OpenXRInterface will also emit the pose_recentered
signal so your game can react accordingly.
警告
Be careful using this mode in combination with virtual movement of the player. The user recentering in this scenario can be unpredictable unless you counter the move when handling the recenter signal. This can even be game breaking as the effect in this scenario would be the player teleporting to whatever abstract location the origin point was placed at during virtual movement, including the ability for players teleporting into locations that should be off limits. It is better to use the Stage mode in this scenario and limit resetting to orientation only when a pose_recentered
signal is received.
备注
其他 XR 追踪元素,如控制器、锚点,也将相应调整,以保持和重置后的玩家位置一致。
警告
在使用此参照空间时,不应调用 center_on_hmd
。
环境混合模式
The environment blend mode defines how our rendered output is blended into “the real world” provided this is supported by the headset.
Opaque
means our output obscures the real world, we are in VR mode.Additive
means our output is added to the real world, this is an AR mode where optics do not allow us to fully obscure the real world (e.g. Hololens),Alpha
means our output is blended with the real world using the alpha output (viewport should have transparent background enabled), this is an AR mode where optics can fully obscure the real world (Magic Leap, all pass through devices, etc.).
If a mode is selected that is not supported by the headset, the first available mode will be selected.
备注
Some OpenXR devices have separate systems for enabling/disabling passthrough. From Godot 4.3 onwards selecting the alpha blend mode will also perform these extra steps. This does require the latest vendor plugin to be installed.
注视点级别
Sets the foveation level used when rendering provided this feature is supported by the hardware used. Foveation is a technique where the further away from the center of the viewport we render content, the lower resolution we render at. Most XR runtimes only support fixed foveation, but some will take eye tracking into account and use the focal point for this effect.
The higher the level, the better the performance gains, but also the more reduction in quality there is in the users peripheral vision.
备注
Compatibility renderer only, for Mobile and Forward+ renderer, set the vrs_mode
property on Viewport to VRS_XR
.
警告
This feature is disabled if post effects are used such as glow, bloom, or DOF.
动态注视点
When enabled the foveation level will be adjusted automatically depending on current GPU load. It will be adjusted between low and the select foveation level in the previous setting. It is therefore best to combine this setting with foveation level set to high.
备注
Compatibility renderer only
提交深度缓冲区
If enabled an OpenXR supplied depth buffer will be used while rendering which is submitted alongside the rendered image. The XR runtime can use this for improved reprojection.
备注
Enabling this feature will disable stencil support during rendering. Not many XR runtimes make use of this, it is advised to leave this setting off unless it provides noticeable benefits for your use case.
启动警报
If enabled, this will result in an alert message presented to the user if OpenXR fails to start. We don’t always receive feedback from the XR system as to why starting fails. If we do, we log this to the console. Common failure reasons are:
No OpenXR runtime is installed on the host system.
Microsoft’s WMR OpenXR runtime is currently active, this only supports DirectX and will fail if OpenGL or Vulkan is used.
SteamVR is used but no headset is connected/turned on.
Disable this if you support a fallback mode in your game so it can be played in desktop mode when no VR headset is connected, or if you’re handling the failure condition yourself by checking OpenXRInterface.is_initialized()
.
拓展选项
This subsection provides access to various optional OpenXR extensions.
手部跟踪
This enables the hand tracking extension when supported by the device used. This is on by default for legacy reasons. The hand tracking extension provides access to data that allows you to visualise the user’s hands with correct finger positions. Depending on platform capabilities the hand tracking data can be inferred from controller inputs, come from data gloves, come from optical hand tracking sensors or any other applicable source.
If your game only supports controllers this should be turned off.
See the chapter on hand tracking for additional details.
眼动交互
This enables the eye gaze interaction extension when supported by the device used. When enabled we will get feedback from eye tracking through a pose situated between the user’s eyes orientated in the direction the user is looking. This will be a unified orientation.
In order to use this functionality you need to edit your action map and add a new pose action, say eye_pose
. Now add a new interaction profile for the eye gaze interaction and map the eye_pose
:
Don’t forget to save!
Next add a new XRController3D node to your origin node and set its tracker
property to /user/eyes_ext
and set its pose
property to eye_pose
.
Now you can add things to this controller node such as a raycast, and control things with your eyes.