The Jetpack XR SDK supports the playback of stereoscopic side-by-side video onto flat surfaces. With stereoscopic video, each frame consists of a left-eye and a right-eye image to give viewers a sense of depth.
You can render non-stereoscopic 2D video on Android XR apps with the standard media APIs used for Android development on other form factors.
Play side-by-side video using Jetpack XR SDK
With side-by-side video, each stereoscopic frame is presented as two images arranged horizontally adjacent to each other. Top-and-bottom video frames are arranged vertically adjacent to each other.
Side-by-side video is not a codec but rather a way of organizing stereoscopic frames, which means it can be encoded in any of the codecs supported by Android.
You can load side-by-side video using Media3 Exoplayer and then render it
using the new StereoSurfaceEntity
. To create a
StereoSurfaceEntity
,
call createStereoSurfaceEntity()
on your Session
, as
shown in the following example.
stereoSurfaceEntity = xrSession.createStereoSurfaceEntity(
StereoSurfaceEntity.StereoMode.SIDE_BY_SIDE,
Dimensions(2.0F, 2.0F, 0.0F),
// Position 1.5 meters in front of user
Pose(Vector3(0.0f, 0.0f, -1.5f), Quaternion(0.0f, 0.0f, 0.0f, 1.0f))
)
val videoUri = Uri.Builder()
.scheme(ContentResolver.SCHEME_ANDROID_RESOURCE)
.path(R.raw.sbs_test_video.toString())
.build()
val mediaItem = MediaItem.fromUri(videoUri)
exoPlayer = ExoPlayer.Builder(this).build()
exoPlayer.setVideoSurface(stereoSurfaceEntity!!.getSurface())
exoPlayer.setMediaItem(mediaItem)
exoPlayer.prepare()
exoPlayer.play()