The AI glasses experience is built on the existing Android Activity framework
API and includes additional concepts to support the unique aspects of
AI glasses. Unlike XR headsets that run a full APK on the device, AI glasses use
a dedicated activity that runs within your phone's existing app. This activity
is projected from the host device to the AI glasses.
To create your app's AI glasses experience, you extend your existing phone app
by creating a new projected Activity for AI glasses. This activity serves
as the main launch entry point for your app on AI glasses. This approach
simplifies development because you can share and reuse business logic between
your phone and AI glasses experiences.
Declare your activity in your app's manifest
Just like other types of activities, you need to declare your activity in your app's manifest file for the system to see and run it.
<application>
<activity
android:name=".AIGlassesActivity"
android:exported="true"
android:requiredDisplayCategory="xr_projected"
android:label="Example AI Glasses activity">
<intent-filter>
<action android:name="android.intent.action.MAIN"/>
</intent-filter>
</activity>
</application>
Key points about the code
- Specifies
xr_projectedfor theandroid:requiredDisplayCategoryattribute to tell the system that this activity should use a projected context to access hardware from a connected device.
Create your activity
Next, you'll create a small activity that can display something on the AI glasses whenever the display is turned on.
/**
* When this activity launches, it stays in the started state.
*/
class AIGlassesActivity : ComponentActivity() {
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
val projectedWindowManager = ProjectedWindowManager.create(this)
setContent {
GlassesComposeContent {
GlimmerTheme {
TopBarScaffold(modifier = Modifier.background(color = Color.Black)) {
ProjectedDisplayController.create(activity)
.addPresentationModeChangedListener {
presentationModeFlags ->
// Check whether visuals are on or off
val areVisualsOff = !presentationModeFlags.hasPresentationMode(VISUALS_ON)
}
// Conditional UI based on presentation mode.
if (areVisualsOff) {
// Implementation for the when the display is off.
} else {
DisplayUi()
}
}
}
}
}
}
override fun onStart() {
// Do things to make the user aware that this activity is active (for
// example, play audio frequently), when the display is off.
}
override fun onStop() {
// Stop all the data source access.
}
}
Key points about the code
AIGlassesActivityextendsComponentActivity, just as you would expect in mobile development.- The
setContentblock within theonCreate()defines the root of the Composable UI tree for the activity. - Initializes the UI during the activity's
onCreate()method (see projected activity lifecycle). - Sets up a
TopBarScaffoldbase layout with a black background for the UI using Jetpack Compose Glimmer.
Implement the composable
The activity that you created references a DisplayUi composable function that
you need to implement. The following code uses Jetpack Compose Glimmer to
define a composable that can display some text on the AI glasses' display:
@Composable
fun DisplayUi() {
Box(
modifier = Modifier
.fillMaxSize(),
contentAlignment = Alignment.Center
) {
Text("Hello World!")
}
}
Key points about the code
- As you defined in your activity earlier, the
DisplayUifunction includes the composable content that the user sees when the AI glasses' display is on. - The Jetpack Compose Glimmer
Textcomponent displays the text "Hello World!" to the glasses' display.
Start your activity
Now that you've created a basic activity, you can launch it onto your glasses. To access the glasses' hardware, your app must start your activity with specific options that tell the system to use a projected context, as shown in the following code:
val options = ProjectedContext.createProjectedActivityOptions(context)
val intent = Intent(context, AIGlassesActivity::class.java)
context.startActivity(intent, options.toBundle())
The createProjectedActivityOptions() method in ProjectedContext
generates the necessary options to start your activity in a projected context.
The context parameter can be a context from the phone or the glasses
device.
Check whether AI glasses are connected
If you want to determine whether a user's AI glasses are connected to their
phone before launching your activity, use the
ProjectedContext.isProjectedDeviceConnected() method. This method
returns a Flow<Boolean> that your app can observe to get real-time updates on
the connection status.
Next steps
Now that you've created your first activity for AI glasses, explore other ways that you can extend its functionality:
- Handle audio output using Text to Speech
- Handle audio input using Automatic Speech Recognition
- Build UI with Jetpack Compose Glimmer
- Access AI glasses' hardware