Meeting Moohan?
- Opening the app that recorded the meeting.
- Opening an app that could create a transcript of the meeting and creating and saving the transcript.
- Opening a word processor and loading the transcript.
- Editing the transcript into a summary and saving.
- Opening an e-mail app.
- Creating a cover letter.
- Defining the send list on the cover letter.
- Attaching the summary to the cover letter and sending
- Closing all apps.
- Tell the AI what to do.
- Check the summary and cover letter and edit it if necessary.
- Do other stuff…
The reason we continue to focus on Samsung’s level of Ai integration is because of Samsung’s upcoming release of its XR headset (aka ‘Project Moohan’ –( 프로젝트 무한 ) which translates to ‘Project Infinity’) seems to follow a similar path toward deep AI/application integration, particularly through its use of Android XR, Google’s (GOOG) new XR OS that will be used for the first time in the upcoming Samsung Moohan release. While the details of Android XR are still sparse, the overall objective is to create a platform based on open standards for XR devices and smart glasses (AR) that uses existing Android frameworks, development tools, and existing Android elements (buttons, menus, fields, etc.), making it able to access all existing Android applications in a spatial environment.
In a practical sense, it would allow existing 2D Android applications to be opened as 2D windows in a 3D environment, and with modification, can become 3D (Google is redesigning YouTube, Google TV, Google Photos, Google Maps and other major 2D applications for a 3D setting). Android XR will also support a variety of input modes, including voice and gestures through head, eye, and hand tracking, and will support some form of spatial audio.
One feature of the Samsung XR headset that we believe will be well received is the visual integration with the AI. Siri can hear what you say and can respond, but while it can ‘see’ what the user sees, it doesn’t have the capability to analyze that information on the fly and use it. Meta’s headsets can hear what the user hears and perform a level of analysis for context, but that function is primarily for parsing voice commands. Typically the Meta system does not access the camera information unless requested by the user and then takes a snapshot. It is able to perform limited scene analysis (object recognition, lighting, depth, etc.) to allow for virtual object placement, but works specifically on the snapshot and only ‘sees’ what is in the real world, excluding virtual world objects.
If the recent demo versions of the Samsung XR headset are carried through to the final product, the headset will hear and see, both real world and virtual objects, and analyze that information on a continuous basis. This allows the user to say, “What kind of dog is that?” to the AI at any time and have the AI respond based on a continuous analysis of the user’s visual focus. The user can also ‘Circle to search’ an object within view with a gesture, as the Ai also recognizes virtual objects (the circle) as well as the real-world data. According to Google, the embedded AI in the Samsung headset also has a rolling ~10-minute memory that enables it to remember key details of the user’s visuals, which means you can also ask “What kind of dog was that in the window of the store we passed 5 minutes ago?” without having to go back to the store’s location.
We know there will be limitations to all of the features we expect on both the Samsung Galaxy S series smartphones and the Samsung XR headset, but, as we noted yesterday, Samsung seems to understand the concept that both the AI functionality and the user’s Ai experience are based on how tightly the Ai is integrated into the device OS and the applications themselves. This desire has led them to work closely with Google and that allows users to use familiar Android apps, along with those specifically designed or remodeled for the spatial environment. Hopefully they price it right at the onset, learning from the poor Vision Pro results, but we will have to wait a few more weeks to find out.