The test system we saw relied on a touch screen for all input, although a production system might use some hard buttons on the edges of the screen. But unlike standard touchscreen applications, this system allows for multitouch, making gesture control, similar to that used on the iPhone, possible. Jensen demonstrated tracing a lower-case ‘h’ with his fingertip on the display, which caused the system to bring up the home screen. Similarly, tracing an ‘n’ brought up the navigation screen.
In this development stage, the system had applications for navigation and music, but Jensen explained that it could serve as a platform for third-party developers to build useful widgets that could be installed by the end user. This model would be similar to how iPhone owners can load apps from iTunes.