Autonomous wearable displays.
Funding | FFG, bm:vit FIT-IT Embedded Systems | ||
Project no. | 809450 | ||
Duration | 20005-2008 | ||
Consortium | Johannes Kepler Universität Linz*, Universität Salzburg, Silhouette International Schmied AG, Research Studios Austria Forschungsgesellschaft mbH | ||
Role | Proposer, Coordinator |
SPECTACLES project aims at creating a autonomous computing platform that is integrated into the frame structure of glasses. The system includes components for communication, interaction and sensing to support different applicatoin areas.
Wearable see-through displays overlaying the user's real view with computer-generated display output have gained as a potentially effective means for a variety of mixed reality applications (e.g. in medicine, industrial maintenance, mobile information systems or even tourism and sports). Such multimedia, wearable see-through spectacle systems allow to create a visual perception of the real world visually merged with a virtual world by annotating real life objects with computer-generated data to real world objects. The user is enabled to access any kind of information, unobtrusively adapted to his current situation, while not having to give up paying attention to his environment or conducting his tasks.
Spectacles ISWC 09 Video
Key Issues
- Flexible toolkit for different application domains
- Component-oriented design for design-time and run-time adaptation
- Supports different media classes
- Custom hardware component integration
The project SPECTACLES attempts for a modular, autonomous, lightweight, wirelessly communicating wearable display device, that can be integrated into the physical structure of an eyeglasses frame. A modular and reconfigurable system design approach is followed both in hardware and in software, supporting a plug-and-play configuration of SPSs ("Special Purpose Spectacles") that meet the individual requirements of a specific use case scenario.
An SPS as an autonomous, wearable display system is enabled to communicate with its environment wirelessly (technologies like GPRS, BT and WiFi are being addressed), sense different environmental parameters, and display different kinds of media (video, audio, image, text). Besides the output facilities, the computational platform of SPECTACLES is designed to be flexible enough to allow integration of additional input devices like cameras, accelerometers and other sensor units that can act as a means for natural human-computer-interaction and as a source for recognizing the user's context and focus of attention.
The project was awarded the "Landespreis für Innovation" (Upper Austria) in 2009 and the Innovation Award of the National Award for Multimedia and e-Business in 2011.
A. Ferscha and S. Vogl, "Wearable Displays for Everyone!", in IEEE Pervasive Computing, vol. 9, no. 1, pp. 7-10, Jan.-March 2010.
This paper presents the development of Spectacles-a modular autonomous, wireless displays platform for integration with eyeglass frames. The Spectacles hardware-software platform includes local computation and communication facilities, an integrated power supply, and modular system building blocks, such as sensors, voice-to-text and text-to-speech components, localization and positioning units, and microcameras and micodisplays units. Developers can easily assemble the system building blocks to application-specific instances of wearable displays.