Tutor: Marcus Handte
More and more mobile devices are equipped with a broad variety of sensors such as gyroscopes, accelerometers, video cameras, and microphones. To provide better task support, future applications will have to use these sensors to determine the state of their environment unobtrusively. As a simple example consider smart phones that are deactivating the touch screen when they detect that the user is holding them close to his ear.
In more complicated scenarios, this requires applications to “make sense” of the multitude of inputs received at any point in time. To simplify this task, the participants will develop a software system that is able to automatically categorize different situations. This system will be developed for and tested on Android-based mobile phones which will be given out to the participants. Due to the use of Android as operating system, participants should be able to program in Java.
The project is suitable for students at the bachelor and the master level. However, the course contents and the requirements for passing are different depending on the level. As a consequence, it is not possible to create mixed teams. If you are not sure whether you fulfill the requirements or if you want to participate in the project, please send an email to email@example.com.
The kickoff meeting for this project takes place on October 19, 2010 from 9.00h-10.30h in room BC 504. Participation in this meeting is mandatory.