Tutor: Marcus Handte
More and more mobile devices are equipped with a broad variety of sensors such as gyroscopes, accelerometers, video cameras, and microphones. To provide better task support, future applications will have to use these sensors to determine the state of their environment unobtrusively. As a simple example consider smart phones that are deactivating the touch screen when they detect that the user is holding them close to his ear.
In more complicated scenarios, this requires applications to “make sense” of the multitude of inputs received at any point in time. To simplify this task, the participants will develop a software system that is able to automatically categorize different situations. This system will be developed for and tested on Android-based mobile phones which will be given out to the participants. Due to the use of Android as operating system, participants should be able to program in Java.
If you are not sure whether you fulfill the requirements or if you want to participate in the project, please send an email to firstname.lastname@example.org.
The kickoff meeting for this project takes place on Thursday, 15th of April from 11.00h to 12.00h in BC504. Participation in this meeting is mandatory for everyone.