Hi there!
I’m a second year PhD Student in Computer Science at the University of St. Gallen in Switzerland in the lab for Interactions- and Communication-based Systems.
My research combines the following areas:
- Mixed Reality
- Ubiquitous Computing
- Personalization
- Privacy
- Internet of Things
- Computer Vision
- Technology Acceptance
For updates on what I’m doing, have a look at the Publications of my colleagues and me,
follow me on the Fediverse: https://hci.social/@jannis,
or contact me via email: jannisrene.strecker@unisg.ch. 😀
📑 Recent Publications
ShoppingCoach: Using Diminished Reality to Prevent Unhealthy Food Choices in an Offline Supermarket Scenario
In
Extended Abstracts of the CHI Conference on Human Factors in Computing Systems (CHI EA ’24)
Date
May 11, 2024
Authors
Jannis Strecker, Jing Wu, Kenan Bektaş, Conrad Vaslin, and Simon Mayer
Abstract
Non-communicable diseases, such as obesity and diabetes, have a significant global impact on health outcomes. While governments worldwide focus on promoting healthy eating, individuals still struggle to follow dietary recommendations. Augmented Reality (AR) might be a useful tool to emphasize specific food products at the point of purchase. However, AR may also add visual clutter to an already complex supermarket environment. Instead, reducing the visual prevalence of unhealthy food products through Diminished Reality (DR) could be a viable alternative: We present Shopping-Coach, a DR prototype that identifies supermarket food products and visually diminishes them dependent on the deviation of the target product’s composition from dietary recommendations. In a study with 12 participants, we found that ShoppingCoach increased compliance with dietary recommendations from 75% to 100% and reduced decision time by 41%. These results demonstrate the promising potential of DR in promoting healthier food choices and thus enhancing public health.
Jannis Strecker, Jing Wu, Kenan Bektaş, Conrad Vaslin, and Simon Mayer. 2024. ShoppingCoach: Using Diminished Reality to Prevent Unhealthy Food Choices in an Offline Supermarket Scenario. In Extended Abstracts of the CHI Conference on Human Factors in Computing Systems (CHI EA ’24), May 11–16, 2024, Honolulu, HI, USA. ACM, New York, NY, USA, 8 pages. https://doi.org/10.1145/3613905.3650795 Text Reference
BibTex Reference
@inproceedings{strecker2024, title = {{{ShoppingCoach}}: {{Using Diminished Reality}} to {{Prevent Unhealthy Food Choices}} in an {{Offline Supermarket Scenario}}}, booktitle = {Extended {{Abstracts}} of the {{CHI Conference}} on {{Human Factors}} in {{Computing Systems}} ({{CHI EA}} '24)}, author = {Strecker, Jannis and Wu, Jing and Bekta{\c s}, Kenan and Vaslin, Conrad and Mayer, Simon}, year = {2024}, langid = {english}, doi = {10.1145/3613905.3650795}, publisher = {ACM}, location = {Honolulu, HI, USA}, series = {CHI EA '24} }
QR Code Integrity by Design
In
Extended Abstracts of the CHI Conference on Human Factors in Computing Systems (CHI EA ’24)
Date
May 11, 2024
Authors
Luka Bekavac, Simon Mayer, and Jannis Strecker
Abstract
As QR codes become ubiquitous in various applications and places, their susceptibility to tampering, known as quishing, poses a significant threat to user security. In this paper we introduce SafeQR codes that address this challenge by introducing innovative design strategies to enhance QR code security. Leveraging visual elements and secure design principles, the project aims to make tampering more noticeable, thereby empowering users to recognize and avoid potential phishing threats. Further, we highlight the limitations of current user-education methods in combating quishing and propose different attacker models tailored to address quishing attacks. In addition, we introduce a multi-faceted defense strategy that merges design innovation with user vigilance. Through a user study, we demonstrate the efficacy of ’Integrity by Design’ QR codes. These innovatively designed QR codes significantly raise user suspicion in case of tampering and effectively reduce the likelihood of successful quishing attacks.
Luka Bekavac, Simon Mayer, and Jannis Strecker. 2024. QR Code Integrity by Design. In Extended Abstracts of the CHI Conference on Human Factors in Computing Systems (CHI EA ’24), May 11–16, 2024, Honolulu, HI, USA. ACM, New York, NY, USA, 9 pages. https://doi.org/10.1145/3613905.3651006 Text Reference
BibTex Reference
@inproceedings{strecker2024, title = {{QR Code Integrity by Design}}, booktitle = {Extended {{Abstracts}} of the {{CHI Conference}} on {{Human Factors}} in {{Computing Systems}} ({{CHI EA}} '24)}, author = {Bekavac, Luka and Mayer, Simon and Strecker, Jannis}, year = {2024}, langid = {english}, doi = {10.1145/3613905.3651006}, publisher = {ACM}, location = {Honolulu, HI, USA}, series = {CHI EA '24} }
GlassBoARd: A Gaze-Enabled AR Interface for Collaborative Work
In
Extended Abstracts of the CHI Conference on Human Factors in Computing Systems (CHI EA ’24)
Date
May 11, 2024
Authors
Kenan Bektaş, Adrian Pandjaitan, Jannis Strecker, and Simon Mayer
Abstract
Recent research on remote collaboration focuses on improving the sense of co-presence and mutual understanding among the collaborators, whereas there is limited research on using non-verbal cues such as gaze or head direction alongside their main communication channel. Our system – GlassBoARd – permits collaborators to see each other’s gaze behavior and even make eye contact while communicating verbally and in writing. GlassBoARd features a transparent shared Augmented Reality interface that is situated in-between two users, allowing face-to-face collaboration. From the perspective of each user, the remote collaborator is represented as an avatar that is located behind the GlassBoARd and whose eye movements are contingent on the remote collaborator’s instant eye movements. In three iterations, we improved the design of GlassBoARd and tested it with two use cases. Our preliminary evaluations showed that GlassBoARd facilitates an environment for conducting future user experiments to study the effect of sharing eye gaze on the communication bandwidth.
Kenan Bektaş, Adrian Pandjaitan, Jannis Strecker, and Simon Mayer. 2024. GlassBoARd: A Gaze-Enabled AR Interface for Collaborative Work. In Extended Abstracts of the CHI Conference on Human Factors in Computing Systems (CHI EA ’24), May 11–16, 2024, Honolulu, HI, USA. ACM, New York, NY, USA, 8 pages. https://doi.org/10.1145/3613905.3650965 Text Reference
BibTex Reference
@inproceedings{strecker2024, title = {{GlassBoARd: A Gaze-Enabled AR Interface for Collaborative Work}}, booktitle = {Extended {{Abstracts}} of the {{CHI Conference}} on {{Human Factors}} in {{Computing Systems}} ({{CHI EA}} '24)}, author = {Bekta\c{s}, Kenan and Pandjaitan, Adrian and Strecker, Jannis and Mayer, Simon}, year = {2024}, langid = {english}, doi = {10.1145/3613905.3650965}, publisher = {ACM}, location = {Honolulu, HI, USA}, series = {CHI EA '24} }