Comparing visualizations and text on smart watches while running in realistic environment

dc.contributor.authorKashanj, Sarina
dc.contributor.supervisorPerin, Charles
dc.date.accessioned2024-11-29T00:29:44Z
dc.date.available2024-11-29T00:29:44Z
dc.date.issued2024
dc.degree.departmentDepartment of Computer Science
dc.degree.levelMaster of Science MSc
dc.description.abstractIn today’s digital age, smartwatches have become popular tools for millions of runners, providing metrics like pace, heart rate, and distance. Despite the widespread use of text-based displays on these devices, research suggests that visualizations could offer a more effective alternative. However, little is known about how visualizations perform in real-world running scenarios. This study addresses this gap by investigating how visualizations compare to text is aiding runners’ performance and experience. Through a study involving 20 runners completing running tasks on an outdoor track, we found that visualizations significantly outperformed text, with runners completing tasks 1.5 to 8 times faster. Moreover, participants expressed a strong preference for visualizations and indicated a willingness to use them during their runs if available on their smartwatch. These findings highlight the potential of visualizations to enhance the usability and effectiveness of smartwatches for runners.
dc.description.scholarlevelGraduate
dc.identifier.urihttps://hdl.handle.net/1828/20809
dc.languageEnglisheng
dc.language.isoen
dc.rightsAvailable to the World Wide Web
dc.subjectvisualization
dc.subjectwearables
dc.subjectsmartwatch
dc.subjectHCI
dc.titleComparing visualizations and text on smart watches while running in realistic environment
dc.typeThesis

Files

Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Master_Thesis_Sarina_Kashanj__2024-3.pdf
Size:
9.29 MB
Format:
Adobe Portable Document Format
License bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
1.62 KB
Format:
Item-specific license agreed upon to submission
Description: