Improving Typing Experiences Through the Use of a Keyboard Interface With Integrated Gesture Recognition

dc.contributor.authorNorrie, Samantha
dc.date.accessioned2023-03-17T13:05:15Z
dc.date.available2023-03-17T13:05:15Z
dc.date.copyright2023en_US
dc.date.issued2023-03-17
dc.description.abstractThe current keyboard typing experience is inefficient due to the need to rely on external technologies such as mice or trackpads. The Keyboard Interface With Integrated Gesture Recognition (KIWIGR) aims to solve this issue by implementing common word processing actions into keyboard gestures. The gestures explored in this research project aid users with document navigation as well as text highlighting. A sensor tablet and a machine learning model were used to implement these gestures into a keyboard-like system. The current prototype of the KIWIGR uses image amplification, image combination, and the aforementioned machine learning model to help predict keyboard gestures.en_US
dc.description.reviewstatusRevieweden_US
dc.description.scholarlevelUndergraduateen_US
dc.description.sponsorshipJamie Cassels Undergraduate Research Awards (JCURA)en_US
dc.identifier.urihttp://hdl.handle.net/1828/14835
dc.language.isoenen_US
dc.subjectMétisen_US
dc.subjectresurgenceen_US
dc.subjectbeadworken_US
dc.subjectknowledge transmissionen_US
dc.subjectresilienceen_US
dc.titleImproving Typing Experiences Through the Use of a Keyboard Interface With Integrated Gesture Recognitionen_US
dc.typePosteren_US

Files

Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Samantha Norrie-JCURAposter-2023.pdf
Size:
10.04 MB
Format:
Adobe Portable Document Format
Description:
License bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
2 KB
Format:
Item-specific license agreed upon to submission
Description: