SENSEable Shoes – Hands-free and Eyes-free Mobile Interaction

SENSEable Shoes is a hands-free and eyes-free foot-computer interface that supports on-the-go interaction with surrounding environments. We recognize different low-level activities by measuring the user’s continuous weight distribution over the feet with twelve Force Sensing Resistor (FSR) sensors embedded in the insoles of shoes. Using the sensor data as inputs, a Support Vector Machine (SVM) classifier identifies up to eighteen mobile activities and a four-directional foot control gesture at approximately 98% accuracy. By understanding user’s present activities and foot gestures, this system offers a nonintrusive and always-available input method. We present the design and implementation of our system and several proof-of-concept applications.

Print

A person’s weight is not allocated symmetrically over the plantar. As the sole is not flat but arched, the weight mainly centers on the hallex, the first metatarse and the calcaneus. When sitting, the weight of a person’s upper body rest mostly on the chair and the weight on the feet is relatively small. When standing, the whole body’s weight is put evenly on both feet. Leaning left or right changes the weight distribution over the feet. When walking, the weight distribution changes with the pace; the weight on the front and rear part of the foot alternately increases and decreases because not all parts of the sole contact the ground at once. The changes in weight distribution on the feet reflect one’s activity, and different activities have different changes of weight distribution signatures.

sen_shoe_2

We observed people’s common low-level activities in a mobile context and classified them as static or dynamic. Static activities include sitting (with variations: sitting straight, stretching out and legs shaking) and standing (with variations: standing straight, leaning to left and right, swinging and slouching); for dynamic categories, we want to know walking (slow and fast, backing up and carrying a bag, turning left and right), running, jumping and climbing stairs (ups and down). Some activities are similar, such as sitting straight and stretching out; but distinguishing these minor differences helps us to know more about a user’s status in a mobile environment. For instance, sitting straight could imply the user is in a relatively serious environment while stretching out may suggest the user is in a relaxed environment.

Print
Print

The hardware platform consists of a pair of ordinary canvas shoes, a pair of 4mm sponge insoles, six FSRs in each shoe with a round shape sensing area with a 12.7 mm diameter, two microcontrollers, two wireless transceivers and power supplies. The FSRs are mounted underneath the insole and wired to the microcontroller. The sensor signals are sent to Arduino UNO boards with ATmega328 microcontrollers, one for each foot, which are connected to Xbee wireless transceivers running the ZigBee protocol. Due to their size, boards are attached to the top of the shoes and powered by 9 Volt Alkaline batteries. Each microcontroller has six analog-to-digital channels that gather raw data from the six FSRs, which represent the pressure distribution under the insole. The data from each sensor are labeled and then transmitted to a laptop. A python program on the PC reads the raw data, uses a Support Vector Machine (SVM) classifier to identify activities and displays the data on a graphical user interface.

Print
sen_shoe_6

We implement a SVM classifier with python’s Orange API. New samples are passed to the trained classifier to predict activities. A ten-fold cross-validation is employed to estimate the accuracy. It separates the dataset into ten parts, chooses one part as testing set, and the rest as training set. Our tests achieved a 98% accuracy in overall performance.

sen_shoe_7

We also employ the same training procedure to classify simple four-directional foot gestures. These gestures are based on weight distribution over different parts of the right foot. They require only subtle foot movement and can barely be distinguished by a casual observer.

Print

We also presents several applications: slideshow control, pedometer, Mobile music control, snakes, and robot control.

Print

The shoes can also be applied on other applications: gaming platforms, secret message senders, and abnormal gait detection.

Print

The methods in Horst Rittel’s “The Reasoning of Designers” is also used to rethink the design problems in this project. (The size of the image is very large. If you have problems in viewing it on the browser, please left-click on the image first, then right-click to download it to the computer, and use other software to open it.)

Print

This is an open source project using MIT licence
https://github.com/legenddolphin/SENSEable-Shoes

Team Members
Huaishu Peng – Original Ideas and Design Research
Yen-Chia Hsu – Machine Learning Algorithm and Technical Support

Project Advisor
Prof. Mark D. Gross

References

[1] Andrew Campbell, Tanzeem Choudhury, Shaohan Hu, Hong Lu, Matthew K. Mukerjee, Mashfiqui Rabbi, and Rajeev D.S. Raizada. “NeuroPhone: brain-mobile phone interface using a wireless EEG headset. ” MobiHeld ’10.

[2] Bamberg, S., Benbasat, A. Y., Scarborough, D. M., Krebs, D. E., & Paradiso, J. A. “Gait analysis using a shoe-integrated wireless sensor system. ” Information Technology in Biomedicine, IEEE Transactions on, 12(4), 413–423. 2008.

[3] Berchtold, M.; Budde, M.; Gordon, D.; Schmidtke, H.R.; Beigl, M.; , “ActiServ: Activity Recognition Service for mobile phones,” ISWC 10 , vol., no., pp.1-8, 10-13

[4] Boris Smus and Vassilis Kostakos. “Running gestures: hands-free interaction during physical activity.” Ubicomp ’10.

[5] Burges, C. “A tutorial on support vector machines for pattern recognition.” Data mining and knowledge discovery, 43, 1-43. 1998

[6] Chen, M., & Huang, B. “Intelligent shoes for abnormal gait detection.” IEEE International Conference on Robotics and Automation, 2019-2024. 2008.

[7] Chris Harrison, Desney Tan, and Dan Morris. “Skinput: appropriating the body as an input surface. ” CHI ’10.

[8] Chris Harrison, Hrvoje Benko, and Andrew D. Wilson. “OmniTouch: wearable multitouch interaction everywhere.” UIST ’11.

[9] Crossan, A., Brewster, S., & Ng, “A. Foot tapping for mobile interaction.” BCS HCI (Vol. 10). 2010.

[10] Daniel Wigdor and Ravin Balakrishnan. “TiltText: using tilt for text input to mobile phones.”  (UIST ’03).

[11] Demšar, J., Zupan, B., Leban, G., & Curk, T. “Orange: From experimental machine learning to interactive data mining.” Knowledge discovery in databases: PKDD 2004, 537–539. Springer.

[12] Doug Engelbart. Father of the Mouse, http://www.dougengelbart.org/firsts/mouse.html

[13] J Appl Physiol, “Hessert MJ, Vyas M, Leach J, Hu K, Lipsitz LA, Novak V. Foot pressure distribution during walking in young and old adults.” 90:2117-2129. 2001.

[14] Jeremy Scott, David Dearman, Koji Yatani, and Khai N. Truong. “Sensing foot gestures from the pocket.” UIST ’10.

[15] Jungsoo Kim, Jiasheng He, Kent Lyons, and Thad Starner. “The Gesture Watch: A Wireless Contact-free Gesture based Wrist Interface.” ISWC ’07.

[16] Morris, S.J.; Paradiso, J.A.; , “Shoe-integrated sensor system for wireless gait analysis and real-time feedback,” 24th Annual Conference and the Annual Fall Meeting of the Biomedical Engineering Society EMBS/BMES Conference, 2002.

[17] Paelke, V., Reimann, C., & Stichling, D. Foot-based mobile interaction with games. pp. 321–324. ACE 04

[18] Saono, E. S., Bumpus, T., Zeigler, S., & Marocco, S. “Classification of plantar pressure and heel acceleration patterns sing neural networks” IEEE International Joint Conference on Neural Networks, Vol. 5, pp. 3007-3010. 2005.

[19] Sung-Jung Cho, Changkyu Choi, Younghoon Sung, Kwanghyeon Lee, Yeun-Bae Kim, and Roderick Murray-Smith. “Dynamics of tilt-based browsing on mobile devices. ” .CHI EA ’07.

[20] T. Scott Saponas, Desney S. Tan, Dan Morris, Ravin Balakrishnan, Jim Turner, and James A. Landay. “Enabling always-available input with muscle-computer interfaces.” UIST ’09.

[21] T. Scott Saponas , Jonathan Lester , Jon E. Froehlich , James Fogarty and James A. Landay, “iLearn on the iPhone: Real-Time Human Activity Classification on Commodity Mobile Phones”, CSE Technical Report , 2008.

[22] Thorp, E. O. “The invention of the first wearable computer. Digest of Papers. Second International Symposium on Wearable Computers” Cat. No.98EX215, 4-8. IEEE Comput. Soc. 1959.

[23] Weizhong Ye, Yangsheng Xu, and Ka Keung Lee, “Shoe-Mouse: An Integrated Intelligent Shoe”, IEEE International Conference on Intelligent Robot Systems, pp.1947-1951, Edmonton, Canada, Aug. 2-6, 2005.

[24] Yang, C.-C.;   Hsu, Y.-L. “A Review of Accelerometry-Based Wearable Motion Detectors for Physical Activity Monitoring.” Sensors 2010, 10, 7772-7788.

[25] Toni Pakkanen and Roope Raisamo. “Appropriateness of foot interaction for non-accurate spatial tasks.” CHI EA ’04

One thought on “SENSEable Shoes – Hands-free and Eyes-free Mobile Interaction

Comments are closed.