EXGbuds is a wearable device that allows users to control smart-home devices.
EXGbuds is a compact headset/earbud that generates actionable commands from simple eye movements and facial gestures. It allows users to interact with surrounding smart devices hands-free.
Not advertised, preorder possible
EXGbuds consists of customizable hardware and software, with biosensors placed on top of the ears combined with machine learning algorithms to measure various physiological signals to satisfy a variety of user needs.
The team developed their own patented Electroencephalogram (EEG) dry electrode sensors and microscale Bluetooth communication modules.
Directly from manufacturer
Telecommunication service required for the product/service to work (Mobile data, SMS, voice, Internet, Other)
Local network communication protocol used aside from the telecommunications required (Wifi, bluetooth, Radio frequency, Zigbee, Other)
Device features required for the product/service to work.
Text, Numbers, Multimedia (Photos, video, audio, etc), Geolocation, Sensor readings, Other
List what type of hardware components/devices are used in this IoT solyution (Proprietary, Non Proprietary [Raspberry Pi, Arduino, other])Display order
If Yes, specify the type and quantity of sensors used for the product to deliver the service
Is the code of this project open source?
Is the data from this project publically available?
Android, IOs, Windows, Not applicable, or other [specify]
Batteries, Uninterrupted prower supply (UPS), solar energy, Other [specify]
Monitoring & evaluation (M&E) , survey for project management (baseline, midline, and endline), household surveys for health, ag, financial inclusion, water, education, academic research survey, impact evaluations, market research, other (specify)
By placing sensors and electronic modules in a compact and ergonomic way, EXGbuds provides a human-centered product design. Users can customize their own ergonomic design to place sensors at different locations to measure different physiological signals.
Assist people with disabilities and improve productivity with augmented sensing.
The classification of the eye and facial gestures under the developed machine learning algorithm can reach to above 95% accuracy.
Wang, K. J., Tung, H. W., Huang, Z., Thakur, P., Mao, Z. H. and You, M. X., 2018, EXGbuds: Universal Wearable Assistive Device for Disabled People to Interact with the Environment Seamlessly, Companion of the 2018 ACM/IEEE International Conference on Human-Robot Interaction, pp. 369-370.
Wang, K. J., Liu, Q., Zhao, Y., Zheng, C. Y., Vhasure, S., Liu, Q. and Mao, Z. H., 2018, Intelligent Wearable Virtual Reality (VR) Gaming Controller for People with Motor Disabilities, 2018 IEEE International Conference on Artificial Intelligence and Virtual Reality (AIVR), pp. 161-164.
Wang, K. J., You, K., Chen, F., Thakur, P., Urich, M., Vhasure, S. and Mao, Z. H., 2018, Development of Seamless Telepresence Robot Control Methods to Interact with the Environment Using Physiological Signals, in Companion of the 2018 ACM/IEEE International Conference on Human-Robot Interaction, pp. 44-44.
Wang, K. J., Zhang, A., You, K., Chen, F., Liu, Q., Liu, Y. and Mao, Z. H. , 2018, Ergonomic and Human-Centered Design of Wearable Gaming Controller Using Eye Movements and Facial Expressions, 2018 IEEE International Conference on Consumer Electronics-Taiwan (ICCE-TW), pp. 1-5.
Wang, K. J., Liu, Q., Vhasure, S., Liu, Q., Zheng, C. Y. and Thakur, P., 2018, EXG Wearable Human-Machine Interface for Natural Multimodal Interaction in VR Environment, Proceedings of the 24th ACM Symposium on Virtual Reality Software and Technology, p.49.
Academic research: several papers evaluated the accuracy of the machine learning algorithm that classifies human gestures into interaction with the device.
In context photographs (0)
You can use the form below to contribute a photo of this product.
In context files & documents (0)
Feedback from users, customers & distributors
Please let us know your experience in using this product through comments below.