Updated on November 17, 2019

·

Created on November 17, 2019

EXGbuds

Upcoming Update

EXGbuds is a wearable device that allows users to control smart-home devices.

Developed By Unknown
Content Partners
Unknown

Author

Product Description

EXGbuds is a compact headset/earbud that generates actionable commands from simple eye movements and facial gestures. It allows users to interact with surrounding smart devices hands-free.

Distributors / Implementing Organizations

Manufacturing/Building Method

EXGbuds consists of customizable hardware and software, with biosensors placed on top of the ears combined with machine learning algorithms to measure various physiological signals to satisfy a variety of user needs. The team developed their own patented Electroencephalogram (EEG) dry electrode sensors and microscale Bluetooth communication modules.

Intellectural Property Type

Trademarked

User Provision Model

Directly from manufacturer

Distributions to Date Status

Unknown

Design Specifications

By placing sensors and electronic modules in a compact and ergonomic way, EXGbuds provides a human-centered product design. Users can customize their own ergonomic design to place sensors at different locations to measure different physiological signals.

Technical Support

From manufacturer

Replacement Components

None

Lifecycle

Unknown

Manufacturer Specified Performance Parameters

Assist people with disabilities and improve productivity with augmented sensing.

Vetted Performance Status

The classification of the eye and facial gestures under the developed machine learning algorithm can reach to above 95% accuracy.

Safety

N/A

Complementary Technical Systems

None

Academic Research and References

Wang, K. J., Tung, H. W., Huang, Z., Thakur, P., Mao, Z. H. and You, M. X., 2018, EXGbuds: Universal Wearable Assistive Device for Disabled People to Interact with the Environment Seamlessly, Companion of the 2018 ACM/IEEE International Conference on Human-Robot Interaction, pp. 369-370.

Wang, K. J., Liu, Q., Zhao, Y., Zheng, C. Y., Vhasure, S., Liu, Q. and Mao, Z. H., 2018, Intelligent Wearable Virtual Reality (VR) Gaming Controller for People with Motor Disabilities, 2018 IEEE International Conference on Artificial Intelligence and Virtual Reality (AIVR), pp. 161-164.

Wang, K. J., You, K., Chen, F., Thakur, P., Urich, M., Vhasure, S. and Mao, Z. H., 2018, Development of Seamless Telepresence Robot Control Methods to Interact with the Environment Using Physiological Signals, in Companion of the 2018 ACM/IEEE International Conference on Human-Robot Interaction, pp. 44-44.

Wang, K. J., Zhang, A., You, K., Chen, F., Liu, Q., Liu, Y. and Mao, Z. H. , 2018,  Ergonomic and Human-Centered Design of Wearable Gaming Controller Using Eye Movements and Facial Expressions, 2018 IEEE International Conference on Consumer Electronics-Taiwan (ICCE-TW), pp. 1-5.

Wang, K. J., Liu, Q., Vhasure, S., Liu, Q., Zheng, C. Y. and Thakur, P., 2018, EXG Wearable Human-Machine Interface for Natural Multimodal Interaction in VR Environment, Proceedings of the 24th ACM Symposium on Virtual Reality Software and Technology, p.49.

Compliance with regulations

Unknown

Other Information

None

Leave a Reply

Explore similar solutions

Agriculture

September 3, 2018

Mobilized Construction

Read Solution

Agriculture

November 17, 2019

Tambula GPS Tracking

Read Solution
All Solutions

Contribute to E4C's Library of Breakthrough Sustainable Development Technology Solutions

Suggest A Solution

Get more information about Solutions Library and its features.

Learn More

Have thoughts on how we can improve?

Give Us Feedback

Join a global community of changemakers.

Become A Member