Augmentative and Alternative Communication (AAC) Advances: A Review of Configurations for Individuals with a Speech Disability

Yasmin Elsahar, Sijung Hu, Kaddour Bouazza-Marouf, David Kerr, Annysa Mansor, Yasmin Elsahar, Sijung Hu, Kaddour Bouazza-Marouf, David Kerr, Annysa Mansor

Abstract

High-tech augmentative and alternative communication (AAC) methods are on a constant rise; however, the interaction between the user and the assistive technology is still challenged for an optimal user experience centered around the desired activity. This review presents a range of signal sensing and acquisition methods utilized in conjunction with the existing high-tech AAC platforms for individuals with a speech disability, including imaging methods, touch-enabled systems, mechanical and electro-mechanical access, breath-activated methods, and brain-computer interfaces (BCI). The listed AAC sensing modalities are compared in terms of ease of access, affordability, complexity, portability, and typical conversational speeds. A revelation of the associated AAC signal processing, encoding, and retrieval highlights the roles of machine learning (ML) and deep learning (DL) in the development of intelligent AAC solutions. The demands and the affordability of most systems hinder the scale of usage of high-tech AAC. Further research is indeed needed for the development of intelligent AAC applications reducing the associated costs and enhancing the portability of the solutions for a real user's environment. The consolidation of natural language processing with current solutions also needs to be further explored for the amelioration of the conversational speeds. The recommendations for prospective advances in coming high-tech AAC are addressed in terms of developments to support mobile health communicative applications.

Keywords: assistive technologies; augmentative and alternative communication; machine learning; mobile health; sensing modalities; signal processing; speech disability; voice communication.

Conflict of interest statement

The authors declare no conflict of interest.

Figures

Figure 1
Figure 1
The four components of the Human Activity Assistive Technology (HAAT) model presented in [4]. The interaction between the human and the assistive technology (AT) is emphasized to highlight the relationship between the needs of the AAC users and the elements of development of high-tech solutions discussed in this review.
Figure 2
Figure 2
Components of a typical eye gaze system, adapted from [22,38]. The optical and the visual axes are used for the calibration process commonly required to set up the eye gaze system [22,39].
Figure 3
Figure 3
(a) A sample visual scanning interface activated via switch scanning. The yellow box moves vertically across the lines until a selection is made, followed by a gliding green box moving horizontally across the highlighted line until a letter is also selected. In (b), two scanning button switches are displayed.
Figure 4
Figure 4
Examples of (a) a dedicated touch-based device and (b) a non-dedicated smart device running an AAC application (APP), usually with predictive language model and speech generation capabilities.
Figure 5
Figure 5
Examples of (a) discrete breath encoding, where soft and heavy breathing blows are recorded to encode combinations of zeros and ones, or Morse codes, representing the intended messages, and (b) continuous breath encoding, where the speed, amplitude, and phase of breathing are modulated to create patterns representing the intended message.
Figure 6
Figure 6
Examples of (a) training mode, and (b) live mode of continuous breath encoding for the storage and the retrieval of breathing patterns linked to a user phrase using a mobile APP.
Figure 7
Figure 7
The components and flow diagram of a Brain–Computer Interface (BCI) system, adapted from [66,67].

References

    1. García-Méndez S., Fernández-Gavilanes M., Costa-Montenegro E., Juncal-Martínez J., Javier González-Castaño F. Automatic natural language generation applied to alternative and augmentative communication for online video content services using simple NLG for Spanish; Proceedings of the 15th Web for All Conference: Internet of Accessible Things; Lyon, France. 23–27 April 2018.
    1. Kerr D., Bouazza-Marouf K., Gaur A., Sutton A., Green R. A breath controlled AAC system; Proceedings of the CM2016 National AAC Conference; Orlando, FL, USA. 19–22 April 2016; pp. 11–13.
    1. Schultz Ascari R.E.O., Pereira R., Silva L. Mobile Interaction for Augmentative and Alternative Communication: A Systematic Mapping. SBC J. Interact. Syst. 2018;9:105–118.
    1. Cook A.M., Polgar J.M. Assistive Technologies Principles and Practices. 4th ed. Elsevier; New York, NY, USA: 2015.
    1. Smith A. Speech motor development: Integrating muscles, movements, and linguistic units. J. Commun. Disord. 2006;39:331–349. doi: 10.1016/j.jcomdis.2006.06.017.
    1. van de Sandt-Koenderman M.W.M.E. High-tech AAC and aphasia: Widening horizons? Aphasiology. 2004;18:245–263. doi: 10.1080/02687030344000571.
    1. Light J., McNaughton D. The Changing Face of Augmentative and Alternative Communication: Past, Present, and Future Challenges. Augment. Altern. Commun. 2012;28:197–204. doi: 10.3109/07434618.2012.737024.
    1. Hodge S. Why is the potential of augmentative and alternative communication not being realized? Exploring the experiences of people who use communication aids. Disabil. Soc. 2007;22:457–471. doi: 10.1080/09687590701427552.
    1. Mirenda P. Toward Functional Augmentative and Alternative Communication for Students With Autism. Lang. Speech Hear. Serv. Sch. 2003;34:203–216. doi: 10.1044/0161-1461(2003/017).
    1. National Academies of Sciences, Engineering, and Medicine . The Promise of Assistive Technology to Enhance Activity and Work Participation. The National Academies Press; Washington, DC, USA: 2017. Augmentative and Alternative Communication and Voice Products and Technologies; pp. 209–310.
    1. Smith E., Delargy M. Locked-in syndrome. Br. Med. J. 2005;330:406–409. doi: 10.1136/bmj.330.7488.406.
    1. Simion E. Augmentative and Alternative Communication—Support for People with Severe Speech Disorders. Procedia-Soc. Behav. Sci. 2014;128:77–81. doi: 10.1016/j.sbspro.2014.03.121.
    1. Arthanat S., Bauer S.M., Lenker J.A., Nochajski S.M., Wu Y.W.B. Conceptualization and measurement of assistive technology usability. Disabil. Rehabil. Assist. Technol. 2007;2:235–248. doi: 10.1080/17483100701343665.
    1. Giesbrecht E. Application of the human activity assistive technology model for occupational therapy research. Aust. Occup. Ther. J. 2013;60:230–240. doi: 10.1111/1440-1630.12054.
    1. Iacono T., Lyon K., Johnson H., West D. Experiences of adults with complex communication needs receiving and using low tech AAC: An Australian context. Disabil. Rehabil. Assist. Technol. 2013;8:392–401. doi: 10.3109/17483107.2013.769122.
    1. McNaughton D., Light J. The iPad and mobile technology revolution: Benefits and challenges for individuals who require augmentative and alternative communication. AAC Augment. Altern. Commun. 2013;29:107–116. doi: 10.3109/07434618.2013.784930.
    1. Shane H.C., Blackstone S., Vanderheiden G., Williams M., Deruyter F. Using AAC technology to access the world. Assist. Technol. 2012;24:3–13. doi: 10.1080/10400435.2011.648716.
    1. Baxter S., Enderby P., Evans P., Judge S. Barriers and facilitators to the use of high-technology augmentative and alternative communication devices: A systematic review and qualitative synthesis. Int. J. Lang. Commun. Disord. 2012;47:115–129. doi: 10.1111/j.1460-6984.2011.00090.x.
    1. Glennen S.L. The Handbook of Augmentative and Alternative Communication. Cengage Learning; Boston, MA, USA: 1997. Augmentative and alternative communication systems; pp. 59–69.
    1. Tobii Dynavox PCEye Plus. [(accessed on 10 February 2019)]; Available online:
    1. Chennamma H.R., Yuan X. A Survey on Eye-Gaze Tracking Techniques. Indian J. Comput. Sci. Eng. 2013;4:388–393.
    1. Kar A., Corcoran P. A review and analysis of eye-gaze estimation systems, algorithms and performance evaluation methods in consumer platforms. IEEE Access. 2017;5:16495–16519. doi: 10.1109/ACCESS.2017.2735633.
    1. Hansen D.W., Ji Q. In the Eye of the Beholder: A Survey of Models for Eyes and Gaze. IEEE Trans. Pattern Anal. Mach. Intell. 2010;32:478–500. doi: 10.1109/TPAMI.2009.30.
    1. Townend G.S., Marschik P.B., Smeets E., van de Berg R., van den Berg M., Curfs L.M.G. Eye Gaze Technology as a Form of Augmentative and Alternative Communication for Individuals with Rett Syndrome: Experiences of Families in The Netherlands. J. Dev. Phys. Disabil. 2016;28:101–112. doi: 10.1007/s10882-015-9455-z.
    1. Chen S.-H.K., O’Leary M. Eye Gaze 101: What Speech-Language Pathologists Should Know About Selecting Eye Gaze Augmentative and Alternative Communication Systems. Perspect. ASHA Spec. Interes. Groups. 2018;3:24–32. doi: 10.1044/persp3.SIG12.24.
    1. Ball L., Nordness A., Fager S., Kersch K., Mohr B., Pattee G.L., Beukelman D. Eye-Gaze Access to AAC Technology for People with Amyotrophic Lateral Sclerosis. J. Med. Speech. Lang. Pathol. 2010;18:11–23.
    1. Corno F., Farinetti L., Signorile I., Torino P. A Cost-effective solution for eye-gaze assistive technology; Proceedings of the IEEE International Conference on Multimedia and Expo; Lausanne, Switzerland. 26–29 August 2002; pp. 433–436.
    1. Majaranta P., Aoki H., Donegan M., Hansen D.W., Hansen J.P. Gaze Interaction and Applications of Eye Tracking: Advances in Assistive Technologies. IGI Publishing; Hershey, PA, USA: 2011.
    1. Bates R., Donegan M., Istance H.O., Hansen J.P., Räihä K.J. Introducing COGAIN: Communication by gaze interaction. Univ. Access Inf. Soc. 2007;6:159–166. doi: 10.1007/s10209-007-0077-9.
    1. Bates R., Istance H., Oosthuizen L., Majaranta P. Survey of De-Facto Standards in Eye Tracking. Information Society Technologies; Tallinn, Estonia: 2005. Communication by Gaze Interaction.
    1. Al-Rahayfeh A., Faezipour M. Eye Tracking and Head Movement Detection: A State-of-Art Survey. IEEE J. Transl. Eng. Heal. Med. 2013;1:2100212. doi: 10.1109/JTEHM.2013.2289879.
    1. Janthanasub V. Ophapasai: Augmentative and Alternative Communication Based on Video-Oculography Control Interface. Appl. Mech. Mater. 2016;848:60–63. doi: 10.4028/.
    1. Tai K., Blain S., Chau T. A Review of Emerging Access Technologies for Individuals With Severe Motor Impairments. Assist. Technol. 2008;20:204–221. doi: 10.1080/10400435.2008.10131947.
    1. Harezlak K., Kasprowski P. Application of eye tracking in medicine: A survey, research issues and challenges. Comput. Med. Imaging Graph. 2018;65:176–190. doi: 10.1016/j.compmedimag.2017.04.006.
    1. van der Geest J.N., Frens M.A. Recording eye movements with video-oculography and scleral search coils: A direct comparison of two methods. J. Neurosci. Methods. 2002;114:185–195. doi: 10.1016/S0165-0270(01)00527-1.
    1. Robinsont D. A Method of Measuring Eye Movement Using a Scleral Search Coil in a Magnetic Field. IEEE Trans. Bio-Med. Electron. 1963;10:137–145.
    1. Tobii Technology Accuracy and Precision Test Method for Remote Eye Trackers—Test Specification Report. [(accessed on 7 February 2011)];2011 Available online: .
    1. Farivar R., Michaud-Landry D. Construction and Operation of a High-Speed, High-Precision Eye Tracker for Tight Stimulus Synchronization and Real-Time Gaze Monitoring in Human and Animal Subjects. Front. Syst. Neurosci. 2016;10:1–10. doi: 10.3389/fnsys.2016.00073.
    1. Schwiegerling J.T. Eye Axes and Their Relevance to Alignment of Corneal Refractive Procedures. J. Refract. Surg. 2013;29:515–516. doi: 10.3928/1081597X-20130719-01.
    1. Salvucci D.D., Goldberg J.H. Identifying fixations and saccades in eye-tracking protocols; Proceedings of the Symposium on Eye Tracking Research & Applications; Palm Beach Gardens, FL, USA. 6–8 November 2000; pp. 71–78.
    1. Poole A., Ball L.J. Eye Tracking in Human-Computer Interaction and Usability Research: Current Status and Future Prospects. [(accessed on 1 January 2005)]; Available online: .
    1. Kunka B., Kostek B. Non-intrusive infrared-free eye tracking method; Proceedings of the Signal Processing Algorithms, Architectures, Arrangements, and Applications Conference Proceedings (SPA); Poznan, Poland. 24–26 September 2009; pp. 105–109.
    1. Talk To Me Technologies Eyespeak. [(accessed on 9 April 2019)]; Available online: .
    1. IntelliGaze by Alea Technologies IntelliGaze. [(accessed on 9 April 2019)]; Available online:
    1. EagleEyes. [(accessed on 9 April 2019)]; Available online:
    1. MacKenzie I.S., Ashtiani B. BlinkWrite: Efficient text entry using eye blinks. Univ. Access Inf. Soc. 2011;10:69–80. doi: 10.1007/s10209-010-0188-6.
    1. Bhalla M.R., Bhalla A.V. Comparative Study of Various Touchscreen Technologies. Int. J. Comput. Appl. 2010;6:12–18. doi: 10.5120/1097-1433.
    1. Lee D. The State of the Touch-Screen Panel Market in 2011. Inf. Disp. 2011;27:12–16. doi: 10.1002/j.2637-496X.2011.tb00364.x.
    1. Qin H., Cai Y., Dong J., Lee Y.-S. Direct Printing of Capacitive Touch Sensors on Flexible Substrates by Additive E-Jet Printing With Silver Nanoinks. J. Manuf. Sci. Eng. 2017;139:31011. doi: 10.1115/1.4034663.
    1. Intuary Inc. Verbally. [(accessed on 9 April 2019)]; Available online: .
    1. AssistiveWare Proloquo2Go. [(accessed on 9 April 2019)]; Available online: .
    1. Therapy Box Predictable TM. [(accessed on 9 April 2019)]; Available online: .
    1. Massaroni C., Venanzi C., Silvatti A., Lo Presti D., Saccomandi P., Formica D., Giurazza F., Caponero M., Schena E. Smart textile for respiratory monitoring and thoraco-abdominal motion pattern evaluation. J. Biophotonics. 2018;11:e201700263. doi: 10.1002/jbio.201700263.
    1. Itasaka Y., Miyazaki S., Tanaka T., Shibata Y., Ishikawa K. Detection of Respiratory Events during Polysomnography—Nasal-Oral Pressure Sensor Versus Thermocouple Airflow Sensor. Pract. Oto-Rhino-Laryngol. 2010;129:60–63. doi: 10.5631/jibirinsuppl.129.60.
    1. Zhang X., Ding Q. Respiratory rate monitoring from the photoplethysmogram via sparse signal reconstruction. Physiol. Meas. 2016;37:1105–1119. doi: 10.1088/0967-3334/37/7/1105.
    1. Yahya O., Faezipour M. Automatic detection and classification of acoustic breathing cycles; Proceedings of the 2014 Zone 1 Conference of the American Society for Engineering Education; Bridgeport, CT, USA. 3–5 April 2014.
    1. Elsahar Y., Bouazza-Marouf K., Kerr D., Gaur A., Kaushik V., Hu S. Breathing pattern interpretation as an alternative and effective voice communication solution. Biosensors. 2018;8:48. doi: 10.3390/bios8020048.
    1. Shorrock T., MacKay D., Ball C. Deterministic and Statistical Methods in Machine Learning. Springer; Heidelberg/Berlin, Germany: 2005. Efficient Communication by Breathing; pp. 88–97.
    1. Plotkin A., Sela L., Weissbrod A., Kahana R., Haviv L., Yeshurun Y., Soroker N., Sobel N. Sniffing enables communication and environmental control for the severely disabled. Proc. Natl. Acad. Sci. USA. 2010;107:14413–14418. doi: 10.1073/pnas.1006746107.
    1. Fager S., Bardach L., Russell S., Higginbotham J. Access to augmentative and alternative communication: New technologies and clinical decision-making. J. Pediatr. Rehabil. Med. 2012;5:53–61.
    1. Garcia R.G., Ibarra J.B.G., Paglinawan C.C., Paglinawan A.C., Valiente L., Sejera M.M., Bernal M.V., Cortinas W.J., Dave J.M., Villegas M.C. Wearable augmentative and alternative communication device for paralysis victims using Brute Force Algorithm for pattern recognition; Proceedings of the 2017 IEEE 9th International Conference on Humanoid, Nanotechnology, Information Technology, Communication and Control, Environment and Management (HNICEM); Manila, Philippines. 1–3 December 2017; pp. 1–6.
    1. Voiceitt. [(accessed on 9 April 2019)]; Available online: .
    1. Chaudhary U., Birbaumer N., Curado M.R. Brain-Machine Interface (BMI) in paralysis. Ann. Phys. Rehabil. Med. 2015;58:9–13. doi: 10.1016/j.rehab.2014.11.002.
    1. Birbaumer N., Murguialday A.R., Cohen L. Brain-computer interface in paralysis. Curr. Opin. Neurol. 2008;21:634–638. doi: 10.1097/WCO.0b013e328315ee2d.
    1. Yeo M., Jiang L., Tham E., Xiong W. Evaluation of a low-cost alternative communication device with brain control; Proceedings of the 2015 10th IEEE Conference on Industrial Electronics and Applications, ICIEA 2015; Auckland, New Zealand. 15–17 June 2015; pp. 229–232.
    1. Kaiser V., Bauernfeind G., Kreilinger A., Kaufmann T., Kübler A., Neuper C., Müller-Putz G.R. Cortical effects of user training in a motor imagery based brain-computer interface measured by fNIRS and EEG. Neuroimage. 2014;85:432–444. doi: 10.1016/j.neuroimage.2013.04.097.
    1. Hippe Z.S., Kulikowski J.L., Mroczek T., Wtorek J. A Robust Asynchronous SSVEP Brain-Computer Interface Based On Cluster Analysis of Canonical Correlation Coefficients. Adv. Intell. Syst. Comput. 2014;300:3–14.
    1. Chen X., Wang Y., Nakanishi M., Gao X., Jung T.-P., Gao S. High-speed spelling with a noninvasive brain–computer interface. Proc. Natl. Acad. Sci. USA. 2015;112:E6058–E6067. doi: 10.1073/pnas.1508080112.
    1. Tan P., Tan G., Cai Z. Dual-tree complex wavelet transform-based feature extraction for brain computer interface; Proceedings of the 12th International Conference on Fuzzy Systems and Knowledge Discovery, FSKD 2015; Zhangjiajie, China. 15–17 August 2015; pp. 1136–1140.
    1. Thomas J., Maszczyk T., Sinha N., Kluge T., Dauwels J. Deep learning-based classification for brain-computer interfaces; Proceedings of the 2017 IEEE International Conference on Systems, Man, and Cybernetics, SMC 2017; San Diego, CA, USA. 5–8 October 2017; pp. 234–239.
    1. Gupta A., Parameswaran S., Lee C.H. Classification of electroencephalography (EEG) signals for different mental activities using Kullback Leibler (KL) divergence; Proceedings of the ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing; Taipei, Taiwan. 19–24 April 2009; pp. 1697–1700.
    1. Lotte F., Congedo M., Lécuyer A., Lamarche F., Arnaldi B. A review of classification algorithms for EEG-based brain-computer interfaces. J. Neural Eng. 2007;4:R1–R13. doi: 10.1088/1741-2560/4/2/R01.
    1. Zhang Y., Ji X., Zhang Y. Classification of EEG signals based on AR model and approximate entropy; Proceedings of the 2015 International Joint Conference on Neural Networks (IJCNN); Killarney, Ireland. 12–17 July 2015.
    1. Guger C., Schlögl A., Neuper C., Walterspacher D., Strain T., Pfurtscheller G. Rapid prototyping of an EEG-based brain-computer interface (BCI) IEEE Trans. Neural Syst. Rehabil. Eng. 2001;9:49–58. doi: 10.1109/7333.918276.
    1. Ortiz-Rosario A., Adeli H. Brain-computer interface technologies: From signal to action. Rev. Neurosci. 2013;24:537–552. doi: 10.1515/revneuro-2013-0032.
    1. Choi B., Jo S. A Low-Cost EEG System-Based Hybrid Brain-Computer Interface for Humanoid Robot Navigation and Recognition. PLoS ONE. 2013;8:e74583. doi: 10.1371/journal.pone.0074583.
    1. Nijboer F., Plass-Oude Bos D., Blokland Y., van Wijk R., Farquhar J. Design requirements and potential target users for brain-computer interfaces–recommendations from rehabilitation professionals. Brain-Comput. Interfaces. 2014;1:50–61. doi: 10.1080/2326263X.2013.877210.
    1. McFarland D.J., Wolpaw J.R. Brain–computer interface use is a skill that user and system acquire together. PLoS Biol. 2018;16:10–13. doi: 10.1371/journal.pbio.2006719.
    1. Perdikis S., Tonin L., Saeedi S., Schneider C., Millán J. del R. The Cybathlon BCI race: Successful longitudinal mutual learning with two tetraplegic users. PLoS Biol. 2018;16:1–28. doi: 10.1371/journal.pbio.2003787.
    1. Nuyujukian P., Albites Sanabria J., Saab J., Pandarinath C., Jarosiewicz B., Blabe C.H., Franco B., Mernoff S.T., Eskandar E.N., Simeral J.D., et al. Cortical control of a tablet computer by people with paralysis. PLoS ONE. 2018;13:e0204566. doi: 10.1371/journal.pone.0204566.
    1. Yu T., Li Y., Long J., Gu Z. Surfing the Internet with a BCI mouse. J. Neural Eng. 2012;9:036012. doi: 10.1088/1741-2560/9/3/036012.
    1. Karim A.A., Hinterberger T., Richter J., Mellinger J., Neumann N., Flor H., Kübler A., Birbaumer N. Neural Internet: Web surfing with brain potentials for the completely paralyzed. Neurorehabil. Neural Repair. 2006;20:508–515. doi: 10.1177/1545968306290661.
    1. Pennington C., McCoy K.F., Trnka K., McCaw J., Yarrington D. The effects of word prediction on communication rate for AAC; Proceedings of the NAACL HLT 2007, Rochester, NY, USA; Rochester, NY, USA. 26 April 2007; pp. 173–176.
    1. Alomari M.H., Abubaker A., Turani A., Baniyounes A.M., Manasreh A. EEG Mouse: A Machine Learning-Based Brain Computer Interface. Int. J. Adv. Comput. Sci. Appl. 2014;5:193–198.
    1. Higginbotham D.J., Lesher G.W., Moulton B.J., Roark B. The application of natural language processing to augmentative and alternative communication. Assist. Technol. 2012;24:14–24. doi: 10.1080/10400435.2011.648714.
    1. Trnka K., Yarrington D., McCoy K., Pennington C. Topic modeling in fringe word prediction for AAC; Proceedings of the 11th International Conference on Intelligent User Interfaces; Sydney, Australia. 29 January–1 February 2006; pp. 276–282.
    1. Müller K.R., Krauledat M., Dornhege G., Curio G., Blankertz B. Human Interface and the Management of Information. Methods, Techniques and Tools in Information Design. Volume 4557. Springer; Berlin/Heidelberg, Germany: 2007. Machine Learning and Applications for Brain-Computer Interfacing; p. 132.
    1. Shenoy P., Krauledat M., Blankertz B., Rao R.P.N., Müller K.R. Towards adaptive classification for BCI. J. Neural Eng. 2006;3:R13–R23. doi: 10.1088/1741-2560/3/1/R02.
    1. McFarland D.J., Wolpaw J.R. Brain-Computer Interfaces for Communication and Control. ACM Commun. 2011;54:60–66. doi: 10.1145/1941487.1941506.
    1. Mainsah B.O., Collins L.M., Colwell K.A., Sellers E.W., Ryan D.B., Caves K., Throckmorton C.S. Increasing BCI communication rates with dynamic stopping towards more practical use: An ALS study. J. Neural Eng. 2015;12:16013. doi: 10.1088/1741-2560/12/1/016013.
    1. Hussein A., Adda M., Atieh M., Fahs W. Smart home design for disabled people based on neural networks. Procedia Comput. Sci. 2014;37:117–126. doi: 10.1016/j.procs.2014.08.020.
    1. Alamsaputra D.M., Kohnert K.J., Munson B., Reichle J. Synthesized speech intelligibility among native speakers and non-native speakers of English. Augment. Altern. Commun. 2006;22:258–268. doi: 10.1080/00498250600718555.
    1. Beukelman D.R., Mirenda P. Augmentative and Alternative Communication: Supporting Children and Adults with Complex Communication Needs. 4th ed. Paul H. Brookes Pub.; Baltimore, MD, USA: 2013.
    1. Zhang X., Kulkarni H., Morris M.R. Smartphone-Based Gaze Gesture Communication for People with Motor Disabilities; Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems; Denver, CO, USA. 6–11 May 2017; pp. 2878–2889.
    1. Villanueva A., Cabeza R., Porta S. Eye tracking system model with easy calibration; Proceedings of the 2004 symposium on Eye Tracking Research & Applications; San Antonio, TX, USA. 2004; p. 55.
    1. Sellers E.W., Vaughan T.M., Wolpaw J.R. A brain-computer interface for long-term independent home use. Amyotroph. Lateral Scler. 2010;11:449–455. doi: 10.3109/17482961003777470.
    1. Brumberg J.S., Pitt K.M., Mantie-Kozlowski A., Burnison J.D. Brain–computer interfaces for augmentative and alternative communication: A tutorial. Am. J. Speech-Lang. Pathol. 2018;27:1–12. doi: 10.1044/2017_AJSLP-16-0244.
    1. Abdulkader S.N., Atia A., Mostafa M.-S.M. Brain computer interfacing: Applications and challenges. Egypt. Inform. J. 2015;16:213–230. doi: 10.1016/j.eij.2015.06.002.
    1. Kumar M. Reducing the Cost of Eye Tracking Systems. [(accessed on 1 January 2006)];Citeseer. 2008 4 Available online: .
    1. Light J., McNaughton D., Beukelman D., Fager S.K., Fried-Oken M., Jakobs T., Jakobs E. Challenges and opportunities in augmentative and alternative communication: Research and technology development to enhance communication and participation for individuals with complex communication needs. AAC Augment. Altern. Commun. 2019;35:1–12. doi: 10.1080/07434618.2018.1556732.
    1. Courtney V.E., Koverb S.T. An Open Conversation on Using Eye-Gaze Methods in Studies of Neurodevelopmental Disorders. J. Speech Lang. Hear. Res. 2015;58:1719–1732.
    1. Kok E.M., Jarodzka H. Before your very eyes: The value and limitations of eye tracking in medical education. Med. Educ. 2017;51:114–122. doi: 10.1111/medu.13066.
    1. Wang Y.T., Wang Y., Cheng C.K., Jung T.P. Developing stimulus presentation on mobile devices for a truly portable SSVEP-based BCI; Proceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology Society, EMBS; Osaka, Japan. 3–7 July 2013; pp. 5271–5274.
    1. Waller A. Telling tales: Unlocking the potential of AAC technologies. Int. J. Lang. Commun. Disord. 2019:1–11. doi: 10.1111/1460-6984.12449.
    1. Tauroza S., Allison D. Speech rates in British English. Appl. Linguist. 1990;11:90–105. doi: 10.1093/applin/11.1.90.
    1. Wilkinson K.M., Mitchell T. Eye Tracking Research to Answer Questions about Augmentative and Alternative Communication Assessment and Intervention. Augment. Altern. Commun. 2015;30:106–119. doi: 10.3109/07434618.2014.904435.
    1. Costigan F.A., Newell K.M. An analysis of constraints on access to augmentative communication in cerebral palsy. Can. J. Occup. Ther. 2009;76:153–161. doi: 10.1177/000841740907600304.
    1. Kumar S., Aishwaraya B.K., Bhanutheja K.N., Chaitra M. Breath to speech communication with fall detection for Elder/Patient with take care analytics; Proceedings of the 2016 IEEE International Conference on Recent Trends in Electronics, Information & Communication Technology (RTEICT); Bangalore, India. 20–21 May 2016; pp. 527–531.
    1. Moore M.M. Real-World Applications for Brain—Computer Interface Technology. IEEE Trans. Neural Syst. Rehabil. Eng. 2003;11:162–165. doi: 10.1109/TNSRE.2003.814433.
    1. Ruan S., Wobbrock J.O., Liou K., Ng A., Landay J. Speech is 3x faster than typing for english and mandarin text entry on mobile devices. arXiv. 20161608.07323
    1. Leo M., Furnari A., Medioni G.G., Trivedi M., Farinella G.M. Deep Learning for Assistive Computer Vision; Proceedings of the European Conference on Computer Vision (ECCV); Munich, Germany. 8–14 September 2018; p. 11134.
    1. Baxter S., Enderby P., Evans P., Judge S. Interventions using high-technology communication devices: A state of the art review. Folia Phoniatr. Logop. 2012;64:137–144. doi: 10.1159/000338250.

Source: PubMed

3
Sottoscrivi