Imagine being unable to speak, yet still wanting to communicate with your loved ones. This is the reality for many patients in the ICU, including a family member of Reddit user buck746. After a spinal cord injury, this individual has been in the ICU since July 7th and is unable to talk due to a tracheotomy tube. Despite the challenges, they’re still able to move their lips in normal speech movements, but can’t make a sound.
This is where lip reading technology comes in. The family member is desperately seeking a solution that can run locally on their laptop, allowing them to communicate more than just yes/no answers. The ICU’s wifi restrictions and cell signal shielding make an offline solution the only viable option.
One potential solution is Chaplin, an open-source lip reading model. However, getting it to work on a MacBook Pro with 64GB of RAM running OSX 15.1 has proven to be a challenge. The family member is not familiar with Python, but is comfortable with the command line in the terminal.
The question remains: are there any alternative lip reading models that can be gotten to work sooner rather than later? This technology has the potential to greatly improve the quality of life for ICU patients who are unable to speak. It’s a reminder that technology can be a powerful tool in improving human connection and communication.
If you’re interested in exploring this topic further, there are many resources available, including research papers and open-source projects focused on lip reading technology.