The vOICe augmented reality app for the BLIND works GREAT on my 25.1 week old Xiaomi Redmi 9T Android smartphone with my 22.3 week old BOSE QuietComfort 35 Series II stereo headphones!
Hello World! Ever since I became totally BLIND in early 2018, I have been wishing for wearable smart-glasses, only to discover I will never afford them for the remainder of my life. So, I decided to purchase a generic virtual reality headset and create my own smart-glasses with The vOICe! ☺️

For my ongoing computer vision (computer SCIENCE) project, I am using The vOICe Android app on my Xiaomi Redmi 9T smartphone with my Bose QuietComfort 35 Series II headphones and my NEW virtual reality goggles (with a sliding door for the camera).
For those that are curious, my new VR goggles came from Lazada, an online eCommerce site in The Philippines, and costs under 🇵🇭₱400 (🇺🇸$7.78). It’s cheaply-molded in plastic from (mainland) China. — This is the alternative to purchasing the Vision 800 glasses from Huawei online store, costing 🇵🇭₱22,000 (🇺🇸$428), approximately the same price of identical products imported via eBay.
My wife’s fully-sighted nephew, Josh, helped me with setting up my VR goggles with my Xiaomi smartphone with some difficulty; the difficult part is inserting smartphone into the tray that slides into the headset. Since my smartphone can NOT be manipulated when inside the goggles, I had to pre-launch The vOICe app, shake to mute, and have Josh put it inside the goggles for me. After initial testing, which was successful, I had Josh take a photo of me wearing the set-up for this blog post. — Josh turns 22 years old, 12 days after I become 52 years old next month. He’s presently in 4th year electrical engineering course in his college, here in The Philippines. I trained him my computer skills before I became blind. ☺️
At approximately 8AM this morning, I conducted my initial testing in my air conditioned bedroom at my mini SCIENCE lab (or my “man cave”). I stood up from my chair while wearing my VR goggles, rotated my head around in random positions while identifying changes in the output of soundscapes. I intentionally-walked to the wall-divider where I sometimes bumped my head, and noticed the edge detection sound, warning me of a potential collision. I then walked to my wall near my door and the soundscapes are informing me I’m about to collide into the wall. When I turned my head to my door, the door-knob is being detected in which the soundscapes are informing me of its location, in reference to my head’s position. I was able to maneuver back to my chair WITHOUT colliding with anything! ☺️ Throughout the initial test procedure, my Bose headphones’ noise-cancelling feature was set to maximum. My initial testing, which lasted approximately five minutes, was witnessed by Josh (whom was standing) and my wife (laying on my bed), both of whom were detected by The vOICe with face detection toggled off. This was my first HANDS-FREE computer vision science project! ☺️
Previously, to note, before The vOICe, I was unable to SAFELY navigate between my chair and my door with maximum noise-cancelling of my Bose headphones! I often required “open-ears” to avoid colliding into wall-divider, side-wall, bed-frame or my chair not under my table.
Though my initial science experiment was short, I did LEARN a lot! ☺️ The most important thing I LEARNED, is that I am able to safely navigate in a noisy environment! ☺️ Though I’m hearing The vOICe soundscapes as variable tones, subliminally, my brain is converting those tones into a navigational map for me to move or walk; normally, I required “open-ears” to do the same tasks without noises.
Yes, my VR goggle set-up is slightly-heavy on my head, and the goggles are not comfortable to wear, my original & current intention is to conduct hands-free computer vision SCIENCE experiments in lieu of unaffordable, wearable smart-glasses. It’s basically one of my “proof of concept” SCIENCE experiments! ☺️
As of this writing, I’m thinking of, and possibly considering the use of my old Android smartphone, the Samsung Galaxy Core Duo with Android 4.4 for a future The vOICe SCIENCE experiment, since the app can be configured to automatically-run when phone is booted up. The Core Duo was my Android smartphone, six months before I became blind, and upgraded to Android 7 phone prior to my blindness. I may purchase a more expensive VR goggles, costing between 🇵🇭₱ 2000 (🇺🇸$39) and 🇵🇭₱3000 (🇺🇸$59), for my own wearable smart-glasses invention. ☺️
And no, I have no plans on wearing VR goggles outside my home as I know it looks ugly in public! ☺️
Update (April 8, 2022): I conducted a 90 minute test that included my Logitech K480 Bluetooth QWERTY keyboard for remote control of the The vOICe app. I tested the digital compass function and I’m surprised at the useful information it provides. I toggled text identification and became surprised at how fast it read my LCD digital thermometer/hygrometer module, Google Chromecast screen on my HDTV, the words on my mug, the word “Amazon” on my Echo smart-speaker, and the temperature setting of my air conditioner, all by just pointing my head. The object identifier was not accurate as it told me objects I don’t have, so I toggled it off. Keyboard control helped me locate where the speed controls are, which are not found using finger gestures. My wife remarked at me, shaking my head to activate the mute & unmute function. Volume buttons on my Bose headphones does control the volume level of soundscapes & speech at the same time, however, it’s pause button does not toggle the mute function. Overall, the “hands-free” experience was very amazing! ☺️
As a side-note, I removed the lenses from the VR goggles for me to physically-touch the smartphone’s touchscreen without having to remove it from the structure. Towards the end of my SCIENCE experiment, using my hardware keyboard, I was able to exit the The vOICe app and launch the Envision AI app. Using Envision, Instant Text reading of my LCD digital thermometer/hygrometer module was slightly-less accurate, yet the Describe Scene function was somewhat accurate, yet very slow as it requires access to a remote server in The Netherlands; this was my “test simulation” of Envision Glass, which I will never afford.
On a technical note, I spent approximately 45 minutes, manipulating my Xiaomi Redmi 9T Android smartphone & my Apple iPhone 12 mini on to the tray that slides into the VR goggles. I slightly-damaged the screen-protector of the Xiaomi smartphone, but does not affect Google TalkBack finger gestures. Using some physical force, I got my Xiaomi smartphone inside my VR goggles and it did not fall out during the shake tests at my table; this test insures the smartphone will not drop to floor when conducting science experiments. All phone manipulations were done with devices turned off to prevent undesired operations.
Thanks for reading my latest blog post! Have a Great Day!
🇵🇭🇺🇸👨🦯🦽 📱⌨️📻🎧 📚🪀🧮