I’m a bit of a paradox.
I love technology for its advances, but I’m far from a proponent of a life full of automation. I realize there’s no way to 100% avoid living life with computers controlling things, but I like some old-school technology… like CDs (or, more appropriately, those things that are used as coasters these days, thanks to streaming services).
But when it comes to navigating my environment, or receiving visual information, I would be at a huge disadvantage without technology. I’m grateful that I have lots of choices to receive visual information. If I need real-time information, where the information is hard to read or interpret, I use services like Aira (which I’ve written about a couple of times before) and Be my Eyes (a free service that I will write about soon. But technology has improved so much that there are applications that can do some things more efficiently, or more confidentially, but using OCR to read text from a variety of sources.
I’ve used visual interpreting services like Aira and Be my Eyes to travel, navigate, and get information about a variety of items. I wanted to try a few other options to see what I could find out about my environment. Last week, I tried to use Google’s Lookout app to see the new bus stop signs in my neighborhood. One of Lookout’s features is the ability to scan your surroundings and read any text it comes up with. Because of the height of the bus stop, I had a hard time angling myself and my phone in such a way to read the stop, if I could find the stop at all. Although I did approach one bus stop at such a flawless angle that it read the entire bus stop sign. I haven’t been able to repeat that success since. However, I was able to read the Apartment for Rent sign across the street.
Even before my bus stop experiment, I’ve use Lookout for a variety of tasks. Things like reading labels on cans, bottles or jars. In addition to the text finding feature, it has a Documents mode, which gives you very clear directions on where to situate your phone in relation to the document – from whether you need to move your phone closer or further away, or to the right or left to capture all of the text. Because of the directions provided, I’ve become much more confident in how to angle my phone to capture the text, which will save time in instances where I will need visual interpreting services like Aira or Be my Eyes.
I Was using this newly-earned knowledge and confidence yesterday when a new bank card arrived in the mail. I flattened the paper that surrounded the card, positioned my phone almost perfectly, and heard the “Hold still” and Click that signify a photo of the text had been taken. I read the text on the paper about how to activate my card, and then I heard an amazing sound… a series of digits. I counted them: 16. I scrolled down and heard the expiry date. Because OCR isn’t perfect, I snapped another photo to doublecheck the digits. The same 16 digits, the same expiry date. I called the bank and activated my card, and I felt something balloon inside me.
For the first time in my entire life, I was able to completely independently verify my banking information. I didn’t need a bank employee to tell me my card number. I didn’t have to ask a family member or friend for the information. I didn’t even need a visual interpreting service. I just needed a phone and a camera and the ability to take a solid snapshot of the card itself… and off I went. It has literally never been that easy.
Today, I’m thankful for Google Lookout, for the gift of autonomy.
Now, please excuse me while I find the one CD player in the house to play my favourite coaster.
Thanks for this reminder. I redownloaded the app and forgot that I once had it. I’ll be sure to look for ways to use it. hopefully it’ll work better than some of the other apps I have.
LikeLiked by 1 person