The Future of Tech is In and Near Our Hands

It seems like every time I take a long car trip with my girlfriend, we start to discuss the latest in technical innovations and what might be the technology of the future. We each have our own sources of information about what is up and coming in the tech world. She listens to the Science Friday program on NPR and I read whatever interesting articles happen to appear in my Twitter timeline from the various tech sites/people I follow. I obviously have an upper hand when it comes to the latest news, while she gets a more curated form of information. With our combined knowledge, I think we have a good idea of what is to come for the technology we use day to day in the future.

First, a little primer of what other people are talking about before we take a peek into my imagination. Many of us have seen video's like Microsoft's Vision of the Future. This video has such a spectacular display of computer graphics and simulated innovation that it's hard not to believe that it could one day be real- especially if big companies like Microsoft are dreaming of it, too. I was in that same boat of believing that one day we could video chat halfway across the world with our transparent smartphones powered by a holographic display-- that is until I read a fascinating article by Bret Victor called A Brief Rant on The Future of Interaction Design. Bret presents an interesting point about Microsoft's vision: he argues that the video shows a stagnant view of the way we already use things, and that it continues to limit the way we use our technology. Why should we continue to learn new and unnatural gestures such as swiping, tapping, or pinching with just one or two of our fingers, when there is so much more that we can do with the rest of our hands that we have been doing for hundreds of years? Even in the rare cases when we do use our whole hand, like switching applications or screens on an iPad or Mac, we still accomplish just one task.

There is also a future in the innovation of screens. I don't mean increasing the resolution until there is an IMAX movie theatre sitting in my pocket. I'm talking about the machines that run the plethora of screens around us all the time. Right now, my two favorite screens are the one on my wrist and the one in my pocket. I backed the Pebble Smartwatch since the beginnings of it's Kickstarter campaign and have been wearing it nearly non-stop since February of this year (and I have the tan-lines to prove it). But after all is said and done, it is a convenience for me to wear a Pebble rather than pull my phone out my pocket every time I get a notification. My iPhone has become more of a necessity as I dive deeper and deeper into an always-online life-- it is the one thing I insist on having with me at all times and it powers some of the most important decisions I make every day. Now enough with the primer, let's paint a pretty picture of the future.


Picture this:

Wake up in the morning feeling like P-Diddy….Kidding, let's not.

I wake up to sound of my phone's morning alarm on my nightstand dock and the vibration of my smart cuff. The tap of my hand on the stand snoozes the alarm because the smartcuff around my forearm is within Bluetooth range. I roll out of bed and flick the air. My bedroom lights fade on, without blinding me. I swing my arms like I'm conducting an invisible orchestra and my morning playlist blasts from the bedroom speakers. My music follows me as I shuffle into the bathroom, and then I grip the air in front of me and turn my wrist clockwise. Water starts to pour of the shower head, and I adjust the heat based on the temperature readings on my cuff's display.

Alright stop- explanation time.

There are technologies coming to fruition that allow us to re-imagine how we interact with the technology around us, like the Leap Motion controller. Leap is essentially a tiny metal box of sensors that can track your hands and fingers in the air, so developers can program their desktop/web apps to use gestures like grabbing, thumbs up, or writing in the air for increased functionality. Elon Musk of Tesla Motors and SpaceX has a great video demonstrating how this can be used. Leap is still primarily for developers but, as you can see from that video, there is positive work being done to integrate this device with commercial software. However, this display interaction is not engaging enough because there still isn't any tactile feedback from the device.

Thalmic Labs has been working on an amazing gadget to fix the tactile problem called Myo, and the video on the landing page perfectly displays the incredible potential behind it. Myo is a forearm cuff packed with sensors that track the minute movements of your arm muscles and, by literal extension, your hand muscles. This allows for more finely-tuned gestures that motion sensors and cameras just can't compete with. The Myo could allow for haptic (tactile) feedback through the use of a vibration pack or micro-electric signals that simulate touch.

Now let's take a look back at the smartcuff that seemed to make magic happen during my morning routine. This cuff could be the future combination of technologies used by Myo, Pebble, Nike and Disney today. As an extension of your phone, the cuff would be connected to it via the latest evolution of Bluetooth, or similar tech, and constantly communicating your movements back to your phone for analysis and review later. Sensors similar to the ones found in the Leap Motion controller could be placed all over a house or building to translate your gestures into the operative task. This is particularly enterprising for the germaphobes who will never have to touch a questionable surface ever again.

Let's continue with my day:

On my way into the garage, I quickly brush my cuff against the fridge handle to receive an immediate inventory of its contents and what I may need to pick up at the grocery store later. I plug my phone into the dock compartment between my e-bike's handlebars and start pedaling to work. Once I get to the office and hang up my bike, I dock my phone to my desk and take a seat. Both of my monitors immediately light up with my desktop home screen- my bluetooth keyboard and trackpad glow briefly to signal they're active, and I get to work. About an hour later, my phone rings, and I see a caller notification appear on one of my monitors. I mime a phone with my cuffed hand, bring it up to my face, and begin speaking into my pinky to answer the call. Once the call has ended, I hang up my invisible phone and return to work. When it's time to go home, I pick my phone off the dock, and the monitors go dark.

So a few things happened in that last part. First, let's discuss the phone situation:

Over the years, the hub of the Apple ecosystem has moved from the desktop computer to the mobile device, thanks to iCloud. When iCloud came along it became the wireless hub where all your devices can stay in sync with other cloud storage services that store your settings, notes, photos, etc. As cloud storage continues to increase (you can get over a terabyte with some services !) and the age of the Superphone begins (a 64-bit smartphone is a big deal! ), what's to stop consumers from using just one device across multiple empty screens? While this idea is nothing new, the OS and implementation are not quite ready yet. Imagine owning a quad-core 64-bit phone with a couple gigs of RAM that you can dock to multiple screens via a universal connector and run truly responsive applications which give additional functionality for larger screens. This device would be constantly connected to the cloud to back up and store all the information you require and is always-online to communicate with the world of Internet-connected devices around it. With this technology, tablets and laptops would become portable docking stations containing enormous batteries that will charge your phone while in use. Imagine never again worrying about your files not syncing between devices, or mobile apps not having desktop counterparts.

Now about that phone call. When your phone is connected to another screen, what do you do when you actually have to use it as a phone? That's where the cuff comes in as the natural extension of the phone's capabilities. The theory behind using the hand-phone is the same employed by Bluetooth headsets today. The vibrations from your jawbone travel through your hand and up your arm to be accepted by the tiny microphone/speaker on the cuff, which then relays the info to your phone. As the person answers, the process is reversed and you have seamless communication.

Finally, the first interaction with the fridge. RFID is a small but powerful technology employed by many companies to relay information between devices, much like NFC except it is strictly one-way communication and works at greater distances. The theory behind the fridge's inventory system is an RFID reader in the door of the appliance that reads tags embedded in the food packaging to take stock of your latest grocery trip. It would be able to register expiration dates and nutritional information as well as take the initial weight from the scales integrated in all of the refrigerator's shelves and compartments to let you know if you're running low on milk, or if that milk has expired. Simply tapping the RFID area on the handle of the fridge with an applicable reader like the smartcuff allows instant transfer of information. *This is what we need in our refrigerators, Samsung/LG/GE, not the ability to tweet from the door. Notifications from our kitchen and washroom appliances could aid in saving water, power, and pots and pans forgotten on an active stove.

On to the final part of a day in the life of future me.

The work day is done and now it is time to enjoy a night out with some friends. We decide to meet at a local bar. On the way, I pass an ad-board offering 20% off if I like/follow/add their [insert new social media application here] page. Interested in saving a bit of money, I give the board a thumbs up with my cuffed hand and receive an instant notification of the coupon being sent to my email. At the bar, we all order our meals, followed by a couple rounds of drinks. When the time comes to settle our bills, the waitress brings over the e-bill pads, and we all tap our cuffs to the pad's reader and press our index fingers to the adjacent scanner to confirm the payment. After returning home from a pleasant evening out, I prepare for bed and wonder how we ever could have lived without such convenient technological advances.

OK, final overview. First things first--the interactive ad board delivering that helpful coupon:

Engaging and responsive ads are found all over the web, but they don't have to stop there. Seeing an ad you like doesn't have to mean taking a picture of the url or ugly QR code to remember it- instead, giving it a natural reaction like a thumbs up will delight users and advertisers alike. The instant feedback will also be enjoyable for the loads of ADHD consumers in the world who can't stand to wait for a delayed email confirmation.

The payment process would be a much quicker, more secure, and environmentally friendly solution than our current situation. Instead of handing over cash or a piece of plastic with personal account numbers out in the open and waiting for the staff member to return with it, we can shorten and secure the process by using the protective assets of our smart cuffs/phones and unique fingerprints directly. Fingerprint technology is evolving rapidly to become faster and more secure than ever before- just take a look at the newest iPhone. This two-step form of verification would be much harder to replicate and would save a lot of paper by reducing the use of wasteful receipts. Of course there will be speculation of reliability--but there always when such a radical paradigm shift is introduced.

I hope you enjoyed this little adventure into the (hopefully) not-so-distant future of consumer technology. I don't hold claim to the originality of any of these ideas, they are merely a collection of today's technology used in more advanced and convenient ways. Let me know what you think by reaching out to me on Twitter, shooting me an email from my contact page, or commenting below. I look forward to hearing your thoughts on the future as well.