A Day In The Life: Going Somewhere

Dan Furman
4 min readApr 26, 2021

How Brain Decoding Technology May Transform Our Everyday World.

“Found my coat and grabbed my hat, made the bus in seconds flat…”

Continuing my previous post (link), Jill has finished breakfast and now puts on her smart glasses and walks to her car. Her vehicle unlocks for her effortlessly. She has a 20-minute ride to the office for a meeting. The ride is super comfy. Nothing distracts her from the few things she planned to do on the commute. First, she pays the bills that arrived during the night, and then for the rest of the ride her AI voice assistant reads her a dynamic meeting brief. She notices that since the new system was implemented she became much better at remembering things, not only due to the better sleep she feels, but for some other reason… her brain is just feeling sharper than ever these days.

How it works

Lightweight, comfortable, polarized smart glasses, among other AR/VR devices entering the market in the next several years, will most likely carry BCI sensors by default onboard. Brain ID might be most natural way for users to effortlessly gain access to things like cars and office buildings, in addition to all forms of digital access points.

Biometric authentication based on brain signals will probably serve for new generations of head mounted devices, since you don’t want to show your face to glasses or touch it for fingerprints each time you want to do something, right? Plus those other metrics give worse security. Extremely low energy consumption and all other embedded functionality makes brain ID even better.

Combinations of computer vision systems recognizing objects, Brain ID authenticating users and BCI software with the ability to read simple commands will allow device manufacturers and developers to implement passwordless access systems that are both convenient and secure. Similar to what happened with sleep in Jill’s home bedroom environment, smart car controllers in her vehicle coordinate climate, route, acceleration, sound and other ride parameter changes to deliver her the most comfortable ride possible.

The same functionality used for car unlock and navigation personalization, works for Jill’s morning incoming bill information, with a pre-scheduled prompt to her acting as a trigger instead of the computer vision system. For all her invoice payments and check outs Jill prefers to use her Brain ID.

Jill might indeed be remembering and understanding things better as her BCI has now worked with her over months, checking parameters such as memorability and confusion and repeating wherever she was distracted. So she learned more in the moment, but also built a stronger base for better information storage and retrieval based on how her system has tuned to her specific memory signals.

When she did not understand something or just had a lower probability of moving a key piece of information from working memory to long-term declarative memory storage, for her work purposes for example, her AI voice assistant would brief her more circularly, repeating the important items the exact right amount of times for her brain to absorb in information optimally. This sort of intelligent information titration is at the core of augmented cognition programs aimed to maximize one’s cognitive performance and also central to the ecosystem of technologies like what’s being tested and developed at Toyota’s Woven City (link).

External impact

While it was only a 20 minute long ride, Jill’s whole ecosystem has changed :) She arrived in the car in a much better state than she would have been without BCI technology orchestrating a sound night of sleep, and to enter the car she used a new standard for high security: brain ID.

Authentication methods are changing fast as our devices evolve — effortless and passwordless authorization is one of the main trends now, which brain ID meets all the criteria for. There is no doubt that brain ID capabilities will drive faster BCI sensor adoption to enable easier single sign-on services and other identification needs across the technology stack.

Smart cars (from L3 autonomous vehicles on), as one example, will require significantly less human attention than they do now. Car manufacturers may loose lot of their main competition points, like the driving experience when you are the driver. The competition will move to how well car companies take care of the passenger who is not driving, the ‘occupants’ of the interior car space.

Actually, the car becomes a room where you spend time — essentially. Its both a challenge and opportunity for automotive companies — will they loose brand connection with car riders and become no name OEM’s for entertainment/internet companies, or will they have direct relationships with customers and be coordinating everything happening in the vehicle.

It seems to me like car companies are in a similar position now to the Telco industry before actual user relations and value was captured by Internet companies, who obviously also never gave back the relationships or value.

Voice assistants are another area we touched on, as well as education technology (more on this later in the day). What’s important here in the ‘going places’ morning part of her example day is that Jill’s BCI has given a reliable ID and silent command capabilities along the way that removed important barriers for her voice banking and checkouts needs, while creating a range of totally new experiences.

How we get around is poised to change dramatically with electric power, autonomous vehicles, and smart cities all advancing and combining to orchestrate new experiences of comfort, safety, and personalization.

I’ll continue describing Jill’s Day in my next post… stay tuned.

--

--