What If Tesla Had Its Own Proprietary AI Assistant?

Tesla

DEC 16 2017 BY EVANNEX 28

Tesla

Source: Tesla

TESLA SHOULD LAUNCH AN AI ASSISTANT

When Tesla unveiled the Model 3 in early 2016, people were shocked at the minimalism and simplicity of the car’s interior. Everything in the car is controlled through a single touchscreen.

When compared to a 2016 BMW 3 series, the difference is striking.

*This article comes to us courtesy of EVANNEX (which also makes aftermarket Tesla accessories). Guest Blog Post: Galileo Russell*. Posted by Matt Pressman.

Tesla

Source: BMW

The reason for Tesla’s departure from industry norms can be analyzed from two different angles.

Why no buttons?

The first reason is Tesla’s newcomer status. The company has no legacy to attach itself to — therefore they were able to start from scratch when designing the Model 3. This is a big reason Tesla was able to (so easily) depart from the status quo.

The second reason is Tesla’s big bet on software. Every little nuance on the Model 3 is controllable via software through the touchscreen. If everything is activated via tapping, some may construe this as burdensome — perhaps even a step backward. Certain tasks require several swipes/taps, where a button could be simpler. Beyond a clunkier user interface, it’s conceivable this could add to driver distraction.

But when you start to think about where the future of transport (autonomous) and technology (voice) are headed, the Silicon Valley automaker’s strategy starts to make a lot of sense.

Voice is coming

In 2014, Amazon created a new tech product category with the launch of its Alexa enabled smart speaker. Since then, the company has sold an estimated 15M units and boasts an impressive 75% market share.

Tesla

Source: Amazon

But it’s not just Amazon. As this chart from Recode shows, the cat is out of the bag and the era of voice is coming quickly. Most large tech companies have accepted that voice is the future and are beginning to arrange their product lineups accordingly.

Tesla

Source: Recode via Jackdaw Research

As the functionality and software behind smart speakers improves, their value is becoming undeniable.

Using a hands-free voice approach is quickly becoming the de facto interface for how we control our homes. Everything from turning on the lights, to ordering an uber, to checking the news can be done with a smart speaker.

The selling point to consumers is clear. Voice saves time and energy, and therefore it’s incredibly convenient for a huge array of tasks.

Beyond the home, the car seems like the next logical step for voice interactions. Fiddling with buttons, nobs, (or even a touchscreen) while driving can be distracting, and seems far more burdensome than simply asking for what you want to happen.

  • “Roll down the front two windows”
  • “Play my favorite Spotify playlist”
  • “Take me to the closest Starbucks”
  • “Put on my favorite podcast”

These commands aren’t that complicated and make for a much more frictionless driving experience. When combining voice commands and an autonomous car, the synergies are very compelling.

Imagine in 2025 hopping into your car and telling it where to take you and what entertainment to play.

Enter the AI assistant: Jarvis

Tesla’s theoretical assistant Jarvis (who I’m naming Tony Stark’s AI assistant in Iron Man) fits seamlessly into this future.

Tesla

Source: One Reach

Talking to your car about what you want seems like the most natural and frictionless riding experience. As a luxury brand, Tesla should settle for nothing less.

Fiddling with radio stations, entering directions into a navigation system and even controlling temperatures will seem archaic in this new AI assistant era.

The beauty of the automaker’s software-first interior design approach is that it’s set up perfectly to adapt to an AI assistant when the company is ready.

Tesla

Source: Tesla

With everything in the car controlled by software, eventually, an AI assistant will have access the most useful features and settings that passengers would need.

Tesla can upgrade cars remotely via over-the-air software updates and slowly introduce an AI with increasing functionality over time.

Where will Tesla get the money!?!

I can already foresee the biggest complaint — Tesla doesn’t have the money. They’re already capital constrained with Model 3 production and have a million other things on their plate. For the most part, I agree, and it’s likely impractical for Tesla to dedicate meaningful resources to this right now.

That said, Elon’s vision for Tesla and the future of transportation is grandiose (to say the least) and a world with autonomous electric cars taking us from point A to B seems a natural fit for voice.

Tesla needs to be thinking about this future and invest in a program like this if the Model 3 continues to be successful. So even if now is not the time to ramp spending on Jarvis, the day is quickly approaching.

Tesla should launch an AI assistant

Given Elon Musk’s visionary nature, and this interesting clue in Tesla’s Q3 2017 shareholder letter, my personal guess is that Tesla’s already working on an AI assistant.

Tesla

Source: Tesla

But even if they’re not, I strongly believe it should be on the company’s roadmap.

Tesla’s brand image is one of being a software-first car company of the future. This means staying one step ahead of the curve and delivering a superior driving experience.

Having a proprietary AI assistant will become a necessary piece of this puzzle as cars continue to become more and more like smartphones.

“Now Jarvis take me to the Gigafactory! I think Elon still needs help on Model 3 production!”

Video

Source: HyperChange TV

===

*Guest Blog Post: Galileo Russell is a 24 year-old Tesla shareholder based in NYC. He has been blogging about Tesla since 2012, and is the founder of HyperChange TV, a new YouTube channel about tech and finance news for millennials.

Categories: Tesla

Tags:

Leave a Reply

28 Comments on "What If Tesla Had Its Own Proprietary AI Assistant?"

newest oldest most voted

hmmmm… Tencent sunk $1.78 Billion into buying TSLA shares early this year:

https://insideevs.com/chinas-tencent-invest-1-78-billion-in-tesla-gets-5-stake/

Tencent now has an Alexa-like voice assistant:

https://www.techinasia.com/tencent-voice-assistant-launch

Would cost money, when this is supposed to understand several languages. . .
Just ask Apple and Google. They only cover a few languages. Years after their assistants were released.
Buttons, dials and switches are universal.
But I understand they save cost by doing this. Quality buttons and switches is expensive if you buy millions of them.
I also think voice commands are more a gimmic in most situations, just like hand gestures.
There are situations were it may be OK, and even an anvantage – but they are few.

I’ve never personally used gesture control, but from what I’ve read it sounds like it holds great promise. And unlike voice interface, it would seem to be a “universal language” (at least for those who have a full set of fingers) and would be far less prone to outside interference and variances between different people, which are problems which plague voice interface.

To call gesture control a “gimmick” seems to be about as forward-thinking as claiming motorcars are a passing fad. 😉

Well, Motor Cars did a lot of passing of Horses, aling the way!

Hand gestures (not a gimmick for the few), come in obviously many forms, especially if ones vision is comprised, due to many outlying cicumbstancial factors.
These types of command inputs, will be a most welcome surprise for many vision handicapped individuals (current non drivers). Drivers with line of sight problems, and grayscale impairment, along with elderly drivers navigating at night, will welcome these input choices. The more all inclusive AP driving control the better, no matter how it gets initiated into any cars UI.

Yeah. I just watched a video of gesture controls in the BMW 7-Series. Looks very promising, and almost certainly not prone to the same fallibility as voice recognition.

This appears to be the solution to complaints about people wanting knobs for certain controls such as radio/sound system volume or vent fan speed or intermittent wiper controls. The BMW 7-Series gesture control can recognize the action of a hand turning, as if there’s an invisible knob being turned, to control volume.

I wonder if it could detect the motion of a single finger waving, and turn on the windshield wipers?

Or depending upon which finger, honk the horn…

LOL!

“Or depending upon which finger, honk the horn…”

Presumably, if it’s the “one-finger salute”, that would be a horn mimicking the sound of a
“Bronx cheer”? 😀

Galileo needs to think about doing his podcast sitting on his hands.

Driving is fun.

This is nonsense. Tesla shouldn’t be dissipating its resources on developing advanced voice control systems. It’s bad enough that Tesla is spending so much time and energy on developing a self-driving system. I have no doubt that as voice interface for expert systems (falsely labeled “A.I.”) advance, they will be integrated into Tesla’s cars. I also have no doubt that, as LiDAR systems come down in price, they will also be integrated into Tesla’s self-driving car sensor systems. It’s as silly to claim that Tesla should spend its resources on developing cheap solid-state LiDAR systems as to claim it should develop voice interface for expert systems software. * * * * * “Jarvis” in the Iron Man movies is fantasy, or at best science fiction. It’s not real. Tesla can’t put fantasy into its cars; it has to put actual real-world systems into them. In the real world, no Jarvis exists. Just a day or two ago I read a long complaint from someone about how problematic the voice-activated controls were in their car. He (or was it she?) said that he had to turn down the music and tell the kids in the back seat to be quiet, before… Read more »

Yes, A. D. D. If focus == 0, Fail

Voice recognition in the Model S is really good. I use it all the time for navigation and making phone calls. Its extremely fast and accurate and much better than typing in. I would not call this a work in progress

In a quieter environment voice control works fine. How about with windows down. Or your daughter streaming her Spotify playlist via Bluetooth. It’s not that voice control doesn’t work its just the fact it’s not 100% accurate and there are limitations to its use. Buttons always work regardless of the environment.

Maybe in the future buttons in cars will be little more that small LCD screens that can be reprogrammed based on driver preference. But still something you could physical press.

“Voice recognition in the Model S is really good. I use it all the time for navigation and making phone calls. Its extremely fast and accurate and much better than typing in. I would not call this a work in progress” I don’t at all want to deny your experience; I’m glad it works for you, and obviously it must work to some extend or Tesla wouldn’t bother to include it in their cars. But from other reports, many other reports, voice recognition systems in general — not necessarily Tesla’s own system — are clearly problematic for some others. Among the problems reported with voice recognition in general — again, not necessarily Tesla’s voice recognition system — is that the systems have a harder time interpreting women’s voices than men’s. Men’s voices are richer in harmonics. Women’s tend to be purer and lacking in harmonics, so there is less data there for the voice recognition system to interpret. I’ve also read that people who had their voice recognition system “trained” for their particular voice had problems when they had a cold, since stuffed-up sinuses change the sound of their voice. But those are older problems, and perhaps they have been… Read more »

Without something other than that screen with thousands of options available to the driver, its a distracted driving accident waiting to happen. Mark my words.

It always takes a longer time to touch a touchpad area that it does a knob or control. I know my bolt has both.

A brainstorm idea I had the other day, and I haven’t seen this suggested elsewhere, would be to put a panel on the dash with just two knobs, and a few icons in a row on the panel above the knobs. Touching one of the icons would cue up the controls for that specific task. This would enable a few of the more commonly used functions to be controlled with knobs, where people prefer those.

From reading what various people have requested or complained about, those icons might include radio volume, wiper speed, and climate control fan speed and temperature. A truly advanced system might allow the user to select which controls could be controlled by that panel, and perhaps even display different icons based on what the user selected.

Now, all I have to do is convince every EV maker that this is the right approach to eliminating most buttons and knobs, while still allowing drivers the comfort and positive control of tactile interface with a real hardware knob. 🙂

I was thinking along those lines as so many functions, virtually all, are done through the screen.
Not too surprising.

I couldn’t tell a word Galileo said in the Moonshot Monday youtube video!!!
Slow down man.

Look at those awesome heads-up displays by Jarvis. 😀

NIO just launched the ES8 and it already has this. its called NOMI not Jarvis though.

“Model 3, go faster”
“I can’t do that Dave, it’s not safe”
“Model 3, take me to McDonald’s”
“I can’t do that Dave, your over weight”

*You’re

Sorry, I couldn’t help myself.

Maybe interacting with an AI is too newfangled for me, I can certainly imagine asking my car if I have enough range to go to X, or ask where is there a charger between where I am and where I’m going, and is there a non-Starbucks coffee shop near that charger.

I can also see myself asking it to play my Highway Playlist.

I can also see myself asking the car to remind me to stop at the grocery on the way home to pick up specific items.

Even solve “The Traveling Salesman Problem” (look it up), but in all sincerity, if you have to ask the car to raise and lower the windows, there is, in my opinion a definite design problem with the car itself. In the same way that the Hazard Lights still have a physical button, I believe certain other functions should still have physical buttons in the most logical locations, such as window, mirrors, and locks. If you want an operate by wire supplement that’s fine, but in my not so humble opinion, until we’re at level 5 autonomy, I still want a few buttons.

“Even solve ‘The Traveling Salesman Problem’ (look it up)…”

Hopefully the car’s computer would not, as in all too many bad TV shows of the sixties and seventies, short-circuit itself in a shower of sparks merely because you had posed a problem beyond the capacity of the computer to resolve. 😉

No need to get all sci-fi. Use voice or gesture to set the “mode” and have the thumbwheels on the steering wheel do the actual control.

RADIO – one thumbwheel selects station, the other controls volume

MIRRORS – left thumbwheel controls left mirror, right controls right mirror

HEAT – left controls fan speed, right controls temperature

AC, DEFROST, etc. – same

You can do this for wipers, cruise control, etc., though rain sensing wipers and advanced cruise control shouldn’t need much fiddling.

That is perhaps more likely to become commonplace than my suggestion of a separate panel on the dash with two knobs, and a row of icons above it to select the function of those knobs.

My suggestion would have the advantage of allowing either the driver or the front seat passenger to access the controls, which you might want for either the radio/music player, or the climate control. But perhaps my suggestion is too limited, not sufficiently forward-thinking.

Physical buttons are more expensive, easier to find. A microphone with software behind it is cheaper yet, but even less reliably accessible than a touch screen.

A trade-off.