Musk Personally Pushes Autopilot To The Limit To Improve System

Tesla Autopilot


The Tesla CEO is very hands-on. And off!

Tesla CEO Elon Musk takes a very hands-on approach to running the electric vehicle manufacturer. From acting as product architect to spending time on the assembly line torquing bolts, his fingerprints are literally smudged across every part of the company. We’re not surprised, then, to learn of the depth of his involvement with the development of the company’s Autopilot feature: a suite of advanced driver-assistance systems (ADAS) that control a vehicle with oversight from the driver. According to The Information, the entrepreneur drives a  Model S equipped with a special development version of the software that he can use to judge the latest changes and suggest new ones.

And he doesn’t just drive the car and make suggestions, though. He, at times, actually holds the weekly meeting with the Autopilot team’s senior managers — there are about a dozen of them and our source article discusses their individual roles — from behind the wheel. While interacting with the system, he can bring up any flaws or give feedback in real time about slight setting changes. You see, besides having the latest features, his development version of the software allows him to be able to tweak how “aggressive” the car may react in certain situations.

For instance, he has more latitude over the space between his car and the one it’s following, and can even have the system change lanes in tighter situations. This allows him to experience bugs that normal Autopilot users would never encounter. It’s this flexibility, they say, that came into play when his experiences led to the team have the feature “…steer away from big vehicles, such as trucks, that may unintentionally cross into the Tesla vehicle’s lane as they drive alongside it.

Besides talking about the Autopilot team and Musk’s deep involvement, The Information also gives us a heads up about upcoming improvements to the system. According to the publication, we might see Autopilot equipped cars recognize stop signs and traffic lights and stop accordingly sometime next year. They may even begin taking right turns on their own. As always, the progress of this technology should be interesting to watch as it progresses to eventual full self-driving capability.

Source: The Information

Categories: Tesla

Tags: , , ,

Leave a Reply

21 Comments on "Musk Personally Pushes Autopilot To The Limit To Improve System"

newest oldest most voted

And everything else too.

The ability to recognize street signs and traffic lights is a big thing because it can then be able to function on city streets, along with helping distracted/unaware drivers do better with stop signs and stop lights. I also understand allowing for right turns before left turns, which are more critical with crossing opposing traffic. With the thousands of drivers offering real-time feedback on a daily basis, Tesla has to be so far ahead of any other ‘company’ attempting to create an autonomous driving system using a few test vehicles in controlled environments. The difference in the real-time feedback from an actual consumer vs a professional driver hs to be huge. Still, I don’t get why automakers have not come together to build one standard system that they all develop and share, instead of several individual systems. Tesla had the clear lead, and it seems as if the other automakers resisted Tesla’s efforts because the company would also be eating into the ICE automakers consumer base with EV technology as well. But then again, with combustion engines going away, along with transmissions, cooling systems, etc, ICE automakers need to find something different to market their EVs with, which I guess… Read more »
Tesla and the other companies are essentially starting from different locations. Tesla have picked up the low hanging fruit (driving steady on main roads, within defined lanes) while most of the other companies started with what is arguably the much more complex side of things – city navigation. It’s also worth pointing out that the original Autopilot was designed by Mobileye, which is a large AV company providing products to multiple manufacturers. Tesla then decided to move away from them a few years ago and have now got back to where they were (using Mobileye tech) with their own tech, so arguably Tesla is the one resisting others efforts, not the other way round (they’re just about the only one insisting LIDAR isn’t needed for example). Tesla don’t have as much of a lead (if any) as many seem to believe. They do have a different ethos though. The other manufacturers (Waymo, “Uber”, Ford, GM etc.) are testing and using products that will start off in fleets within towns and cities (“ride sharing” AKA taxis), where they can control and monitor the technology before subsequently moving it out to the public. Tesla has gone the other way (again) and is… Read more »

The Tesla approach may be a really long and torturous path. Tesla is already back-tracking on their on-board computers by making them easily replaceable. Everyone else is clean-sheeting the design whilst Tesla thinks they can incrementally improve existing driver-assist tech into FSD. If it turns out to be impossible, then, Tesla will be far behind the end-game.

Tesla will have a really bad PR day when people start litigation because FSD is never delivered.

“Tesla is already back-tracking on their on-board computers by making them easily replaceable.”

They made them replaceable since they ditched Mobile Eye a few years ago, they started in-house chip development, and they said all along that the computers were modular. How is that backtracking? That’s just planning.

“Everyone else is clean-sheeting the design whilst Tesla thinks they can incrementally improve existing driver-assist tech into FSD.”

Their software and their upcoming chip is a clean-sheet design? Do you mean the cameras? What’s wrong with their cameras? What’s is a clean-sheet camera? Who else has their own software and computer hardware in the first place?

I dunno, it looks to me like everyone working on this except Waymo is foolishly trying to incrementally improve from Level 2 to Level 4.

Nobody is going to succeed without using active scanners to scan the entire environment 360° around the car, and build up a real-time SLAM simulation so the car can “see” stationary obstacles. Passive scanners like cameras aren’t gonna cut it for multiple reasons, the most obvious of which is that they can’t “see” at night any better than our eyes can.

So long as Tesla keeps trying to depend on cameras as the primary sensors, it’s going to be stuck in the equivalent of second gear.

Just my opinion, of course.

Infrared cameras can see better than me at night so I disagree on that count. Also, Teslas are equipped with headlights.

Knowing where objects are around you with cameras is a simple math problem once you’re able to identify objects and track a point of reference as you move at a known velocity. The real issue is whether you can actually identify objects and traffic signals and that goes for whatever sort of sensors you have. Knowing an object is on a sidewalk isn’t the same as knowing it’s a pedestrian about to cross with full right of way and you can scan a traffic light all you want, but it wont matter if you can’t tell which lane it’s for. That’s the difference between Level 2 and 3.

Also, from personal experience, Waymo is NOT ready for prime time. Watching them in action is like watching a nervous squirrel try to decide how to cross a highway.

“Infrared cameras can see better than me at night so I disagree on that count.” That would be compounding the problem with the difficulty of optical object recognition being unreliable. Then you’d need two sets of cameras, two software packages with unreliable ability to detect objects in video pictures. Infrared cameras also require active cooling systems, which would make miniaturization difficult and expensive. There are good reasons nobody is trying to use infrared video cameras for self-driving cars. “Also, Teslas are equipped with headlights.” Headlights that only illuminate the path the car is pointed in, and that inadequately. The problem with cars over-running their headlights is something that safety regulators are only now starting to address. Are you really suggesting that self-driving cars be limited like that, and only be able to “see” what the headlights are pointed at? That would be rather foolish. “Knowing where objects are around you with cameras is a simple math problem once you’re able to identify objects…” May I suggest, politely, that you read up on this subject. What you’re asserting broadly ignores the problems with optical object recognition. “Also, from personal experience, Waymo is NOT ready for prime time.” I agree, but they… Read more »

I didn’t ignore the problems with optical recognition. I think it’s more that you ignored the rest of my paragraph identifying optical recognition as the primary problem.

As for you assertions about infrared cameras being impractical in self driving applications, having two software packages and a cooling system is not untenable. Surely you must be alluding to some other reason like perhaps that they can’t distinguish color, which makes image recognition more difficult. I think a good reason that they aren’t needed is because there isn’t a requirement that cameras have super-human visual prowess to begin with even though with ultrasonics and 8 cameras and a radar, the car clearly beats a pair of human eyes anyway.

I do think that headlights solve some of the problem of things being too dark along the path of a vehicle. Having cars only see things with headlights does sound foolish, but who’s suggesting it? Not me. 🙂

As for Waymo, I would not rate their system as anything approaching reliable. Not even nearly approaching. Going at speeds that preclude there being damage if things go awry and simply refusing to merge onto highways in traffic is not a self-driving system.

Funny that you are claiming having two sets of sensors is compounding the problem, while at the same time claiming an additional set of sensors (LIDAR) is indispensable…

I think the main thing that Pushmi-Pullyu is missing is that if the problem is the recognition of stationary objects, you can only run into a stationary object directly in front of you when you’re not in reverse. You’re not going to be t-boned or rear-ended by a stationary object. Tesla has multiple forward-facing cameras, ultrasonics, and a radar so LIDAR adds exactly zero new pieces of information over Tesla’s system in that situation.

In engineering, the ‘start with something simple and then gradually add the more advanced stuff’ approach is usually more successful than trying to launch a perfect product. Nothing beats getting your ‘hands dirty’ in ‘the real world’.

Trying to incrementally improve Level 2 autonomous driving to Level 4 is about as useless as trying to improve the sails on a sailing ship when what you need is the steamship. Sure, there were advances to be made in the shape of the hull and in the steering gear, but no amount of fiddling around with the sails is gonna produce steam power.

Until self-driving cars get reliable SLAM systems using real-time active scanning, they will never stop running into stationary obstacles.

Tesla autopilot does not continuously run into stationary obstacles. You’re underestimating their system and overstating the benefit of LIDAR(?) it seems. Obviously Tesla’s current system already does simultaneous localization and mapping (SLAM) of its environment as is.

It would seem that LIDAR-based systems will never reach autonomy without extremely sophisticated object recognition and the same is true of Tesla’s system. However, extremely sophisticated object recognition enables an accurate distance measurement so either both systems will enable autonomy in the future or neither. LIDAR only carries the advantage of an extra level of collision avoidance at Level 2, which masks the deficiencies of the object recognition software. It is an extremely dangerous path. A car that can act to avoid collisions, but has no real understanding of the objects in its environment.

With Tesla’s system, one can at least be confident that the car is acting because it is aware of what is in its surroundings and its awareness will only improve over time.

I don’t know if Tesla is doing aerospace level simulations off the street, or only relying on shadow driving and machine learning. If they are not running simulations, they are going to have to get about 1 trillion shadow driving miles. That’s about 10 years with 250,000 vehicles driving 24/7. Their fleet is growing fast of course but at last count I think they said they are getting 3 billion miles per year? That’s roughly 333 years.

Miles driven in the real world (even in shadow mode) are way more valuable than miles driven in a simulation.

Presumably a lot of other employees of Tesla have the same software as well?

The article claims that other employees have special software too, but Elon’s supposedly is extra special…

I get that Tesla hopes to do everything with cameras, but there has to be some plan to improve their dynamic range. It’s well known that humans are able to handle a much wider brightness range which is why HDR took off to actually try and replicate what we actually see (to varying degrees of success.)

Anyone know anything more on that front?

The cameras need to have sufficient dynamic range, no matter whether they are assisted by other sensors or not… I haven’t seen it mentioned as a problem anywhere.

Another inroad for Tesla as CA seeks to replace aging natural gas plants: