Issue #43: Driving Mr. Jason, My First Week with Tesla FSD — Jason Michael Perry

Howdy👋🏾, Tesla’s Q1 earnings missed the mark, so the company’s offering everyone a free 30-day trial of its $15,000 beta full self-driving mode. If you read my newsletter, you know I own a Tesla. I’ve documented my experience facing range anxiety driving from Baltimore to Winston-Salem, NC, and shared my thoughts sitting in the backseat of Waymo, Google’s fully autonomous self-driving taxi.

Even with experience in robotics and a keen interest in autonomous driving, $15k on top of my car’s purchase price felt like a leap too far (also available as a subscription for $200 a month), so I was more than excited to take Tesla up on its offer.

Before I share my thoughts on Tesla’s Full Self Driving or FSD, let me explain some of Tesla’s lingo so we’re all on the same page. All Teslas ship free with AutoPilot, which is best described as autonomous cruise control. This feature is meant for highway driving and enables the car to stay in its lane and adjust its speed relative to the cars in front. Tesla also offers an AutoPilot enhanced for an additional $6k, which adds the ability for a car to fully navigate from on-ramp to off-ramp, switch lanes to maintain speed, auto park, and summon, which can help get a car out of a tight parking spot or pick your up in a parking lot.

My car has AutoPilot, which I enjoy and use on the interstate regularly, but AutoPilot is far from perfect. Many have reported that the car can sometimes slow down in response to brake lights or hazard lights in far lanes. It gets very confused when passing an on-ramp, and AutoPilot can deactivate suddenly with little notice. This so-so experience has led me to assume Tesla’s FSD mode must lack accordingly. 

Back to the lingo, FSD promises to fully drive a car from point A to point B by navigating city streets, highways, interstate, and parking lots. The goal is that you sit in a car, put in a destination, and the car handles the rest of the trip, all while abiding by traffic lights and signs. The language Tesla’s trial uses is supervised FSD, meaning it can do these things, but the driver must stay alert the entire time.

I enabled FSD about a week ago. I generally drive very little, mainly to the office two days a week, and that trip is about 3.5 miles all through city streets with very few turns—not the best test. So, to get a better feel, I added two round-trip visits to Delaware and drove to DC overnight to help out my little brother. I also squeezed in errands like the grocery store and a dinner date.

FSD is pretty darn amazing. I’m truly impressed. 

One of the features I couldn’t wait to try was summon. I have long dreamed of walking out of a mall or business meeting and having the car pull up to the front door with the cabin at the perfect temperature. Sadly, my 2023 Telsa Model 3 does not appear to support this feature yet. The reason requires some quick backstory. 

A few years ago, Tesla moved away from its ultrasonic radar sensors to a video-only implementation known as Tesla Vision. Having dabbled in robotics with Mindgrub’s Snax project, this was surprising, to say the least – the reality is video is not the most reliable system, especially in dark or rainy conditions, so the combination of video and sensors offers a level of fault tolerance that seems necessary for a 50k self-driving vehicle. Rumor has it that the sensors were removed to cut per-car costs, even though the internal engineers saw this as a setback.

Sadly, features like summon are being rewritten to work for Tesla Vision, and I’m not aware of when it may be supported.

The driving experience is, at first, jarring. I would compare it to the anxiety I felt teaching my daughter to drive. It just doesn’t drive like I drive. I constantly found myself like a passenger asking if the car noticed the light turned red or if the car ahead had its brakes light on. I had to bite my lips several times during those first few drives and learn how the car reacted instead of always jumping to grab the wheel. This was nerve-racking, but the more I learned about the car, the more I began to trust it for better or worse. Here are a few examples that surly impact my blood pressure but the car handled flawlessly:

  • Approaching construction on a city road that limited the road to one lane and obstructed that lane, requiring you to drive around the obstacle.
  • The car makes a right turn onto a city street only to immediately discover a car blocking the right lane with its hazards on.
  • A two-lane city street with a parked Amazon delivery truck blocking one lane and two cars with hazards delivering food in the other lane.
  • The car moved into the middle lane with blinkers on, as another car without blinkers moved into the same lane.
At least FSD is not afraid of driving over bridges…

In each of these cases, I kept my hands on the wheel, ready to intervene, but was surprised to find that the Tesla handled them with little issue. In the case of the car moving into the same lane, the Tesla noticed the issue before I did and moved back into the lane before I could react.

For better or for worse, the more it handles situations like this, the more complacent I felt while letting the car drive. Things that once kept my toes curled and my back rigid, I began to expect the car to handle. That very thing made driving long distances much more relaxing and made me aware that, supervised or not, it would be too late for me to react if the car made a mistake.

So now, let’s talk about the things I’m not a fan of.

Baltimore has terrible streets, and many roads are littered with terrible potholes and two-lane streets in neighborhoods that no longer have lane paintings. In FSD, the car did slow for speed bumps and when it could it would slow for particularly bad potholes, but it didn’t attempt to drive around them. On one very bad stretch of road near Johns Hopkins Hospital, I wondered if the car had tried to go out of its way to hit every single one. I also found the car sometimes drove in the middle of a two-lane street lacking any painted lines, and you can’t correct the car or explain that it should pull left or right. However, it did correct itself when it noticed other cars driving on that same street.

Speed limits are much more confusing than I realized. Beyond the fact that streets (looking at you, DC) will shift from 45MPH to 20 MPH, making things confusing, many areas require you to just know the speed limit. For example, an on-ramp is meant for acceleration, but beyond that, I wouldn’t say they have a limit. In one part of Baltimore, entering I-95’s express lane has a long set of “on-ramps” before you truly enter the expressway. On my first drive, the car was convinced the speed limit was 35MPH, the same as the street we turned from, forcing me to manually accelerate the car by hitting the gas.

After this and some Googling, I discovered that FSD enables a new feature called Automatic Speed Offset, which will determine the speed limit based on the speed of other cars or the type of street the car is in. Turning this feature on helped a ton and helped the car drive more naturally, but sometimes created weird situations like the car not decelerating as quickly as I would like on off-ramps or taking curves without decreasing speed. 

The car’s lane change is focused on optimizing speed, and it often waits longer than I would to get over to the right lane before an exit. This led to the car (and sometimes myself) needing to become extra aggressive in getting to the right lane to avoid missing an exit.

Left lanes can be challenging, and apparently, It’s not just me, especially when partially obscured by parked cars. I think timid is the best way to describe FSD when making a left turn. I often needed to push the gas pedal to get it to put some oomph in the turn. On obscured left turns, the car would inch out, but its timidness in moving when the street was obviously clear made me a little weary, so I often took control to complete it.

The last issue, which is a feature request, is how the car treats reaching its destination. Let me explain.

My house, a row home in Baltimore, has on-street parking in front but a garage accessible through the alley behind the house. The streets to access that alley are both one-way streets, so often, the best way to get to my house is different if you’re aware that the destination is to access the garage in the alley and not the actual address of my house. Often, I ignore the directions from GPS as I get closer to my house, knowing that what it thinks is the destination is simply incorrect.

No matter how many times I do this, I have yet to find a way to tell the car, Apple, or Google Maps that my actual destination is this unaddressed alley behind my home. FSD magnifies this issue because you’re locked into the path the mapping decides for the car, and you can’t adjust it to say take this street over this street. 

On my drives to frequent destinations, I often make decisions not about the optimal route but for feel. Maybe I like to pass in front of a favorite spot to see if it’s busy, or I prefer a street that feels safer, or in Baltimore, it’s not abnormal to take a road that is paved better than the one that would get me someplace quicker. 

You can’t tweak how the car goes to a destination, and you certainly can’t tell it to remember how I go from point A to B. Your options are typically 1-3 choices, leaving you to leave FSD and take control. While that sounds simple, you have to take control long enough for the car to understand that this is what’s happening. Otherwise, it will try and readjust for the best speed and sometimes work to actively get back on a road you’re trying to avoid.

So far in my testing, I’m surprised that FSD feels SO much better than the default AutoPilot. I can only assume that the feature uses a different set of code or was purposely limited. Many of the issues I frequently ran into with AutoPilot just didn’t happen in FSD. I don’t get it, but I’m enjoying my trial. Now, here are my thoughts on tech & things:

⚡️Speaking of autonomous driving, it looks like Cruise is bringing back its robotaxis with humans ready to intervene as needed. All as Waymo expands its own service across California. Maybe this is why Tesla has said it may delay its future 25k car to focus on a future Robotaxi announcement.

⚡️ Apple’s world is getting complicated. Last week, I mentioned the soon-to-be-released AltStore in the EU, which will allow legacy emulator games to be played on Apple’s platform. Like clockwork, Apple announced support for this (though limited) on the App Store. Is this a sign that the DMA is actually working?

⚡️I did not see this coming. Mindgrub’s partner Automattic just purchased Beeper, the same messaging app that reversed-engineered Apple’s iMessage. Between this and some of the moves into the Fediverse, I can’t help but wonder what future plans they might have.

My experience with FSD makes me raise an eyebrow when I look at the word supervised. As with AutoPilot, the cameras in the car look to make sure you’re focusing on the road, and if it thinks you’re not, it sometimes sounds an alert and requires you to gently tug on the wheel to let the car know you’re paying attention.

All of this is for a good reason, documented in tons of articles on Tesla accidents while in AutoPilot or other modes or safety concerns with how well these features work – but I must say the experience leads to an annoying driving experience and even worse with FSD. When driving in AutoPilot, the car generally goes in a straight line, but FSD switches lanes, takes roads, turns, and does much more – tugging the wheel to say I’m alert while the car is in the middle of a lane change feels unsafe. 

I also realize that once you get comfortable with these features, the nature of how they work will cause you to trust the car more and more; with that trust, you will relax and rely on it more. I found myself setting mental markers based on the map, like getting off the interstate as places I knew to focus and give my fullest attention, but playing second fiddle to the car for most of the actual driving. The result – even with my hands on the wheel, if the car didn’t react to a situation, I’m not sure I would be able to understand the situation, take control, and react in a way that would be superior to whatever decision the car might make on its own.

This Friday, I’m kicking off the Anne Arundel Community College Chesapeake AI Hackathon with a virtual intro to AI Workshop. Sign up for a chance to see my talk for free. Next week is also stacking up to be pretty busy. I’m at HIMMS Maryland on Monday. On Tuesday, I’m joining Mindgrub’s partner NowSecure for a webinar on Navigating the Pitfalls of Al-Generated Code: Security Best Practice at 11 AM ET, and that’s all followed by a virtual panel conversation on Wednesday on the impacts of AI on Workforce development starting about 3 PM ET. Of course, my workshop schedule is also current, so sign up for a class or reach out for a private session for your team.

-jason

p.s. How could we end this without considering the many FSD promises from Tesla for FSD and other product announcements? Look, ya’ll, I love my car, but man, watching this video really makes you wonder how this all shakes out. I’ll let you know when ​I cash in my trillions from my Tesla Robotaxi fleet​!