January is over and I hope you’ve had a lovely time breaking your New Year’s resolutions! Personally, I got myself a sports injury last week which has been a great excuse not to work out …
But enough about you and me – let’s look at what the rest of the automotive industry has been up to this month:
Volvo partners with Luminar and Zenseact to bring autonomous driving feature to new e-SUV – via TechCrunch
Volvo Cars originally proclaimed they wouldn’t build a Level 3 automated driving system: The safety risks surrounding the handover of driving functions from a car to its driver were deemed too high. However, the OEMs goal of going directly for L4 has been rendered impractical for the time being: With no system deemed safe enough for fully autonomous operation in certain areas and no regulation that would allow them, Volvo Cars has chosen to go for L3 after all.
Their new system is called “Ride Pilot” and aims for limited self driving “on highways that Volvo has validated and at lower speeds” – which makes it a direct competitor for Mercendes-Benz’ “Drive Pilot” and Honda’s “Sensing Elite” – especially if and when these systems will be available outside their respective home markets.
It seems understandable that Volvo Cars does not want to miss out on a Level 3 go-to-market: For all we know, L3 could be as good as it gets for a while – and when unsupervised self driving really comes, it will surely be advantageous for OEMs to have proven themselves on the lower rungs. It will be interesting to see if the Volvo brand can establish itself as a pioneer for safety even in this new domain.
Elon Musk says he’s hiking “full self driving” by another $2,000 – via Ars Technica
Tesla’s Full Self Driving (FSD) package now costs $12,000 in the US. This price hike comes during an all-time high of criticism of the system’s proclaimed and actual capabilities:
The Dawn Project took out a full-page ad in the New York Times this month, warning of safety risks not only for drivers but the general public: It considers all of us unwilling test subjects in what is essentially FSDs (paid) public beta phase. Additionally, the California DMV is revisiting its decision to consider the test program to be outside its regulatory mandate.
Also worth noting is the suspected business case for Tesla – which Brad Templeton (ex-Waymo) described in an article for Forbes last month: “As an automaker, Tesla plans to lease Teslas to customers, then buy them back at the end of 3 years and convert them to robotaxis. That’s easy with the sleek interior of the Model 3 and Y — just pull out the wheel and pedals and put a wooden plate in the hole. Then [Musk] gets to supply his robotaxi fleet with 3 year old cars where customers ate almost half the depreciation cost.”
The Deadly Myth That Human Error Causes Most Car Crashes – via The Atlantic
Let’s get ethical: We all know the argument that human error is responsible for over 90% of car accidents. It is one of the more popular talking points to advocate for allowing autonomous vehicles on the road; that they would actually be safer than human drivers. This article does a good job challenging that belief, looking at the systemic conditions within which human drivers operate:
The article is focused on the USA, where more than 20 000 people died in traffic accidents in the first half of 2021 alone. Being a rather car-centric society, new US drivers often take to the roads at 16 years old. They then get behind the wheel of vehicles frequently weighing over 3 metric tons and packing up to 400 horse power – and the hood of which easily towers above children and small adults.
Additionally, road construction factors are not taken into account by the statistic: Crashes happening on accident-prone roads are still chalked up exclusively to human error if the driver fails to compensate for shortcomings in civil engineering.
These are of course issues that to some degree exist in most modern societies, not only in the US: In their efforts to electrify fleets, many OEMs have been focusing on large and heavy high-margin models, such as SUVs. So is it really honest to blame only human error when drivers are frequently operating under systemic conditions that make accidents/fatalities more likely? The law says yes, but there may be other ways to look at it …
Alright, this ended up being a lot of text per item so I’ll compensate by leaving it at those 3 articles for this time.
I hope you appreciate the read – and if you have any thoughts to share, I’d love to hear them! You can leave a comment here or tag me on LinkedIn.
All the best