As you have probably heard, NVIDIA has been trying to acquire chip company ARM for the last year. Just when it seemed like a done deal, the US Federal Trade Commission (FTC), an open competition watchdog has filed a lawsuit to stop the merger this month.
The FTC (and other regulators) worry that one-party ownership might lead to Nvidia “[using] its control over Arm to advance its own interests in emerging markets like data centers and autonomous vehicles, instead of working to ensure that all the companies […] can continue to do so on an equal playing field.”
However this suit will develop, it brings up an interesting point: In this day and age, high-tech has become such an integrated part of our daily lives that we might need to rethink how to regulate access to it (especially in times of strained supply chains). There’s already a movement lobbying to declare the internet a public utility and regulate it like electricity and water – and I suppose it’s indeed hard for most of us to argue that the internet is anything less than integral to our way of life. I wonder if we will see similar ideas come up in the hardware/software domains during this decade …
I found this a rather interesting look at Honda’s two-part strategy to eliminate traffic collision fatalities in automotive by 2050:
The 𝗜𝗻𝘁𝗲𝗹𝗹𝗶𝗴𝗲𝗻𝘁 𝗗𝗿𝗶𝘃𝗲𝗿-𝗔𝘀𝘀𝗶𝘀𝘁𝗶𝘃𝗲 𝗧𝗲𝗰𝗵𝗻𝗼𝗹𝗼𝗴𝘆 will focus on individual human drivers’ risk profile: Taking into account the results of Honda’s “original functional magnetic resonance imaging (fMRI)-based study of the human brain and analysis of risk-taking behaviors”, the goal is to create a kind of safety profile unique to every driver and to predict (and mitigate) driver errors “based on information obtained through a driver monitoring camera and pattern of the driving operations.”
The 𝗦𝗮𝗳𝗲 𝗮𝗻𝗱 𝗦𝗼𝘂𝗻𝗱 𝗡𝗲𝘁𝘄𝗼𝗿𝗸 𝗧𝗲𝗰𝗵𝗻𝗼𝗹𝗼𝗴𝘆 on the other hand focuses on “potential risks in the traffic environment, which are detected based on information obtained from roadside cameras, on-board cameras and smartphones” – and continuously aggregated into a comprehensive digital twin of traffic situations.
The latter sounds a little bit like science fiction at this point, but this is a long-term plan, after all: The goal is to analyze traffic scenarios and run possible simulations on how they might develop faster than they actually play out- with the goal of identifying (and mitigating) risks before they even occur. It reminds me of the Tom Cruise movie Minority Report, but with a less dystopian outcome. 🙂
Congratulations, Mercedes-Benz: The brand became the first to gain approval for a Level 3 self-driving series production function. Daimler will outfit its next-generation S-Class model with the traffic jam assist feature, which can take over control on certain highway routes (13 000 km to start with) when driving at speeds of up to 60 km/h.
Unlike Tesla, who have removed all medium- and long-range sensors other than cameras from their cars with a less-than-ideal outcome (see the look at the new Model Y review in last month’s blog), Daimler is not only keeping radar but also using a LiDAR sensor by French Tier 1 supplier Valeo in its hardware stack.
There’s been a debate for several years now about whether “full” autonomous driving (Level 4 or even 5) can be achieved via an evolutionary path, going from L2 to L3 and then continuously improving and upgrading the tech until it’s L4-ready – or if you need to find an entirely new, revolutionary path to develop an L4/L5 system. While that debate is far from decided, the pathway to an actual go-to-market which Daimler (as well as Honda) have now paved for themselves, could go a long way in gaining the public’s trust for more developments to follow – as well as that of shareholders/investors.
Speaking of Tesla, the NYT came out with an worthwhile investigative piece this month: It casts a very unflattering light on Elon Musk’s promise of “Full Self Driving” and the way he is reportedly conducting himself and his company on the way there.
Quoting anonymous sources, including 3 people who were part of the project’s origin, there’s since been continuous friction between executives pushing for “Autopilot” and lofty promises surrounding it and engineers more cautious of what the system actually can and cannot do – as well as the potentially misleading and dangerous nature of its name and marketing.
With the NHTSA now officially probing Tesla for safety risks connected to their ADAS stack, this article is definitely worth reading (or listening to): both as a recap of the developments since Tesla’s FSD announcement, and as a behind-the-scenes look at how Time Magazine’s Man of the Year 2021 runs the company that made him the richest man in the world.
I hope you enjoy the read and wish you all the best for the holidays!
Winter is here! Driving to work through the snow and rain this morning, I once more realized what a luxury it is to drive a personal automobile: I honestly hope this industry can find a sustainable way for us to keep these comforts …
I also hope you are staying warm and cozy at this time – so curl up with a hot beverage of your choosing and let’s dive into this month’s read:
As you may remember, Tesla announced earlier this year that they’d be removing radar sensors from all upcoming models – presumably because they were interfering with the superior environmental perception provided by their cameras. I took a critical look at the camera vs. radar argument in this guest article for AV International in July – and now the first radar-less Tesla is coming out. CNET’s editor in chief, Tim Stevens, experienced multiple false positives during his test drive and comes to a clear conclusion:
“This is a massive problem. It happens on both the highway and on secondary roads, any time the cruise control is engaged even without Autosteer. It means the car’s cruise control is patently unsafe, which means the entirety of Autopilot is unsafe. And that means the car itself is unsafe.”
The semiconductor crisis affects everyone? Yes, but actually no: A new EY study says automotive OEMs seem to have successfully mitigated the shortage by prioritizing their high-end, high-margin models – and posted record profits for the last quarter. Their suppliers, on the other end, are negatively affected by the lower-volume, more volatile requests from carmakers with over 40% of Tier1s “now in a financially tense situation” according to PwC.
As someone working in sales along the automotive value chain, I negotiate quite a few contracts with both OEMs and suppliers. Next time I get the argument that “this will never happen, anyway, why do you care if we leave the clause in the contract”, I might just reply with a link to this article ….
Speaking of chips and Tier1s: Qualcomm’s play to out-bid Magna for Swedish sensor maker Veoneer got completed this month. People in the industry were confident that Qualcomm’s real acquisition target was Arriver: a subsidy of Veoneer working on cutting edge ADAS/autonomous driving technology that was formed when Zenuity, the supplier’s former joint venture with Volvo Cars was disbanded.
Now, Qualcomm was able to announce BMW as a new flagship customer for its automotive-grade chips – and Arriver will ride that wave as a supplier of ADAS technology to the OEM. I guess both Mobileye and Nvidia will keep a close eye on the developments to follow …
Let’s close on a note about autonomous trucks: Aurora released a first commercial version of their self-driving stack this month. In a collaboration with truck maker PACCAR and logistics carrier FedEx, the “Aurora Driver” is going to service a route between Dallas and Houston: a highway that’s seeing a lot of interest in this field, with TuSimple, Kodiak Robotics, Embark and other players also having chosen Texas for testing and commercial debuts in the autonomous trucking space.
Aurora uses cameras, radars and a priprietary LiDAR for perception and claims that Driver is ready for commercial use, mastering “unprotected left and right turns, high-speed merges, and various forms of construction” in its present beta form.
That’s it for this month; I hope you enjoyed the selection – be careful on winter roads and have a lovely advent season!
In-person events are back! Over the past weeks, I’ve had the pleasure of attending a number of face-to-face meetings, conferences and expos in Germany and Sweden. I can tell you: It’s been a bliss for my contact-starved, mostly remote-working self.
And the fun’s just getting started – because I can already tell you that the good news are about to carry on into the winter season:
One Brexit referendum and a global pandemic later, it’s finally time to go back: back to “MOVE: Mobility Re-Imagined” in Lodon, that is. This conference I last attended just before COVID hit in 2020 is one I have fond memories of: Not only did we receive a lot of interest at the atlatec booth, the expo was also notable for its integrated approach to mobility, bringing together stakeholders from all kinds of organizations. If you’re there, come visit me at our pod S98 on November 9 – 10!
We’ve seen it before: @tes Tesla rolls out a new increment of their “Autopilot” system, users upload videos of it failing to YouTube and the social media audience gets upset. Elon Musk has become infamous among the ADAS community for beta testing the safety-critical FSD system on the general public. This time, however, the latest release was actually rolled back after Elon Musik himself announced “some issues” via Twitter. A possible change of pace related to changes in NHTSA leadership? You tell me …
Luminar is a major player in the LiDAR space, having won over OEMs such as Volvo Cars for use of their sensors in ADAS/AV systems. So it’s quite interesting to see how a company this well-positioned looks at what actually constitutes their value-add – and what revenue models they envision in this space. From the article: “While [Luminar’s Vice President of Product and Strategy] Jefferson predicts Luminar will sell physical lidar units for the foreseeable future, he also envisions that driver assistance technology may eventually be sold to consumers on a subscription basis, resulting in monthly recurring revenue for Luminar.”
Tesla again! The company is not only shifting paradigms in testing safety-critical software, but is also looking to disrupt the auto insurance sector: Elon Musk has announced a policy model with premiums based on real-time driver behavior analytics. This carries the potential for another ethics-based debate: Is it acceptable to penalize individual drivers for mistakes we can all be expected to make now and again? And: Could this make drivers hesitate a split second too long about a life-saving full brake, for fear of negatively affecting their safety score – and thus raising their insurance costs? This month’s news should be enough for a debate or two in my opinion …
Welcome back to the automotive industry news for August 2021! I hope you’ve been having lovely weather where you’re spending your summer – but if not (like in Germany), then perhaps I might interest you in a bit of light reading and a hot beverage instead. We’ll supply the reading of course – let’s get to it:
We’re super excited to return to real-world events, starting as exhibitors at the re-booted IAA Moblity in Munich next week! To make things even better, we are offering a live, in-vehicle demo:
Together with our partners Artisense and NNG we will demonstrate how HD maps boost ADAS functions such as lane-level navigation/guidance, AR HUD style. Places will be limited, but you are more than welcome to let us know you’d like to reserve a seat – or drop by the atlatec booth A418 in hall B2 to inquire about ad-hoc availability.
Or, if you can’t make it to Munich, we are also running a joint webinar on the topic tomorrow, September 1 at 11 AM.
Sadly, a Tesla using its “Full Self Driving (FSD)” or “Autopilot” system struck an emergency vehicle this month – the 12th time this has been reported. The incident occured right after the US NHTSA announced a probe into Tesla and its ADAS functions. As mentioned in this newsletter last month, Tesla is the only OEM not using HD maps for its hands-free driving system which, like LiDAR, Elon Musk has called “a crutch.” The article, wondering what the NHTSA probe might lead to, also takes a closer look at the background for context.
I thought I’d include this review of Ford’s new hands-free driving system, comparing it to GM’s “Super Cruise” because the two companies’ approach to Level 2+/Level 3 ADAS is so different from Tesla’s – as the article says:
“Both […] use high-definition maps of divided highways across the US and Canada to limit where their respective systems can be used. GM’s maps now include more than 200,000 miles of roads while Ford is initially limited to 130,000 miles although expansions are promised from both. If you aren’t on one of the approved roads, the systems simply won’t engage. While Tesla tells drivers that Autopilot is only meant for use in divided highways, it does nothing to prevent enabling the system anywhere.”
In the electric vehicles space, Rivian seems to be eyeing an IPO – after already having raised USD 10.5 billion so far: Building a car company is a costly business!
The desired valuation of USD 80 billion seems like a moonshot at first, considering that Ford, as one of their invstors has a 53 bn market cap – and Rivian has yet to begin regular vehicle deliveries. But then again people were saying similar things about Tesla – and now they’re worth more than the next 9 car makers combined … Apparently the IPO is supposed to happen around Thanksgiving, so this will be one to watch over the next weeks!
Just a few months after NVIDIA announced their acquisition of DeepMap, Mitsubishi’s Woven Planet has confirmed they’ll be buying mapping company Carmera (bonus article here). And, just in time, Mitsubishi Fuso has announced they’ll be integrating Woven Planet’s HD maps into their ADAS stack.
To me, this is two interesting pieces of news in one: First, the consensus that HD maps will be required for next-level ADAS features (Level 2+ and above) is once again confirmed across vehicle categories, leaving Tesla as basically the only OEM to double down on not using them (see also the article on Autonomous Vehicle International above). Second, the field of HD mapping is facing consolidation – and investor-backed players might have to opt for a profitable sell-off in the short term, rather than waiting for the elusive moment in time when the AV business model will fully take off.
Waymo is arguably one of the front runners in autonomous driving, and they have always been vocal about their view of simulation as a cornerstone of it: According to the company, Waymo has simulated 15 billion miles of driving, compared to “only” 20 million miles of real-world driving that have been completed.
In addition to their older simulator Car Craft, Waymo has now presented their new software, called “Simulation City” – which it hopes will bridge some gaps, such as new vehicle models. The article contains a video and a GIF giving some visual impressions, too, so feel free to steal a glance!
This month was ripe with mapping-related news! Tier1 supplier BOSCH is partnering up with Volkswagen to create a “crowdsourcing” solution, aiming to produce/update HD maps by leveraging the data of sensors on board series production vehicles – in this case the VW Golf 8. The article quotes BOSCH’s Dr. Mathias Pillin as saying “The more vehicles that provide information now and in the future, the larger and more robust the database will be for automated and assisted driving”, which seems to be in line with what other suppliers are banking on, most famously perhaps Mobileye.
It looks like crowdsourcing will truly be the holy grail of HD mapping in this decade – and it will be very interesting to see the differences in how automotive players approach this (traditional Tier1s vs. tech companies such as NVIDIA) as well as which strategies and platforms we’re going to see.
Welcome back to the monthly industry newsletter from atlatec. I hope you had a great start into the summer. In this issue, we are covering Tesla’s latest decision to completely remove radar sensors, DeepMap’s future with NVIDIA, and Volkswagen’s idea of pay-per-use self-driving cars. Enjoy the read and make sure to watch our latest fireside chat on YouTube.
Since atlatec is a computer vision company, you can expect us to be strong supporters of what this type of sensor technology can do. With Elon Musk’s most recent headline news, however, many industry experts believe he is putting too much trust in cameras – by completely removing radar sensors from future Tesla models.
Tesla’s head of AI, Andrej Karpathy, has shed some more light on the reasons behind this move, explaining that in Tesla’s view, their camera system has become so superior to radar that adding “mixed signals” only makes for errors and added sensor fusion effort, rather than higher safety and better performance through redundancy.
If the past is any indication for the future, Elon won’t care much about what other EV/AV companies think about this move – but it remains to be seen if his shareholders will, depending on what results Tesla will be able to deliver regarding its self-driving functionalities.
Last year, we were actually speculating about whether or not this would happen: NVIDIA, a key investor in DeepMap, has announced they’ll be acquiring the HD mapping company completely.
According to official quotes, the reason is to leverage DeepMaps IP in order to boost NVIDIA’s mapping – and thus, self-driving – capabilities, presumably enabling them to widen their footprint as suppliers for ADAS/AV technology among other automotive OEMs. The one comparison that comes to mind as a direct competitor is Mobileye: It will be interesting to see if NVIDIA will try to establish a similar offering with an end-to-end solution encompassing sensors, processors (remember they bought ARM in 2020) and software.
Who will use autonomous vehicles, and who will pay for them? Underlying business cases are one of the pillars that will have to carry self-driving technology if it is ever to become mainstream. Volkswagen has shared some ideas: A pay-per-use model, where the public can rent AVs by the hour.
Klaus Zellmer, VW board member for sales, marketing and after sales even shared a possible price: 7 Euros per hour might buy you access to a self-driving Volkswagen model.
One one hand, easy access to autonomous mobility when you need it seems neat – on the other hand, 7 Euros per hour is a hefty price when compared to public transportation solutions that many city dwellers on the globe are used to. For more rural areas, however, this might solve a real mobility problem; especially if co-financed by municipalities, as an addition to public transportation.
When we opened up registrations for our first-ever webinar, our team made some bets on how many people would sign up. Most of us guessed at around 20 to 30 registrations – and we’re very glad we didn’t put up any money: We ended up with over 500 registered participants, the majority of which joined us live to watch the discussion between atlatec CEO Henning Lategahn and our partners at GeneSys, MdynamiX and the Kempten University of Applied Sciences.
Apparently the topic, scaling precise ADAS testing on public roads and putting measured performance into meaningful context, was quite relevant for many: We are still getting daily requests for the video recording as well as our panelists’ presentations. If you, too, would like to watch it, you can access all the material right here on our website.
As we are slowly entering the summer season, let’s look back and recap the latest automotive news of May. This month SAE updated the official names for ‘Autonomous Driving’ Levels, Germany passed legislation for autonomous vehicles driving without safety drivers’ presence, and Ford released its first electrical truck – the F-150 Lightning.
Apart from that, we kindly invite you to join our live panel discussion on ADAS testing that will take place in just a week. You can register for both German and English sessions – pick the one that fits you best.
There’s a lot of debate around what “autonomous driving” really is; and some pretty diametral view points – sometimes within one and the same company (looking at you, Elon Musk and Tesla’s legal department). One framework that’s proven to be useful in differentiating between what does and does not constitute self-driving technology are the SAE Levels Of Driving Automation (L0 – L5).
This standard, formally known as SAE J3016 has now been updated to more accurately separate driver support features (L0 – L2) and automated driving features (L3 – L5). It also clearly classifies simultaneous use of modern ADAS features like ACC and LKA as a Level 2 system – and thus firmly places it in the driver support domain. So get the latest “cheat sheet” and you’ll be well prepared for the next heated ADAS vs. AD debate – which we probably all get into at some point.
Speaking of automated driving: In a move sure to surprise many, Germany’s national parliament has voted to allow testing of Level 4 systems on public roads from 2022 – without a safety driver on board. Some restrictions such as proper insurance and remote shutdown options apply, but those hardly seem like roadblocks for companies serious about this type of technology.
With several OEMs in the country as well as players like Argo AI and Mobileye already testing their cutting-edge systems on public roads in Germany, it will certainly be interesting to see what to actually expect on and off the Autobahn next year – and how the public will react.
When is a car not a car? When it is a truck – or perhaps even something else entirely. Ford has revealed the battery electric version of its best-selling truck, dubbed the F-150 Lightning – and while thorough, independent reviews will need to be considered, it seems that competitors such as Rivian or Tesla’s Cybertruck will be facing a formidable competition:
The product management at Ford seems to have employed impressive user centricity; completely rethinking what a truck actually is – or can be. The F-150 Lightning is not only powered by a battery: It will also power external appliances, from work crews’ tools all the way to complete households, if necessary in a blackout: Something that will arguably be a selling point for citizens of US states regularly threatened by flooding, tornadoes or wildfires. For companies that run work crews (perhaps the most important customer segment for this vehicle), the BEV version of the F-150 may well turn the question of “Why go electric” into “Why not”: The use of a vehicle that makes both the job and fleet management easier (thanks to improved telematics) and that can easily be recharged back at the company lot every night seems compelling – even more so if it also brings maintenance costs down, which are typically higher for combustion engine vehicles.
You may remember our article about ADAS testing from last month’s newsletter, exploring a solution to create and leverage reference data at scale that we co-created for Porsche. Now we bring the contents to you live, as a webinar together with our partners:
Join atlatec CEO Henning Lategahn as well as representatives from GeneSys, MdynamiX and the Kempten University of Applied Sciences on June 8th or 10th:
We are offering sessions both in English and German language. So far the resonance has been amazing; which is why we’ve upgraded our webinar hosting package to allow for additional registrations. So if you haven’t already, you are warmly invited to sign up – we hope to see you next week!
I hope this overview helps you to stay on top of industry news. Make sure to watch the latest fire-side chat with the atlatec team. The video is already available on YouTube.
Stay tuned for the atlatec industry newsletter coming in the end of June!
As usual at the end of each month, we’ve prepared a brief automotive news overview to help you to keep track of the hottest headlines.
This time, we’re especially happy to include some news of our own! Additionally, we found interesting developments at Volvo Trucks and their partnership with Aurora, the Polestar’s long-term commitment to the first climate-neutral EV, and Nvidia’s DRIVE Orin AI-computing platform, that Volvo Cars has opted to use for their AV.
I hope you enjoy the read and make sure to watch out latest fire-side chat that is already available on YouTube.
While actual autonomous vehicles may still be a few years out, the L1/L2 ADAS domain is already going stronger than ever. That’s why we’re happy to publish some news of our own this week: A detailed look at a solution for ADAS validation that brings capabilities and fidelity previously limited to proving grounds to public road testing.
Take a look at this solution for creating and leveraging reference data at scale as it was piloted by Porsche and built together by atlatec and our partners GeneSys, MdynamiX and the Kempten University of Applied Sciences. We’re quite excited to share this and hope you will take an interest, too: If you have any thoughts, we’d love to hear them!
Autonomous trucks are often regarded as perhaps the first instance of actual production AVs we’re likely to encounter on the open road. Reasons include the focus on (relatively non-comlex) highway routes, saving human drivers the grind of long-distance trips as well as the clear business case to be made in the logistics domain.
The latter, of course, relies on actual commercialization – towards which Volvo Trucks may have just taken another step, announcing a partnership with AV stack provider Aurora. The mutual goal: To bring autonomous hub-to-hub truck operations to North America – and thus bringing everyone a step closer to encountering actual AVs on public roads.
Basically every car maker and their suppliers are currently asking themselves, “How can we reduce carbon emissions a bit more – and perhaps offset the rest?” This is, of course, a relevant effort; and it continues to produce reductions for CO2, NOx and other emittents by a few percent every year (or at least every time a new emissions standard is announced).
However, instead of asking about 10% less, Geely-owned Polestar has chosen to question everything about themselves, aiming for 100% elimination of emissions – including not only the operations lifecycle of their new “Polestar 0” model, but also the entire supply chain and production, moving away from toxicity-related materials for chassis and batteries.
That asking bigger questions lead to bigger answers is something tech companies like Google have known for a long time (see “10X thinking”) – it will be exciting to see its effects on automotive and manufacturing, and whether others will follow suit!
More news from Sweden, and thus from Geely, who are also the proud owners of Volvo Cars: As was announced during NVIDIA’s GTC this month, the car maker has chosen their “DRIVE Orin” system to enable its passenger cars to drive themselves.
As Volvo Cars has previously announced, they’re skipping Level 3 entirely, instead aiming for L4 operations on highways as their debut on the autonomous vehicles stage. The first vehicle to come with the new NVIDIA SoC is the next-gen XC90; in which it will work hand in hand with ADAS software developed by Zenseact and LiDAR sensors supplied by Luminar.
Stay tuned for the atlatec industry newsletter coming at the end of May! In the meantime, feel free to reach out to us if you have any questions.
Get automotive industry news directly to your mailbox – sign up for the atlatec newsletter.
Advanced Driver Assistance Systems (ADAS) have been around since quite a while: Modern-day vehicles inevitably come with assistance features such as Adaptive Cruise Control (ACC), Lane-Keep Assist (LKA), Blind Spot Monitors, Traffic Sign Recognition and other features supposed to make driving safer and more comfortable. In contrast to Autonomous Vehicles, ADAS is a huge and profitable market today, and will likely remain so for the foreseeable future.
If you’ve driven a car with such features in the last few years, though, you might have realized that the performance of these systems can vary – sometimes by quite a lot: While some LKA systems do a good job of keeping your vehicle on track, others tend to react too late, or overcorrect and send you right across the opposite lane border rather than properly centering the car between them. Similarly, some ACCs make for a smooth ride, while others may apply the brakes when another vehicle cuts you off after overtaking – rather than just letting their higher speed widen the gap between you, as most human drivers would do.
Join live panel discussion “How to scale ADAS testing with objective KPIs“. Register here.
Why ADAS Performance Varies So Much
One of the reasons for the varying performance of ADAS features is that reliable, objective testing procedures for public roads have been rare.
During real-world testing, car makers and their suppliers routinely record all onboard data from sensors, actuators and more – allowing for in-depth analysis of system failures, near-misses and similar incidents after a drive. However: These onboard systems only record their own version of events – figuratively speaking, what the car thinks happened during the drive. If you want to iron out false positives/negatives, you need to compare this questionable version of events to a more trusted data set – reference data, or “ground truth” data.
Using Lane-Keep Assist as an example: If a test vehicle missed a lane border, overcorrected, or failed completely, you need to closely examine the actual environment/lane borders in that exact position – as well as the vehicle’s relative position and pose in that specific moment.
In the past, this has been a huge task and massive undertaking, requiring lots of on-site engineering manpower and high-precision measurements that were only possible in closed-off, controlled environments – in other words, on proving grounds.
And while proving grounds are an amazing asset for automotive testing, the total scope and variability of their test routes are by definition limited – which makes it a challenge to optimize a system for use on hundreds of thousands of kilometers of open road. Similarly, standardized tests as defined by EuroNCAP or ISO don’t come close to capturing the variety of roads and scenarios a vehicle is sure to encounter during its lifecycle.
To use another metaphor: Imagine driver training only taking place on a single safety course, with drivers unable to learn better performance after passing the test and entering public roads.
To better optimize for the real world, more real world testing is required – and it needs to happen without sacrificing the precision and fidelity of known approaches: It is not feasible having to rely on subjective feedback from test drivers that “it felt maybe a little bit strange somewhere back there.”
This leads us to a second reason why ADAS features differ so much from brand to brand: The lack of an objective standard for how they should perform – and what the criteria for optimal driving pleasure might be. The following portion of this article describes a field-tested approach to solving both of these problems.
From Subjective Feelings To Objective KPIs and Measurements
In a collaboration with the performance car maker Porsche, the companies GeneSys, MdynamiX and atlatec as well as research partner Adrive Living Lab of the Kempten University of Applied Sciences have created a solution to bridge this gap. Together, the partners have introduced a testing process that allows for objective ADAS validation at scale and on open roads, as featured in ATZ magazine (Automobiltechnische Zeitschrift).
The approach consists of 4 steps – in the aforementioned project, it was applied to validate LKA performance:
The definition of objective criteria for ADAS performance.
The creation of ground truth environment/HD map data
The accurate recording of vehicle position/pose reference data during test drives
In-depth analysis of relevant driving situations and their recreation in the virtual space/simulation
Defining Objective Criteria For Driving Pleasure
The first challenge is already one of the hardest: How do you quantify an emotional quality criterion such as driving pleasure, or the feeling of safety? To solve this challenge, the Kempten University of Applied Sciences has developed a model of 3 layers: Subjective customer assessment, subjective expert assessment and finally a layer of objective vehicle signals.
A series of test drives, workshops, benchmarking campaigns and more produce insights into the subjective preference of customers for how a feature (e. g. an LKA) should perform. These insights are then refined into categories and sub-categories by automotive experts. Finally, the results are matched with related vehicle-level signals and the expected intensity to be measured for each of them, on a scale from none (0) to high (9).
Generating Ground Truth Data At Scale
Automotive OEMs and suppliers of ADAS technology already do test drives on a defined set of routes: These routes are chosen for factors like their variety, internationality, likelihood of certain events and more – and can cover hundreds or thousands of kilometers of public roads across multiple continents.
These routes are a great resource for objective ADAS testing on public roads – if you have access to high-accuracy measurements of their features and a way to generate reference data for the trajectories your test vehicles drive over them.
To this effect, the High Definition (HD) mapping capabilities of atlatec are leveraged to create 3-dimensional maps of test routes on – in this case, on public roads around Stuttgart and Kempten in Southern Germany.
The finished maps are exported into a multi-layered data format allowing for localization and matching of vehicle poses in real time. For the described collaboration, a variation of OpenDRIVE was used.
Test Drive Recording At High Fidelity
Accurately recording and recreating all trajectories driven during testing is made possible by an Automotive Dynamic Motion Analyzer (ADMA) unit by GeneSys: a high-precision motion sensor which allows for differential GNSS correction and is designed specifically for vehicle dynamics testing.
Based on this technology new test methods had to be developed for the objective evaluation of driving characteristics in the ADAS/AD context. Therefore, driver input as well as road and traffic input, control intervention and the resulting vehicle reaction/movement should be evaluated in its 6 degrees of freedom. Derived from the automated lateral control, it is necessary to obtain a high level of knowledge of the road excitation (essentially road markings and surface geometry) and the driver input in order to be able to evaluate the resulting vehicle reaction accordingly. In the case of assisted longitudinal guidance, a high-level knowledge of the surrounding traffic is required.
Like all sensors, environmental sensors such as cameras, radar or lidar are faulty and not available or sufficiently accurate in all situations. This can have a significant impact on driving characteristics: For example, a camera may not be able to reproduce the curvature of a road accurately, which can cause difficulties for the lane-keeping controller. This repeatedly leads to uncertainties if the experienced driving characteristics are a result of the poor performance of sensors, trajectories, controllers, actuators or the poor response of the vehicle influenced by steering, axles, tires and chassis control systems.
In order to investigate this cause and effect chain, a much more accurate reference measurement method should be used as “Ground Truth”.
In addition, an optimized measuring steering wheel allows for precise recording of steering speed/angle and torque/gradient. In the collaboration with Porsche, an original steering wheel was used to fully preserve the brand- and model-specific haptics, control functions and other details, ensuring realistic driver/vehicle interaction.
For the test drives, a comprehensive catalog of defined maneuvers and situations is created by MdynamiX and the University of Applied Science Kempten, ensuring that all relevant scenarios are encountered and recorded.
Turning Data Into Insights And Reproducible Scenarios For Simulation
The use of suitable algorithms makes for precise calculations and the automatic generation of KPI values from the recorded data. For example, the yaw rate and lateral acceleration recorded by the reference system – based on the ground truth curvature – can be matched with the measurements from the onboard system, allowing for accurate measurement of the production system’s deviation from the actual/reference values.
To gain further insights from the data, the digitalized test routes can be imported into automotive simulation tools. This allows for additional MIL/SIL/HIL tests as well as immersive Driver-in-the-Loop simulations.
Additionally, select scenarios encountered and recorded on real-world test drives can be reproduced – allowing for variation of parameters to further narrow down performance limitations.
If you’d like to explore this topic further and in more scientific detail, we recommend the following resources:
M. Höfer, F. Fuhr, B. Schick und P. E. Pfeffer, „Attribute-based development of driver assistance systems“ in 10th International Munich Chassis Symposium 2019, P. E. Pfeffer, Hg., Wiesbaden: Springer Fachmedien Wiesbaden, 2020, 293 – 306.
J. Nesensohn, S. Levéfre, D. Allgeier, B. Schick, and F. Fuhr. “An Efficient Evaluation Method for Longitudinal Driver Assistance Systems within a Consistent KPI based Development Process”.
S. Keidler, D. Schneider, J. Haselberger, K. Mayannavar und B. Schick, „Entwicklung fahrstreifengenauer Ground Truth Karten für die objektive Eigenschaftsbewertung von automatisierten Fahrfunktionen“ in 17. VDI-Fachtagung, Hannover, 2019.
B. Schick, C. Seidler, S. Aydogdu und Y.-J. Kuo, „Driving experience vs. mental stress with automated lateral guidance from the customer’s point of view“ in Proceedings, 9th International Munich Chassis Symposium 2018, P. Pfeffer, Hg., Wiesbaden: Springer Fachmedien Wiesbaden, 2019, S. 27–44, doi: 10.1007/978-3-658-22050-1_5.
S. Aydodgu, B. Schick und M. Wolf, „Claim and Reality? Lane Keeping Assistant – The Conflict Between Expectation and Customer Experience“ in 27. Aachener Kolloquium, Aachen, 2018.
D. Schneider, B. Schick, B. Huber und H. Lategahn, „Measuring Method for Function and Quality of Automated Lateral Control Based on High-precision Digital “Ground Truth” Maps“ in 34. VDI/VW-Gemeinschaftstagung Fahrerassistenzsysteme und Automatisiertes Fahren 2018: Wolfsburg, 07. und 08. November 2018, 2018.
B. Schick, F. Fuhr, M. Höfer und P. E. Pfeffer, „Eigenschaftsbasierte Entwicklung von Fahrerassistenzsystemen“, ATZ Automobiltech Z, Jg. 121, Nr. 4, S. 70–75, 2019, doi: 10.1007/s35148-019-0006-2.
To learn more from the parties involved, feel free to reach out directly:
Titel image: „Eigenschaftsbasierte Entwicklung von Fahrerassistenzsystemen“, ATZ Automobiltech Z *„Eigenschaftsbasierte Entwicklung von Fahrerassistenzsystemen“, ATZ Automobiltech Z **„Eigenschaftsbasierte Entwicklung von Fahrerassistenzsystemen“, ATZ Automobiltech Z ***„Eigenschaftsbasierte Entwicklung von Fahrerassistenzsystemen“, ATZ Automobiltech Z ****„Eigenschaftsbasierte Entwicklung von Fahrerassistenzsystemen“, ATZ Automobiltech Z
As we approach the end of March, let’s look back at the headlines that made noise this month. In this issue: Tesla, Honda, and Volvo. This month we are especially excited about the release of atlatec’s brand-new website. We would like to thank everyone who participated in this challenging project and contributed to the result that we are ready to present. Feel free to check out atlatec.de and let us know what you think.
Tesla is one of those companies that tends to polarize people: You’re either a real fan or a pronounced sceptic, with little middle ground between “Teslaratis” and outspoken critics.
One large reason for that is Tesla’s “Full Self Driving” (FSD) feature – on which, apparently, Tesla is pretty divided itself: While Elon Musk has repeatedly praised the system as an actual self-driving feature on Twitter, his lawyers argue the polar opposite in front of the DMV: A new trove of emails, revealed after after a public records request show that Tesla’s lawyers adamantly claim FSD to be nothing but a L2 driver assist feature – with no perspective or even a plan to turn it into anything resembling autonomous driving, under any conditions.
The article contains a link to the emails if you want to dive in yourself. An additional takeaway that was very interesting to us: Tesla lawyer Eric Williams references the Model 3 handbook, clarifying that FSD will indeed have trouble in areas for which proper map data is not available and may very well be unable to recognize stop signs and traffic lights due to inaccurate maps. Once again, quite the contrast to the messages of Musk himself, who has called reliance on (HD) maps “a really bad idea” before.
So there it is, the first Level 3 system on the market, that will actually allow you to take your hands off the wheel, while the car takes over responsibility for driving.
Honda debuted its first L3 feature this month, the “Traffic Jam Pilot” which can drive autonomously in bumper-to-bumper highway traffic, while the “driver” is free to enjoy the infotainment system or otherwise occupy themself – provided they remain able to take back operations if the system notifies this to be required.
Honda reports they’ve driven 1.3 million kilometers for testing, and have simulated around 10 million scenarios in preparation. Still, the company wants to make sure they’re not moving too fast: The feature will only be available to 100 leasing customers to start with and they’re limiting it to speeds up to 50 km/h rather than the 60 mph regulation allows for.
Volvo Cars is one company that has been behind some massive innovations in automotive over the decades: The 3-prong safety belt, SIPS/side airbags and limiting all new vehicles to 180 km/h top speed, to name a few. The first and the latter were pretty controversial at their time (the latter as recently as 2020) but Volvo did what they thought was right anyway.
The next chapter in that legacy may be ahead: Volvo Cars has announced they see “no long-term future for cars with an internal combustion engine” and will sell nothing but electric vehicles by 2030. By 2025, half of the fleet shall be fully electric already, with hybrids making up the other 50%.
In addition to this massive overhaul, they also want to modernize the customer experience in order to make car sales more digital and mainly online-driven, only offering in-person assistance where customers really want it (e. g. around test drives and delivery).
This month, we have some news of our own, and we’re pretty excited about it: After loads of discussions, drafting, designing and reworking, we are happy to announce the launch of our all-new atlatec.de website.
So, why the do-over? First of all, we wanted to reflect the degree of maturity that we’ve achieved over time: Working for international automotive OEMs and Tier1 suppliers as well as other leading companies in the mobility sector, we thought it was high time to get rid of what our CEO lovingly called “Mickey Mouse animations” and replace similar young-blood gimmicks with actual footage of our work.
Secondly, we wanted to present said work in a more customer-oriented manner: Rather than focusing on what we find interesting ourselves, the new website breaks down our solutions by customer use cases, such as HD maps/scenarios for simulation or maps for AV/ADAS production systems. For those and more, atlatec.de now offers dedicated pages focusing on specific, related parts of our portfolio: All the relevant info is curated in one place, the rest left to explore elsewhere, for those who want to do so.
If you decide to take a look at the new website, we’d love to hear your thoughts on it: Let us know by simply replying to this email or shoot us a message on LinkedIn!
I hope this overview helps you to stay on top of the industry news. Make sure to watch the latest fireside chat with the atlatec team on YouTube.
Stay tuned for the atlatec industry newsletter coming end of April!