The case for automotive GNSS cyber security measures
Since the backbone of automotive position, navigation, time and velocity is dependent on real and secure GNSS data, the spoofing threat is considered dangerous for the safety of using ADAS and driverless-based cars
As the use of ADAS and Autonomous Driving technology grows in automotive, so does the dependency on navigation and safety technologies that accommodate it. One of the core technologies used for navigation is GNSS – Global Navigation Satellite System.
GNSS plays a critical role in nextgeneration positioning systems as the only source of absolute position, navigation, and time.
GNSS is a general term to describe the different satellite technologies providing positioning, navigation, velocity and timing (PNT, or PVT). These satellite networks include GPS, BeiDou, Galileo and GLONASS among others. All of these satellite networks are commonly used across multiple commercial applications.
Automotive and road applications are the largest users of GNSS technology, adding to about 50% of GNSS cumulative revenue, according to the European GNSS Agency.
GNSS – Safety-critical automotive positioning, and the spoofing problem
Spoofing is the action of replicating GNSS signals. A spoofer can fool a receiver into thinking that it is elsewhere in either time or location. The generating and transmitting of falsified GNSS signals at a slightly stronger level than the authentic signals causes the targeted GNSS receiver to accept the fabricated signals. Furthermore, the arrival of cheap Software Defined Radios (SDR), costing as little as $300, combined with the availability of open source code has made spoofing far more accessible to anyone.
Since the backbone of automotive position, navigation, time and velocity is dependent on real and secure GNSS data, the spoofing threat is considered dangerous for the safety of using ADAS and driverless-based cars. In automotive, positioning and time information is increasingly used for safetycritical features. GNSS is now the main input for safety-critical automotive positioning and is used in safety critical systems like speed control – thus GNSS resiliency is becoming an aspect of functional safety in cars. GNSS is now an important part of both ADAS systems available today, and the level 2 – 5 autonomous vehicles. GNSS also determines the exact timing of actions and communications (V2V & V2X and other functions), thus is critical beyond the single car – for the network, fleet and infrastructure supporting the vehicles.
In our ongoing independent tests of several current cars, from leading manufacturers, using advanced ADAS capabilities – disturbing vulnerabilities of the different GNSS systems have been found. By using a simple off-theshelf software defined radio, Regulus’ researchers were able to remotely affect different aspects of the driving experience including navigation, mapping, power calculations, speed control (including autonomous acceleration and deceleration) and even the car’s suspension system.
The cost of ignoring GNSS vulnerability
The Fiat Chrysler\Harman Cybersecurity lawsuit
During the summer of 2015, Wired Reporter Andy Greenberg was driving a Jeep Cherokee while being hacked by Chris Valasek and Charlie Miller, two cybersecurity experts. This exposed a cybersecurity vulnerability across multiple vehicles, leading to 1.4 million cars being recalled.
Almost 4 years later, in January 2019, Fiat Chrysler Automobiles and HARMAN (a Samsung Electronics subsidiary) are at the center of the biggest (est. $440M) automotive cybersecurity lawsuit in history. Both companies are charged with knowing about a cybersecurity vulnerability within their cars, and still releasing them to the public. The US court system sent a clear message to the automotive industry: a car should never be sold without proper cybersecurity.
Spoofing incident at Geneva Motor Show
On March 14, 2019, a GNSS spoofing attack was performed inside Geneva Motor Show. According to the report, companies affected include AUDI, Peugeot, Renault, Rolls-Royce, Volkswagen, Mercedes-Benz, and BMW. This was the largest scale GNSS spoofing attack on cars ever recorded.
The risk of not being prepared
These two recent occurrences were a big wake-up call for automakers worldwide. The mass spoofing of vehicles demonstrates the vulnerability that exists, regardless of car, model, etc. As we have seen in the recent developments in the Fiat Chrysler/Harman (Samsung) lawsuit, regarding known automotive cyber vulnerabilities, these incidents put the automakers in substantial legal liability requiring them to implement solutions for their GNSS vulnerabilities so that drivers and passengers know they are safe from GNSS hacking.
An example of dangerous spoofing – Autonomous vehicle spoofing experiments
In the first week of June, Regulus Cyber experts test-drive the Tesla Model 3 using Navigate on Autopilot (NOA). An active guidance feature for its Enhanced Autopilot platform, it’s meant to make following the route to a destination easier, which includes suggesting and making lane changes and taking interchange exits, all with driver supervision. While it initially required drivers to confirm lane changes using the turn signals before the car moved into an adjacent lane, current versions of Navigate on Autopilot allow drivers to waive the confirmation requirement if they choose, meaning the car can activate the turn signal and start turning on its own.
Tesla Model S was also tested, and during the spoofing experiment, it showed different results such as wrong navigation cues, incorrect battery power warnings (when calculating the distance) and changes to the suspension. However, the spoofing did not affect the actual driving as it does not have the NOA feature available. The Model S test revealed that there is a link between the car’s navigation and air suspension systems. This resulted in the height of the car changing unexpectedly while moving because the suspension system “thought” it was driving through various locations during the test, either on smooth roadways, when the car was lowered for greater aerodynamics, or “off-road” streets, which would activate the car elevating its undercarriage to avoid any obstacles on the road.
Tesla model 3 was successfully spoofed in several attack scenarios. The navigate on autopilot feature is highly dependent on GNSS reliability and spoofing resulted in multiple high-risk scenarios for the driver and car passengers. Tesla Model 3 spoofing during navigating on autopilot led to extreme deceleration and acceleration, rapid lane changing suggestions, unnecessary signaling, multiple attempts to exit the highway at incorrect locations and extreme driving instability. This test proves beyond doubt the crucial dependence on GNSS for any level 2+ autonomous navigation and the high threat spoofing poses to drivers and passengers utilizing these features.
Equipment used for spoofing and hacking
Jammer – ADALAM PLUTO configurable SDR manufactured by Analog Devices ($150).
Spoofer – Blade RF SDR manufactured by nuand ($400) with external PPS sync connected to a laptop *The equipment used was purchased online and easily accessible to anyone.
Important terms regarding Tesla mentioned in this research
Cruise – a mode in which the driver designated a certain maximum speed and the car maintains it.
Autopilot – In addition to the cruise, the mode that can only be activated when the car’s cameras are recognizing lane markings. During this mode, the car is responsible for 3 additional activities – maintaining a safe distance from the car in front, adjusting speed according to road conditions, and maintain the middle of the lane. The driver has to hold the wheel momentarily every few seconds.
Navigate on autopilot (NOA) – Tesla’s semi-autonomous mode, this can only be activated if the car is driving on a road that has 2 lanes in each direction PLUS the car has a clear destination. This mode includes all the features of cruise and autopilot with the addition of 2 new additional activities – changing lanes to maintain maximum speed to pass slow vehicles (the car does require the driver to agree to lane change with blinker) and autonomously exiting highways at the relevant interchange. Exiting highway feature is engaged on its own and does not require driver confirmation, the car automatically engaged the blinker and changes lanes and physically turns off the high into the exit up to a distance of 250m before requiring the driver to regain control.
During the Tesla 3 experiment, the spoofing antenna was mounted on the roof. This was done to simulate an outside attack and see if the car is capable of isolating against the spoofing. This is the typical case in which an external attacker would try to influence the car. This was also done to prevent the spoofing from affecting any nearby cars or other GNSS receivers.
The spoofer can easily use an off the shelf high-gain directional antenna to get a range of up to a mile. If they add an amplifier, a range of a few miles is very much possible. It has already been proven that spoofing can even occur across dozens of miles, for example in the Black Sea spoofing attack in June 2017.
Regulus Cyber initially discovered the Tesla vulnerability during its ongoing study of the threat that easily accessible spoofing technology poses to GNSS (global navigation satellite systems, also known as GPS) receivers.
The researchers found that spoofing attacks on the Tesla GNSS (GPS) receiver could easily be carried out wirelessly and remotely.
Tesla emphasizes that “in both of these scenarios until truly driverless cars are validated and approved by regulators, drivers are responsible for and must remain ready to take manual control of their car at all times.” It appears the Tesla Navigate On Autopilot (NOA) has no reliance on GPS for the actual physical driving decisions. It relies on its own visual sensors, just like a human driver.
There is one exception. The feature “Navigate on Autopilot” (NOA) uses GPS and Google map data, because the point of that feature is to follow a route. The car is only using GPS and map data to determine what lanes it should be in and what exits to take. Actual control of the car is still the job of the onboard sensors. That means that spoofing it basically manipulating the car’s autonomous turning decision, which means an attacker can remotely engage the car to turn while driving with NOA engaged.
Sensor fusion used the GNSS data together with the camera data to make the mistake of turning off the highway. Tesla’s computer used the GPS position to understand where it is. It uses the camera to identify lanes/exits and the radar to avoid collisions and keep the distance from other cars. Tesla does not use LiDAR and even if it did, it wouldn’t matter. The experiment shows that utilizing GNSS for autonomous navigation makes spoofing a wireless threat that manipulates the car’s path.
Tesla Model 3 experiment description
The test was designed to reveal how the semi-autonomous Model 3 would react to a spoofing attack, the Regulus Cyber test began with the car driving normally and the Navigate on Autopilot (NOA) activated, maintaining a constant speed and position in the middle of the lane.
The test started with normal driving, having Navigate on autopilot engaged, driving on a main high way at 95 KPH. The navigation destination was a nearby town requiring the car to autonomously exit an interchange in 2.5 km.
Using a small 1 meter (3 feet) range antenna mounted on the roof, the researchers transmitted fake satellite coordinates that got picked up by the Model 3 receivers. These coordinates were a location on the highway, 150 meters before the exit.
The exact moment that the car was spoofed to the new location, it passed a dotted white line on it’s right hand side, leading to a small road into an emergency pit stop.
Although the car was a few miles away from the planned exit when the spoofing attack began, the car reacted as if the exit was just 500 feet away— slowing down from 60 MPH to 24 KPH, activating the right turn signal, and making a right turn off the main road into the emergency pit stop. During the sudden turn the driver was with his hands on his lap since he was not prepared for this turn to happen so fast and by the time he grabbed the wheel and regained manual control, it was too late to attempt to maneuver back to the highway safely.
The testing is designed to assess the impact of spoofing with low-cost, open source hardware and software, the same kind of technology that is accessible to anyone via e-commerce websites and open source projects online. The very same hardware teenagers use to cheat PokemonGo or Uber drivers to fake their commute. This dangerous technology is everywhere.
Taking control of Tesla’s GPS with off-theshelf tools took less than one minute. The researchers were able to remotely affect various aspects of the driving experience, including navigation, mapping, power calculations, and the suspension system. Under attack, the GNSS system displayed incorrect positions on the maps, making it impossible to plot an accurate route to the destination.
Prior to the Model 3 road test, Regulus Cyber provided its Model S research results to the Tesla Vulnerability Reporting Team, which responded with the following points at that time:
“Any product or service that uses the public GPS broadcast system can be affected by GPS spoofing, which is why this kind of attack is considered a federal crime. Even though this research doesn’t demonstrate any Tesla-specific vulnerabilities, that hasn’t stopped us from taking steps to introduce safeguards in the future which we believe will make our products more secure against these kinds of attacks.
The effect of GPS spoofing on Tesla cars is minimal and does not pose a safety risk, given that it would at most slightly raise or lower the vehicle’s air suspension system, which is not unsafe to do during regular driving or potentially route a driver to an incorrect location during manual driving.
While these researchers did not test the effects of GPS spoofing when Autopilot or Navigate on Autopilot was in use, we know that drivers using those features must still be responsible for the car at all times and can easily override Autopilot and Navigate on Autopilot at any time by using the steering wheel or brakes, and should always be prepared to do so.”
“This is a distressing answer by a car manufacturer that is the self-proclaimed leader in the autonomous vehicle race,” Yoav Zangvil, Regulus Cyber CTO and co-founder comments. “As drivers and safety/security experts, we’re not comforted by vague hints towards future safeguards and statements that dismiss the threats of GPS attacks.” He offers the following counterp oints in response:
Attacks against any GPS system are indeed considered a crime because their effects are dangerous, as we’ve shown, yet the same devices we used to simulate the attacks are legally accessible to any person, online via e-commerce sites Taking steps to “introduce safeguards for the future” indicates that spoofing is, in fact, a major issue for Tesla, which relies heavily on GNSS
In the case of cars, a spoofing attack is confusing in the best case, and a threat to safety in more severe scenarios
The more GPS data is leveraged in automated driver assistance systems, the stronger and more unpredictable the effects of spoofing becomes
The fact that spoofing causes unforeseen results like unintentional acceleration and deceleration, as we’ve shown, clearly demonstrates that GNSS spoofing raises a safety issue that must be addressed.
In addition, the spoofing attack made the car engage in a physical maneuver off the road, providing a dire glimpse into the troubled future of autonomous cars that would have to rely on unsecure GNSS for navigation and decision-making
Given that the trust of the public still has to be earned as the automotive industry moves towards autonomy, the leading players are accountable for a responsible deployment of new technology
As Tesla clearly stated, drivers are responsible for overriding autopilot under a spoofing attack, so it appears its auto pilot system can’t be trusted to function safely under a spoofing attack.
Because every GNSS/GPS broadcast system can be affected by GNSS/ GPS spoofing, the issue is everyone’s problem and shouldn’t be ignored; furthermore, governments and regulators that have a mandate to protect the public’s safety must engage in proactive measures to ensure only safe GNSS receivers are used in cars
According to Tesla, they’ll soon be releasing completely autonomous cars utilizing GNSS, which means that, in theory, an attacker could remotely control the car’s route planning and navigation, however, we’re obligated to ask what steps they’re taking to address this threat, and whether new safeguards will be implemented in its next generation of entirely autonomous cars.
Although the researchers tested only the Model S and Model 3, they concluded that the “disturbing vulnerability” of Tesla’s GNSS system is most likely an industrywide issue, since “the most common GNSS chipsets used are all vulnerable to our testing – you can learn more about it in the Regulus Cyber resiliency report.
Spoofing detection and mitigation
One of the key concepts to negate the harmful effects of spoofing is detection. If a GNSS receiver is capable of telling whether the signals received are real or false, this provides the first line of defense against spoofing. The receiver could stop transmitting fake data into the car’s navigation sensor fusion, hence preventing false information from corrupting the system.
The next step in GNSS cybersecurity is mitigation. This means the GNSS receiver can differentiate between real and fake signals, and lock-on to the real signals coming from satellites, even under a spoofing attack.
Regulus has been developing the Pyramid GNSS technology enabling the detection and mitigation of spoofing. The Regulus Pyramid GNSS technology is diverse, aiming at the different GNSS product levels – from a resilient, stand-alone Pyramid GNSS Receiver, fortified to defend against spoofing attacks, to a software solution compatible with the most common commercial GNSS receivers on the market, and even GNSS chip level detection and mitigation.