Whether or not you welcome the advent of the driverless car—which is just around the bend say several manufacturers—it's a fact that in-car technology continues to transform the motor industry and the driving experience.
Safety is at the heart of most technological innovation, and University of Cambridge researchers have been at the forefront of these developments for some time. One of the key areas in which Cambridge researchers led the way is with a Head Up Display (HUD) incorporating laser holographic techniques, an industry first.
HUD technology enables critical information about speed, lane position and road geometry to appear on the windscreen, so drivers do not have to take their eyes off the road to look at dashboard instruments. This makes their driving experience safer. Recent high profile court cases in which drivers distracted by their mobile phones have caused fatal crashes underline the benefits of displaying information in this non-intrusive way.
The holographic HUD technology—conceptualised in the University’s Department of Engineering more than a decade ago—originated with Prof Bill Crossland in 2001 and was licensed by Cambridge Enterprise to Alps Electric. It was developed by Alps and then by Two Trees Photonics Ltd at Milton Keynes (acquired by DAQRI in March 2016), in collaboration with researchers at Cambridge’s Centre for Advanced Photonics and Electronics (CAPE). Two Trees and Alps designed the products, and eventually Alps manufactured them for integration into Jaguar Land Rover vehicles, with the HUD becoming an available option in September 2014.
The commercially available Cambridge HUD projected information onto the windscreen, in full colour and in two dimensions. According to Prof Daping Chu, the current Director of CAPE, sophisticated in-car technology was just getting started then. At the time, he correctly predicted that this Augmented Reality (AR) approach would become part of the “immersive experience” of driving.
The HUD has joined dashboard cameras, reverse parking and anti-collision systems, proximity sensors and similar features that remove various, fallible human elements from driving and take us closer to the autonomous vehicle. The technology continues to evolve, accelerating progress towards driverless cars.
Now AR is destined “to redefine the automotive user interface while mitigating the transition to autonomous driving”, according to business analysts at ABI Research. Their report in August 2016 predicted that Augmented Reality (AR) and Virtual Reality (VR) applications would redefine the driving experience, making it both safer and more intuitive.
“Specifically, AR Heads-Up Displays (HUDs) will allow advanced autonomous operation by ‘painting’ 3D navigation instructions onto road geometry, highlighting moving obstacles like crossing pedestrians, and enhancing driver awareness of and trust in autonomous operation….By 2025, more than 15 million AR HUDs will ship, with more than 11 million to be embedded solutions.”
For a technology that not long ago was the stuff of science fiction or cheesy television drama (remember Knight Rider?), HUDs have a long history.
The earliest HUDs were developed during the Second World War to help pilots hit their targets while manoeuvring. The modern HUD became commonplace in military aircraft in the 1960s and in commercial aircraft in the 1970s, typically providing information such as airspeed, altitude, heading and a horizon line, with additional information for military applications. In 1988, General Motors introduced the first production car with a HUD.
More than a decade later, the Cambridge team pioneered the use of laser holographic techniques for HUDs, offering better colour, brightness and contrast than other systems in a smaller, lighter package.
Prof Chu understood how HUDs could enhance situational awareness. He suggested that AR could also encourage good driver behaviour, and recent developments have proven his point.
Now AR software can detect, pinpoint, display and even tell the driver about potential obstacles, from preoccupied pedestrians to animals at the roadside. It can project satnav directions on the windscreen, showing virtual objects—arrows, lane markings and so on—while the road stays visible in the driver’s field of view. Additional information may also be added, from weather updates and traffic reports to night-driving assistance.
And while some manufacturers favour embedding the technology into the windscreen, others are developing devices or apps for smartphones to mount on the dashboard, or even introducing wearables—glasses or wristbands that communicate with cars. Touch-screen navigation is also on the horizon.
“The car will evolve,” said Chu. “In 50 years’ time, everything in cars will be controlled by computers”. The way that the human- and computer-controlled elements are integrated, however, remains a moving target. “When these systems are integrated, who ultimately makes critical decisions, the car or the driver? And in the case of disagreement, who wins?”
Cambridge’s laser holographic HUD was an important step in the right direction. Lee Skrypchuk, Human Machine Interface Technical Specialist at Jaguar Land Rover, said: “Incorporating a laser holographic light engine was a true world-first application of technology for Jaguar Land Rover, and I’m delighted that the technology has worked so well in our vehicles”.Tags: Bill Crossland, Centre for advanced photonics and electronics, Daping Chu, Department of Engineering, Head Up Display, HUD, physical sciences