The Future of Driving Begins with Trust

People need to be able to quickly and reliably gauge what an autonomous vehicle is going to do next

For many people, the theme of “digital transformation” conjures up a vague feeling somewhere between fascination and uncertainty. Fascination because digital technologies can drastically simplify complex activities, find amazing solutions and offer unique opportunities. Uncertainty because futuristic visions of complete automation seem to leave little room for human individuality and our ability to act.

“At Mercedes-Benz we are convinced that the digital transformation can only be designed successfully if it is deeply anchored within society. Humans and access to data must be at the heart of a digital transformation,” says Jasmin Eichler, Head of Research Future Technologies at Daimler AG. “That is why we are also working on solutions in the field of digitalisation which place the freedom, decision-making autonomy and individuality of human beings at their centre. We aim to create a balance between humans and technology. The approach we are following here is “Human first”.”

Because there are diverse issues around digital transformation, Mercedes-Benz is basing its endeavours on “open innovation”. Stakeholders from all different industries – business, research, art, industry or biology – are brought together for shared research purposes. The results are projects which consider the future of mobility from new perspectives and which produce exceptional problem-solving approaches. Mercedes-Benz presented some of these collaborative projects at the FutureInsight event in Berlin.

A future with autonomous vehicles

How do we establish trust between humans and machines? Autonomous driving is going to be an integral part of our future. When it comes to this topic, Mercedes-Benz regards empathy and trust as central factors for the success and acceptance of the transformation. The concept of “informed trust” takes on great importance here: “People need to be able to quickly and reliably gauge what an autonomous vehicle is going to do next. The vehicle must therefore provide information about its intentions in a way that people can grasp immediately and intuitively,” says Alexander Mankowsky, a futurologist at Daimler. Based on this information, the person needs to be able to decide what they are going to do and how they are going to respond to the situation. Among other innovations at FutureInsight, for this purpose Mercedes-Benz introduced concepts for a “cooperative vehicle”. Projects with external providers demonstrate further possibilities for how future autonomous vehicles could communicate and work together with their surroundings.

The cooperative vehicle – know intuitively what the car intends to do

The cooperative vehicle, based on an S-Class, features 360-degree light signalling. Turquoise light strips in the windscreen, the radiator grille, the headlamps, the exterior mirrors and the lower area of the windows indicate to pedestrians and surrounding traffic that the vehicle is operating in autonomous mode. Lamps on the roof provide information about the next actions that the vehicle is going to perform. Slow flashing means that the vehicle is braking. A stationary light shows that the vehicle is in autonomous driving mode, regardless whether it is driving or at a standstill. The lights on the roof also follow the movements of people at the side of the road and in front of the vehicle to signal that the vehicle is aware of their presence. In doing so, the cooperative vehicle recreates the natural eye contact that would have taken place between the driver and pedestrians. Rapid flashing indicates that it is about to move off.

The cooperative S-Class also informs its surroundings that it is about to enter into operation while it is still at the side of the road. The light strips around the vehicle emit an appropriate light signal. The exterior mirrors fold out and first the rear of the vehicle lifts up followed by the front. These movements resemble a living thing that is waking up and stretching. People can understand this communication intuitively.

Study shows pedestrians wish for 360-degree communication in turquoise

360-degree light signalling is particularly important when it comes to keeping pedestrians informed. This finding is the result of several light studies that Mercedes-Benz has conducted at its test facility in Sindelfingen, as well as at the recently opened site in Immendingen under the direction of Stefanie Faas from Daimler’s Innowerkstatt (innovation workshop). The research looked at how pedestrians react to different signaling autonomous vehicles in various traffic situations. It became clear that light signalling has a strong effect on the acceptance of autonomous driving vehicles, as well as on how safe pedestrians feel. In particular, people wish for light signalling in situations where there was before interaction with the driver. For example, people are used to seeking eye contact with a driver when they wished to cross the road. If light signalling is communicating that a vehicle is in the autonomous driving mode, pedestrians can feel safe even if the vehicle occupants are obviously not paying attention to what is happening in traffic. The majority of participants in the study preferred turquoise as the signalling colour. All participants favoured a 360-degree display. Mercedes-Benz is also contributing the results of the study on the theme of “autonomous driving” to SAE International, an international organisation dedicated to advancing mobility technology. There Mercedes-Benz recommends the use of turquoise, a colour which, up to this point, has not been used in the automotive sector to enable 360° signalling.

Visions of the future: the vehicle body as a means of communication

Going beyond the studies and the light signalling demonstrated based on the cooperative vehicle, Mercedes-Benz is already concerning itself with longer-range visions, which are intended to enable “informed trust” between humans and machines. Informed trust contrasts with blind trust and demands a certain knowledge of the object. Here the entire outer skin of the vehicle becomes a communication medium for 360-degree communication. The conventional body is transformed into a “digital exterior”.

Mercedes-Benz showed a first step in this direction back in 2015 with the F015 research vehicle. Among other features, this has a digital grille, which can be used as a communication medium. A year later the Vision Van, an electrically powered van with integral delivery drones for transporting parcels over the last mile, picked up on this motif. This is fitted with digital LED grilles at the front and rear, which the vehicle can use to warn traffic behind, for example, with messages such as “Vehicle stopping”. In 2018 the Vision URBANETIC, a mobility concept for on-demand, efficient and sustainable mobility, took this design further. The concept comprising an autonomous drive platform with interchangeable modules for transporting cargo and passengers can communicate with its surroundings by means of “digital shadowing” on the body. For example, the shadow of a pedestrian will be displayed when the vehicle’s 360-degree sensors perceive someone nearby. Due to this interaction, the pedestrian can feel confident that the vehicle has detected them and can act accordingly. Building on these innovations, Mercedes-Benz is now working on other solutions that provide vehicle occupants and passers-by with the same information about the vehicle’s perceptions and subsequent actions. In addition, the vehicle occupants should be able to decide what the vehicle communicates outwardly. This creates a cocooning effect inside the vehicle so that the vehicle feels like a protected space for its passengers.

Groove – interaction via reactive surfaces

The “Groove” project – a collaboration between Mercedes-Benz and the designers at Studio 7.5 in Berlin – is exploring the communicative potential of reactive surfaces. There is also a focus on collaboration between humans and autonomous vehicles. They developed a mobile, manoeuvrable membrane, which perceives its environment and responds to it in a similar way to a sea anemone. The aim of the project is to use these modes of expression to communicate an autonomous system’s processes and intentions to its surroundings. This should improve the interaction between humans and machine.

Polygon – different dimensions of informed trust based on animations

In collaboration with Japanese animation studio Polygon Pictures, Mercedes-Benz has designed animations of different scenarios in which autonomous vehicles could build informed trust with humans. The basic idea when designing animes is that a lot of emotion is expressed in a few strokes. The key issue that this project concerns itself with is therefore: How can these basic principles be used for intuitive communication between human and machine?

Answers are provided by, among other scenarios, the “AICAR” which shows an autonomous vehicle as an animated character. In addition to light signalling which informs of situations such as stopping, moving off or turning, the vehicle has various communication features that can express emotions.

Eye contact is the central focal point of a second scenario. Studies have shown that people intuitively seek eye contact with autonomous vehicles. In order to leverage this behaviour for active interaction between human and machine, Polygon Pictures has created a stylised eye design for autonomous vehicles. This makes it possible to present actions that the vehicle is about to perform and allows people to intuitively grasp what is going on.

A third scenario known as “AIMY” approaches the same topic in a somewhat more abstract way. Here the vehicle communicates with its environment via a target pointer. This target pointer consists of optical signals such as crosses or rays, which announce actions such as turning, accelerating or braking.

“See like a pony” – using the sensory system of animals as a model

The “SLAP – See like a pony” project orchestrated by Sabine Engelhardt from the Future Technologies division at Daimler AG is looking at the interaction between human and machine from a very unusual perspective. She is figuring out how ponies perceive their surroundings and drawing conclusions from that to assist with communication between people and autonomous cars. This approach first originated from Stanford professor Clifford Nass. In one of his lectures the sociologist compared autonomous cars to domestic animals: Their behaviour is predictable to a certain extent but there are, however, also actions which even humans cannot predict. Furthermore, communication between humans and animals chiefly involves body language – similar to the way in which it could take place between humans and machines. At the same time, neither animals nor machines can 100 percent predict and understand human actions. With the help of cameras, “SLAP” puts the researchers in the position to see the world from a pony’s perspective and in that way allows them to learn how they behave with humans. A familiar example is that horses show attention by the direction of their ears. This knowledge of attention helps considerably when interacting with the animals. The findings gleaned from that can be transferred to the design and technology of self-driving cars whereby their sensory attention can be made externally visible and can therefore be comprehended.

Maya Ganesh – the ethics of autonomous vehicles

Researcher and author Maya Ganesh is considering the topic of empathy in the context of the mobility of the future from the meta-perspective. She engages with the ethics of autonomous vehicles. In her lecture “Insight on Ethics; Society & AI” she discusses different aspects of ethics in the interaction between human and machine. Among other issues, she addresses the question of whether an autonomous vehicle can really be regarded as “autonomous”, i.e. whether it can really be understood as a “being” that has intelligence, consciousness, sensory perception and free will. At the same time she asks why people assume that all intelligence must be based on human intelligence, which is itself an arbitrary and changeable benchmark, and whether it is meaningful to apply human standards to the appraisal of machines. Based on these issues, she makes arguments for reclassifying the relationships between humans and machines – whether they be hybrids, cyborgs or robots – and as a consequence thereof for a re-evaluation of the ethical standards that apply to their interaction.

Mercedes partners with Uber

The two companies are joining forces to develop autonomous vehicle technology together and to eventually put self-driving Mercedes-Benzes on the Uber ridesharing network

The two companies are joining forces to develop autonomous vehicle technology together and to eventually put self-driving Mercedes-Benzes on the Uber ridesharing network.

In terms of established automotive brands, Daimler, Mercedes-Benz’s parent company is already at the forefront of autonomous driving technology. So much so that it builds the first series production car — the Mercedes E-Class — that qualifies for an autonomous driving test licence in the state of Nevada.

However, the rate at which changes are taking place means that the company will need to forge alliances and partnerships if it intends to stay ahead of the pack and to navigate the move from being solely a car manufacturer and into the realms of mobility services.

“As the inventor of the automobile, Daimler aims to be a leader in autonomous driving — one of the most fascinating aspects of reinventing mobility,” said Daimler chariman Dieter Zetsche. “Mobility service providers offer an ideal platform for autonomous driving technology and Uber is a leading mobility platform company.”

As well as having a mature digital ride-sharing platform that crosses continents, Uber has very quickly developed its own in-house engineering division dedicated to the advancement of self-driving vehicle technology, making the company a perfect potential partner for Mercedes.

“Self-driving technology holds the promise of creating cities that are safer, cleaner and more accessible,” said Uber CEO, Travis Kalanick. “But we can’t get to that future alone. That’s why we’re opening up the Uber platform to auto manufacturers like Daimler.”

Mercedes Self Driving Cars The Way of The Future

Mercedes-Benz is on it's way to developing autonomous vehicles for a safer driving experience

Imagine you’re driving down the 405, the turn signal clicks on and your Mercedes S-Class accelerates to 65 mph as it changes into the left lane passing two slower vehicles, all while you’re focused on the latest New York Times best seller. Once arriving at your destination, your self-driving car finds a parking space, drops you off and with the push of an electronic key, parks itself in the space.

On your way home, you hit rush hour, but your S-Class keeps a safe and continuous distance in stop-and-go traffic, minimizing your accident risk and letting you focus on your phone call. Your S-Class then effortlessly finds its way home through the dense traffic of the city, navigating it’s way around other cars, trucks, buses, bicyclists and pedestrians, all unpredictable and moving at their own pace. In reduced traffic areas the Mercedes adheres to the prescribed walking pace because it can read traffic signs and thanks to radar sensors and stereo cameras it can also keep an eye on pedestrians.

This is the future of driving, this is autonomous driving.

Up until a few years ago engineers and computer scientists and movie directors developed such science fiction scenarios to provide a visionary outlook of the mobility of the 21st century. Think back to movies like Demolition Man or Minority Report. Thanks to the innovations from Mercedes, reality is catching up with those movies, all the driving described above are already possible and are being tested under real-life conditions with the help of the latest assistance systems from Mercedes-Benz. At the same time researcher and development engineers from electronics companies, automotive suppliers and universities are working on intelligent hardware and software intended to gradually make the vehicles autonomous.

Mercedes Concept

Consequently, everyday life will face a profound revolution. Even though the vision of autonomous driving goes back many decades, it is only now that the combination of steadily increasing computing power, innovations in the area of sensor technology and the scanning of a vehicle’s surroundings paired with the rapid digitisation and networking of everyday life makes driverless mobility attainable. There are many possibilities to enhance traffic safety, to make mobility more efficient and environmentally compatible and to create unimaginable freedoms for all road users. However, before the goal of highly or even fully autonomous driving is reached, several development obstacles must be overcome in order to make the hardware and software faster, more intelligent and more affordable. At the same time the infrastructure, legislative bodies and society will have to prepare for this new dimension of motoring.

“Autonomous driving will gradually become reality”, states Ralf Guido Herrtwich, Head of Driving Assistance and Chassis Systems in Group Research and Advanced Engineering at Daimler. “Initially we will drive autonomously on certain classes of roads, starting with the freeways and maybe only under certain weather or lighting conditions. In the beginning the system will also have to be monitored rather than grabbing a book and tuning out completely.”

Herrtwich warns that placing too high an expectation on autonomous vehicles that manage without any human intervention would be dangerous. At slow speeds, in stop-and-go traffic or in parking situations, driverless mobility is just a few years away. But, at high speeds and in complex situations it will be necessary for the driver to be involved for at least the next ten years. Assistance systems already on the market have shown that partly autonomous vehicles are capable of lowering the amount of accidents by compensating for human errors and reacting faster than humanly possible.

Mercedes Concept F125!

Driving assistance systems, like the ones on the new 2014 Mercedes S-Class and others found as standard equipment on Mercedes-Benz models play a crucial role in autonomous driving. These technologies are responsible for merging comfort and safety, they include DISTRONIC PLUS proximity control, which keeps the desired distance from the vehicle travelling ahead. In addition, the STEER CONTROL steering assistance system, for example in the new Mercedes-Benz E- and S-Class, keeps the vehicle in the centre of the lane. However, drivers need to keep their hands on the steering wheel at all times. Active Lane Keeping Assist can intervene when the driver unintentionally crosses a dotted line and the adjacent lane is occupied. The previous generation of the lane-keeping assistance system was already capable of detecting when a solid line was crossed. BAS PLUS Brake Assist with Cross-Traffic Assist cannot only prevent rear-end collisions, but can also intervene in the event of impending collisions with crossing traffic at junctions, if need be even including a full emergency stop. The latest version can now recognise pedestrians walking in front of the vehicle, warn the driver visually and audibly or in emergency situations even initiate autonomous braking.

These intelligent systems are made possible by an array of sensors that provide the vehicle with a 360-degree view of what is going on. Radar sensors of different ranges can “see” for a distance of up to 200 metres. Their input is complemented by a stereo camera behind the windscreen. Thanks to its two eyes, the camera can see a three-dimensional image of the area up to about 50 metres in front of the car and from there on – similar to human eyes to infinity – two-dimensionally.

All the data constantly streaming in are processed by various on-board systems, for instance, to calculate the trajectory of crossing vehicles or a pedestrian in anticipatory fashion, “read” traffic signs and issue appropriate warnings or initiate reactions. This makes it possible, for example, to let a vehicle drive or even autonomously overtake other vehicles safely at high speeds with the Mercedes-Benz Motorway Pilot system that has already undergone successful testing under real-life conditions.

Ideally autonomous automobiles equipped with the necessary sensor package, detailed map data and sufficient computing power can travel virtually any arbitrary route. One of the milestones for autonomous driving was the DARPA Grand Challenge, which was organised by the research and development branch of the US Department of Defense in the desert of Nevada in 2004 and 2005. Only on the second try did some of the expensive and hair-raisingly retrofitted vehicles manage to complete the route that stretched over 240 kilometres of very rough terrain.

Mercedes Concept F125!

“These two competitions inspired an entire research community that went to work with passion. This led to a quantum leap in technology, for sensors as well as applications. It is astonishing how far we have come this past decade”, says William “Red” Whittaker, professor of robotics at Carnegie Mellon University (CMU) in Pittsburgh and, together with his team, one of the DARPA winners. Pioneers like Whittaker also know about the obstacles the researchers and engineers still have to eliminate. Firstly there is the question of when the necessary technology will be powerful, compact and affordable enough to have the required potential for series production. The LIDAR laser scanners used for instance in Google’s driverless cars are too expensive for series-production use. Such precision mechanics that constantly rotate on the roof provide a detailed 360-degree view of the surroundings. But they cost several times the value of the cars on which they are mounted.

“Many of the hardware and software components are still too expensive. They are plainly and simply unaffordable for normal consumers. If I had that much money, I’d buy a great sports car and drive myself”, jokes Emilio Frazzoli, professor of aerospace engineering at the Massachusetts Institute of Technology (MIT), who normally deals with autonomous vehicles travelling by land or air.

Mercedes Concept

This is why Daimler researchers like Ralf Guido Herrtwich are trying to offer an intelligently assembled array of radar sensors and cameras that collects the required information even without costly lasers in order to travel safely, efficiently and comfortably. “This technology ultimately mustn’t cost any more than today’s driving assistance systems, that is to say, a couple of thousand euros”, Herrtwich stresses. This also includes a continuously updated digital map that provides significantly more detail and may also remain more up to date than those of conventional navigation systems. Otherwise an autonomous vehicle will flounder if it encounters a new, non-registered construction zone or a recorded bend that deviates from the values measured by the on-board sensors. However, vehicles can assist each other in creating such new real-time maps because theoretically every car is able to record the route it travels and to feed the route data into databases.

Experts like CMU professor Whittaker expect autonomous vehicles to see the world differently. Their navigation aids have little in common with the combination of conventional maps and superimposed images we know from today’s assistance systems. “We are already able to create three-dimensional models of our environment that are better and more detailed than the human eye would ever be able to perceive”, says Whittaker of the initial prototypes. Such super-realistic models of the environment are generated partly on board and – thanks to mobile broadband access to the internet in future vehicles – partly in the Cloud.

Not only the vehicles need to evolve, the surrounding infrastructure does as well. Companies like Daimler have long researched so-called car-to-x communications that allow vehicles to exchange data with each other and their surroundings, including road signs and traffic cameras mounted above the road.

In April the Los Angeles metropolitan area became the first city in the world to synchronise all its 4500 traffic lights. Magnetic sensors in the road and hundreds of cameras feed their data into a central computer that dynamically controls all traffic lights to speed up the traffic flow of seven million daily commuters. During rush hour the system can phase the traffic lights only for bus lanes while other vehicles have to wait. “Especially for driving in an urban area surrounded by hundreds of thousands of other vehicles we already have a wealth of information as well as the infrastructure for lowering the costs and complexity of autonomous driving”, MIT researcher Frazzoli reflects. “A car can use its surroundings and other vehicles for its eyes and ears”.

Mercedes Concept

Besides all the technical advances that are happening rapidly this also requires another change that has already begun. Society at large and legislative bodies have to rethink what constitutes the nature of a vehicle and of the modern transportation system overall. Because what would be possible technically is frequently legally impermissible. The Vienna Convention on Road Traffic from 1968 determines who may steer a car: “Every driver must have control of his vehicle at all times or be able to lead his animals”. Nobody thought of a computer of whatever kind at the wheel 45 years ago. And thus questions about certification and insurance as well as liability in the event of accidents are still a grey area.

Some legislators have tackled the issue. The US states of Nevada, California and Florida were the first to pass laws that govern the certification and operation of autonomous automobiles. This provides an incentive to companies to test their prototypes there and serves as a role model for one of the world’s largest automobile markets. Should the US establish national rules for autonomous vehicles, the EU and China would soon follow suit. Until then, autonomous driving will continue to be relegated to narrowly defined areas of application where people may never really take their hand and eyes off the steering wheel.

“We build all the systems in a way that ensure the driver regains full control the moment he or she wants to take over. Our systems are fully dedicated to providing support and relief”, says Daimler researcher Herrtwich. In his mind the transition from partly to fully autonomous systems is not only a matter of the technical capabilities of the systems, but goes hand in hand with the driver’s growing trust. “Once you personally experience that such a system works, then you trust it in more and more situations.“

Mercedes Concept

That is precisely what people seem to do if they are members of the group called ‘digital natives’, that is to say, all those who grew up surrounded by digital devices and services and in many cases willingly and completely count on technology. They hope that autonomous vehicles will relieve them of performing tedious routine tasks, such as commuting to and from work. People who try to talk on the phone, write something on their smartphone or even read their emails while driving will mostly be thrilled by the prospect of soon leaving the driving largely to the vehicle. The designers are already sketching driver’s seats for concept vehicles that swivel to let drivers direct their attention to a tablet computer or the newspaper instead of watching traffic.

Many senior citizens will also put their hope in the next vehicle generation or the one after that because their sensor systems and algorithms can make up for their own declining abilities. This promises to increase the mobility radius for millions of people who previously had been severely limited by old age, illness or disability.

Against this background it is understandable that Google promotes the prototypes of its autonomous vehicles with a video showing a blind man regaining his mobility thought to have been long lost. “For people with disabilities and senior citizens autonomous driving is a question of human dignity”, robotics researcher Whittaker believes. “For that we by no means need vehicles that drive autonomously under all conditions”. He envisions fully automatic people mover systems for public transportation, such as are already in existence at many airports. And some local authorities are considering their use in inner cities.

Mercedes Concept F125! Interior

Autonomous driving also creates new freedoms in a much wider sense. For MIT professor Frazzoli, for instance, it is not about automatically steered vehicles that drive occupants from Point A to Point B, but about the opportunity to reinvent the transportation system and make it more efficient. “Today our cars are only utilised 5 to 10 per cent. The rest of the time they sit around. That is not a sustainable model”, says the scientist working in Singapore. “That’s why I believe that the ‘sharing economy’ and autonomous driving are two sides of the same coin”. ‘Sharing economy’ refers to a culture of sharing services and objects.

Instead of waiting for all-capable fully autonomous vehicles to arrive, says Frazzoli, carsharing services should be outfitted with vehicles that have a limited list of capabilities, such as finding the way to the nearest filling or charging station, picking up a waiting customer at a specific address, or if needed moving to another location. Such cars would solve several problems of autonomous driving at once, the scientist argues: “Since they’re driving without human passengers, they can always take the easiest route, for example, like a municipal commercial vehicle they could initially drive slowly at the edge of the road, and even if their steering and braking manoeuvres were a bit jerky, it wouldn’t bother any occupants. “In this way it is possible to lower the requirements on autonomous vehicles and at the same time expand their fields of application”. With growing experience the autonomous carsharing fleet could increase its effective radius.

Mercedes Concept F125!

There is still the question of how people at the wheel will come to terms with vehicles that act ever more independently. Experts agree that for the foreseeable future there will be mixed operations: some of the vehicles will be steered by people, while others drive partly or highly autonomously. Vehicles will enter and exit parking spaces at the push of a button. Or learn an oft-travelled route to derive independent actions therefrom. The urban infrastructure will increasingly exchange data with the road users. But at the same time there will be older vehicles on the road that have a lot less electronics and intelligence.

To William Whittaker this interaction of man and machine is not a problem. “When we drive on the motorway, we don’t have any direct contact with other drivers even at high speeds. You observe and interpret the behaviour of other road users. This works for all kinds of driving situations without having to draw a distinction between man and machine. Only one thing is for sure: autonomous driving is already a done deal today and will continue to advance steadily”. Via: Daimler