Assisting the Driver: From Android Phones to Android Cars (Google I/O ’17)

Assisting the Driver: From Android Phones to Android Cars (Google I/O ’17)

welcome to Google I/O 2017. I’m Patrick Brady from the
Android team at Google, and I’m here with AJ and
Chen to give you guys an update on Android in cars. We have a lot to
cover today, and I see there’s still
people streaming in. But let’s buckle
up and get driving. The automotive industry
is really heating up. I think that’s no secret. We see cars rapidly transforming
from disconnected mechanical machines to powerful
connected computers on wheels. The controls are shifting from
mechanical knobs and dials to software defined interfaces
accessible through touch screens and voice controls. Cars are getting more sensors
and screens, microphones, connectivity, and compute power. We also see a convergence
between the mobile and automotive worlds, creating
exciting opportunities to revolutionize the way
we interact with our cars today, through open platforms,
artificial intelligence, and vibrant
developer ecosystems. Android Auto is an effort
from Google and our partners to bring all of these
advances together and create a safe and seamless experience
for drivers everywhere. It’s still early days, but today
anyone with an Android Auto compatible car can
see the beginnings of this vision at
work with access to their favorite Android apps
right from the car’s display. We continue to expand the number
of Android Auto compatible models available through
great partnerships with over 50 car brands. There are over 300 Android
Auto cars available today– compatible cars available
today, including first launches from Chrysler, Renault, Volvo
in the last few months alone. And that’s triple the number
from just one year ago. It’s well on its way to
becoming a standard feature in every new car. But, of course, we know that
not everyone has a new car. And so we’ve also
made the Android Auto app available as a standalone
experience on Android phones. This opens up the
platform and ecosystem to many millions of drivers
no matter what kind of car they drive. Whether you’re looking to
stay connected with WhatsApp, Facebook Messenger or Allo,
or pass the time with NPR 1 and audio books from
Audible, or the latest album release on Spotify,
the apps drivers want on the road are available
through Android Auto. And before anyone asks,
Ways is now hitting beta, so we’re very excited to welcome
them into Android Auto as well. We’re seeing strong growth on
the developer front, as well. In fact, we’ve seen a
50% increase in auto enabled apps in the Play
Store since last year. But none of this would matter
if we didn’t have users, right? Well, I’m happy to say
that Android Auto usage is growing like crazy, as well. Here’s where we were
at Google I/O 2016. And here’s where we
are today at I/O 2017. We’ve seen a 10x growth in
usage over the last year alone. And the strong
trajectory continues. But this is really
just the beginning, and we’re really more
excited about where Android is going on the road ahead. Last year at I/O we
announced that we’re building automotive features
right into the Android platform to make it a turnkey
solution for cars every bit as much an automotive platform
as it is a mobile platform today. We’re doing this
because we believe open platforms are powerful
enablers of ecosystems. They reduce the
cost of development, and they speed time to market. In a sense, they
accelerate innovation. We’re also doing this
because it enables us to deliver an even
more seamless driving experience with all the
information controls and connectivity available
through a single system that’s infused with the driver’s
identity, their digital life, and context. We listen and learn from
partners who had already brought Android into
automotive, and we worked hard to solve
the challenges they hit along the way– a robust and full-featured
Bluetooth stack. A standard way to connect with
the cabin controls for Windows and air conditioning
and car sensors, a system that boots instantly. At Google I/O 2016, we showed
this all off with Android Nougat powering the entire
in-car experience of a concept Maserati– the windows, the doors,
the air conditioning, the instrument cluster. And as much as we really
loved that Maserati, and I’d really like
to take it home. It was a concept, and
I think what we want– and what we think you all want– is to see real cars
running modern, Android powered systems. The traction we’re seeing in
the industry is pretty amazing. Just a few short months after
the official Android Nougat release, several of
our partners were demoing Android powered
infotainment systems at CES 2017, including
Chrysler’s Android powered U Connect system. Earlier this week, the
automotive industry heavyweight with a
reputation for building some of the most advanced
systems on the market– Audi– announced that they will
build their next generation infotainment systems on Android. Please join me in
welcoming Alfons Pfaller, Head of Connected Car and
Infotainment at Audi AG. [MUSIC PLAYING] Patrick, and our
friends from Google, thank you all very, very
much for being here. That we got the invitation to be
a at the Google I/O. For Audi, it’s the third time that
we are here at the I/O. And for Audi, a
Google I/O is really, really a special event– a great event. We have a very, very long
term relationship with Google, and together Patrick
and your team, we have been able to launch
a lot of really cool services over the last years. We started with Google
Earth– amazing. [INAUDIBLE], BUI, Street
View, a lot of things. And now the really big
next step is coming. If you are looking
on the picture– on this picture we see up
brand new Audi interior design. This design is based on the
best in class digital cockpit technology. And we see here three absolute
automotive high-end displays in the front, and additional
displays in the form. Based on this
digital cockpit, we are able to bring it in a
really, really amazing design, and really straightforward,
excellent user experience. And in this interior
design, we are doing a seamless integration
of embedded Android. And I think that is
really, really a big step. If I am to do a short
summary in terms of time, what’s ongoing automotive? We are bringing together
with our silicon [INAUDIBLE] partners really, really high
end computing inside the car. Next step is with our friends
from mobile communication, we are connecting all
cars with high-speed LTE, and very soon also is 5G to
the whole cloud ecosystem. Now the point is
this embedded Android we are able to provide an
open platform for automotive. And this enables
us to invite you– to invite all of you– to participate in automotive
development, and that’s great. So thanks very much,
and looking forward to seeing this concept
on our show car outside. And have [INAUDIBLE]
force of discussion. Thank you very much. Thanks. PATRICK BRADY: Thanks. Thank you Alfons. We couldn’t be more
excited to partner with Audi again to bring
Android into their future cars. But you can’t build an ecosystem
with just one partnership, right? And that’s why I’m happy to say
that one of the industry’s most innovative companies– Volvo– is also putting their
weight behind the Android in-car ecosystem. I’m happy to welcome up
Henrik Green, Senior Vice President of Research and
Development at Volvo Car Group to tell you a bit
about their new system. HENRIK GREEN:
Thank you, Patrick. Good afternoon, and
thank you the friends at Google for inviting us
here to this great event. Thank you for the
sun, and most of all, thank you for the sunscreen. I haven’t looked at the
mirror yet, so I’m not sure. But I’m definitely
burned on the right side. I mean, we have
worked with Google for a number of years on a
different size of projects, and, of course, the
recent Android Auto is a fantastic success of that. But today we’re here to
announce that our partnership is going to the next level. We are, together
with Google team here, going to develop the next
generation of connected car infotainment. And as always for Volvo,
it’s very important for us to create a completely
seamless solution where the user experience is
at the heart of everybody, and that we can provide a better
life, and a better experience through the integrated use
of this infotainment system. Now, the design of a UX
system plays, of course, an enormous important
role, but at the heart of it is the operating system. And here we strongly believe,
that together with Google, we will develop the next
generation based on the Android platform. So, again, it’s
to you out there– you are the developers. You are now the people
and the brain power that will bring this experience
to life in the future cars. So be with us here
and show that we can make the best connected
infotainment system and car experience out there. Thank you so much. [MUSIC PLAYING] PATRICK BRADY: It’s
pretty cool, right? I definitely I definitely
encourage you to go check out the demos that
Audi and Volvo are showing in the Android
and IOT booths out there. As good as the cars
look up on the screens, they look even better in person. Sadly, I have to say
we’re not giving out any free cars this
year at I/O, so you’ll have to make do with a t-shirt. But Audi and Volvo are
really true pioneers in automotive electronics
and infotainment, and we couldn’t have
picked better partners to usher in the future
of connected cars. We’re really, really
excited about that. With the Android
Auto app on phones, with 300 Android Auto
compatible car models, and with Android built in
systems hitting the road soon, we’re pushing forward
to revolutionize the way we stay
connected in our cars, creating a safe and seamless
connected experience for drivers everywhere. But a big part of the
driver’s digital life– as Henrik and Alfons said– is the applications
and services they use throughout their digital
life on their other devices. And we need to bring the
ecosystem along for the ride. So I’d like to
welcome up onstage AJ Kimbembe from the Android Auto
UX team to talk about what this all means for developers. AJ KIMBEMBE: Thank you, Patrick. Thank you, Patrick. And hi, everyone. My name is AJ, and I am the
lead designer for Android in-car infotainment. Patrick already detailed how
we’ve been out at work to make Android the best possible
infotainment platform. We’ve can plug-in to Bluetooth,
BlueTIME, car control– such as HVAC, radio car
sensors, closers, and so on. But Android is not
only a great platform, it’s also an entire ecosystem
of apps and services. But first, I would like to talk
about why are we doing this? So this is natural for
our field research. This driver is trying to
catch Pokemons on Pokemon Go while at the wheel. And you can see the family
crossing in the background. So don’t worry,
everybody is safe. And again, this is
not a stock image. This is our researchers in
the car with the drivers out on the road. And now if this is happening
with our researchers in the vehicle, the gist of
it is our internal research shows that people– drivers– wants
to stay connected, and they may result to
use their phone to do so, and that can lead to situations
that are fundamentally unsafe. We want to have
drivers stay connected with your interface auto
safe and predictable. So I’d like to [INAUDIBLE]
three things here– why it’s complex to design
services for the car. What Android is doing to
help as a platform then. And finally, what
are the various ways that you can integrate with
Android Auto as a developer. OK, what is making
designing apps and services for the car specific, exciting,
but sometimes challenging? Android is a platform
for infotainment across a wide range of
possible surfaces with shapes, aspect ratio, that could
be different– densities, fidelities. A car interior is a
meticulously crafted experience. Everything is optimized
for safety, of course, but also to provide
a specific experience for each brand and
each price point. This space can big, small, wide,
tall, [INAUDIBLE] close or as of this place. Android is a platform that
supports all of these. But along with the
outputs, the inputs methods are multiple, as well. Touchscreens are more and
more common in the cars, but also D-pad, touchpad, rotary
inputs, and many more cause of that as well. And some systems are even
hybrids of multiple input mechanism, and
Android is a platform that supports that, as well. Safety– safety, of
course, is our priority. We are working
really, really hard to reduce driver distraction,
visual distraction, minor distraction,
cognitive distraction. Our internal user research team
works closely and interactively with the broader team to
strengthen understanding of in-vehicle
driver interaction, and also to get our
feature developments. Our team conduct
definitional research– the studies, the
needs assessment, usability evaluation, quantity
surveys and formal driver distraction testing, as well. Also engaging the positive
working relationship with [INAUDIBLE] organization
and to stay up to date with the regulation standards. And we do all that so our
partners and developers don’t have to do it. So you can see that our driving
simulator is an expensive set up that can offer us, at least,
simulate driving conditions. We use it for many
things, including the detection
[INAUDIBLE] on task that is one way that we use
to evaluate cognitive load. We also use it eye tracking. You can see on the picture Katie
with the apparatus on her face. And we also use the
Octagon glasses. The little tool that
we use to simulate a driver focusing on the road
and glancing at the eye– they close and they open. What I would like to say is,
it’s a complex and expensive endeavor. And again, we are doing all
that so our partners and drivers don’t have to worry
too much about it. OK, so now to what Android
is doing to add developers. So, like we said
earlier, we know that drivers want to
stay connected, and want to have them with interface
that are safe and predictable, but the content– the context and the
[INAUDIBLE] free nature of modern digital services
can make that a challenge. The content of infotainment
used to be more predictable. Now with Android
we are designing a system that is ready to
safely bring into cost services that we know autos use on other
platforms, and that we like, as well. Android is already
a great platform for more digital services– on phones, on tablets, on
TVs, on wear-ables, and now we are bringing that
into cars, as well. The key to adapt
quickly and efficiently to the wide diversity of
human machine interfaces that we find in the
industry is modularity. Android is abstracting
away the inputs and outputs so developers don’t have
to worry about screen size or shapes, or whether
something is touch or rotary. We take care of that. Android also provides
an in-car platform so developers can quit
to innovate and focus on building innovative
apps without having to worry about all the details. One way Android does the
heavy lifting for developers is through templates. We’ve designed a system that
enabled automotive grade templates to safely
and effectively perform the most common driving tasks. Those templates are designed
to connect to a wide variety of regulatory concerns. So for instance,
for Spotify, don’t have to worry about what is the
target size for some screens or how many items
can be displayed on one screen in a
given country or region. We take care of that. And another way we
make Android in the Car a predictable
experience is with apps. Although powerful,
the app grid is not the most suitable metaphor
for the automotive space. If I notified the key
areas for the car and made it easy to switch between them. And it’s not only in our
own Android auto platform. You can see this
today, for example, in the best in class systems
designed by our partners– Audi and Volvo. And be able to take them
after the IOT and Android right outside. Here, the most
commonly used functions are elevated so drivers can
quickly and safely switch between them without having to
make their way to an upgrade. We also want the experience
to feel personal and tailored. We work hard to make sure
that Android understands when a driver is in the car
and can optimize its self to safely bring the
drivers apps and services. Android is already built
in with much user support. It’s been a few years now. And we are fully
leveraging this framework in the vehicle to enable
frictionless personalization. We want to get to
the point where a driver can step
into a rental car and have an experience that
is personalized and ready to drive. Now onto the last portion– what are the different ways
that you can bring your Apple service into cars. Android is a common platform
that millions of Americans already know how to build for. If you’re already invested
in Android system, we want to make
things easy for you. Android, as a platform,
does the heavy lifting, and provides a set
of tools and APIs to integrate very
easily with the car. Our goal is to have one single
platform across every car. Let’s look at that
in more details. We want to be respectful
of the time and resources that are required to
commit to bring the app and services to the car. That’s why there
are multiple ways you can integrate
with Android Auto, depending on the nature of
your service, of course. For a few of those apps, it will
require a lot of more effort– a lot more design time, and
possibly more cost, as well– but for most of the app, it will
be less effort and less time that will be involved. What we call full
app, or services that need full control of what
is displayed on the screen– I’ll elect the examples
here, in the [INAUDIBLE] the Volvo S90 that is on demo. Google Maps is using dynamic,
Collarbone, pinchable maps appitized bringing a
full app experience– [INAUDIBLE] a huge amount of
human and technical resources. You have to think about target
size, type size, day and night done, focus best system– such as rotary
input on touchpads. All of these, then, required
full testing capabilities to ensure that the resulting
app is up to formal standards regarding driver distraction. Waze is another example
of an application that we are full
peaks in control. As you can see here,
presenting our Android Auto for your car screen version. Not all the apps are
making application, though. For audio apps,
we’ve created a set of templates that
safely offer all the necessary functionalities
in a simple, predictable, and safe manner. Here we like the example of
the Audi q8 sport concept that is on demo, as well. You can see how Audi
elegantly implemented the media station API that they
can use to bring their media app into cars. The templates are working
off all our surfaces, screen densities, and inputs
and they are rigorously tested for driver distraction. And the great thing is all
of the applications that are already working with Android
Auto for your phone’s screen or for your car screen
will be working here, as well without extra
work from our developers. And you can also see
how Audi implemented media browse API, and here
presented with Pocket Casts, so we can find
your content easily and safely in [INAUDIBLE] and
right in place in the car. Messaging is one of the major
use case in the car– for SMS, but also apps such as WhatsApp
or Facebook Messenger, or Auto. Intuition with Auto
is simple and based on Android notifications. All that is necessary is to
integrate with our Car Extender object with a safe,
voice centric message experience for your users. This is another example of
the media station implemented with [INAUDIBLE] one. But here you can see a WhatsApp
notification showing up. We’ve made them working grades
for safe and practical use on a phone that is mounted on
your dashboard of your car. And the great thing is that
[INAUDIBLE] for WhatsApp we can make it work on different
on different phone factors– Android Auto for your
car screen and that is working for touch, but
also focus-based system, such as rotary inputs. Another way to
integrate with auto is through the Google Assistant. Assistant based apps and
services are inherently cross-platform, and their
voice centric nature makes them ideally
suited for in-car use. The next speaker is going
to talk more about that. And so we not explain it
into more detail here. So that’s what we
are working hard on to enable a healthy
ecosystem with Android. We work hard to have
tucked away the complexity and provide developers
and car manufacturers with a robust, up-to-date,
and secure platform. Now we’d like to welcome Chen
Xiao to talk more about how you can integrate with Google
Assistants on Android Auto. Thank you. CHEN XIAO: Thank you, AJ. Hello, I’m Chen,
Technical Lead and Manager on the Google Assistant
on Android Auto Team, and I’m here to tell you
about how you, the developers, can engage with
users in the car. See if this scenario
sounds familiar. You’re running late for work
and you rush out the door. On your drive, you
think back to yourself, did I actually locked
the front door? Wouldn’t it be awesome if you
could talk to your car and say, OK Google, did I lock the door? Or maybe on that same
commute, you’re thinking, what meetings am I
actually rushing to? Or, I better not forget to
buy milk before I go home. It’d be great if you could
talk to your car and say, what’s my next meeting? Or, remind me to buy
milk before I go home, these are all things
that the Google Assistant can help you with. And more importantly,
you as a developer, can make your services
available via the assistant. As AJ has mentioned
before, voice input is critical in the car. For safety reasons,
we want to make sure the users keep
their eyes on the road and hands on the wheel. While the users are
on this journey, we want to make
sure the user stays connected, they can get things
done, and stay entertained. The Google Assistant can
provide that intelligent and conversational voice as we
move from a mobile first to an AI first world. We want it to be
the ultimate copilot for the user in the car. Users are spending
precious time in the car, and we’re making
the assistant help the user complete the mission
and make the journey fun. With the feature called My Day,
you can say, OK Google, tell me about my day. And the assistant will tell you
about upcoming calendar events, reminders, commute traffic, so
you can hit the ground running when you get to work. On the road, your assistant can
tell you about traffic ahead, or show you gas stations if
you need to make a pit stop. If you’re bored, you
can ask the assistant to play that favorite
song you’ve downloaded or listen to a variety
of music and pod casts. We know that staying connected
is very important for users on the road, so we make
calling and messaging easy with voice actions like,
call mom, send a WhatsApp message, or read my messages. Many of these
experiences are only possible because
developers like you have implemented the right APIs. So thank you for helping
us create that great user experience. Since the assistant
is cross surface, it will help the users get
things done wherever they are– in the home, and on the go. Patrick and AJ have already
talked about messaging and media apps can
integrate with Android Auto for a seamless and consistent
user experience in the car. Those are the templated
and notification apps in AJ’s pyramid
that want to show a UI. But there are tons of
other apps that don’t fall under these categories. How can you, as a developer,
get your experience in the car without having to deal with
the complexity of building a full automotive app and having
to accommodate different screen sizes, various
inputs, and outputs? If your service can be
voice centric in the car, and you want a
lightweight integration for reaching these users,
then the good Google Assistant is a good option. So how do you build for
the Google Assistant? Actions on Google
is the platform for developers to build
for the Google Assistant. The platform allowed developers
to reach users wherever the assistant is available. The platform uses
cloud execution so users don’t need
to preinstall any app. When they ask for
your service by name, you’ll be connected immediately. Many developers are already
building for the Actions on Google. When building for
Actions on Google, there are three key
pieces that need to be defined by the developer– invocation triggers,
dialog, and fulfillment. Let me illustrate
with an example. I like coding, and I
like fortune cookies. So I want to build
a service that gives me a daily dose of
coding fortune cookies. So the first step is I need to
define my invocation trigger. How does the user
connect to my service? What do they
actually have to say? In this case, I’ll
make it simple. Whenever the user says, talk
to coding, fortune cookie, then the Google Assistant will
connect the user to my service. The next step is dialogues. So once the user is
connected to my service, how do I talk to them? What kind of
information am I asking, and what kind of
responses am I expecting? What are valid responses? What are valid responses? In this case, I’m going to
make it a simple question– do they want a Java,
or c++ fortune cookie? So there are only two valid
responses in this case. So let’s suppose that
the user picked Java. Now we can move on to the third
step, which is fulfillment. For fulfillment, I have
all the information that I need, so how do
I go about committing that transaction or executing
that action on behalf of the use? So in this case,
I’m going to look in my bank of Java
fortunes, and I’m going to pick one
out for today– random one. Today’s fortune
says Confucius says, for true happiness
prefer interfaces over abstract classes. That’s one I actually
use everyday. That’s a high level view
of actions on Google. The Google Assistant
and Actions on Google are coming to Android Auto and
will open up a wide possibility of use cases. Agents and services that are
built on top of the platform will transfer over. If you think that users will
interact with your agent or service in the
car, it’s a good idea to start thinking
about, what is the user experience you want to build
for that special context? One car-specific consideration
is safety and driver distraction. How can you build
a user experience that both minimizes driver
distraction and allows users to get things done
and stay entertained? Let’s think about
some potential use cases for engaging
with users in the car. What is possible here? One idea is reservations. It’s a Friday night,
and I’m meeting up with my husband for dinner. We pick out a restaurant,
and on the way there, I would love to be
able to say, book a table– and have that table waiting
for me by the time I get there. What would be even more
awesome and magical is if the Google Assistant
and that third party service can suggest a good
reservation time that takes into account my
location, traffic, and maybe parking availability so I
make a reservation for a time that I can definitely make. Another idea is food ordering. I’m working late on a weeknight
and I’m too lazy to cook. On my way home, I
would love to be able to put in an order
at my favorite restaurant and have that food delivered
and arrived at my home by the time I get there. Saves on time, saves on cooking. Like I mentioned
earlier, how many times have you driven
away from your house and can’t remember
whether you’ve actually locked the front door? It’d be very cool if you could
say, OK Google, check my door. Or, did I lock my door? Just lock it. Another one is lights. I’m driving home and I actually
hate walking into a dark house because I’m actually
afraid of the dark. It would be great if from
the car I could just say, turn on my living room lights. Or turn on my
dining room lights, and I never have to walk
into a dark house again. Games and entertainment–
how many times have you been on long
road trips with kids where you struggle to find a
way to keep them entertained? Wouldn’t it be awesome if you
could say, OK Google, entertain my kids and have
games that are age appropriate for
different kids so they can play whatever they want? They stay entertained and
I can focus on the road. These are just some ideas
on ways developers can engage with users in the car. I look forward to what creative
ideas you come up with. Please check out the
Actions on Google site where there are actually
code samples and tools to help you build the
dialogue, and triggers, and fulfillment steps
so that makes it easier to bring up your service. We also have demos at
IOT and Android domes where we’ll actually be
showcasing some of the use cases that I just talked about,
including controlling hue lights and August door locks
directly from inside the car. Thank you, and back to Patrick. PATRICK BRADY: Thanks, Chen. It’s really exciting to see
a new way for developers to easily extend their
services into cars. With conversational voice,
powered by the Google Assistant, we hope to see many
new types of apps and use cases by next I/O. We
also heard from AJ earlier on how
media and messaging apps can extend into the
car through a scalable way with Android Auto templates
and notification extensions. Android Auto and
the Google Assistant abstract most of the complexity
of automotive development so that you can just worry
about building a great app and experience for your drivers. This model has helped
Android Auto grow to be the largest
in-car ecosystem today, and it’s about to get
a whole lot bigger with the Google Assistant. With 10x growth and users over
the last year, and new partners like Audi and Volvo
building Android directly into their cars, we
know that developers are going to have
plenty of eager drivers out there who want to stay
connected on the road. So we’re excited to continue
this road ahead with all of you to build the future of
automotive infotainment and create a safe and seamless
experience for drivers everywhere. Thanks, and enjoy
the rest of the show.

About the Author: Michael Flood


  1. "Today anyone with an Android Auto compatible car can see the beginnings of this vision at work with access to their favorite Android apps right from the car's display"
    That's just a huge pile of bull. You have to release Android Auto to the world before anyone with a compatible car can use it.
    Car manufacturers will have given up on Android Auto long before you get it on the market.

  2. Android Auto isn't really as powerful as Android itself. I don't get why the need to have it.
    There are even multimedia parts that people can buy to add to their car, that are based on Android itself (sadly quite old versions), yet they have more features than Android Auto.

  3. 35:21 how are you going to pick up the food if no one is at home to pay for the food?

    These examples aren't very good, they seem to be pointless tasks that you'd never really do in the real world.

Leave a Reply

Your email address will not be published. Required fields are marked *