Hands-On with AWS DeepRacer Evo Autonomous Race Car!

Hands-On with AWS DeepRacer Evo Autonomous Race Car!

everybody’s doing from tested and I am
in this futuristic office park that also turns out to be the headquarters for
Amazon and in one of these office buildings here is the headquarters for
AWS deep racer that’s the platform for their autonomous race car system that
they have developed to teach developers about machine learning and reinforcement
learning and they just announced a new deep racer Evo system as well as
upgrades to the software platform I’m gonna chance to check both loads out
right now let’s go take a look all right so I’m really excited to be
checking out the appraiser here with sahih casino replied scientists on the
project this is the racer tell me about how this was designed oh it was designed
as part of a process that we would like to teach machine learning to people out
there who are interested I would like to really put that hands-on experience for
them what we were really trying to figure out there was how we can create
an educational content around it and that we can apply it hands-on using a
vehicle so Michael was the essential core for us to make the connection
between what you’re coding versus what you’re observing it’s a totally a
hands-on product that really physical eise’s all that modeling that can be
done with machine learning so not only is it of course it’s for a car you’ve
designed the idea to be a race right this needs to strive autonomously on a
track tell me about the kind of the parameters of that how is this what are
what are your users configuring here what input you get from the sensors and
what are they tweaking to have it run autonomously so in the in the console
there are many things that you can adjust and one of the things that we
were hearing from academia and from our developers along was be careful what you
wish for so it were not important and most
critical parts is the reward function so users today can change the reward
function and they can get parameters from the simulation such as the speed
the steering and is a turn coming up and also things like am I going
I’m not staying within the track within the white lines am i steering off to the
left of the yellow line and things like that so all those are exposed and then
you can design your own reward function with let’s not win a mine that’s good
right right the idea is that there’s not only you have one camera here there’s
computer vision so it’s seeing you know analyzing the scene on a frame by frame
basis but was also it’s lemon tree are you talking its own speed its own where
the wheels are turned and that’s all simulated in virtual
environment exactly and the other key component of
its value were moving to the device was out to have the inference fast enough so
this is where the racing comes into the picture and this is what we see in the
industry people working in the field are really interested in making the machine
learning work in the real world really requires you to move fast right
so one thing that we did was we use an optimizer so around the neural network
is making inferences frame by frame it’s actually putting in an optimizer on the
neural network to make those decisions fast so that it can actually act as
quickly as possible we use Intel open we know to do that can we take a look under
the hood here sure yeah see how this whole thing the whole thing is designed
and obviously it’s very inspired by just a scale model car you have wheels but
it’s really a computer system yeah exactly so the actually what you see
here is that the top of the car is our computer environment but the bottom here
is very low cost we actually experimented with different
scales and what we really found out is that we want to go smaller for the
racing experience but we also would like to give the opportunity for people to
try different tracks that’s why we have rubber so you can use carpet you can use
floor so if you’re racing your opponents in the office that it’s as ready to go
and as versatile as you can get it there’s so many variables in the physics
of the real world yeah the friction of the track the reflectivity of the
material all that needs to be kind of accounted for
yeah it’s actually funny that you mentioned that even the springs that you
will see okay yeah the suspension so in our earlier
models one of our cars came with a different suspension and we were testing
and the car starts bobbing and I’m like this car is so excited to be on the
track no it was it just hit the wrong springs and ami we’re thinking why it’s
not transferring properly so you really need to get a high fidelity simulation
environment so we’ve currently modelled the steering so you would see that in
our console they exposed her to degrees as max
because what the vehicle can do I can also take this off and I can show you
yeah let’s look at computer system because see heat sink there it’s running
computer obviously processing the visuals from the single camera system
you get a live view as well on your control panel but you’re only adjusting
speed there this is like truly as autonomous as can be yes so it’s
actually a learning both steering and speed so we wanted to give the
opportunity for the racers to fine tune their cars so they can calibrate it the
way they wanted based on their model and they can calibrate it such that they can
decide to go a little slower on the curves and that’s a big difference
between that personal environment in the real world hey grab every car through
many laps will change there’ll be wear on the actual physical hardware and that
is that’s shows how robust I guess yeah yeah so we create a lot of very we
actually inject noise in the simulation environment so what we do is we inject
noise when the steering and speed signals are stand that accommodates for
some of the variation in the mechanical components so neural network in the
simulation learns to deal with some of those to create robustness when it’s
removed from the simulation to real-world there is a funny story about
the heatsink that we doesn’t have a really big heatsink initially and then
our next generation cars family were developing heats think came and it
turned out that it was really happy and none of our models were working so we
had to go back and change the mass so masts friction Newtonian physics to
really I mean that’s a lot of mass as well it’s something that you know
electric vehicles and autonomous cars will have to contend with because at the
scale it is a big portion but you do have the sensor it is this camera you
chose this one for you know obviously for for cost but also again to test the
robustness of the system didn’t need to be ultra wide ultra high fidelity after
some of those limitations helped the model
show that there okay well they are yes it is actually be reduced resolution on
the camera bamberg learning so we were thinking like we were going to do 64 48
but what happened is we could get by by just doing 160 by 120 pixels pixel and
the actual do gray scales it turns out that there are more robust to the
changes in the texture and the years convolutional neural network so it
really picks up what matters on the image even with the lower resolution
that’s very cool and obviously this platform was to be was used for this
first season of the deep brazier challenge but what you’re announcing at
reinvent is an upgrade to this can we take a look at the upgrade yes so he
could tell me about this we have a new experience coming up here we preview it
reinvent and what we are planning is that you’re going to now have the option
to add more sensors to your car and we think this is really critical because
one of the important things that we learn from our developers is they
usually have to work with various types of sensors and these types of sensors
change and inputs as well this was literally we show people out there
trying to raise multiple depressors on the track and we’re like this looks
really fun maybe maybe we shall try it is it turned out that Theriault of it is
really really difficult and the literature on this topic is really
catching up as more and more systems are coming out where you need to coordinate
many different components and you have this multitude of sensors with different
modalities so we thought this is really a big challenge that was really fun to
work with and make it happen it does really sound like a challenge because
even though you’re adding second camera lidar it’s orders of magnitudes complex
because all that sensor fusion is then something that needs to be accounted for
on the model on again like a frame-by-frame basis but it’s what the
people wanted it’s they were already putting these deep racers next to each
other and so this allows you to go head-to-head yes basically it does buy
it’s what cameras me alone the card have a depth perception and
that means that when you’re doing a head-to-head race with another car it
enables you to see how far you are from the other car and you can navigate
through so it’s just that raw data the models will train they’ll understand
that you now have near-field awareness so that it’s not just looking at the
contrast that the lines kind of staying within the lines as the most basic tasks
but other moving objects now moving obstacles are one thing and I know
sterile lets you kind of navigate around off schools but moving obstacles other
deep racers that seems to be very complex yes it is very complicated
because you have to account for maybe people at the Navy will not know
this better when you have two battleships and they have to really
predict where the other a battleship is going to be in order to make their
maneuver so it is very similar now and other car is moving the car has to learn
that it’s going to be at a certain trajectory and that it will try to avoid
that trajectory one of the things that we figured out very quickly that we can
do these maneuvers and predict what the car is and the opponent car is moving
using a stereo camera but then we started seeing lots of crashes and down
the analyze why these crashes were happening because you see the car in
front of you you’re going really fast and you steer it but it turns out that
we have a multi objective your board function that even the car is passing
the moving car is coming out of the corner so it really needs to slow down
blind spot exactly so what we realized is that the car actually needs to look
back in order to decide that oh there’s a car coming behind me and it’s another
racer it’s the competence of another collaborator multi-agent system so it
really needs to make a decision on there’s an opponent coming from behind
and there’s a curve in front of me should I move fast or should I slow down
so we actually put the lidar a bit tilted to provide maximum coverage and
so people can they’re gonna get access to this and work
CLEC City it’s gonna be a whole nother season of people with obstacles and the
tracks what is this like going forward it means more sensors like things change
and surprise you in setting this program up in what do you see less going yes yes
actually in the platform you will see that in the consoles you’ll see that
you’re going to be learning how to race against another car by having kind of a
very nasty racer so you’re really training to be really well in the real
world however we are going to provide opportunity for people to pick their
levels so you can adjust the boat car that you’re racing towards which we call
both car and you will see different versions of it coming some of them will
be changing lanes and some of them may be try to block you so over the course
of the year you will see that I you will see this varying levels of opponents
that you will be racing against time to get ready for the next challenge
it’s awesome thank you so much thank you for sharing with us this new version
that the bracer evo thank you thank you for having me

About the Author: Michael Flood


  1. have all the particle accelerators switched off

    originator souls are able to track matter flows

    in varoius naturally occuring forms, but the unnatural ones are maybe not

    tracked so correctly such as particle beams wich instead collide with souls as

    those have a high speed, but i doubt near speed of light speed or i doubt near limitation speed speed

    this sounds strange because you are used to souls belonging to other
    dimension, however souls and originator souls are both in this 3
    dimensions but acting on the message level so particle message speed is
    what those have without concerning mass

    but the highspeed beams of particles do hit the soul constructs because
    there is touch interaction in between the two (such as flow tracking flow following)

  2. What a joke. Why should I be impressed by this? These cars are barely moving. Is Amazon really wasting money on this shit?

  3. The modern neural network with an $80 billion valuation is so slow compared to the $30 line followers we had 20 years ago, it's more like AWS deep equity.

  4. bleh… tacky amazon trying to look like a tech innovator while being a human abuser. I cant stand tech tubers doing videos for clicks and somehow turning a blind eye to shitty things companies like that do… amazon should be a write off… but guess not for norm… brought to you by peanut butter m&m's.

  5. the placement of the lidar is a bit odd. It looks like the body covers would occlude most of the road in front of the racer.

  6. Not only are they not compiling with the FTC's guidelines for announcing this is a sponsored vid but they cant even level their audio! Literally would take a few mins todo these things, wow

  7. https://www.youtube.com/watch?v=X2h_yHnTwVw
    Hi guys, thought you might be interested in this video. Colin is a legend in the UK! Check out his hoverbike video. Awesome!

    Geoff Govey

  8. Uh-huh. Hm. So, looks like they haven’t been able to get very slow RC cars to drive remotely (Punintentional) safely on the clearest and plainest possible type of road. Is that… cool? 😐 This looks like shit you could do with LEGO Mindstorms.

  9. I told Wilbur and I told Orville, that thing will never fly. Needs a few more laps– I wish I could drive all over the road and still get the big applause.
    Robot scientist has a very engaging laugh. Nice.

Leave a Reply

Your email address will not be published. Required fields are marked *