,

Look out for Ghost in the Fast Lane!

What's new at Ghost: scaling highway testing, integrating radar, and taking the wraps off our new Dallas office!

By Matt Kixmoeller

October 31, 2022

minute read

It’s been an exciting few months at Ghost, as we accelerate progress on our goal of enabling auto OEMs with L4 highway autonomy that scales.  Let’s recap what the fall has brought:

  • We transitioned this summer from the test track to highway validation of our technology, now driving ~5,000 miles/month autonomously around the freeways of the Bay Area.  While these drives are monitored by test drivers and safety drivers, the Ghost Autonomy Engine is executing fully-autonomous driving at highway speeds, handling the normal occurrences of traffic jams, stop-and-go traffic, varying cruising speeds, cut-ins, and a myriad of other complex scenarios that present themselves on the open road.
  • We’ve grown our test fleet, sporting a fresh new look (see above!).  You may be asking, “where’s the wedding cake” of sensors that sits atop most AV development and test vehicles?  Ghost sensors (surround stereo cameras plus radar) are designed to be easily and beautifully integrated by OEMs into any vehicle.  And our Qualcomm-based compute platform can be fully-packaged in almost no space, leaving the trunk empty for cargo and gear! (you can see the Ghost Driving Computer hiding at the top in the picture below…)
  • Ghost now drives with multimodality perception.  We’ve integrated both our KineticFlow visual neural network and HD radar perception to provide scene/lane perception and redundant and precise distance/velocity measurement for objects and actors, all without the aid of HD Maps.  The shot below, for example, is an “inside Ghost” view taken from a recent drive on US 101.

There’s a lot going on in this view, so let’s break it down.  At the top you can see the driving scene out the front window, showing three vehicles in the surrounding lanes.  The bottom left pane visualizes how Ghost “sees” through its two front-mounted stereo cameras (you can see our front camera module affixed to the windshield, just below the rear view mirror), and the resulting per-pixel disparity map generated by the KineticFlow neural network.  The bottom-right pane shows the detected center of the lane (green line) we’re following, as well as the measured distances of the detected relevant surrounding vehicles.  This information is transposed into a top-down lane view at the right, showing our relative distances and velocities to our ego position.  And finally, the numbers at the bottom-left show the steering and accelerator/brake control state of the vehicle, speed, autonomy state (self-driving in this case), and whether the safety driver is providing any inputs/override to control the vehicle.

  • We’ve taken the wraps off our new center for radar development: Dallas.  We have assembled a team of world-class radar experts in Dallas over the past year, mostly hailing from aerospace, defense, and autonomy backgrounds.  The upgraded site and the radar effort is led by Dr. Matt Markel, and if you’d like to really geek out, Matt just literally released the book on radar in autonomous driving.  And speaking of the office, check out the new digs below, complete with a full suite of design and test capabilities, including multiple anechoic chambers with robots for radar articulation.

That’s it for now – stay tuned over the coming weeks as we start to show-off more videos and views of Ghost in action, and if you are in the Bay Area (or very soon in Detroit!) – we’ll see you out there cruising.

Company