Monday, November 9, 2015

Bringing a bird’s eye view to a car near you

QNX and TI team up to enable surround-view systems in mass-volume vehicles

Paul Leroux
Uh-oh. You are 10 minutes late for your appointment and can’t find a place to park. At long last, a space opens up, but sure enough, it’s the parking spot from hell: cramped, hard to access, with almost no room to maneuver.

Fortunately, you’ve got this covered. You push a button on your steering wheel, and out pops a camera drone from the car’s trunk. The drone rises a few feet and begins to transmit a bird’s eye view of your car to the dashboard display — you can now see at a glance whether you are about to bump into curbs, cars, concrete barriers, or anything else standing between you and parking nirvana. Seconds later, you have backed perfectly into the spot and are off to your meeting.

Okay, that’s the fantasy. In reality, cars with dedicated camera drones will be a long time coming. In the meantime, we have something just as good and a lot more practicable — an ADAS application called surround view.

Getting aligned
Approaching an old problem from a
new perspective
. Credit: TI
Surround-view systems typically use four to six fisheye cameras installed at the front, back, and sides of the vehicle. Together, these cameras capture a complete view of the area around your car, but there’s a catch: the video frames they generate are highly distorted. So, to start, the surround-view system performs geometric alignment of every frame. Which is to say, it irons all the curves out.

Next, the system stitches the corrected video frames into a single bird’s eye view. Mind you, this step isn’t simply a matter of aligning pixels from several overlapping frames. Because each camera points in a different direction, each will generate video with unique color balance and brightness levels. Consequently, the system must perform photometric alignment of the image. In other words, it corrects these mismatches to make the resulting output look as if it were taken by a single camera hovering over the vehicle.

Moving down-market
If you think that all this work takes serious compute power, you’re right. The real trick, though, is to make the system affordable so that luxury car owners aren’t the only ones who can benefit from surround view.

Which brings me to QNX Software Systems’ support for TI’s new TDA2Eco system-on-chip (SoC), which is optimized for 3D surround view and park-assist applications. The TDA2Eco integrates a variety of automotive peripherals, including CAN and Gigabit Ethernet AVB, and supports up to eight cameras through parallel, serial and CSI-2 interfaces. To enable 3D viewing, the TDA2Eco includes an image processing accelerator for decoding multiple camera streams, along with graphics accelerators for rendering virtual views.

Naturally, surround view also needs software, which is where the QNX OS for Safety comes in. The OS can play several roles in surround-view systems, such as handling camera input, hosting device drivers for camera panning and control, and rendering the processed video onto the display screen, using QNX Software Systems’ high-performance Screen windowing system. The QNX OS for Safety complies with the ISO 26262 automotive functional safety standard and has a proven history in safety-critical systems, making it ideally suited for collision warning, surround view, and a variety of other ADAS applications.

Okay, enough from me. Let’s look at a video, hosted by TI’s Gaurav Agarwal, to see how the TDAx product line can support surround-view applications:

For more information on the TDAx product line, visit the TI website; for more on the QNX OS for Safety, visit the QNX website.

Tuesday, November 3, 2015

An ADAS glossary for the acronym challenged

If you’ve got ACD, you’ve come to the right place.

Paul Leroux
Someday, in the not-so-distant future, your mechanic will tell you that your CTA sensor has gone MIA. Or that your EDA needs an OTA update. Or that the camera system for your PLD has OSD. And when that day happens, you’ll be glad you stumbled across this post. Because I am about to point you to a useful little glossary that takes the mystery out of ADAS acronyms. (The irony being, of course, that ADAS is itself an acronym.)

Kidding aside, acronyms can stand in the way of clear communication — but only when used at the wrong time and place. Otherwise, they serve as useful shorthand, especially among industry insiders who have better things to do than say “advanced driver assistance system” 100 times a day when they can simply say ADAS instead.

In any case, you can find the glossary here. And when you look at it, you’ll appreciate my ulterior motive for sharing the link — to demonstrate that the ADAS industry is moving apace. The glossary makes it abundantly clear that the industry is working on, or has already developed, a large variety of ADAS systems. The number will only increase, thanks to government calls for vehicle safety standards, technology advances that make ADAS solutions more cost-effective, and growing consumer interest in cars that can avoid crashes. In fact, Visiongain has estimated that the global ADAS market will experience double-digit growth between 2014 and 2024, from a baseline estimate of $18.2 billion.

And in case you’re wondering, ACD stands for acronym challenged disorder. ;-)

Wednesday, October 28, 2015

Five reasons why they should test autonomous cars in Ontario

Did I say five? I meant six…

Paul Leroux
It was late and I needed to get home. So I shut down my laptop, bundled myself in a warm jacket, and headed out to the QNX parking lot. A heavy snow had started to fall, making the roads slippery — but was I worried? Not really. In Ottawa, snow is a fact of life. You learn to live with it, and you learn to drive in it. So I cleared off the car windows, hopped in, and drove off.

Alas, my lack of concern was short-lived. The further I drove, the faster and thicker the snow fell. And then, it really started to come down. Pretty soon, all I could see out my windshield was a scene that looked like this, but with even less detail:

That’s right: a pure, unadulterated whiteout. Was I worried? Nope. But only because I was in a state of absolute terror. Fortunately, I could see the faintest wisp of tire tracks immediately in front of my car, so I followed them, praying that they didn’t lead into a ditch, or worse. (Spoiler alert: I made it home safe and sound.)

Of course, it doesn’t snow every day in Ottawa — or anywhere else in Ontario, for that matter. That said, we can get blanketed with the white stuff any time from October until April. And when we do, the snow can play havoc with highways, railways, airports, and even roofs.

Roofs, you say? One morning, a few years ago, I heard a (very) loud noise coming from the roof of QNX headquarters. When I looked out, this is what I saw — someone cleaning off the roof with a snow blower! So much snow had fallen that the integrity of the roof was being threatened:

When snow like this falls on the road, it can tax the abilities of even the best driver. But what happens when the driver isn’t a person, but the car itself? Good question. Snow and blowing snow can mask lane markers, cover street signs, and block light-detection sensors, making it difficult for an autonomous vehicle to determine where it should go and what it should do. Snow can even trick the vehicle into “seeing” phantom objects.

And it’s not just snow. Off the top of my head, I can think of 4 other phenomena common to Ontario roads that pose a challenge to human and robot drivers alike: black ice, freezing rain, extreme temperatures, and moose. I am only half joking about the last item: autonomous vehicles must respond appropriately to local fauna, not least when the animal in question weighs half a ton.

To put it simply, Ontario would be a perfect test bed for advancing the state of autonomous technologies. So imagine my delight when I learned that the Ontario government has decided to do something about it.

Starting January 1, Ontario will become the first Canadian province to allow road testing of automated vehicles and related technology. The provincial government is also pledging half a million dollars to the Ontario Centres of Excellence Connected Vehicle/Automated Vehicle Program, in addition to $2.45 million already provided.

The government has also installed some virtual guard rails. For instance, it insists that a trained driver stay behind the wheel at all times. The driver must monitor the operation of the autonomous vehicle and take over control whenever necessary.

Testing autonomous vehicles in Ontario simply makes sense, but not only because of the weather. The province also has a lot of automotive know-how. Chrysler, Ford, General Motors, Honda, and Toyota all have plants here, as do 350 parts suppliers. Moreover, the province has almost 100 companies and institutions involved in connected vehicle and automated vehicle technologies — including, of course, QNX Software Systems and its parent company, BlackBerry.

So next time you’re in Ontario, take a peek at the driver in the car next to you. But don’t be surprised if he or she isn’t holding the steering wheel.

A version of this post originally appeared in Connected Car Expo blog.

Tuesday, October 20, 2015

ADAS: The ecosystem's next frontier

At DevCon last week, Renesas showcased their ADAS concept vehicle. It was just what you would expect from an advanced demonstration, combining radar, lidar, cameras, V2X, algorithms, multiple displays and a huge amount of software to make it all work. They were talking about sensor fusion and complete surround view and, well, you get the picture.

What isn’t readily obvious as you experience the demo is the investment made and the collaboration required by Renesas and their ADAS ecosystem.

Partnership is a seldom recognized cornerstone of what will ultimately become true sensor fusion. It seems, to me at least, unlikely that anyone will be able to develop the entire system on their own. As processors become more and more powerful, the discrete ECUs will start to collapse into less distributed architectures with much more functionality on each chip. The amount of data coming into and being transmitted by the vehicle will continue to grow and the need to secure it will grow alongside. V2X, high definition map data, algorithms, specialized silicon, vision acceleration and more will become the norm in every vehicle.

How about QNX Software Systems? Are we going to do all of this on our own? I doubt it. Instead, we will continue to build on the same strategy that has helped take us to a leadership position in the infotainment market: collaborating with best of breed companies to deliver a solution on a safety-certified foundation that customers can leverage to differentiate their products.

The view from above at Renesas DevCon.

Wednesday, October 14, 2015

What does a decades-old thought experiment have to do with self-driving cars?

Paul Leroux
Last week, I discussed, ever so briefly, some ethical issues raised by autonomous vehicles — including the argument that introducing them too slowly could be considered unethical!

My post included a video link to the trolley problem, a thought experiment that has long served as a tool for exploring how people make ethical decisions. In its original form, the trolley problem is quite simple: You see a trolley racing down a track on which five people are tied up. Next to you is a lever that can divert the trolley to an empty track. But before you can pull the lever, you notice that someone is, in fact, tied up on the second track. Do you do nothing and let all 5 people die, or do you pull the lever and kill the one person instead?

The trolley problem has undergone criticism for failing to represent real-world problems, for being too artificial. But if you ask Patryk Lin, a Cal Tech professor who has delivered talks to Google and Tesla on the ethics of self-driving cars, it can serve as a helpful teaching tool for automotive engineers — especially if its underlying concept is framed in automotive terms.

Here is how he presents it:

“You’re driving an autonomous car in manual mode—you’re inattentive and suddenly are heading towards five people at a farmer’s market. Your car senses this incoming collision, and has to decide how to react. If the only option is to jerk to the right, and hit one person instead of remaining on its course towards the five, what should it do?”

Of course, autonomous cars, with their better-than-human driving habits (e.g. people tailgate, robot cars don’t) should help prevent such difficult situations from happening in the first place. In the meantime, thinking carefully through this and other scenarios is just one more step on the road to building fully autonomous, and eventually driverless, cars.

Read more about the trolley problem and its application to autonomous cars in a recent article on The Atlantic.

Speaking of robot cars, if you missed last week's webinar on the role of software when transitioning from ADAS to autonomous driving, don't sweat it. It's now available on demand at Techonline.

Wednesday, October 7, 2015

The ethics of robot cars

“By midcentury, the penetration of autonomous vehicles... could ultimately cause vehicle crashes in the U.S. to fall from second to ninth place in terms of their lethality ranking.” — McKinsey

Paul Leroux
If you saw a discarded two-by-four on the sidewalk, with rusty nails sticking out of it, what would you do? Chances are, you would move it to a safe spot. You might even bring it home, pull the nails out, and dispose of it properly. In any case, you would feel obliged to do something that reduces the probability of someone getting hurt.

Driver error is like a long sharp nail sticking out of that two-by-four. It is, in fact, the largest single contributor to road accidents. Which raises the question: If the auto industry had the technology, skills, and resources to build vehicles that could eliminate accidents caused by human error, would it not have a moral obligation to do so? I am speaking, of course, of self-driving cars.

Now, a philosopher I am not. I am ready to accept that my line of thinking on this matter has more holes than Swiss cheese. But if so, I’m not the only one with Emmenthal for brain matter. I am, in fact, in good company.

Take, for example, Bryant Walker-Smith, a professor in the schools of law and engineering at the University of South Carolina. In an article in MIT Technology Review, he argues that, given the number of accidents that involve human error, introducing self-driving technology too slowly could be considered unethical. (Mind you, he also underlines the importance of accepting ethical tradeoffs. We already accept that airbags may kill a few people while saving many; we may have to accept that the same principle will hold true for autonomous vehicles.)

Then there’s Roger Lanctot of Strategy Analytics. He argues that government agencies and the auto industry need to move much more aggressively on active-safety features like automated lane keeping and automated collision avoidance. He reasons that, because the technology is readily available — and can save lives — we should be using it.

Mind you, the devil is in the proverbial details. In the case of autonomous vehicles, the ethics of “doing the right thing” is only the first step. Once you decide to build autonomous capabilities into a vehicle, you often have to make ethics-based decisions as to how the vehicle will behave.

For instance, what if an autonomous car could avoid a child running across the street, but only at the risk of driving itself, and its passengers, into a brick wall? Whom should the car be programmed to save? The child or the passengers? And what about a situation where the vehicle must hit either of two vehicles — should it hit the vehicle with the better crash rating? If so, wouldn’t that penalize people for buying safer cars? This scenario may sound far-fetched, but vehicle-to-vehicle (V2X) technology could eventually make it possible.

The “trolley problem” captures the dilemma nicely:

Being aware of such dilemmas gives me more respect for the kinds of decisions automakers will have to make as they build a self-driving future. But you know what? All this talk of ethics brings something else to mind. I work for a company whose software has, for decades, been used in medical devices that help save lives. Knowing that we do good in the world is a daily inspiration — and has been for the last 25 years of my life. And now, with products like the QNX OS for Safety, we are starting to help automotive companies build ADAS systems that can help mitigate driver error and, ultimately, reduce accidents. So I’m doubly proud.

More to the point, I believe this same sense of pride, of helping to make the road a safer place, will be a powerful motivator for the thousands of engineers and development teams dedicated to paving the road from ADAS to autonomous. It’s just one more reason why autonomous cars aren’t a question of if, but only of when.

Wednesday, September 30, 2015

A low-down look at the QNX concept cars

Paul Leroux
It’s that time of year again. The QNX concept team has set the wheels in motion and started work on a brand new technology concept car, to be unveiled at CES 2016.

The principle behind our technology concept cars is simple in theory, but challenging in practice: Take a stock production vehicle off the dealer’s lot, mod it with new software and hardware, and create user experiences that make driving more connected, more enjoyable, and, in some cases, even safer.

It’s always fun to guess what kind of car the team will modify. But the real story lies in what they do with it. In recent years, they’ve implemented cloud-based diagnostics, engine sound enhancement, traffic sign recognition, collision warnings, speed alerts, natural voice recognition — the list goes on. There’s always a surprise or two, and I intend to keep it that way, so no hints about the new car until CES. ;-)

In the meantime, here is a retrospective of QNX technology concept cars, past and present. It’s #WheelWednesday, so instead of the usual eye candy, I’ve chosen images to suit the occasion. Enjoy.

The Maserati Quattroporte GTS
From the beginning, our technology concept cars have demonstrated how the QNX platform helps auto companies create connected (and compelling) user experiences. The Maserati, however, goes one step further. It shows how QNX can enable a seamless blend of infotainment and ADAS technologies to simplify driving tasks, warn of possible collisions, and enhance driver awareness. The car can even recommend an appropriate speed for upcoming curves. How cool is that?

The Mercedes CLA 45 AMG
By their very nature, technology concept cars have a short shelf life. The Mercedes, however, has defied the odds. It debuted in January 2014, but is still alive and well in Europe, and is about to be whisked off to an event in Dubai. The car features a multi-modal user experience that blends touch, voice, physical buttons, and a multi-function controller, enabling users to interact naturally with infotainment functions. The instrument cluster isn’t too shabby, either. It will even warn you to ease off the gas if you exceed the local speed limit.

The Bentley Continental GT
I dubbed our Bentley the “ultimate show-me car,” partially because that’s exactly what people would ask when you put them behind the wheel. The digital cluster was drop-dead gorgeous, but the head unit was the true pièce de résistance — an elegantly curved 17” high-definition display based on TI’s optical touch technology. And did I mention? The car’s voice rec system spoke with an English accent.

The Porsche 911 Carrera
Have you ever talked to a Porsche? Well, in this case, you could — and it would even talk back. We outfitted our 911 with cloud-based voice recognition (so you could control the nav system using natural language) and text-to-speech (so you could listen to incoming BBMs, emails, and text messages). But my favorite feature was one-touch Bluetooth pairing: you simply touched your phone to an NFC reader in the center console and, hey presto, the phone and car were automatically paired,

The Chevrolet Corvette
I have a confession to make: The Corvette is the only QNX technology concept car that I got to drive around the block. For some unfathomable reason, they never let me drive another one. Which is weird, because I saw the repair bill, and it wasn’t that much. In any case, the Corvette served as the platform for the very first QNX technology concept car, back in 2010. It included a reconfigurable instrument cluster and a smartphone-connected head unit — features that would become slicker and more sophisticated in our subsequent concept vehicles. My favorite feature: the reskinnable UI.

The Jeep Wrangler
Officially, the Wrangler serves as the QNX reference vehicle, demonstrating what the QNX CAR Platform can do out of the box. But it also does double-duty as a concept vehicle, showing how the QNX platform can help developers build leading-edge ADAS solutions. My favorite features: in-dash collision warnings and a fast-booting backup display.

Well, there you have it. In just a few months’ time, we will have the honor of introducing you to a brand new QNX technology concept car. Any guesses as to what the wheels will look like?

If you liked this post, you may also be interested in... The lost concept car photos