Monday, January 26, 2015

New to 26262? Have I got a primer for you

Driver error is the #1 problem on our roads — and has been since 1869. In August of that year, an amateur scientist named Mary Ward became the first person to die in an automobile accident, after being thrown from a steam-powered car. Driver error was a factor in Mary’s death and, 145 years later, it remains a problem, contributing to roughly 90% of motor vehicle crashes.

Can ADAS systems mitigate driver error and reduce traffic deaths? The evidence suggests that, yes, they help prevent accidents. That said, ADAS systems can themselves cause harm, if they malfunction. Imagine, for example, an adaptive cruise control system that underestimates the distance of a car up ahead. Which raises the question: how can you trust the safety claims for an ADAS system? And how do you establish that the evidence for those claims is sufficient?

Enter ISO 26262. This standard, introduced in 2011, provides a comprehensive framework for validating the functional safety claims of ADAS systems, digital instrument clusters, and other electrical or electronic systems in production passenger vehicles.

ISO 26262 isn’t for the faint of heart. It’s a rigorous, 10-part standard that recommends tools, techniques, and methodologies for the entire development cycle, from specification to production. In fact, to develop a deep understanding of 26262 you must first become versed in another standard, IEC 61508, which forms the basis of 26262.

ISO 26262 starts from the premise that no system is 100% safe. Consequently, the system designer must perform a hazard and risk analysis to identify the safety requirements and residual risks of the system being developed. The outcome of that analysis determines the Automotive Safety Integrity Level (ASIL) of the system, as defined by 26262. ASILs range from A to D, where A represents the lowest degree of hazard and D, the highest. The higher the ASIL, the greater the degree of rigor that must be applied to assure the system avoids residual risk.

Having determined the risks (and the ASIL) , the designer selects an appropriate architecture. The designer must also validate that architecture, using tools and techniques that 26262 either recommends or highly recommends. If the designer believes that a recommended tool or technique isn’t appropriate to the project, he or she must also provide a solid, defensible rationale for the decision.

The designer must also prepare a safety case. This document provides the evidence, necessary for 26262 assessment, that the design is sufficiently safe. The safety case comprises three main components: 1) a clear statement of what is claimed about the system, 2) the argument that the claim has been met, and 3) the evidence that supports the argument. Of course, no system is safe unless it is deployed and used correctly, so the system designer must also produce a safety manual that sets the constraints within which the product must be deployed.

Achieving 26262 compliance is a major undertaking. That said, any conscientious team working on a safety-critical project would probably apply most of the recommended techniques. The standard was created to ensure that safety isn’t treated as an afterthought during final testing, but as a matter of due diligence in every stage of development.

If you’re a system designer or implementer, where do you start? I would suggest “A Developer’s View of ISO 26262”, an article recently authored by my colleague Chris Hobbs and published in EE Times Automotive Europe. The article provides an introduction to the standard, based on experience of certifying software to ISO 26262, and covers key topics such as ASILs, recommended verification tools and techniques, the safety case, and confidence from use.

I also have two whitepapers that may prove useful: Architectures for ISO 26262 systems with multiple ASIL requirements, written by my colleague Yi Zheng, and Protecting software components from interference in an ISO 26262 system, written by Chris Hobbs and Yi Zheng.

Tuesday, January 20, 2015

Driving simulators at CES

CES was just 15 minutes from closing when I managed to slip away from the very busy QNX booth to try out an F1 simulator. Three screens, 6 degrees of freedom, and surround sound came together for the most exciting simulated driving experience I have ever had. I was literally shaking when they dragged me out of the driver’s seat (I didn’t want to stop :-). Mind you, at around $80K for the system, it seems unlikely I will ever own one.

The experience got me thinking about the types of vehicles currently in simulation or in the lab that I fully expect to drive in my lifetime: cars that are virtually impossible to crash, cars that make it painless to travel long distances, and, ultimately, cars that worry about traffic jams so I can read a book.

Re-incarnated: The QNX reference
vehicle.
QNX Software Systems had a very popular simulator of its own at CES this year. You may have seen some details on it already but to recap, it is a new incarnation of our trusty QNX reference vehicle, extended to demonstrate ADAS capabilities. We parked it in front of a 12 foot display and used video footage captured on California’s fabled Highway 1 to provide the closest thing to real-world driving we could create.

The resulting virtual drive showcased the capabilities not only of QNX technology, but of our ecosystem as well. Using the video footage, we provided camera inputs to Itseez’ computer vision algorithms to demonstrate a working example of lane departure warning and traffic sign recognition. By capturing GPS data synchronized with the video footage, and feeding the result through Elektrobit’s Electronic Horizon Solution, we were able to generate curve speed warnings. All this was running on automotive-grade Jacinto 6 silicon from Texas Instruments. LiDAR technology from Phantom Intelligence rounded out the offering by providing collision feedback to the driver.

The lane departure and curve speed warnings in action. Screen-grab from video by Embedded Computing Design.

Meeting the challenge
While at CES, I also had the opportunity to meet with companies that are working to make advanced ADAS systems commercially viable. Phantom Intelligence is one example but I was also introduced to companies that can provide thermal imaging systems and near-infrared cameras at a fraction of what these technologies cost today.

These are all examples of how the industry is rising up to meet the challenge of safer, more autonomous vehicles at a price point that allows for widespread adoption in the foreseeable future. Amazing stuff, really — we are finally entering the era of the Jetsons.

By the way, I can’t remember what booth I was in when I drove the simulator. But I’m willing to bet that the people who experienced the Jeep at CES will remember they were in the QNX booth, seeing technology from QNX and its key partners in this exciting new world.

Tuesday, January 13, 2015

Tom’s Guide taps QNX concept car with CES 2015 award

Have you ever checked out a product review on Tom’s Guide? If so, you’re not alone. Every month, this website attracts more than 2.5 million unique visitors — that’s equivalent to the population of Toronto, the largest city in Canada.

The folks at Tom’s Guide test and review everything from drones to 3D printers. They love technology. So perhaps it’s no surprise that they took a shine to the QNX technology concept car. In fact, they liked it so much, they awarded it the Tom’s Guide CES 2015 Award, in the car tech category.

To quote Sam Rutherford of Tom’s Guide, “After my time with QNX’s platform, I was left with the impression there’s finally a company that just “gets it” when it comes to the technology in cars. The company has learned from the success of modern mobile devices and brought that knowledge to the auto world…”.

I think I like this Sam guy.

Engadget was also impressed...
A forward-looking approach to seeing
behind you.
The Tom’s Guide award is the second honor QNX picked up at CES. We were also shortlisted for an Engadget Best of CES award, for the digital rear- and side-view mirrors on the QNX technology concept car.

If you haven’t seen the mirrors in action, they offer a complete view of the scene behind and to the sides of the vehicle — goodbye to the blind spots associated with conventional reflective mirrors. Better yet, the side-view digital mirrors have the smarts to detect cars, bicycles, and other objects, and they will display an alert if an object is too close when the driver signals a lane change.

In addition to the digital mirrors, the QNX technology concept car integrates several other ADAS features, including speed recommendations, forward-collision warnings, and intelligent parking assist. Learn more here.

Thursday, January 8, 2015

A behind the scenes look at creating an integrated driving experience

Lynn Gayowski
Lynn Gayowski
To quote the timeless lyrics of Britney Spears, “You want a Maserati? You better work.” This is exactly what the QNX team did to get our 2015 technology concept car ready for this year’s CES. And we had the cameras rolling throughout the build process.

The video below not only gives a behind the scenes look at the making of our latest technology concept car based on a Maserati Quattroporte GTS, but it features team members talking about the technology behind the car and what QNX brings to the table (or garage in this case) to enable a customized car in mere months.

Yes, the QNX CAR Platform has cool features and amazing reliability. But another draw for our customers is the platform’s pre-integrated partner technologies. The platform gets silicon, apps, and services working together so OEMs don’t have to solve this problem for themselves. It makes development time shorter and helps the focus to stay on branding the user experience. As Alex — one of the software engineers interviewed — mentions, after seeing what we did in the Maserati, “Just imagine what our customers can do!”

We targeted an integrated driving experience for this vehicle and I think this focus is evident in the finished product. The user interface and ADAS features are intuitive, and let’s be real, gorgeous. Check out this video that summarizes the making of the 2015 QNX technology concept car:



You’ll see many members of the concept team working hard throughout this video, but a shout out as well to all of the developers who contributed to the QNX CAR Platform, QNX operating system, and acoustics technologies that made this amazing vehicle possible. Congratulations to all of you for a job well done!

Wednesday, January 7, 2015

Finalist for Engadget Best of CES Awards 2015

By Lynn Gaowski

*Fist pump!* The accolades from CES just keep coming. I'm excited to share the news that the digital mirrors implemented in our 2015 QNX technology concept car have been selected by Engadget as a finalist for their Best of CES Awards 2015, in the Best Automotive Technology category!

With advanced driver assistance systems (ADAS) influential in the design of this year's QNX vehicle, replacing the mirrors on the Maserati with digital screens to warn of possible collisions and enhance visibility for the driver was a natural choice.

Not only do the side-view screens eliminate blind spots, they also give a red warning overlay if an obstacle is in the way when making a lane change. If the coast is clear, the overlay is green.

The rear-view display is a wide-angle view behind the car that provides the driver with an expanded picture that's larger than what you'd see with a typical mirror.

Powered by the reliable QNX OS, these digital mirrors could be a feature that helps drivers of the future avoid accidents.

The rear- and side-view video displays in the 2015 QNX technology concept car based on a Maserati Quattroporte GTS offer a complete view behind and to the sides of the vehicle, eliminating blind spots.

If you're attending CES, check out the digital mirrors and the many other ADAS and infotainment demos in the QNX booth: North Hall, Booth 2231.


Now with ADAS: The revamped QNX reference vehicle

Tina Jeffrey
Since 2012, our Jeep has showcased what QNX technology can do out of the box. We decided it was time to up the ante...

I walked into the QNX garage a few weeks ago and did a double take. The QNX reference vehicle, a modified Jeep Wrangler, had undergone a major overhaul both inside and out — and just in time for 2015 CES.

Before I get into the how and why of the Jeep’s metamorphosis, here’s a glimpse of its newly refreshed exterior. Orange is the new gray!



The Jeep debuted in June 2012 at Telematics Detroit. Its purpose: to show how customers can use off-the-shelf QNX products, like the QNX CAR Platform for Infotainment and QNX OS, to build a wide range of custom infotainment systems and instrument clusters, using a single code base.

From day one, the Jeep has been a real workhorse, making appearances at numerous events to showcase the latest HMI, navigation, speech recognition, multimedia, and handsfree acoustics technologies, not to mention embedded apps for parking, internet radio streaming, weather, and smartphone connectivity. The Jeep has performed dependably time and time again, and now, in an era where automotive safety is top of mind, we’ve decided to up the ante and add leading-edge ADAS technology built on the QNX OS.

After all, what sets the QNX OS apart is its proven track record in safety-certified systems across market segments — industrial, medical, and automotive. In fact, the QNX OS for Automotive Safety is certified to the highest level of automotive functional safety: ISO 26262, ASIL D. Using a pre-certified OS component is key to the overall integrity of an automotive system and makes system certification much easier.

The ultimate (virtual) driving experience
How better to showcase ADAS in the Jeep, than by a virtual drive? At CES, a 12-foot video screen in front of the Jeep plays a pre-recorded driving scene, while the onboard ADAS system analyzes the scene to detect lane markers, speed signs, and preceding vehicles, and to warn of unintentional lane departures, excessive speed, and imminent crashes with vehicles on the road ahead. Onboard computer vision algorithms from Itseez process the image frames in real time to perform these functions simultaneously.

Here’s a scene from the virtual drive, in which the ADAS system is tracking lane markings and has detected a speed-limit sign:



If the vehicle begins to drift outside a lane, the steering wheel provides haptic feedback and the cluster displays a warning:



The ADAS system includes EB Assist eHorizon, which uses map data with curve-speed information to provide warnings and recommendations, such as reducing your speed to navigate an upcoming curve:



The Jeep also has a LiDAR system from Phantom Intelligence (formerly Aerostar) to detect obstacles on the road ahead. The cluster displays warnings from this system, as well as warnings from the vision-based collision-detection feature. For example:



POSTSCRIPT:
Here’s a short video of the virtual drive, taken at CES by Brandon Lewis of Embedded Computing Design, in which you can see curve-speed warnings and lane-departure warnings:



Fast-boot camera
Rounding out the ADAS features is a rear-view camera demo that can cold boot in 0.8 seconds on a Texas Instruments Jacinto 6 processor. As you may recall, NHTSA has mandated that, by May 2018, most new vehicles must have rear-view technology that can display a 10-by-20 foot area directly behind the vehicle; moreover, the display must appear no more than 2 seconds after the driver throws the vehicle into reverse. Backup camera and other fastboot requirements such as time-to-last-mode audio, time-to-HMI visible, and time-to-fully-responsive HMI are critically important to automakers. Be sure to check out the demo — but don’t blink or you’ll miss it!

Full-featured infotainment
The head unit includes a full-featured infotainment system based on the QNX CAR Platform for Infotainment and provides information such as weather, current song, and turn-by-turn directions to the instrument cluster, where they’re easier for the driver to see.



Infotainment features include:

Qt-based HMI — Can integrate other HMI technologies, including EB Guide and Crank Storyboard.

Natural language processing (NLP) — Uses Nuance’s Vocon Hybrid solution in concert with the QNX NLP technology for natural interaction with infotainment functions. For instance, if you ask “Will I need a jacket later today?”, the Weather Network app will launch and provide the forecast.

EB street director — Provides embedded navigation with a 3D map engine; the map is synched up with the virtual drive during the demo.

QNX CAR Platform multimedia engine — An automotive-hardened solution that can handle:
  • audio management for seamless transitions between all audio sources
  • media detection and browsing of connected devices
  • background synching of music for instant media playback — without the need for the synch to be completed

Support for all smartphone connectivity options — DLNA, MTP, MirrorLink, Bluetooth, USB, Wi-Fi, etc.

On-board application framework — Supports Qt, HTML5, APK (for Android apps), and native OpenGL ES apps. Apps include iHeart, Parkopedia, Pandora, Slacker, and Weather Network, as well as a Settings app for phone pairing, over-the-air software updates, and Wi-Fi hotspot setup.

So if you’re in the North Hall at CES this week, be sure to take a virtual ride in the QNX reference vehicle in Booth 2231. Beneath the fresh paint job, it’s the same workhorse it has always been, but now with new ADAS tech automakers are thirsting for.

Monday, January 5, 2015

QNX and Qualcomm Technologies give show goers another stunner at 2015 CES

Guest post By Nilesh Parekh, Director of Product Management at Qualcomm Technologies

Year after year, CES attendees are repeatedly amazed by the advances in automobile infotainment. Not so long ago, it was about having a great stereo in the car and maybe a tiny screen in the center stack with a primitive navigation system. Then came Bluetooth connectivity… multiple multimedia screens… front and rear displays… gaming… 3G and 4G connectivity… Wi-Fi hotspots…

This year, QNX Software Systems and Qualcomm Technologies are bringing you something really special — a “mashup,” you could say, of a Maserati Quattroporte GTS, the QNX OS, the QNX CAR Platform, and the Snapdragon™ Automotive Solutions (SAS) platform, all working together in a show-stopping technology concept car.



The QNX concept team worked closely with Qualcomm Technologies to create an immersive in-vehicle experience using advanced technologies for infotainment, digital instrument clusters, and driver assistance systems. These systems feature high-resolution UIs with multi-touch support, 3D graphics for navigation, and LiDAR-based obstacle detection. And note the side mirrors have been swapped for smart displays that eliminate typical vehicle blind spots and present relevant color-coded overlay information to promote safer driving.

Inner beauty
Admittedly, the car is a thing of beauty. But being in the tech field, I find the real beauty inside the car — deep inside. There, working hand-in-hand with the field-proven QNX OS, is the Snapdragon Automotive Solutions (SAS) platform. The SAS platform manages all infotainment features; it also processes vital vehicle safety information, collected via a myriad of camera, ultrasonic, and LIDAR sensors, and delivers all relevant information to the driver in real time — that’s a lot of computing and processing power.

What’s so special about the SAS platform? First, let me define what it is (put on your tech hats): a highly integrated, thermal-efficient automotive-grade platform that incorporates an optimized combination of CPU, GPU, 4G LTE modem, GPS/GNSS, Bluetooth, and Wi-Fi. What’s special is that it is engineered to not only enhance the driver and passenger experiences with the infotainment features we know today, but it future-proofs the vehicle for next-generation features — some of which haven’t even been dreamt of yet. More important, it allows automakers and tier ones to accelerate development schedules and to focus on creating feature-rich, reliable infotainment and safety systems built with solutions such as the QNX CAR Platform.

Let’s take a closer look at the three areas of this special technology concept car where I think the presence of SAS makes the biggest impact: the instrument cluster, the infotainment system, and the driver assistance system. And keep in mind that this vehicle is more than a showcase of what’s “out there” and possible — it’s a test bed we’ll use to gain relevant experience and knowledge that we can apply to future technologies in real cars.

The all-digital, reconfigurable instrument cluster
The cluster — the go-to information display for drivers — on the technology concept car can cycle through a number of views, providing the driver with relevant data on what’s going on, in, and around the car in real time. Rear-view park assist, current audio track, navigation data, forward-collision warnings, and vehicle data are all examples of information rendered in the cluster:



The infotainment system
You can’t help but notice the 12” portrait touchscreen next to the instrument cluster. The system is built using the QNX CAR Platform for Infotainment — an automotive-hardened software platform built on the QNX OS. The QNX CAR Platform runs on the SAS hardware and implements a sophisticated UI design that supports voice recognition, touch (including tap, swipe, and pinch and zoom on the map), and synchronizes with the rear-seat control system, allowing rear-seat passengers to manage navigation, song selection, and temperature settings.

Here's a photo of the touchscreen in action. As you can see, it's displaying map info, an incoming call, and a "Now Playing" section. If you simply tap the map, which is powered by Elektrobit (EB) street director navigation and works with EB electronic horizon, it will automatically take over two-thirds of the screen:



Driver assistance system
The car's driver assistance system makes use of LIDAR and ultrasonic sensors to detect the presence of obstacles around the vehicle and renders warning information to the driver through the cluster or side-view displays, and also through an obstacle awareness system made up of dashboard LEDs. This system projects color-coded warnings onto the windshield to indicate the location and proximity of the object.

Other highlights include:
  • “Always On” rear-view display — The rear-view mirror has been converted into a display that renders a wide-angle perspective of the area behind the car
  • Elektrobit electronic horizon — Topographical map data is used to provide curve-speed recommendations and warnings that are displayed in the cluster

If you have the opportunity to see this car at CES, I highly recommend it — it really is an amazing technology concept vehicle that showcases the next-generation of automobile infotainment and safety. It will be located in the Qualcomm booth located in Central Plaza #21A Jan 6-9. If you cannot make it to CES, you can learn more here.