Wednesday, June 8, 2016

Bringing the power of “and” to the car

QNX unveils a new platform at TU-Automotive Detroit and celebrates an acoustics milestone

Paul Leroux
Some people assume that, when it comes to cars, QNX is mostly about infotainment. Or telematics. Or safety. Or security. But in reality, QNX is about all of these things. So, for a better picture of what QNX brings to the car, simply replace all of those ‘or’s with ‘and’s. For an even better picture, add more things to the list. Like instrument clusters. And handsfree systems. And virtualization.

When you put all of these ‘and’s together, you begin to realize that QNX is a platform for the entire automotive cockpit. So why is that important? Well, more than ever, cars are defined by their software. In fact, automakers are now building cars in which half a dozen systems need a high-level OS. Using a single OS platform for all of those systems can consolidate development efforts, increase interoperability, encourage code reuse, reduce training costs, boost productivity, and just plain make things easier. Of course, it doesn’t hurt if that same platform is also secure, standards-based, and production-proven in over 60 million cars.

So why am I going on about this? Because this week, at TU-Automotive Detroit, QNX is showcasing the full breadth of its automotive technology. Visitors to our booth will see demonstrations of ADAS, instrument clusters, infotainment, acoustics, smartphone integration, V2X, remote SIM management — the list goes on. Highlights include the latest QNX technology concept vehicle, which boasts a voice-controlled instrument cluster (man, I’d love one of those) and acoustics technology that allows a driver to talk to back-seat passengers without having to raise his voice or turn around — even if the car is driving at highway speeds. How cool is that?

That’s me, in the driver’s seat of an SUV, speaking to my colleague Tina, who is sitting in the back row. Thanks to
QNX acoustics technology, she can hear me clearly, even though I am speaking normally and looking straight ahead.

New platform for instrument clusters
Of course, we can’t show up at a major auto event without bringing something new for developers. And so, today, we are unveiling the latest addition to our portfolio of automotive safety products, the QNX Platform for Instrument Clusters.

QNX is already a proven player in the digital cluster market. Since 2009, our OS technology has been powering clusters in brands like Alfa Romeo, Audi, Corvette, Jaguar, and Range Rover. (Check out my recent post for a retrospective on QNX-powered clusters.) The new platform builds on this experience, enabling QNX to offer a comprehensive solution for cluster developers, which includes:

  • The QNX OS for Safety, an ISO 26262-certified OS and toolchain that supports all the automotive safety integrity levels, from ASIL A to D, required for clusters and other critical systems
  • A 2D/3D graphics framework based on the OpenGL standard and set to be certified to the ISO 26262 functional safety standard
  • A software framework that protects safety-critical cluster functions from interference by other software components, enabling greater reliability and easier system-level certification
  • A reference implementation, with source code, that gives developers a jumpstart on building fully digital instrument clusters

To get the full story, check out this morning’s press release.

The digital instrument cluster in the QNX concept vehicle, which is based on a Toyota Highlander. QNX has just
unveiled a new platform that allows instrument clusters with ISO 26262 safety requirements to leverage the
full power of accelerated 2D/3D graphics.

50 million systems, you say?
Hands-free systems may be common, but delivering a high-quality hands-free experience can be notoriously difficult. Cars are noisy beasts, and the cacophony created by tires, fans, vents, and open windows can play havoc with any system that has to process voice signals.

What to do? Well, for over 50 million infotainment and telematics systems, automakers have solved the problem with QNX acoustics technology. QNX acoustics offers patented algorithms for echo cancellation, noise reduction, and other technologies to ensure crisp, clear voice communications, even in the harsh sonic environment of the car. In fact, it has become so popular that, on average, it ships in an automotive system every 2.5 seconds. (So, can you do the math and tell me how many systems that adds up to each month?)

Did I mention? The QNX acoustics portfolio does far more than process voice signals. For instance, it includes the QNX Acoustics Management Platform, which offers unified management of all acoustics in the car, enabling customers to reduce the cost, complexity, and time-to-production of audio signal-processing systems. For more details, read this morning’s press release.

Tuesday, June 7, 2016

Everything but the kitchen sink

Using a single SoC to drive a full-blown infotainment stack, 3D surround view, driver monitoring, smartphone connectivity, and dual HD displays.

TI and QNX have been working together in the infotainment space for a long time. The nice thing about this ongoing relationship is that lots of cool technology gets built along the way. Speaking of which, TI have put together a compelling demo that they will show at the TU-Automotive Detroit conference, on June 8 and 9. I’m pretty pumped about it and invite you to take the time to check it out.

The demo is built on the DRA75x (Jacinto 6 EP) SoC, which includes dual ARM Cortex-A15 processors, an Imagination SGX544MP2 GPU, dual TI C66x DSPs, and an IVA-HD video processing core. For starters, it runs the QNX CAR Platform for Infotainment with navigation, multimedia, speech recognition — all the goodies you’ve come to expect. Not surprising, as the platform has been running on Jacinto 6 longer than on any other SoC.

One SoC, two displays, many applications
It’s what they’ve managed to pile on beyond the QNX CAR Platform that makes this demo so exciting. You may not know it but we also work closely with TI on the informational ADAS (infoADAS) front. There’s a full port of the InfoADAS stack available today on the QNX platform and it’s included in the demo. Taking 4 camera inputs and the processing power available on the C66x and SGX, TI can demonstrate full 3D surround view concurrently with everything else. If that wasn’t enough, they’ve added a fifth camera and partnered with FotoNation to add driver monitoring and identification, which runs on the second C66x.

Normally, the smartphone projection runs on ARM, but for this demo, it runs on the IVA-HD to further demonstrate the capabilities of the chip. And to top it all off, the demo drives two HD displays. One display shows the QNX CAR Platform and the other shows the 3D surround view, along with the driver monitoring and identification.

So, to summarize, on one dual-core ARM A15 part, TI is showing a full-blown high-end infotainment system, driver monitoring, the ability to see everything around the car in real time, and the ability to connect to pretty much any smartphone in the world. Take a second to think back to just 5 years ago. It’s amazing how fast this industry moves.

If you aren’t going to be at TU this year, reach out to TI. I’m willing to bet they’d be happy to show it to you…

Tuesday, May 31, 2016

NXP i.MX 8 DV — alive and kicking

The team at NXP have really impressed me with how quickly they were able to bring up the new i.MX 8 DV. If you haven’t heard about it, the DV is a development vehicle that NXP introduced in advance of their upcoming family of i.MX 8 processors, and this thing is a beast.

Mapping closely to the upcoming production device, the DV sports dual A72 and quad A53 cores, along with a host of M-cores and dual Vivante GC7000XSVX GPUs. Combined graphics processing jumps sixfold over the previous generation of i.MX devices. The device also has a strong hardware isolation story: 16 partitions are available to map the various hardware blocks on the device and guarantee isolation between them. This architecture greatly facilitates virtualization and even the ability to partition hardware independent of a hypervisor.

Why is this so great? Chips this powerful can span multiple displays in the vehicle. You could have an infotainment system and a digital instrument cluster running on a single i.MX8. Because you don’t have to worry about virtualizing a single GPU (which is quite the challenge), you can carve up the chip’s graphics and processing power to isolate the infotainment system from the cluster. This, in turn, minimizes your scope of certification. Achieving ISO 26262 for a cluster is daunting enough; achieving it for a complex infotainment system as well is off the scale.

This device marks a change in how QNX Software Systems and NXP work together. For the first time NXP is bringing up a new chip on the QNX OS and Linux in parallel. Usually, Linux come first, but not this time. I am, needless to say, delighted by this level of cooperation between our two companies.

At FTF, NXP demonstrated the i.MX 8DV, and it looked great.

Advanced 3D graphics on an i.MX 8DV.

Tuesday, May 24, 2016

A matter of convergence: building digital instrument clusters with Qt on QNX

Tuukka Turunen
Guest post by Tuukka Turunen, Head of R&D at The Qt Company

The Qt application framework is widely used in automotive infotainment systems with a variety of operating system and hardware configurations. With digital instrument clusters becoming increasingly common in new models, there are significant synergies to be gained from using the same technologies for both the infotainment system and the cluster. To be able to do this, you need to choose technologies, such as Qt and QNX, that can easily address the requirements of both environments.

Qt is the leading cross-platform technology for the creation of applications and user interfaces for desktop, mobile, and embedded systems. Based on C++, the Qt framework provides fast native performance via a versatile and efficient API. It’s easy to create modern, hardware-accelerated user interfaces using Qt Quick user interface technology and its QML language. Qt comes with an integrated development environment (IDE) tailored for developing applications and embedded devices. Leveraging the QNX Neutrino Realtime OS to run Qt provides significant advantages for addressing the requirements of functional safety.

There is a strong trend in the automotive industry to create instrument clusters using digital graphics rather than traditional electromechanical and analog gauges. Unlike the first digital clusters in the 70s, which used 7-segment displays to indicate speed, today’s clusters typically show a digital representation of the analog speedometer along with an array of other information, such as RPM, navigation, vehicle information, and infotainment content. The benefits compared to analog gauges are obvious; for example, it is possible to adapt the displayed items according to the driver’s needs in different situations, or easily create regional variants, or adapt the style of the instrument cluster to the car model and user’s preferences.

A unified experience — for both developers and users
Traditionally, the speedometer and radio have been two very different systems, but today their development paths are converging. Convergence will drive the need for consistency as otherwise the user experience will be fragmented. To meet the needs of tomorrow’s vehicles, it is essential that the two screens are aware of each other and interoperate. It is also likely that, while these are converging, certain items will remain specific to each domain. Furthermore, the convergence will help accelerate time-to-market for car manufacturers by offering simplified system design and faster development cycles.

Qt, which is already widely used in state-of-the-art in-vehicle infotainment systems and many other complex systems, is an excellent technology to unify the creation of these converging systems. By leveraging the same versatile Qt framework and tools for both the cluster and the infotainment system, it is possible to achieve synergies in the engineering work as well as in the resulting application. With the rich graphics capabilities of Qt, creating attractive user interfaces for a unified experience across all screens of the vehicle cockpit becomes a reality.


Cluster demonstrator built with Qt 5.6.

Maximal efficiency
Qt has been used very successfully in QNX-based automotive and general embedded systems for a long time. To show how well Qt 5.6 and our latest Qt based cluster demonstrator run on top of the QNX OS, which is pre-certified to ISO 26262 ASIL D, we brought them together on NXP’s widely used i.MX 6 processor. As the cluster HMI is made with Qt, it runs on any platform supported by Qt, including the QNX OS, without having to be rewritten.

The cluster demonstrator leverages Qt Quick for most of the cluster and Qt 3D for the car model. The application logic is written in C++ for maximal efficiency. By using the Qt Quick Compiler, the QML parts run as efficiently as if they too were written in C++, speeding up the startup time by removing the run-time compilation step.

The following video presents the cluster demonstrator running on the QNX OS and the QNX Screen windowing system:



The QNX OS for Safety has been certified to both IEC 61508 SIL 3 and ISO 26262 ASIL D, so it provides a smooth and straightforward path for addressing the functional safety certification of an automotive instrument cluster.

Qt 5.6 has been built for the QNX OS using the GCC toolchain provided by QNX Software Systems. The display of the cluster is a 12.3" HSXGA (1280×480) screen and the CPU is NXP’s i.MX 6 processor, which is well-suited to automotive instrument clusters.

Our research and development efforts continue with a goal to make it straightforward to build sophisticated digital instrument clusters with Qt. We believe that Qt is the best choice for building infotainment systems and clusters, but that it is particularly beneficial when used in both of these. Please contact us to discuss how Qt can be used in automotive, as well as in other industries, or to evaluate the latest Qt version on the QNX platform.

Visit qt.io for more information on Qt.



About Tuukka
Tuukka Turunen leads R&D at The Qt Company. He holds a Master’s of Science in Engineering and a Licentiate of Technology from the University of Oulu, Finland. He has over 20 years of experience working in a variety of positions in the software industry, especially around connected embedded systems.

Thursday, April 28, 2016

When the rubber ducky hits the road

Paul Leroux
Rubber duckies are born multitaskers. They can serve as bath toys. Or race for charity. Or track ocean currents. Heck, they can even act as crash-test dummies in tiny autonomous vehicles. Don’t believe me? Then check out the following video from MIT’s Computer Science and Artificial Intelligence Laboratory, otherwise known an CSAIL.

Kidding aside, CSAIL has a launched a graduate course on the science of autonomy. This spring, students were tasked to create a fleet of miniature robo-taxis that could autonomously navigate roads using a single on-board camera and no pre-programmed maps. Here is the (impressive) result:



The course looks like fun (and I’m sure it is), but in the process, students learn how to integrate multiple disciplines, including control theory, machine learning, and computer vision. Which, to my mind, is just ducky. :-)


Thursday, April 21, 2016

Autonomous cars that can navigate winter roads? ‘Snow problem!

A look at what happens when you equip a Ford Fusion with sensor fusion.

Paul Leroux
Lets face it, cars and snow don’t mix. A heavy snowfall can tax the abilities of even the best driver — not to mention the best automated driving algorithm. As I discussed a few months ago, snow can mask lane markers, obscure street signs, and block light-detection sensors, making it difficult for an autonomous car to determine where it should go and what it should do. Snow can even trick the car into “seeing” phantom objects.

Automakers, of course, are working on the problem. Case in point: Ford’s autonomous research vehicles. These experimental Ford Fusion sedans create 3D maps of roads and surrounding infrastructure when the weather is good and visibility clear. They then use the maps to position themselves when the road subsequently disappears under a blanket of the white stuff.

How accurate are the maps? According to Ford, the vehicles can position themselves to within a centimeter of their actual location. Compare that to GPS, which is accurate to about 10 yards (9 meters).

To create the maps, the cars use LiDAR scanners. These devices collect a ginormous volume of data about the road and surrounding landmarks, including signs, buildings, and trees. Did I say ginormous? Sorry, I meant gimongous: 600 gigabytes per hour. The scanners generate so many laser points — 2.8 million per second — that some can bounce off falling snowflakes or raindrops, creating the false impression that an object is in the way. To eliminate these false positives, Ford worked with U of Michigan researchers to create an algorithm that filters out snow and rain.

The cars don’t rely solely on LiDAR. They also use cameras and radar, and blend the data from all three sensor types in a process known as sensor fusion. This “fused” approach compensates for the shortcomings of any particular sensor technology, allowing the car to interpret its environment with greater certainty. (To learn more about sensor fusion for autonomous cars, check out this recent EE Times Automotive article from Hannes Estl of TI.)

Ford claims to be the first automaker to demonstrate robot cars driving in the snow. But it certainly won’t be the last. To gain worldwide acceptance, robot cars will have to prove themselves on winter roads, so we are sure to see more innovation on this (cold) front. ;-)

In the meantime, dim the lights and watch this short video of Ford’s “snowtonomy” technology:



Did you know? In January, QNX announced a new software platform for ADAS and automated driving systems, including sensor fusion solutions that combine data from multiple sources such as cameras and radar processors. Learn more about the platform here and here.

Tuesday, March 15, 2016

Goodbye analog, hello digital

Since 2008, QNX has explored how digital instrument clusters will change the driving experience.

Paul Leroux
Quick: What do the Alfa Romeo 4C, Audi TT, Audi Q7, Corvette Stingray, Jaguar XJ, Land Rover Range Rover, and Mercedes S Class Coupe have in common?

Answer: They would all look awesome in my driveway! But seriously, they all have digital instrument clusters powered by the QNX Neutrino OS.

QNX Software Systems has established a massive beachhead in automotive infotainment and telematics, with deployments in over 60 million cars. But it’s also moving into other growth areas of the car, including advanced driver assistance systems (ADAS), multi-function displays, and, of course, digital instrument clusters.

Retrofitting the QNX reference
vehicle with a new digital cluster.
The term “digital cluster” means different things to different people. To boomers like myself, it can conjure up memories of 1980s dashboards equipped with less-than-sexy segment displays — just the thing if you want your dash to look like a calculator. Thankfully, digital clusters have come a long way. Take, for example, the slick, high-resolution cluster in the Audi TT. Designed to display everything directly in front of the driver, this QNX-powered system integrates navigation and infotainment information with traditional cluster readouts, such as speed and RPM. It’s so advanced that the folks at Audi don’t even call it a cluster — they call it virtual cockpit, instead.

Now here’s the thing: digital clusters require higher-end CPUs and more software than their analog predecessors, not to mention large LCD panels. So why are automakers adopting them? Several reasons come to mind:

  • Reusable — With a digital cluster, automakers can deploy the same hardware across multiple vehicle lines simply by reskinning the graphics.
  • Simple — Digital clusters can help reduce driver distraction by displaying only the information that the driver currently requires.
  • Scalable — Automakers can add functionality to a digital cluster by changing the software only; they don’t have to incur the cost of machining or adding new physical components.
  • Attractive — A digital instrument cluster can enhance the appeal of a vehicle with eye-catching graphics and features.
     
In addition to these benefits, the costs of high-resolution LCD panels and the CPUs needed to drive them are dropping, making digital instrument clusters an increasingly affordable alternative.

2008: The first QNX cluster
It’s no coincidence that so many automakers are using the QNX Neutrino OS in their digital clusters. For years now, QNX Software Systems has been exploring how digital clusters can enhance the driving experience and developing technologies to address the requirements of cluster developers.

Let’s start with the very first digital cluster that the QNX team created, a proof-of-concept that debuted in 2008. Despite its vintage, this cluster has several things in common with our more recent clusters — note, for example, the integrated turn-by-turn navigation instructions:



For 2008, this was pretty cool. But as an early proof-of-concept, it lacked some niceties, such as visual cues that could suggest which information is, or isn’t, currently important. For instance, in this screenshot, the gauges for fuel level, engine temperature, and oil pressure all indicate normal operation, so they don’t need to be so prominent. They could, instead, be shrunk or dimmed until they need to alert the driver to a critical change — and indeed, we explored such ideas soon after we created the original design. As you’ll see, the ability to prioritize information for the driver becomes quite sophisticated in subsequent generations of our concept clusters.

Did you know? To create this 2008 cluster, QNX engineers used Adobe Flash Lite 3 and OpenGL ES.

2010: Concept cluster in a Chevrolet Corvette
Next up is the digital cluster in the first QNX technology concept car, based on a Chevrolet Corvette. If the cluster design looks familiar, it should: it’s modeled after the analog cluster that shipped in the 2010-era ‘Vettes. It’s a great example of how a digital instrument cluster can deliver state-of-the-art features, yet still honor the look-and-feel of an established brand. For example, here is the cluster in “standard” mode, showing a tachometer, just as it would in a stock Corvette:



And here it is again, but with something that you definitely wouldn’t find in a 2010 Corvette cluster — an integrated navigation app:



Did you know? The Corvette is the only QNX technology concept car that I ever got to drive.

2013: Concept cluster in a Bentley Continental GT
Next up is the digital cluster for the 2013 QNX technology concept car, based on a Bentley Continental GT. This cluster took the philosophy embodied in the Corvette cluster — honor the brand, but deliver forward-looking features — to the next level.

Are you familiar with the term Trompe-l’œil? It’s a French expression that means “deceive the eye” and it refers to art techniques that make 2D objects appear as if they are 3D objects. It’s a perfect description of the gorgeously realistic virtual gauges we created for the Bentley cluster:



Because it was digital, this cluster could morph itself on the fly. For instance, if you put the Bentley in Drive, the cluster would display a tach, gas gauge, temperature gauge, and turn-by-turn directions — the cluster pulled these directions from the head unit’s navigation system. And if you threw the car into Reverse, the cluster would display a video feed from the car’s backup camera. The cluster also had other tricks up its digital sleeve, such as displaying information from the car’s media player.

Did you know? The Bentley came equipped with a 616 hp W12 engine that could do 0-60 mph in a little over 4 seconds. Which may explain why they never let me drive it.

2014: Concept cluster in a Mercedes CLA45 AMG
Plymouth safety speedometer, c 1939
Up next is the 2014 QNX technology concept car, based on Mercedes CLA45 AMG. But before we look at its cluster, let me tell you about the Plymouth safety speedometer. Designed to curb speeding, it alerted the driver whenever he or she leaned too hard on the gas.

But here’s the thing: the speedometer made its debut in 1939. And given the limitations of 1939 technology, the speedometer couldn’t take driving conditions or the local speed limit into account. So it always displayed the same warnings at the same speeds, no matter what the speed limit.

Connectivity to the rescue! Some modern navigation systems include information on local speed limits. By connecting the CLA45’s concept cluster to the navigation system in the car’s head unit, the QNX team was able to pull this information and display it in real time on the cluster, creating a modern equivalent of Plymouth's 1939 invention.

Look at the image below. You’ll see the local speed limit surrounded by a red circle, alerting the driver that they are breaking the limit. The cluster could also pull other information from the head unit, including turn-by-turn directions, trip information, album art, and other content normally relegated to the center display:



Did you know? Our Mercedes concept car is still alive and well in Germany, and recently made an appearance at the Embedded World conference in Nuremburg.

2015: Concept cluster in a Maserati Quattroporte
Up next is the 2015 QNX technology concept car, based on a Maserati Quattroporte GTS. Like the cluster in the Mercedes, this concept cluster provided speed alerts. But it could also recommend an appropriate speed for upcoming curves and warn of obstacles on the road ahead. It even provided intelligent parking assist to help you back into tight spaces.

Here is the cluster displaying a speed alert:



And here it is again, using input from a LiDAR system to issue a forward collision warning:



Did you know? Engadget selected the “digital mirrors” we created for the Maserati as a finalist for the Best of CES Awards 2015.

2015 and 2016: Concept clusters in QNX reference vehicle
The QNX reference vehicle, based on a Jeep Wrangler, is our go-to vehicle for showcasing the latest capabilities of the QNX CAR Platform for Infotainment. But it also does double-duty as a technology concept vehicle. For instance, in early 2015, we equipped the Jeep with a concept cluster that provides lane departure warnings, collision detection, and curve speed warnings. For instance, in this image, the cluster is recommending that you reduce speed to safely navigate an upcoming curve:



Just in time for CES 2016, the Jeep cluster got another makeover that added crosswalk notifications to the mix:



Did you know? Jeep recently unveiled the Trailcat, a concept Wrangler outfitted with a 707HP Dodge Hellcat engine.

2016: Glass cockpit in a Toyota Highlander
By now, you can see how advances in sensors, navigation databases, and other technologies enable us to integrate more information into a digital instrument cluster, all to keep the driver aware of important events in and around the vehicle. In our 2016 technology concept vehicle, we took the next step and explored what would happen if we did away with an infotainment system altogether and integrated everything — speed, RPM, ADAS alerts, 3D navigation, media control and playback, incoming phone calls, etc. — into a single cluster display.

On the one hand, this approach presented a challenge, because, well… we would be integrating everything into a single display! Things could get busy, fast. On the other hand, this approach presents everything of importance directly in front of the driver, where it is easiest to see. No more glancing over at a centrally mounted head unit.

Simplicity was the watchword. We had to keep distraction to a minimum, and to do that, we focused on two principles: 1) display only the information that the driver currently requires; and 2) use natural language processing as the primary way to control the user interface. That way, drivers can access infotainment content while keeping their hands on the wheel and eyes on the road.

For instance, in the following scenario, the cockpit allows the driver to see several pieces of important information at a glance: a forward-collision warning, an alert that the car is exceeding the local speed limit by 12 mph, and map data with turn-by-turn navigation:



This design also aims to minimize the mental translation, or cognitive processing, needed on the part of the driver. For instance, if you exceed the speed limit, the cluster doesn’t simply show your current speed. It also displays a red line (visible immediately below the 52 mph readout) that gives you an immediately recognizable hint that you are going too fast. The more you exceed the limit, the thicker the red line grows.

The 26262 connection
Today’s digital instrument clusters require hardware and software solutions that can support rich graphics and high-level application environments while also displaying critical information (e.g. engine warning lights, ABS indicators) in a fast and highly reliable fashion. The need to isolate critical from non-critical software functions in the same environment is driving the requirement for ISO 26262 certification of digital clusters.

QNX OS technology, including the QNX OS for Safety, is ideally suited for environments where a combination of infotainment, advanced driver assistance system (ADAS), and safety-related information are displayed. Building a cluster with the ISO 26262 ASIL-D certified QNX OS for Safety can make it simpler to keep software functions isolated from each other and less expensive to certify the end cluster product.

The partner connection
Partnerships are also important. If you had the opportunity to drop by our booth at 2016 CES, you would have seen a “cluster innovation wall” that showcases QNX OS technology integrated with user interface design tools from the industry’s leading cluster software providers, including 3D Incorporated’s REMO HMI Runtime, Crank Software’s Storyboard Suite, DiSTI Corporation’s GL Studio, Elektrobit’s EB GUIDE, HI Corporation’s exbeans UI Conductor, and Rightware’s Kanzi UI software. This pre-integration with a rich choice of partner tools enables our customers to choose the user interface technologies and design approaches that best address their instrument cluster requirements.

For some partner insights on digital cluster design, check out these posts: