Automated Test Tools

Part 2 of the blog series on Test Automation
Read Part 1 and continue to Part 3


The first part of the blog series on test automation focused on the reasons to introduce – and the effects those reasons have on the implementation of – test automation in automotive projects. This second part continues directly from the lessons learned in the first part: the need for maintaining stable test interfaces while keeping product software development and test implementation tightly coupled, as well as still allowing product variation. Here we focus on understanding the tools, what we require of them and how to incorporate the lessons we’ve learned.

There are common requirements for test automation tools, which are just as valid in the automotive as any other field. Ease of use, well-known test development language, support for all areas that need to be tested and good performance in executing the tests are just some of these.


Having done away with the separate test development team, and having variants and various features in development concurrently leads us to the next challenge: scalability of the tools to develop and run the tests. This is especially important in a company like ours, working in a distributed manner in both offices and remotely around the world. Distributing the test development requires development tools that can be used by the developers, and access to the test equipment. Developers can either have local test setups, or, if possible, they can have remote access to shared test setups. Remote test setups reduce the number of needed setups considerably, but sufficient tailored remote management features require significant resources.

There are further scalability challenges due to the product variants and feature specific test setups. Each concurrent variant may be slightly different in both HW and SW, and thus will likely require at least their own test run content, but possibly also separate HW configuration. Feature specific test setups are required, unless every test setup has all test setup components and is able to run all tests. For example, one setup might use camera to verify display artefacts being correct, and the rest of the setups use software framework with much better performance for increased test coverage.

Lesson learned: Scalability is one major criteria in the selection of the test automation tools. We need to be able to answer at least the following questions: How many test systems need to be available for running the tests? At the same time, how many systems need to be available for test development? Are the test development systems provided locally to all of the test developers, or are they shared and accessed remotely? Are different variants tested on dedicated or shared test setups? Are there feature specific test setups, or can every setup run all tests equally?

Test Area Coverage

The next area of interest in test automation tools is the capabilities of the test system and the test case coverage those capabilities allow. Complex automotive devices, such as carputers, are participants in several large ecosystems at the same time. Multiple CAN/LIN buses, Ethernet, various audio interfaces and GPIO link the device with other automotive components. Internet connectivity, USB, Bluetooth and WiFi connect the device to the outside world and users’ other devices. And at the core of the system is the human-computer interaction, linking all of the previous interfaces in one way or another to the whole user experience provided.

Off the shelf tools, on the other hand, usually focus on application testing, Hardware-in-Loop testing or testing a specific connectivity type. Finding one tool with all the capabilities needed for a complex automotive system is highly unlikely. Instead, in practise, all test systems are piecemeal, which leads to not one, but multiple “make or buy” decisions. And more integration effort with more risks, more ways to implement tests and so on.

Lesson learned:  There is no one automated test tool to cover all current and future needs at the moment. Tools must be extendable either to cover more ground or to integrate with other tools that can handle the gaps.

Test Automation System Composition and Hidden Work

Test automation systems are like comets also from another perspective. Looking at them from afar, they look like singular objects, but when you take a closer look, they are formed – not by chunks of ice and rock, but by several features that may be implemented by one or more tools. Test automation system consists of, at least:

  • Test run management
  • Device (pool) management
  • Integration to test planning and reporting tools
  • Integration to device (e.g. flashing, booting, stopping)
  • Test framework (controlling the device behaviour in test cases – anything from software scripts to robots)
  • Test cases

Of these features, usually integration to test planning and reporting tools, integration to device and test cases need to be implemented and the effort included in any plans or comparisons. No matter if the test automation system is bought or made, and no matter how the needed adaptation is done.

Lesson learned:  Any tool selected will require considerable effort to take into active use. Test management, result reporting and test case implementation form a cosiderable chunk of work, so they must be considered for any “make or buy” decisions.

To sum up

So far we’ve seen that a test automation system is fragmented and highly likely to be a mix of different tools and a lot of integration effort. There are many roles to fill and many distinct areas that need to be tested, and the industry in undergoing change, so it will take time before integrated, well standardized solutions become available. Test automation tool development needs to be managed just as any other development area, and it needs to be an on going effort, not one shot decision. Test tools need to be evaluated based on the planned scope of the testing and suitability to the development model and company practises, whether purchased or developed in-house. Test tools are just that, tools. They do not do the work by themselves once acquired.

Markku Tamski has been connecting the dots from software and systems technology development and industry standardization all through to product solutions for more than 20 years. During this time the work has led Markku to explore a wide variety of technology areas and roles in consumer facing software driven products. In his current role he is heading the Delivery Operations teams at Link Motion, including Quality Assurance, Software Builds and Releasing, having the most interesting view point over the automotive industry's software development practices in transition. He holds an M. Sc. from Tampere University of Technology.