Challenges and Issues in RF testing for wireless products

Designers often want to test the performance of RF products such as Bluetooth modules as part of the product selection process. However, they may find the final performance of their products in the field bears little relationship to their lab tests. This article discussing some of the issues around testing RF products and in particular the way the antenna and system interact to make an impact on any measurements.

Read the article in Electronic Product Design &Test online magazine

Importance of testing

It is natural for a designer to want to test a component before committing to using it in a design. This applies particularly to RF components, as firstly they can be critical to the functioning of the end device, and secondly, they possess “analogue” properties, meaning that performance is hard to summarise in a few key figures in a specifications table.

However, what is measured in a simple test setup may bear little resemblance to how the end product performs; this article aims to explain why that might be, and how Design and Test need to go hand in hand to get the results one expects.

A simple test

A simple test of a wireless component – e.g. a Bluetooth Low Energy module with integrated antenna – might involve taking a couple of the manufacturer’s evaluation kits or test boards, putting them on tables suitably separated and then looking at the RSSI (Received Signal Strength Indicator) which is usually easily read out from the device or simply seeing at what distance they stop communicating as a range test.

Challenges

Unfortunately, this will tell you rather little about the real-world performance of your final product. The first point is that an RF device will have a complex three-dimensional radiation pattern. Whilst antenna designers would typically, for a general-purpose device, aim for an omnidirectional radiation pattern (meaning performance is equal in all direction), this is practically impossible to achieve.  Most RF modules will have a dip in output power in the plane of the PCB they are mounted on, as shown in the diagram below. In fact, even the basic dipole antenna has a null along the axis of the dipole, whilst being omni-directional around the axis.

The example test described above – two test boards sitting flat on a table - therefore represents something of a worst-case scenario for most devices.

The complexity does not stop there.  Some modules will radiate well on the side of the module where the antenna is placed, but poorly in the opposite direction. Unless you do a full three-dimensional characterisation of performance, it is hard to know if you are seeing a component at its best or worst.

Test environment matters

The environment in which tests are carried out can also have a major impact. The ideal environment for RF testing is an anechoic chamber where there is no reflection from the walls, floor or ceiling, but few OEMS would have such a facility, and whilst they can be rented, it is unlikely to make economic sense to do so. Nevertheless, it is important to bear in mind that a suspended metal floor as is common in many modern office buildings can have a big impact on transmission between devices.

Additional factors

We previously noted the three-dimensional nature of the performance of an antenna in a device, but in fact there are more dimensions to account for. Electromagnetic radiation is polarised. So relative alignment of the polarisation of the transmitter and receiver devices will have another major impact on results. Think of two pairs of polarised sunglasses – turn them through 90 degrees and the net effect is that combined, they become almost opaque. Similarly in the typical case of two modules with linearly polarised antennas, if one is turned so they become “cross polarised” the RSSI will drop dramatically, even if they are oriented in an otherwise favourable alignment.

An additional factor is that an RF device will typically operate over a range of frequencies. Bluetooth, for example, operates over 80 channels between 2.4 and 2.4835 GHz. Since the protocol specifies frequency hopping across these channels, any RF component needs to be capable of working in any of them. Whilst this is not an enormous range, nevertheless, a particular device may work better at one end of the range than the other.

Testing in isolation

There are also limitations as to what testing an RF component in isolation or on a test board will tell you. The board that an RF module is mounted on will also affect the performance, particularly the “Ground Plane” – the area of grounded conductor on the board. Performance will drop off if there is less than a quarter wavelength linear distance of ground plane (equivalent to ~3 cm for a 2.4 Ghz device like Bluetooth or WiFI component, or 9 cm for a subGiga device such as LoRa module). The radiation pattern will also be affected by a ground plane that is many multiples of a quarter-wavelength, giving rise to a more complex shape with further dips in transmission performance.

The above is a simple “rule of thumb”, but a real product, especially if it is a complex design with say two boards interconnected, may have more complex RF signal paths that will influence performance. Other elements like casings or potting compounds for mechanical stability will also have an impact.

Overall product working environment

The final step is to consider the overall environment in which the product will be expected to work. For example, a wearable device will inevitably be close to the human body. Unfortunately, the human body (to first approximation, electrically charged water which has a high dielectric constant and loss factor) is a particularly good absorber of 2.4 GHz radiation. So a device close to the body will have a drastically different performance to one sitting in free space. For other kinds of devices, they will be affected by any metal or other conductors in the environment in which they are intended to work.

Recommended test approaches

So, given these challenges, what can be done? There are a few possible approaches. Simulation tools, if correctly used, can be highly effective ways to predict real world performance of a device. A device and the key features of its board, housing and environment can be modelled and simulated, to calculate a link budget between the device and whatever it is intended to connect to. This approach can avoid multiple trial and error cycles and show up weak points to be addressed in the design.

This kind of approach can be particularly useful if the device is more complex, is required to operate in a challenging environment, or has specification requirements that are close to theoretical limits. It is much quicker to change aspects of the design and find if any design choices are causing particular performance degradation. There can be cases where small changes can have big impacts in performance, but with RF, it is not a question of it being random or some kind mysterious art, just rather subtle electromagnetic interactions under the quite well-defined laws of physics.

For example, in one case, we were asked to investigate the poor performance of an RF device. There was no obvious flaw in the board design. We therefore asked for a full sample. What we discovered through simulation was that a metal clip, holding the product together was acting to significantly detune the antenna. We noted that the clip was not clearly acting as shield as it was quite small but was nevertheless having a big impact on performance.

The other approach is simply to conduct real world tests of prototypes that are close in the key respects to the final product. As explained above, it is important to conduct tests that cover the range of probable/possible relative orientations of the device and whatever it is intended to connect to, and to make sure the environment is realistic e.g. if it is a wearable, it is either worn or mounted on something to simulate a body. In many cases, the performance may be non-ideal, but perfectly good enough for the desired application. For relatively simple applications, or ones where the requirements are not especially demanding, this may be the quickest route to verifying a device will work.

 

Authors: Nick Wood, VP Sales & Marketing and Chris Barratt, CTO, Insight SiP

CONTACT US

Insight SiP
GreenSide, Bat.7, Entree2,
400 Avenue Roumanille, BP 309
F-06906 Sophia–Antipolis FRANCE

Phone: +33 (0) 493 008 880

Privacy Policy :: Legal Notice

OFFICE

NEWSLETTER

Please enable the javascript to submit this form