Skip to content

Low Bandwidth Devices

Jonathan Hudson edited this page Oct 22, 2024 · 16 revisions

mwp and Configurator low bandwidth device performance evaluation

Documented / instrumented tests of mwp and the INAV Configurator against low bandwidth devices. Tested with a F405 flight controller with a hardware FC UART. Hosted on Arch Linux x86_64, June 2024.

Note that these are artificial bench tests at short range and may not fully represent real world performance.

For the INAV Configurator tests, "wireless" mode was used as data integrity was a key requirement.

Mission Upload

The test mission comprises 120 points, and the test simply involves:

  • Upload the mission to the flight controller
  • Download the mission from the flight controller
  • Verify that the mission had been correctly "round-tripped"
  • Record the times taken (and any errors).

Test 1 CC2451 BLE

Probably the worst serial device I own. Ridiculously small h/w buffers, very slow. Tests are performed using mwp-ble-bridge -t to generate a TCP server, as the Configurator appears not to recognise BLE devices on Linux. This means all applications are using the same infrastructure to access the device (mwp can access the BLE device directly; using the bridge just makes it the same everywhere, including WSL access to the device).

mwp

Version ID
$ mwp --version --build-id
7.135.494 7997d2bc (development)
Upload to FC
14:35:54.078829 Start mission upload for 120 points
14:36:30.068279 mission uploaded for 120 points

So c. 36 seconds to upload the mission.

Download from FC
14:37:02.154259 Received nav_wp_multi_mission_index: 1
14:37:21.466484 Download completed #120 (0)

So c. 18 seconds to download the mission.

Configurator

Version ID
$ git rev-parse --short HEAD
b717b2b3

Master (8.0.0) as of 2024-05-14

Upload to FC

Visually timed from clicking icon until icon becomes sensitive again, took 101 seconds.

Download from FC

Visually timed from clicking icon until icon becomes sensitive again, took 44 seconds.

Summary

Both applications coped with the very weak CC2451 device.

Application Upload Download
mwp linux 36s 18s
mwp WSL 37s 20s
Configurator (Linux) 101s 44s

The Configurator is consistently slower and both applications take longer to upload than download. The WSL overhead is insignificant, particularly considering that there is double virtualisation (Win11 in Linux KVM host running the WSL VM for Ubuntu 22.04).

Test 2 HC-12 Radio

A c. 2016 vintage small form factor / low bandwidth data radio.

Tested with a CP2102 USB to UART Bridge at 9600 baud.

mwp

Version ID
$ mwp --version --build-id
7.135.494 7997d2bc (development)
Upload to FC
15:21:04.409928 Start mission upload for 120 points
15:21:19.521475 mission uploaded for 120 points

So c. 15 seconds to upload the mission.

Download from FC
15:21:40.003163 Received nav_wp_multi_mission_index: 1
15:21:55.023486 Download completed #120 (0)

So c. 15 seconds to download the mission.

Configurator

Version ID
$ git rev-parse --short HEAD
b717b2b3

Master (8.0.0) as of 2024-05-14

Upload to FC

Visually timed from clicking icon until icon becomes sensitive again, took 38 seconds.

Download from FC

Visually timed from clicking icon until icon becomes sensitive again, took 39 seconds.

Test 3 E45-TTL-100 LoRa

A c. 2017 vintage LoRa transparent serial radio. One of the earlier small factor commercial LoRa radios, so probably not as efficient as more modern examples.

Tested via a CP2102 USB to UART Bridge at 9600 baud (9600 airspeed).

mwp

Version ID
mwp --version --build-id
7.140.407 752b6ad7
Upload to FC
15:50:18.048333 Start mission upload for 120 points
15:50:53.106723 mission uploaded for 120 points

So c. 35 seconds to upload the mission.

Download from FC
16:16:32.215014 Start download for 120 WP
16:16:32.215035 Received nav_wp_multi_mission_index: 1
16:17:07.119474 Download completed #120 (0)

So c. 35 seconds to download the mission.

Configurator

Version ID
$ git rev-parse --short HEAD
71817afc

Master (8.0.0) as of 2024-05-19

Upload to FC

Visually timed from clicking icon until icon becomes sensitive again, took 142 seconds.

Download from FC

Visually timed from clicking icon until icon becomes sensitive again, took 141 seconds.

Summary

Both applications coped with the old LoRa ET45-TTL-100.

Application Upload Download
mwp Linux 35s 35s
Configurator (Linux) 142s 141s

The Configurator is again consistently slower. WSL was not tested, but based on earlier results, I would expect mwp / WSL to be similar to the Linux performance.

Test 4 E220-900T22D LoRa

A more recent LoRa transparent serial radio.

Tested via a CP2102 USB to UART Bridge at 9600 baud (9600 airspeed).

Only mwp was tested, requiring 30s for both upload and download, an improvement over the older LoRa device.

Test 5 SiK / 3DR

The old faithfully. Rock solid, well buffered. 64k air speed / 115k ground speed via HC05 Bluetooth (legacy RFCOMM)

mwp was tested, requiring 15s for both upload and download.

The INAV Configurator was also tested, requiring 160s seconds for both upload and download.

Still the short range wireless champion, at least for mwp.

Test 6 pl2303 USB-TTL at 2400 baud

To establish some ground truth; a wired device using 2400 baud.

mwp was tested, requiring 20s for both upload and download.

The INAV Configurator was also tested, requiring 75s seconds for both upload and download.

So wired is faster at 2400 baud that many wireless options at notionally faster speeds.

Overall impression

Device Application Upload (s) Download (s)
CC2541 mwp 36s 18s
CC2541 mwp (WSL) 37s 20s
CC2541 Configurator 101s 44s
HC-12 mwp 15s 15s
HC-12 Configurator 39s 38s
E45-TTL-100 mwp 35s 35s
E45-TTL-100 Configurator 142s 141s
E220-900T22D mwp 30s 30s
E220-900T22D Configurator - -
SiK (3DR) mwp 15s 15s
SiK (3DR) Configurator 161s 160s
pl2303 (2400) mwp 20s 20s
pl2303 (2400) Configurator 76s 75s
  • Both applications work with low bandwidth / inadequately buffered devices.
  • No errors / timeouts / retries were shown in any test or application
  • WSL performs acceptably. Little impact from either double virtualisation or ser2udp serial bridge.

In all cases, the mission was correctly "round-tripped" through the FC.

To my surprise, Configurator 7.1.1 was able to complete the HC-12 test as well (only devices tested on 7.1.1):

Device Applciation Upload (s) Download (s)
HC-12 Configurator 7.1.1 35s 35s

Faster than Configurator 8.0.0, slower than mwp.

Large message test

In order to test the performance with a single message, the tool msp_testmsg was used. This application sends a single MSP request, logging the payload (typically for analysis). It has a 30 second response timeout.

The following messages are used:

  • MSP2_ADSB_VEHICLE_LIST : Id 0x2090, fixed size of 152 bytes
  • MSP_BOXNAMES : Id 0x74, Variable sized response, typically around 400 bytes, depending on FC configuration.

3DR

115200 ground speed, 64k air speed.

$ ./msp_testmsg -d /dev/rfcomm2 0x74
2024/05/29 11:24:40 Using device /dev/rfcomm2
INAV v8.0.0 MATEKF405 (49008f47) API 2.5
 "MISS PIGGY"
MSP 0x74/116 returns OK with payload 392
$ ./msp_testmsg -d /dev/rfcomm2 0x2090
2024/05/29 11:24:54 Using device /dev/rfcomm2
INAV v8.0.0 MATEKF405 (49008f47) API 2.5
 "MISS PIGGY"
MSP 0x2090/8336 returns OK with payload 152

As expected, no problem.

E220 LoRa

9600 ground and air speeds.

$ ./msp_testmsg  -d /dev/ttyUSB0@9600 0x2090
2024/05/29 11:49:12 Using device /dev/ttyUSB0
INAV v8.0.0 WINGFC (27194727) API 2.5
 "BENCHYMCTESTY"
MSP 0x2090/8336 returns OK with payload 152

$ ./msp_testmsg  -d /dev/ttyUSB0@9600 0x74
2024/05/29 11:53:13 Using device /dev/ttyUSB0
INAV v8.0.0 WINGFC (27194727) API 2.5
 "BENCHYMCTESTY"
MSP 0x74/116 returns OK with payload 435

No problem.

E45 LoRA

9600 ground and air speeds.

 ./msp_testmsg  -d /dev/ttyUSB0@9600 0x2090
2024/05/29 13:03:11 Using device /dev/ttyUSB0
INAV v8.0.0 WINGFC (27194727) API 2.5
 "BENCHYMCTESTY"
MSP 0x2090/8336 returns OK with payload 152
./msp_testmsg  -d /dev/ttyUSB0@9600 0x74
2024/05/29 13:03:32 Using device /dev/ttyUSB0
INAV v8.0.0 WINGFC (27194727) API 2.5
 "BENCHYMCTESTY"
MSP 0x74/116 returns OK with payload 435

No problem.

HC-12

9600 ground and air speeds.

./msp_testmsg  -d /dev/ttyUSB0@9600 0x2090
2024/05/29 13:08:30 Using device /dev/ttyUSB0
INAV v8.0.0 WINGFC (27194727) API 2.5
 "BENCHYMCTESTY"
MSP 0x2090/8336 returns OK with payload 152

./msp_testmsg  -d /dev/ttyUSB0@9600 0x74
2024/05/29 13:08:50 Using device /dev/ttyUSB0
INAV v8.0.0 WINGFC (27194727) API 2.5
 "BENCHYMCTESTY"
MSP 0x74/116 returns OK with payload 435

No problem.

BLE CC2541

9600 UART speed, via mwp-ble-bridge as msp_testmsg is not BLE aware.

$ mwp-ble-bridge -k -a BleTest00
2024-05-29T13:13:06+0100 BLE chipset CC2541, mtu 23 (may not end well)
BleTest00 <=> /dev/pts/2
$ ./msp_testmsg  -d /dev/pts/2 0x2090
2024/05/29 13:13:10 Using device /dev/pts/2
INAV v8.0.0 WINGFC (27194727) API 2.5
 "BENCHYMCTESTY"
MSP 0x2090/8336 returns OK with payload 152
$ ./msp_testmsg  -d /dev/pts/2 0x74
2024/05/29 13:13:29 Using device /dev/pts/2
INAV v8.0.0 WINGFC (27194727) API 2.5
 "BENCHYMCTESTY"
Timeout reached, read 280 (272 payload)

I was surprised that this device survived the 0x2090 test. However, if the UART speed is set to 115200 baud. it fails the 0x2090 test as well.

Large message test summary

Surprisingly good results for all tested devices. I had not really expected many devices other than the 3DR and E220 to complete these tests successfully.

LTM Soak tests

There is a suite for LTM Soak Testing that provides a LTM sink and source (soak) generator. The soak generator emulates INAV LTM in rate and content and thus may be used in a bench test environment to validate low bandwidth devices handling of LTM using a desktop computer.

All the viable telemetry devices listed above (3DR, HC-12, Ebyte E45 LoRa , Ebyte E220 LoRa) can sustain LTM fast rate without issues.

The CC2451 cannot sustain LTM at any rate when the generator is directed to the BT side, but does (all rates) when the generator is directed to the serial side (as would be the usual usage cases). In any case, it's not a practical telemetry device.

Clone this wiki locally