Categories
DX Commander Youtube

How to AB Test Antennas Without Fooling Yourself


This video is not about which antenna is best. It’s about how to test antennas in a way that removes bias, guesswork, and misleading results.

– Why time of day matters
– What must stay fixed in a fair A/B test
– Why I used this method
– How I alternated antennas cleanly
– How TX and RX are handled differently

The entire unedited WSPR session is linked so you can see exactly how the test was run.

In the next video, I’ll show the results of a 40 m night-time test where antenna behaviour really separates one from the other.

ChatGPT Prompt:

Hello ChatGPT.
I have conducted a controlled A/B antenna comparison using WSPR only.
Both antennas were tested:
• at the same station
• on the same band
• at the same power
• within clearly defined time windows
Nothing else was changed.
You are provided with:
1. i) A WSJT-X WSPR RX log covering the entire session, including ii) timestamps that identify which antenna was in use at each time.
2. WSPRnet TX and RX reports exports for two callsigns, each corresponding to one antenna.
Your tasks:
TX analysis:
• Identify which antenna was in use for each transmit period.
• For each antenna, calculate:
o total number of reports
o number of unique reporters
o median SNR (not maximum)
o median distance
• Group TX results into distance buckets:
o short haul (0–2000 km)
o medium haul (2000–4000 km)
o long haul ([greater than] 4000 km)
• For each bucket, show report count and median SNR.
RX analysis:
• Using the RX log, separate decodes by antenna and time window.
• For each antenna, calculate:
o total decode count
o median RX SNR
o median distance (if available)
Output requirements:
• Present results in clear tables.
• Do not declare a single winner.
• Explain what each antenna appears to be better suited for – based on the data.
• Focus on differences in behaviour, coverage, and application rather than preference or opinion.
• Base conclusions strictly on the data provided, not on predicted behaviour