How we test and graphs
For our protection ratings this month we draw on real-world tests carried out by not one but two specialist antivirus testing labs – namely AV-Comparatives.org ( pcpro.link/322comp) and AV-Test.org ( pcpro.link/322test). The scores shown on the preceding pages, and in our graphs below, represent the average of the ratings reported by each lab in their latest reports, dated March 2021 and February 2021 respectively. The number of false positives represents the total number of incorrect alerts raised up by each package, across both labs’ tests.
To measure the impact of each security product on system responsiveness, AVComparatives uses a combination of its own in-house test suite and the PCMark 10 benchmark, while AV-Test times a series of standardised tasks, including launching applications, opening websites and copying files, on both a typical and high-end PC. We convert these scores into a percentage and take the average to reach our performance rating for each suite on test.
We also run our own speed test, using each suite to scan through an external hard disk containing 55GB of assorted files, including documents, images, application installers – and the EICAR test file, a harmless bit of code that all security packages identify as fauxmalware for testing purposes. We saw a huge variation in scanning speed: again, you’ll find the details below and next to each review.
While protection and performance are of great importance when choosing a security suite, those aren’t the only things we consider. We extensively road-test each package ourselves, and take into account the accessibility and userfriendliness of the interface, as well as the breadth and usefulness of features beyond the standard scanning component. Naturally, we also weigh up value for money, and all of these factors are combined to award each security suite a star rating out of five.