Welcome, Guest. Please login or register.

Author Topic: Why I think AV-Comparatives is more reliable than AV-Test  (Read 756 times)

0 Members and 1 Guest are viewing this topic.

snadge

  • Guest
Why I think AV-Comparatives is more reliable than AV-Test
« on: 24 September 2014, 11:45:16 »
Advertisement
this is just my opinion:


1) I know first hand just how much impact McAfee has on a computer as I install several copies every day I'm at work and watch these computers (especially AMD ones - quad-core or not) slow down to a crawl once installed... Intel ones handle it better but still noticeable, AV-Test reports that McAfee has almost no impact on system resources ....wrong! ...AV-C reports the opposite which coincides with what I notice at work...it also states that Avira (while also no.1 in AV-T and AV-C) has large impact on performance which is the complete opposite of what I see when I install it and AV-C reports that it has almost no impact.

2) when I check to see how they do their tests they just have a basic bit of info that makes up 1 paragraph...see here:
http://www.av-test.org/en/test-procedures/test-modules/performance/
whereas AV-C has full on reports and breakdowns on how their procedures are done which span as many as 13 pages, see here:
http://www.av-comparatives.org/wp-content/uploads/2014/06/avc_per_201405_en.pdf

3) ISO certification: http://www.av-comparatives.org/iso-certification/ - AV-Test has no such certification, this means AV-C has been properly and thoroughly examined and deemed fit for purpose

4) The amount of "samples" used, for example in their AV-T "detection tests" they use 2000-20,000 - whereas AV-C will use more than 200,000 - this gives a better scope of what is actually finding what.

offtopic:
it might be worth mentioning VB100 AV testing and how the VB100 award and what it means - VB100 awards mean that the Anti-Virus detects the top 100 most common infections without passing 1 false positive, to me this should be taken lightly as there is nothing wrong with having a couple or more of false positives because its better to have something flagged for you to check than let something slip through the net (because heuristics was lowered/changed in a manner to lower false positives...)

Offline CappySpectrum

  • Super-Hero Member
  • ******
  • Posts: 1453
  • Gender: Male
Re: Why I think AV-Comparatives is more reliable than AV-Test
« Reply #1 on: 24 September 2014, 14:34:20 »
I found it shocking, snadge. I had never used McAfee until I sorted a friends friends laptop. The crap people install from bundled software is frightening.

McAfee was draining the battery like nothing but worst of all. It wasn't picking up the viruses/spyware.
Originally known as Quasimoto

steve195527

  • Guest
Re: Why I think AV-Comparatives is more reliable than AV-Test
« Reply #2 on: 24 September 2014, 15:21:25 »
this is just my opinion:


1) I know first hand just how much impact McAfee has on a computer as I install several copies every day I'm at work and watch these computers (especially AMD ones - quad-core or not) slow down to a crawl once installed... Intel ones handle it better but still noticeable, AV-Test reports that McAfee has almost no impact on system resources ....wrong! ...AV-C reports the opposite which coincides with what I notice at work...it also states that Avira (while also no.1 in AV-T and AV-C) has large impact on performance which is the complete opposite of what I see when I install it and AV-C reports that it has almost no impact.

2) when I check to see how they do their tests they just have a basic bit of info that makes up 1 paragraph...see here:
http://www.av-test.org/en/test-procedures/test-modules/performance/
whereas AV-C has full on reports and breakdowns on how their procedures are done which span as many as 13 pages, see here:
http://www.av-comparatives.org/wp-content/uploads/2014/06/avc_per_201405_en.pdf

3) ISO certification: http://www.av-comparatives.org/iso-certification/ - AV-Test has no such certification, this means AV-C has been properly and thoroughly examined and deemed fit for purpose

4) The amount of "samples" used, for example in their AV-T "detection tests" they use 2000-20,000 - whereas AV-C will use more than 200,000 - this gives a better scope of what is actually finding what.

offtopic:
it might be worth mentioning VB100 AV testing and how the VB100 award and what it means - VB100 awards mean that the Anti-Virus detects the top 100 most common infections without passing 1 false positive, to me this should be taken lightly as there is nothing wrong with having a couple or more of false positives because its better to have something flagged for you to check than let something slip through the net (because heuristics was lowered/changed in a manner to lower false positives...)
the biggest issue I have with the VB100 test is that the vendors know the date of the test so they are basically pre-warned and can make sure their definitions etc are perfect for the day of the test,the products that don't attain the 100% perfect score are either bad products or products by vendors who aren't bothered about this test as they know it isn't really worth worrying about

 

Powered by EzPortal
anything