quando eu comecei a ler a um post que um cara simplesmente colocou à prova 60 Web Application Scanners, eu achei que ele tinha feito alguns testes básicos. Me enganei por completo. Fiquei impressionado com o nível de detalhe e os comparativos que ele criou, impressionam.

Vejam como os testes foram feitos:

The benchmark focused on testing commercial & open source tools that are able to detect (and not necessarily exploit) security vulnerabilities on a wide range of URLs, and thus, each tool tested was required to support the following features:
· The ability to detect Reflected XSS and/or SQL Injection vulnerabilities.
· The ability to scan multiple URLs at once (using either a crawler/spider feature, URL/Log file parsing feature or a built-in proxy).
· The ability to control and limit the scan to internal or external host (domain/IP).
The testing procedure of all the tools included the following phases:
· The scanners were all tested against the latest version of WAVSEP (v1.0.3), a benchmarking platform designed to assess the detection accuracy of web application scanners. The purpose of WAVSEP’s test cases is to provide a scale for understanding which detection barriers each scanning tool can bypass, and which vulnerability variations can be detected by each tool. The various scanners were tested against the following test cases (GET and POST attack vectors):
o 66 test cases that were vulnerable to Reflected Cross Site Scripting attacks.
o 80 test cases that contained Error Disclosing SQL Injection exposures.
o 46 test cases that contained Blind SQL Injection exposures.
o 10 test cases that were vulnerable to Time Based SQL Injection attacks.
o 7 different categories of false positive RXSS vulnerabilities.
o 10 different categories of false positive SQLi vulnerabilities.

Ele dividiu todo o post em mais de 30 tópicos, vejam só:

1. Prologue
2. List of Tested Web Application Scanners
3. Benchmark Overview & Assessment Criteria
4. Test I – The More The Merrier – Counting Audit Features
5. Test II – To the Victor Go the Spoils – SQL Injection
6. Test III – I Fight (For) the Users – Reflected XSS
7. Test IV – Knowledge is Power – Feature Comparison
8. What Changed?
9. Initial Conclusions – Open Source vs. Commercial
10. Morale Issues in Commercial Product Benchmarks
11. Verifying The Benchmark Results
12. Notifications and Clarifications
13. List of Tested Scanners
14. Source, License and Technical Details of Tested Scanners
15. Comparison of Active Vulnerability Detection Features
16. Comparison of Complementary Scanning Features
17. Comparison of Usability and Coverage Features
18. Comparison of Connection and Authentication Features
19. Comparison of Advanced Features
20. Detailed Results: Reflected XSS Detection Accuracy
21. Detailed Results: SQL Injection Detection Accuracy
22. Drilldown – Error Based SQL Injection Detection
23. Drilldown – Blind & Time Based SQL Injection Detection
24. Technical Benchmark Conclusions – Vendors & Users
25. So What Now?
26. Recommended Reading List: Scanner Benchmarks
27. Thank-You Note
28. Frequently Asked Questions
29. Appendix A – Assessing Web Application Scanners
30. Appendix B – A List of Tools Not Included In the Test
31. Appendix C – WAVSEP Scan Logs
32. Appendix D – Scanners with Abnormal Behavior

Leitura mais do que recomendada para que trabalha com segurança da informação, testes de intrusão e aplicações Web.

Compartilhar:

Deixe uma resposta

Fechar Menu