7,617 Tests Later, and Juniper’s Firewall Stops Threats Better

By Elevate posted 03-18-2015 18:25


7,617 Tests Later, and Juniper’s Firewall Stops Threats Better


As Jonathan Davidson, executive vice president and general manager of the Juniper Development and Innovation organization, mentioned in his November 2014 blog “Juniper’s Security Strategy: Make the Network Resilient, No Matter What,” Juniper’s approach to security is layered and multi-faceted. Our goal is to ensure both protection and resiliency, and to do so through the use of analytics, security intelligence, multi-threat feeds, and advanced threat protection. Prevention, however, is still key to cyber security; specifically, the use of firewalls.


With that in mind, we decided to do some in-house testing, to see how our firewall solutions measured up against some of our competitors in terms of detecting and stopping attacks. The results are something that you may find interesting.  


In January, we took off-the-shelf security testing tools such as Metasploit, Breakingpoint (now owned by Ixia) and Telus Security Labs exploit feeds, and compared our firewall solution’s performance to others in the market. We used the latest software and signature packs from the firewall vendors tested. We also configured each vendor to detect and block all critical, major and minor attacks.


These tests were focused on how well appliances detected and blocked malicious threats. The size of the competitor’s system, the amount of memory, and the cpu power was irrelevant.  If you tested the competitor’s virtual appliance or hardware appliance the result would be the same when it comes to security efficacy. These types of tests are straightforward: Do you detect and block the malicious threats or not? 7,617 tests later, the results showed that Juniper’s solutions stop threats. Faster.


See the charts below. The results my surprise you. We are laser focused on security here at Juniper and this is just the tip of the iceberg. Stay tuned to this blog for more exciting insights into Juniper’s security.












Testing Methodology Details: HW/SW version/signature pack:

SRX 3400/ 12.1X46D30/ Juniper IDP Signature Database 2454

PAN 500/ 6.0.3/ Signature pack 454-2355

Fortinet VM/ 5.2.2/ Extended IPS DB: 5.00590



1 view



03-22-2015 07:00

@Security First,


Not sure if you missed it, but i did provide a very detailed response to the previous questions about 12 hours before your post.  If you have any further questions or feel that there was something I didn't address, please let me know and i'll be happy to respond.


Best Regards,

Brad Woodberg

03-21-2015 19:40

Juniper should respond to these comments soon or take down this blog post. Shame on you for rigging these tests and then making tall claims.

03-21-2015 11:06

Hi Everyone,


First let me thank you for your profound interest in this blog and our products!  I really appreciate the engagement and the excellent questions.  I'd like to take a moment to provide clarity and address each inquiry above.


Q.  Does it matter that there was different hardware series (and VM's) used in this test?

A.  No, this should have no material impact because we were not testing performance (where this would obviously matter,) but rather security efficacy.  We used the platforms mentioned out of convenience.  PAN and Fortinet do not advertise that they have any different capabilities on their branch or VM products than on their high end, and we have not seen any difference in detection (other than the fact that by default, Fortinet only loads a partial database, so for these tests we enabled their extended database.)


Q.  Why was Juniper/Fortinet's databases from December, while PAN's was from September?

A.  As mentioned, all testing was done in December and early January.  We noticed something strange with PAN's signature download page.  At least according to their download site, sig pack 454-2355 was posted 09/09/2014, but there is a gap between that, and the next posted signature pack 481-2524 which was on 01/13/2015.  We confirmed that this was the case with a third party PAN customer who saw the same thing.  We were not able to obtain an explaination on the gap, and  it was not seen with other updates like Anti-Virus which showed updates during this time.


Q.  Doesn't it matter that PAN's database was 3 months older than Juniper's at the time of the test?

A.  We did not feel that the slight gap was going to have a material impact on the results, and here's why.  Even if we give PAN the benefit of the doubt, and assume that there was some minor error with their web site representing all of downloads,  the fact is that the vulnerabilities in the corpuses from each test tool range from roughly 2015-1998.  Though it is clear the the volume of new vulnerabilities are in the last 5 years there's probably less then 50 new network based vulnerabilities posted in that 3 month window, and thus these had a negligible impact compared to the volume of misses we saw. 


I'd like to point out another piece of information that is food for thought.  At the time of the testing, we also measured how many unique attack objects (signatures + anomalies) each vendor had.  While there isn't a 1:1 relationship between the number of attack objects one has and how many vulnerabilities they can cover, nor are the results of each vendor 1:1 because each vendor has different implementations, it is still very telling (and supported by the results) if you look at how many attack objects each vendor covers (from Decemeber '14)


# of attack objects per vendor:

Juniper: 10,300

PAN: 6100

Fortinet Default: 4700

Fortinet Extended Database: 7700


Also, keep in mind that PAN has had a shipping product for over 7 years now.  Unless if they decided to actually improve their IPS between September and December by releasing thousands of new attack objects, the slight gap will have no material impact.


Q.  What do we mean by stop threats faster?  Does this have something to do with the models tested?

A.  What we meant by stop threats faster is that our detection technologies are superior and can detect more threats, especially in conjunction with Juniper's automation capabilities. 


Thanks again for all of the interest, and I look forward to further discussion!


Best Regards,

Brad Woodberg

03-20-2015 10:06

Where are the metrics to show how you were faster? Were you faster because the Juniper systems being used in the test were high-end models while the Fortinet and Palo systems were VM series or SOHO models?


03-20-2015 08:17

Wow. What a bunch of FUD. Out of date signatures and Juniper, clearly, doesn't know how to use test tools. 7617 "attacks" doesn't account for a corollary number of "tests". Try doing this at load Juniper and share the results - and, oh, might want to update your competition's signature bases / firmware.


As a customer who tests vendors we've found quite the opposite of what's presented here.

03-20-2015 07:12

Not sure about Fortinet, but here's what I've found about Juniper and Palo Alto. Doesn't seem a fair test, but then vendors running their own tests never is!



IDP Signature Database 2454 - 24 Dec 2014

Junos firmware 12.1X46D30 - 15 Jan 2015


Palo Alto

App&Threat Update 454-2355 - 9 Sep 2014

PAN-OS firmware 6.0.3 - 11 June 2014

03-20-2015 06:09

When was this testing done? I'm not that familiar with Fortinet code, but the PAN-OS software version and threat database are over 6 months old. If you completed this test recently using the latest threat database for Juniper -- but old databases for Palo Alto Networks and Fortinet -- I'm not surprised at the results. It's basically a rigged test.

03-19-2015 16:40

Does the hardware series matter when it comes to threat prevention and detection? I see here that SRX3400s were used against PA500 and Fortinet VM. It would be interesting to see a comparison between the 3 on their fully licensed respective VM. vSRX, PA-VM-100, and Fortinet VM.