It’s time to shed a light on the following question: are Next Generation Firewall (NGFW) growth projections up until 2016 realistic? The assertion in question has been made just a year ago. Channelnomics wrote the following on the topic:
“By 2016, NGFWs will secure 35 percent of all Internet connections, in what will become a $10 billion market (including VPNs and intrusion prevention systems).”
That’s a feasible outward projection, but it misses the important inward projection of where do NGFWs evolve until 2016? The answer to this question may not be as bright. Here are a number of factors:
(1) NGFWs are complex. You’d be surprised of the high number of subsystems which need to work in sync to perform well at Gigabit performance for all kinds of different traffic mixes. Some of them can be specific to a single application type. This is not a bad thing per se, but vendors and some customers tend to forget this. What stems from this fact is (2) and (3).
(2) NGFWs are not UTMs (see Unified Threat Management). It’s horrible to cram basic network features into a firewall, because it makes developing and maintaining them harder for both sides. A simple thing like DNS can eat up a lot of development time and thus means higher prices for the customer. The common price tag argument doesn’t hold against it; the loss of control, security and extra cash needed (especially for things like High Availability) are not worth the trade.
(3) NGFWs are fully automated gatekeepers, which don’t really perform well enough against threats; and–let’s be honest–never will. They rely on static intelligence and offer only predefined reporting information collection. Aggregation slowly crumbles away the last bits of useful information needed in security evaluations. I would argue that’s why you see lots of extra features and intelligence crammed into one appliance in the first place: to make it look more valuable than it really is.
And how will the next cycle of innovation in the field of NGFW affect the products of the future, the products that will actually be sold as early as 2014? I’ll make two assertions of my own:
(I) Network virtualisation is a thing and will take care of basic networking again soon, which will in turn shift the focus back to the packet processing and inspection capabilities of a firewall. Most of routing, VPN and reporting will be rendered obsolete, but also help the new products to breathe and grow into other directions, which were not possible with all of that baggage still in place.
(II) The focus on the firewall itself will shrink gradually in future products. I’ve seen months wasted on irrelevant subsystems and second-tier features. There’s simply no room to expand to other than fine-tuning application detection and pulling in a tremendous amount of external intelligence for content filters, anti-malware capabilities and the like. A more interesting angle will be data analytics (small and big), visibility tools and real-time dynamic monitoring. In the Cloud may loom even more opportunities to grow, but how to tap them is going to be the integral question of 2013.
Questions and conclusions
Working on such complex projects bears certain risks: is the minor feature I am working on worth the time or not? How many equally good solutions are already out there? Doing it for the sake of having a competitive edge is probably not worth the time–that’s what I have learned a few times over now.
And then there’s the ever-evolving market itself, which doesn’t wait for one to be on par with the competition. So why does the market evolve at all? Because people are willing to take the time and chances to build something new. To leave the past behind and tell everyone: Look, this never even worked.
And, hopefully, that’s where the innovation begins to happen. In the case of network security or the plain NGFW, giving the customer the tools to get to know his network may reap far greater benefits than trying to manage the network automatically. The next zero-day attack is never far and people will always be at risk. What needs to change is how well a customer will be able to understand the impact of the attack on their network infrastructure and data. This needs to be the focus of 2013: bringing the customer and their network closer together. This may also mean cutting out infrequently used features or parting with established ways in usability and visualisation.
When going back to 2011’s assertion, this task is in no way a linear progression over the next few years. What we now see in the NGFW market feels like a dead-end in customer usability and network visibility. This, plus the vast amount of closed-source, open-source, proprietary, or even free software will make it impossible to reach the predicted market convergence in the NGFW franchise, without redefining or redesigning the NGFW franchise itself from the ground up. However, what follows this dead-end is an endless stream of possible futures.
So while others still argue about the question of capitalisation, one should really worry about how to stay competitive and relevant in the market. The tide will come in and wash away all the rusty and dysfunctional parts of current NGFW products. What remains afterwards needs to be polished and caressed to stand out. Are you ready?