Blog: The Critical Need for Accurate Traffic Flow Classification in Your Network

By Ritesh Mukherjee |

With nearly half of data breaches occurring from within the network [Forrester, Intel], detection only at the edges is meaningless. The use of encryption for all communications on the Internet is increasing 90% year over year with roughly half of the websites today encrypting traffic by default [NSS]. It is expected that 75% of all web traffic will be encrypted by 2019. Most large websites like Google, Twitter, and Facebook use SSL encryption today.

Accurate classification of traffic flows is an essential step for network administrators to enable network tasks such as quality of service, detect threats, and restrict forbidden applications.

There are three broad classification methods used today:
  1. Port Numbers: This method is fast and low resource consuming. It is only useful for applications and services that use fixed port numbers. It is easy to cheat this technique by altering port numbers.
  2. Deep Packet Inspection (DPI): This method inspects the actual payload of the packet. It is slow and requires a great deal of computing resources. It can be very accurate except for applications which do not have strict classifiers. This requires application signatures to be kept up to date. It does not work when the payload is encrypted or requires proxy decryption. Decryption of traffic has legal implications as sensitive user information will be available in plaintext. Financial and healthcare regulations in many countries require PII information to be encrypted and stay hidden.
  3. Statistical Classification: This method relies on statistical analysis of attributes such as byte frequencies, packet sizes and packet inter-arrival times. It can be very accurate. This technique requires implementation of machine learning algorithms. Applications have begun padding packets to beat these techniques.

The trouble with how traditional firewalls deal with encrypted traffic is lies in how they work with QUIC (Quick UDP Internet Connections). For example, traditional firewalls are unable to detect Google applications when they operate over QUIC. This results in loss of visibility and control of Google applications.

Incorporate Best Practices from Multiple Detection Techniques

At 128 Technology, we’ve designed a Session Smart router that analyzes data and control traffic using a rich set of heuristics such as HTTPS Identification, DNS based Classification, and Well-Known Application Identification (such as O365) to identify traffic.

One way to prevent load on the router is to limit the amount of traffic that needs to be analyzed. Any flow through a router only needs to be analyzed once (for a given period). Session Smart routing fits into this model very well as it operates on flows rather than on packets. The router can share this analysis with neighboring routers to enable them to perform early detection as well.

Fine-Tune Network Service and Performance

Traffic application classification is an essential step in the network management process to provide high availability of network services. The ability to classify applications enables the network to fine tune performance per service to provide superior end-user experiences. By enabling early detection of applications and information sharing among routers, network administrators are granted the ability to ensure superior detection capabilities and guarantee performance per application type.

Enter the 128T Networking Platform (128T). The solution does not rely on decryption which results in performance bottlenecks and legal implications. The intelligent use of heuristics and the ability to continuously add new heuristics ensures the 128T solution provides continuous best of breed solution for application classification.

Add Session Smart Technology to your network today to gain full visibility and control of your network.

Contact Us

We're not around right now. But you can send us an email and we'll get back to you, asap.

Start typing and press Enter to search