Contact Radware Sales

Our experts will answer your questions, assess your needs, and help you understand which products are best for your business.

From Vulnerable Code to Vulnerable Services & Users


October 1, 2008 02:00 PM

dark reading 

The New Defense Battle


In a scenario involving real military forces in a battlefield, it is usually believed that in order to be capable of standing still in a defense battle the ratio between the size of the attacking forces and the defense forces should be at least 1-to-3 in favor of the defense forces. Of course, this is true as long as both the attacking and defending forces are well organized, well synchronized and controlled by the commanders on the battlefield.

In the realm of information security, protection technologies are usually deployed and controlled in a security layer defense approach – similar to real defense forces in a battlefield. However, the attackers until not so long ago were not as well-organized and most of the experienced cyber attacks were executed by “troops” in a sporadic nature. Hence their effectiveness wasn’t at an optimal level and the aforementioned ratio of 1-to-3 was not required to win in combat

The situation has recently changed dramatically. Cyber attacks have become more and more well-organized and controlled. In general, hackers have switched from a primary goal of “fun” to one of “profit” and therefore we are now witness to large criminal organizations placing businesses and governments under siege. Organized cyber crime now better resembles how attacks are executed in a real battlefield. Thus, the general success rate of attacks is reaching new heights.

Together with the new developments in attack organizations, we are also seeing new attack methods being developed. Traditional information security methods are usually based on research of application code. The objective is to find flaws in the code that can lead to a security hole, which can be exploited by hackers and to then develop a corresponding patch. Sometimes hackers find these holes before security organizations. Regardless, standard protection centers on locating a vulnerable application code and issuing a patch for it. Hackers are now switching from vulnerability-based attacks to “non vulnerability-based attacks” to bypass existing security technologies that focus on patching vulnerable code. These new types of attacks don’t exploit code flaws and therefore patching will not help with their prevention. These attacks are executed on internet connected services and on users; are unnoticed by existing protection technologies; and can result in severe service disruption, information stealth and fraudulent activities.

One industry that will be considerably impacted by this new attack strategy (organized cyber crime launching non-vulnerability-based attacks) is the financial market, which we will discuss further throughout this article. Altogether, the new organized nature of cyber attacks and the new attack methods that are being developed, are leading us to rethink our information security defense strategies and the technologies that should be used in order to balance the odds in favor of the defense forces. Otherwise we risk losing battle after battle under this new threat landscape.

How Does it Work and What Is the Impact?

Organized cyber rime has already proven its effect over the last year through attacks such as those on Estonia and most recently the distributed denial of service (DDoS) attacks on Georgia that overloaded and effectively shut down Georgian servers. These attacks were well-organized and demonstrate the power that can be brought to bear by controlled cyber attacks. When these attacks use new methods, such as the non-vulnerability attack their impact is even greater. The financial services industry is most at risk as they currently account for over 90% of industry-targeted attacks such as organized Phishing scams. Non-vulnerability threats raise the stakes exponentially for financial institutions that are already under concentrated attack.

To understand why let’s first define exactly what a non-vulnerability threat is.

Non-vulnerability-based threats aim to exploit weaknesses in the servers’ application that cannot be defined as vulnerabilities. Essentially they’re attempts to misappropriate software without a vulnerability. They can be typified by a sequence of legitimate events – generally not associated with unusually large traffic volumes – which are used in order to break authentication mechanisms and scan the application for hidden, confidential files. More sophisticated non-vulnerability application attacks include well-chosen, repeated sets of legitimate application requests that misuse a server’s CPU and memory resources. The effect is a full or partial denial of service condition in the application.

To emphasize the difference between the traditional, vulnerability-based attack (known or zero-minute attack) and the non-vulnerability-based attack, we can say that for the first there is always the possibility to create a signature (sooner or later) that represents a malicious code that can be used to block the attack, or to develop an application patch that fixes the associated application flaw. In the case of non-vulnerability attacks, this malicious code doesn’t exist, therefore there is also no attack signature nor can an application patch exist. Non-vulnerability-based attacks can be executed unnoticed by today’s protection technologies on server applications such as financial online transaction services, and thus can seriously impact their availability and trustworthiness in the eyes of users.

Other types of non-vulnerability attacks, such as Phishing and Farming, directly impact the users. As in the case of the non-vulnerability threat discussed above, the Phishing attack doesn’t aim to exploit vulnerability in a client’s application but rather to exploit the “vulnerable user”. This vulnerable user is an unsuspecting user who is tempted (fooled) to expose his credentials or to become a vehicle for fraud activities against third parties.

Phishing is definitely not new and there are some tools that can prevent it – mainly educational tools. However because Phishing is no longer used only by sophisticated fraudsters who build their own tools, it has become a mainstream type of fraud which results in increased chances to hit vulnerable unsuspecting users. Today, most Phishing attacks are hosted on compromised PC’s which are part of a Botnet. The owner of the PC is unaware of the fact that his PC is a carrier of fraudulent activities. These botnets are owned by organized crime rings that rent them for pure profit-making purposes, taking fraud to another level previously not seen.

Challenges & Applicable New Technologies

The main state - of - the - art security technologies include high-performance firewalls, anti-viruses, host-based and network-based intrusion prevention systems, and authentication and encryption technologies. These security technologies form the required layered defense against cyber threats. However, all these technologies were designed from the bottom-up to prevent attacks that violate some kind of deterministic rules.

Firewalls are responsible for allowing or blocking access to applications, servers and networks based on pre-defined rules. Authentication technologies are responsible for enforcing strict rules on what is allowed or not allowed for a user to do based on the authorization level issued.

Encryption technologies keep data safe from eyes that are not supposed to see it. Lastly, anti- virus, host-based and network-based intrusion prevention technologies are responsible for preventing exploitation attempts of known and sometimes unknown (zero-minute) application vulnerabilities. Anti-virus and intrusion prevention technologies are based on pre-defined attack signatures that represent malicious code or pre-defined protocol rules that, when breached, can also flag attack activities.

Unfortunately, all these good technologies fall short against the aforementioned non-vulnerability threats. These newer threats allow hackers to integrate well with legitimate forms of transactions and to comply with all protocol and application rules. Most importantly, in malicious activities such as illegal access, unauthorized usage of resource, vulnerability-based attacks (known and zero-day) and so on, there is always the possibility of creating a pre-defined rule or an attack signature that represents the violation or the malicious code. In the case of non-vulnerability attacks, this violation and malicious code doesn’t exist.

Another weakness worth mentioning is associated with mobile users. In recent years the use of mobile-based client devices has increased, which results in broader Internet connectivity options (e.g., WiFi hot-spots, mobile data services such as GPRS, HSPA in 3G networks and so on). However, these mobile network environments usually don’t include the structured layers of defense that comprise all the aforementioned state-of-the-art security technologies. This leaves mobile users more exposed to different types of attacks and creates a situation in which mobile users are often the unwitting malware carriers of Bots (part of a Botnet) and Trojans.

The challenges are substantial. However, newer security technologies, designed with a different approach in mind, are being deployed against these newer threats. The two key requirements for these technologies can be defined as follows.

  1. New technologies should adopt an approach that tries to learn and identify abnormal service usage behavior, rather than one that simply looks for illegal access or for a malicious code that is carried in the traffic through attack signatures. The identification of the abnormal behavior by these technologies should be analyzed based on both rate and rate-invariant traffic behavioral parameters, meaning that low-rate abnormal activities will also be detected. Upon identification of the abnormal behavior, the technology must include the capability to automatically create in real-time a new type of “attack signature” that represents behavior rather than code. It is important to understand that this “automatic real-time signature” should have different characteristics than the traditional attack signature. There are two major differences. The first difference is that it should be generated automatically in real-time otherwise the attack will have time to execute successfully. The second difference is that this real-time signature is effective only in certain network and application environments, because abnormal usage of an application in one environment can be completely normal in another one. This lessens the incidence of false-positives or unwanted prevention of legitimate activity.
  2. The same logic should be applied to the already-infected mobile user. The security technology should not only analyze traffic toward the online service that resides on the application servers but should also be capable of identifying those mobile users that are unwitting malware carriers of Bots & Trojans.

To effectively identify them, the new security technology should be able to detect abnormal activities that the already-infected hosts are generating. It should detect unusual client-based activities and flag them. These activities are usually non-vulnerability attacks such as Phishing (i.e., there is no malicious code that can be identified) therefore the “automatic real-time signature” defined above is also needed for the mitigation of these activities.

Summary

Companies recognize that they need to be prepared to respond new methods of attacks that are under constant development by hackers. And in the past, the odds for a successful outcome were in their favor. But the tide has turned. The increased incidence of highly-organized cyber crime rings, combined with the rise of non-vulnerability attacks, have changed the existing balance between the attacking and defending forces in a way that requires radically new technological approaches to information security. In the past organizations were reliant on the odds that their network and services would not be the next target for attack. Today this assumption is more likely to be proven wrong.

Security technologies should be “smarter” and faster in order to maintain secured and reliable communication lines and online services in the new threat landscape. Having said this, it should be clear that meeting these new technological requirements doesn’t require a solution that is an alterative to existing solutions, but rather a complementary solution. The existing and more deterministic solutions such as access control, encryption and signature-based systems are still crucial to maintain a secured network. These solutions, together with the behavioral-based ones that adapt to the new approaches, form a complementary solution which covers more threats together than each one is able to cover on its own.

Lastly, it is important to understand that information security technologies are not a panacea and cannot address all security risks. There is no such thing as a “complete” security solution. The security technologies described in this article are a relatively new development and they should be complemented by sound, internal security policies; user education and training; and proactive research study. However, if applied correctly this new approach can help even the odds in the future battle.

Rate invariant traffic parameters are behavior parameters that have characteristics which are not influenced by the dynamics of traffic volume.

Already a Customer?

We’re ready to help, whether you need support, additional services, or answers to your questions about our products and solutions.

Locations
Get Answers Now from KnowledgeBase
Get Free Online Product Training
Engage with Radware Technical Support
Join the Radware Customer Program

Get Social

Connect with experts and join the conversation about Radware technologies.

Blog
Security Research Center
CyberPedia