Posts

Showing posts from 2008

Malware Trend in 2007

I read the report IBM Internet Security System X-Force 2007 Trend Statistics. This is a report describing trends for various threats in 2007. This team has been tracking trends since 2000. I found the report to be quite interesting. In the rest of this post, I highlight some of the interesting points from the report and what they mean in the context of malware detection. (I) The X-Force team reports continued growth in Web browser exploitation. This clearly shows that the infection vector is changing to the Web. Earlier the primary infection vectors were email and the network. Therefore, for detecting malware, drive-by-downloads (DBD) and other threats targeted at hacking through the Web browser need a lot of attention. (II) X-Force also reports a marked increase in obfuscated exploits, i.e., exploits that use various code obfuscation techiques (such as encryption). Here is a quote, "X-Force estimated that nearly 80 percent of Web exploits used obfuscation and/or self decryption .

Zero Day Threat by Acohido and Swartz

I read the book Zero Day Threat (ZDT) by Byron Acohido and Jon Swartz. I really liked the book! Zero Day Threat is about the underground cyber-economy. It makes some surprising points grounded in real truths. I liked that the book paints a complete picture, i.e., how malware, identity theft, and "drop off" gangs collaborate to facilitate a well oiled cyber-economy. Since my research area is security, I was very familiar with the different types of malware brought up in Zero Day Threat. However, this book gave me a complete picture of the problem. I particularly appreciated two features of the book: Structure: Each chapter is broken into three sections: exploiters, enablers, and expeditors. Exploiter sections focus on crooks (such as scam artists and drug addicts) and how they benefit from the underground economy. The Enablers sections focus on credit card companies, banks, and credit bureaus, and how their current practices enable the underground cyber-economy. Expediters a

Botnets in USA Today

I got a call from Byron Acohido over at the USA Today last weekend, and we had an interesting talk about botnets. Byron and Jon Swartz ended up writing an article about botnets which appeared as the cover story in the Money section of the USA Today on March 17, 2008. Here's a link to the full story ( link ). I found the entire article to be a fascinating read on the nature of botnets. Here are some of the highlights, but definitely go and read the entire article. On a typical day, 40% of the 800 million computers connected to the Internet are bots engaged in various nefarious activities, such as spamming, stealing sensitive data, and engaging in denial-of-service attacks. Think about it. Approximately 320 million computers are engaged these illicit actiivities! Later on in the article they describe various features of Storm , the state-of-the-art for botnets. Storm introduced various innovations into the bot landscape, such as using P2P style communication to converse with the bo

Model Checking and Security

Model checking is a technique of verifying temporal properties of finite-state systems. One of the attractive features of model checking over other techniques (such as theorem proving) is that if a property is not true, a model checker provides a counter-example which explains why the property is not true. Inventors of model checking, Edmund Clarke, Allen Emerson, and Joseph Sifakis, won the 2008 ACM Turing award (see the announcement here ). I have a personal connection to two of the recipients. Edmund Clarke was my adviser at Carnegie Mellon, and Allen Emerson and I have collaborated on few projects and he has supported me through out my career. In this note I try to summarize various applications of model checking to security. Protocol verification : Protocols in the realm of security (henceforth referred to as security protocols) are very tricky to get correct. For example, flaws in authentication protocols have been discovered several years after they have been published. Techniqu

Cooperating Detectors

A malware detector tries to determine whether a program is malicious (examples of malicious programs are drive-by-downloads, botnets, and keyloggers). Malware detection is primarily performed at two vantage points: host and network. This post explains why cooperation between host-based and network- based detectors is a good thing. Traditionally, detection has been performed either at the network or host level, but not both. First, let me examine both approaches separately. A network-based detector monitors events by examining a session or network flow and tries to determine whether it is malicious. The advantage of a network-based detector is ease of deployment -- there are not that many points of deployment for a network-based detector (typically they are deployed behind border routers). Unfortunately, network-based detectors have a limited view of each network session. In fact, if a session happens to be encrypted such as is common with VPNs, Skype, and some bots, a network-based de

Case for kernel-level detection

Why kernel-level detection? These are my thoughts on why malware detection should performed at the kernel level. In general, the lower in the system hierarchy your detector resides, the harder it is for an attacker to evade your detector. For example, if a detector uses system-call interposition, an attacker can evade this system by directly using kernel calls. For example, system-call interposition can be done on Windows using the following package . In my conversations with a guy from NSA (name withheld for obvious reasons:-)) he confirmed that new malware they are observing in their lab are using kernel calls directly. Also, look at the following article The semantic-gap problem: A natural question that comes to mind is: why not perform detection at even a lower layer in the heirarchy? Say the VM layer or even better at hardware. As you move down in the system hierarchy, you lose some high-level semantics. Let me explain. Lets say you are doing detection at the VM layer. A high-leve