There are several problems with malware protection which relies on signatures:
Where does this leave us? Host-based antivirus products are using up more and more CPU cycles to process an ever-growing list of viruses, yet are still unable to keep up with the onslaught of new malware. To make matters worse, the constant creation and release of new definition files is stressing the quality assurance (QA) process for antivirus vendors. We have reached the place where IT professionals are considering turning off automatic AV updates, and deploying labs to test the updates before release.
In short, the odds of timely detection continues to drift downward every so slowly, while the risk of friendly fire from the AV solution itself creeps upward ever so steadily. (The McAfee update issue had an impact on its clients that rivaled a major virus attack.)
We are long overdue for a different approach.
Companies such as Bit9, CoreTrace, and Lumension have been pushing application whitelisting for years now. Microsoft has also provided this technology via AppLocker in Vista, Windows 7 and Server 2008. Even some of the major AV players have purchased or developed application whitelisting technology, but they have not been actively pushing it into the mainstream. They need to start.
Better yet, we as IT leaders and professionals need to start evaluating and deploying the technologies that better address information security concerns in 2010 and beyond, allowing us to make better use our limited budgets and resources.
Application whitelisting is a good idea, because for every environment, there are less items that fall into the "known good" category than bad code that you don't want to run. Just consider the difference in a firewall rule-set that assumes a "deny all that has not been explicitly opened" stance vs one that tries to explicitly prevent access to all bad protocols and ports.
The frequency of change in the "allow list", particularly in corporate environments, will be greatly reduced as compared to the "bad list". This automatically minimizes the chance for error. It also means that the processing power needed to evaluate the former list will be far less than that needed to evaluate the current lists of malware in today's signature-based AV products.
Mitigating Code-Enabled Data
I think that we really have to weigh the disadvantage of code-enabled data files and either abandon them outright (queue lots of whining), or at least ensure that there are centrally controlled configuration options for enabling or disabling the automation features of productivity applications.
For instance, consider how diminished the threat of macro-embedded documents has become since Microsoft enabled much better controls over macro security, including turning them off by default, and allowing them to be set via policies. Remember when macro viruses were the most common threat vector? We need to do the same for PDF exploits.
Getting a better handle on security at the host level entails not only controlling which application can run, but determining in what context, and with what functionality it can run at any given time. If we can get vendors to provide us with centralized controls regulating all of the features they integrate into their apps, then each person and each organization can determine what level of risk to assume for any given application - and in the event of an emergency, the problematic feature can be disabled or otherwise impaired on as a stopgap.
All of these options will sufficiently mitigate external risks without simultaneously increasing risks from errors. And they will consume less processing power and generate less application conflicts than our current antimalware solutions.
Using the Right Technology
Signature-based security devices still have their place within the enterprise - mostly at the perimeter. (And even there, their days are numbered.) But at the desktop, they are increasingly causing more pain than gain, and it is time for us to change our approach, lest we find ourselves slipping further and further behind the malware writers.
And whitelisting need not concern itself with every executable. Each organization can determine just how much to watch and keep track of, balancing performance, productivity and security according to a risk profile that they select.
In the 1980s and 1990s, business applications and data were largely confined within and protected by a Local Area Network (LAN). The 2000s introduced a significant change. Download this white paper now to learn why the shift to the cloud is changing how companies think about and manage their IT infrastructure. more
Microsoft moved to the cloud in 2014, and, as a result, Office 365 is taking off. Now, Okta customers are connecting to Office 365 in increasing numbers. This eGuide explains why IT departments should plan and deploy solutions around identity and mobility management in concert with their Office 365 roll out to get maximum user adoption. more
For most companies, Active Directory (AD) or Lightweight Directory Access Protocol (LDAP) play a central role in coordinating identity and access management policies. When on-premise applications are integrated to Active Directory or LDAP, users get the best possible experience. That's why Okta's cloud-based identity and access management service provides a highly useful single integration point. more
With more and more businesses adopting Software-as-a-Service (SaaS) applications, enterprise IT is fundamentally changing. This whitepaper presents the eight biggest Identity and Access Management (IAM) challenges associated with adopting and deploying cloud and SaaS applications, and discusses best practices for addressing each of them. more