What is File Integrity Monitoring and Why You Need It Part 2

May 10, 2017 | Sacha Dawes
X

Get the latest security news in your inbox.

Subscribe via Email

No thanks. Close this now.

In recent weeks, point-of-sale (POS) breaches have dominated the headlines of cyber security news. For example, just after my last blog on file integrity monitoring (FIM), news of a malware breach at InterContinental Hotels Group (IHG) hit the headlines. Unfortunately, the malware ran undetected from September through December 2016, and it took until March 2017 to get final confirmation that it was fully eradicated. More recently,  Chipotle announced a similar POS breach, with echoes of the malware remaining in place and active for at least a month.

There is no question that the bad actors out there often have the advantage of surprise, but all businesses (not just retailers or hotel chains, but across every industry) need to ensure that all their devices and assets, from their POS and endpoint devices to their central systems, are locked down and monitored.

But, the question remains: how do you protect yourself against malware and other nefarious hacks against your systems and applications?

While it’s not the silver bullet to protecting against malware and other attack vectors, a well-configured File Integrity Monitoring (aka FIM) deployment can go a long way to identifying anomalous changes across your IT environment, such as changes to legitimate binaries, configuration files, and the like. Simply put, every organization should consider file integrity monitoring as one of the essential tools in their security arsenal.

In my last blog on File Integrity Monitoring (part 1 of 3), I introduced the ‘what’ and the ‘why’ behind file integrity monitoring (FIM) as one invaluable approach to monitoring for malicious changes to files. In this week’s blog, I’ll introduce some best practices for FIM, including what files to monitor and how to get the best value from your file integrity monitoring solution.

What Files Should I Monitor?

It can be a challenge to determine what files to monitor. Frankly, operating systems and applications can be a bit of a minefield when it comes to understanding what files they deploy and where. Just when you think you’ve nailed it, a new release of the software can introduce new files and folders that you need to monitor.

In general, it’s better to monitor too many files rather than too few. That said, striking the right balance is important as file monitoring can be taxing on system resources, particularly when there are a lot of files to monitor, when files are in a constant state of flux (e.g. log files, virtual memory swap files, the Windows Registry), or when the file size is so big that analyzing it takes extra time.

File integrity monitoring solutions often come preconfigured with recommendations. In many cases, the authors of these packages are very well informed, and those recommendations may suffice for your needs. However, there is no standard IT environment, so you may want to refer to guidance from trusted entities like the Center for Internet Security (CIS), whose security benchmarks provide recommended settings for operating systems, middleware, software applications, and network devices.

In essence, the following file types should be monitored across your environment. Note that default installation directories can typically be modified at installation, and some applications use their own custom locations, so administrators should verify with their software providers and document for later reference if non-default locations are used:

  • Operating system directories and files. It’s important to assure that your base operating system is functioning as expected, so monitoring the system binaries and libraries should be your first step.
    • On Windows, the core OS binaries and key configuration files are typically located under C:\Windows\System32 directory
    • On Linux, the critical directories to monitor include:
      • /bin
      • /usr/bin
      • /sbin
      • /usr/sbin
  • Applications directories and files. The system is the foundation on which the application sits, however, it is the applications that your employees, partners, and customers interact with, and that store and manage your data. Thus, you should monitor application binaries accordingly.
    • On Windows systems, most applications (by default) store their binaries and configuration files under:
      • C:\Program Files
      • C:\Program Files (x86)
    • Linux systems typically install applications into:
      • /usr/bin
      • /usr/sbin
      • /opt

Depending on the type of server and applications being run, additional files and/or directories may also need to be monitored. For example, if the server is a web server, the directory where the web site files reside should be monitored as well. This will vary by organization based on web server used and configuration of the web server.

  • Configuration files. Modifying system and application binaries can be challenging, since they are often locked when the system starts up or when the services / daemons are running. That said, configuration files define how the system and applications on the system function and are typically only read when the system service or application starts up. Configuration settings can be stored in many ways. On Windows platforms, the Windows Registry is typically used for configuration purposes. Text-based configuration files can be found across Windows, Unix/Linux, and OS X. Attackers may target any of these configuration locations for a planned attack, or an administrator may inadvertently misconfigure a system, causing that system to be exposed and putting the data on that system and the rest of your infrastructure at risk.
  • Log files. Log files contain the transaction and activity history for the core operating system, its subsystems, and applications that reside on the system. They are often the first place an attacker will look to hide their tracks. While actively written log files will continually change, only the system or application should be writing to them. To ensure that log files are not tampered with, you should establish an active log management collection method to pull (or push) the logs from the system to a separate log management solution for centralized monitoring and tamper-proof storage. Archived log files are static in nature, so you can also monitor for any changes or deletions of those files.
  • Digital keys and credentials. Even with the availability of directory systems and hardware security modules, many systems and applications store their keys and credentials for authentication and encryption on a system. Monitoring those credential / key stores is also important to ensure your system is protected. For example, Unix systems store their password file under /etc, and Windows under C:\Windows\System32\config. You may be using other popular authentication applications such as Secure Shell (SSH) application.
  • Content files. Corporate and customer data is the lifeblood of most organizations, and data leakage remains one of the top security concerns of many organizations. Even content as simple as your website is mission-critical. The effects on your brand and reputation can be significant should an attacker deface your public presence. Monitoring content files for unauthorized changes within the web server is critical to ensure the integrity and confidentiality of that data.

While relatively long, this list represents the key file types that attackers will look to modify, or even delete, when they try to steal data or disrupt your operations. However it’s not a case of monitoring one type and not another, as typically applications and operating systems will have interdependencies across all the above-mentioned file types, and any unauthorized access and/or modification could lead to disastrous consequences.

There’s no question. Security is hard, but implementing a file integrity monitoring solution can go a long way to delivering on your security goals, helping assure the trust of your customers, and assuring the viability and continuity of your business. The key is to start somewhere, and then tune as needed to best meet both your security and performance requirements.

In my next and final blog in the series on FIM, I’ll discuss what you need to consider when selecting a file integrity monitoring solution as well as several shortcomings with the different approaches to the technology (a quick shout out to my friend Michael Gough for a great conversation on this the other week), and how to overcome those shortcomings with a unified approach to threat detection and response.

Sacha Dawes

About the Author: Sacha Dawes, AlienVault
Sacha joined AlienVault in Feb 2017, where he is responsible for the technical marketing of the AlienVault Unified Security Management (USM) family of solutions. He brings multiple years of experience from product management, product marketing and business management roles at Microsoft, NetIQ, Gemalto and Schlumberger where he has delivered both SaaS-delivered and boxed-product solutions that address the IT security, identity and management space. Originally from the UK, Sacha lives in Austin, TX.
Read more posts from Sacha Dawes ›

TAGS:

‹ BACK TO ALL BLOGS

Watch a Demo ›
GET PRICE FREE TRIAL