|
Audits from hellFind out how to avoid those audit nightmares |
Audit. The very word produces groans from system administrators at any company you care to name. Quite often, an audit is more intrusive and consumes more resources than a hostile break-in. This month, Carole examines some audits she's participated in to learn from how they went wrong -- or right. (2,800 words)
Mail this article to a friend |
An audit often becomes a political contest between the audit department and the information technology (IT) department. Ensuring a secure architecture becomes almost secondary. Generally, about a month is spent attending meetings and writing memos to explain, clarify, or refute the audit findings. Sometimes, security improvements are actually implemented. Please note that such an unproductive scenario is not always the case -- nor should it be. Hopefully, the real-world experiences I offer here will help to prevent hellish audits from happening in the future.
Ugly audits
Each of the following events actually happened at some point during
the 15 years I've been in this business. There have been many others,
but these examples demonstrate some major points.
Case #1: Auditor with a tool, but no clue
An internal audit department performed its own audits, believing
it could save money by having its people certified and buying an
auditing tool. I think the auditor in this case had received
mainframe certification about 20 years back.
The department on the receiving end wasn't very interested in security, much as I tried to change that. When the audit was announced, I was asked to cover the exposures.
The auditor had a tool he needed installed and run on all the servers. The tool turned out to be a commercial version of COPS and flagged insignificant items with the same weight as major problems. I hoped to use this as an opportunity to implement better security practices without getting my department in too much trouble.
The auditor and I spent one minute reviewing the "root can log in from anywhere" exposure and two hours on the user directory permission problems. Some items flagged as exposures really depended on policy. When I asked to see the policy, I was told that, as a consultant, I could not. I spent two hours every day for three weeks trying to teach the auditor Unix security. When he announced that he wanted to run the tool on the production systems with me, I stalled. Since the systems were downtown, I said I would have to load the software over the network, which could slow down production. We would have to schedule this out of hours.
I went back to my desk, loaded and installed the software on the production systems, and fixed most of the problems. When we met to do the run "together," he never noticed that I did a tar -tvf, or that the software was already installed.
The upshot of this is that neither one of us really improved security, despite my minor corrections. Having to explain every item the audit tool reported on was so time consuming that I quietly fixed problems and reran the report. If I had left it alone, upper management may have dictated the security overhaul I originally wanted.
Case #2: Unauthorized audit
An organization within a company brought in a small company to
perform an audit of its Web server. The corporate audit department
wasn't involved. The outside company produced a report that
extolled the wonderful security of the Web server
it was contracted to audit. (The audit department later found this
to be less than true.) In an apparent excess of zeal, the outside
company decided to audit the Web server of another organization
within the company. Here's what happened:
My job was to shoot down the report.
It wasn't hard. Despite the outside company's pledge to provide straight answers, the report was full of smug innuendo and few verifiable facts. It was clearly biased to make the paying organization look good. While there were indeed security problems on the system in question, the pathnames were incorrect and statements were made based on a SunOS operating system rather than Solaris. It was easy to see that they were false statements. The most damning statement was the auditor's claim to having downloaded software onto the corporate Web server. The company that performed the audit wasn't local and wouldn't make a special trip to discuss the report or provide detailed documentation. Because they claimed to have downloaded software onto the system and considering that the audit department had no contract with them, a decision was made to reload the system from scratch. But this procedure would cause about two days of downtime for the corporate Web server and would attract the attention of the CEO. So in order to keep the whole mess quiet, the organization that contracted for the audit agreed to buy a new system for the attacked organization. This way the old system stayed up until the new one was ready, limiting the amount of downtime.
Case #3: Uncontrolled audit
Believe it or not, I actually look at log files. Automatic intrusion
detection may be preferable, but in this instance we didn't
yet have a system in place. Besides, looking at log files can fill in the time
spent waiting for people. One day, I was looking at the tcp wrappers
log file while waiting for the router to come back up. I noticed
about six attempts to get in from one site. I left messages for the
originating site contact and for my manager. No big deal.
The next day, while waiting for some people, I decided to check the logs. We were actively being hit about six times per second. I told everyone to drop what they were doing because we had an active incident. The VP said he would call the internal audit team to find out whether or not it was them. We tried to contact the originating ISP with no luck. The logs showed that every external system was being actively attacked. We called other organizations to no avail. There was no response from the audit department, and the ISP didn't return our calls. I logged everything with appropriate times and had my partner initial them. The senior VP walked in and said he spoke to the general auditor who had no knowledge of an audit.
With no word from the ISP or the audit department, we considered the ISP an accomplice in an attack. We contacted its provider, who was very responsive. We provided logs and prepared to have the ISP shut down. Our management authorized us to contact the FBI to report an intrusion. We finally reached the contact at the originating ISP and informed him that we planned to report him to the FBI unless he had an explanation. He called us back in a few minutes and gave us the home number of a person at a well known big-company auditor. After seven hours of about 20 people dropping everything else, the incident was finally acknowledged to be an audit. The follow-up meeting required earplugs.
Bottom line: A lot of time was wasted by a lot of people -- some of whom were working on production problems during primetime. The audit department wasn't prepared for detection and didn't inform its upper management of the test. Also, the tool that was run caused a massive amount of network traffic during production hours.
|
|
|
|
The good audits
Yes, there actually were a few! The two examples I give here had one
thing in common: both sysadmins and auditors were truly
interested in security and had an appreciation of each others' work.
In the first situation, I was the auditor. In the second, I was the
audited.
As the auditor
I don't really like doing audits. Having spent too many years as a
system administrator, I have a healthy respect for the demands on
sysadmins' time. Therefore, if I do an audit, I like
to show every way that the administrator is doing a good job. In
this case, I was surprised to be brought to a workstation and logged
in as root. The VP shrugged and stated that he didn't want to waste
time and assumed I could get into a system with physical access. He
wanted to know if I could get into the trusted servers. I actually
felt bad when I showed him I could. Rather than try to backpedal,
he acknowledged the problem and outlined a plan of action to correct
it. I still wrote up the problem in my report, but was able to state
that it had already been addressed.
As the audited
From my experience, big-company audits are a pain and aren't to be taken
very seriously. Therefore, I was caught off guard when a big company
brought in a known hacker to perform the technical analysis on one
of my systems. Realizing the system wasn't as hardened as it could have
been, I was a bit wary. The hacker clearly expected an argument. We
sat down over lunch and talked about security in general. When we
went back to the lab, we both reviewed the system and came up with
areas that could be improved. I actually enjoyed this audit.
Unfortunately, we never received a written report so management
didn't count the audit. However, it enabled me to improve security
of the system and taught me to consider "impossible" scenarios.
Therefore, I consider it a successful audit.
Audit recommendations
Here are a few important points to keep in mind when dealing
with external auditors, reports, auditing departments, and system
administrators.
The external auditors
Qualification checks
Don't take the company's word for it. If possible, have one of the
system administrators interview the auditors to make sure they understand
the architecture. Many companies will object to this, stating that it
interferes with keeping the audit a secret. I don't see the value in
surprise audits.
Audit parameters
A contract should detail exactly which address spaces and/or
telephone exchanges are to be probed. It should also specify the
time of day and duration of the audit. The contract should have a
realistic expectation of results as in hard exposures and
theoretical exposures.
Authorization and indemnification
The auditors must have proper written authorization to perform the
audit. This protects the auditors from being shut down by their ISP
if the audit is detected.
The report
The audit department
Control
During an audit (especially a secret one) the audit department must
be available 24 hours a day. The entire audit department must know
that there is an audit in progress and be prepared for detection.
Tools
Understand the impact of every tool that will be used during the audit.
Demonstrate these tools on a test network to see how "noisy" they
are. Make sure the tool will not impact production.
The system administrators
Don't strike back
Whether you know this is an audit or not, resist the temptation to
attack back. Follow an appropriate incident response procedure.
Documentation
Keep a log of everything -- phone calls, logs checked, etc. And make
sure you log times.
Cover your assets
In these days of "hacking," the more mundane aspects of asset
protection seem to have fallen by the wayside. A company can suffer
just as much, if not more, if it's found to be running software without
proper licensing. Many companies fail to consider that "freeware" may be
for noncommercial use only. Source control is another boring but important
factor in asset protection. Also, it's crucial that all necessary software
can either be reloaded from distribution or recompiled from source. Don't
neglect the traditional aspects of asset protection for the trendy ones.
Followup to last month
There have been several e-mail messages in response to last month's
column regarding padded cells using the chroot system call. Some
valid points stress that chroot is by no means infallible. Indeed,
if someone managed to become root in the chrooted environment, they
can break out of the cell. This does not, however, render the use
of chroot pointless. By limiting the programs used in the padded
cell and carefully screening the programs that are used, there is a
reduced probability that a program in the cell will be exploitable.
It is important to note that the administrator must be very careful
with the programs that are put in the cell.
Thanks to Simon Burr of Uunet in the UK for bringing up some very interesting points.
Bug of the month
The most notable alert this month has been the Trojan horse version
of TCP Wrappers (http://www.cert.org/advisories/).
It demonstrates the importance of verifying the source of any software.
In the news: Frontier justice
This month divergent groups are promoting the use of vigilante tactics to
achieve their goals. According to a CNN article, corporations are
employing "strike back" tactics to discourage hackers.
(See Resources below).
According to the article, one company, with management approval,
traced an attack to its physical location, broke in, and stole the
attacker's computers. According to Hacker News Network, a group
claiming to be hackers in the Legion of the Underground declared
"cyberwar" on the computer infrastructure of China and Iraq. In both cases,
it's difficult to verify the stories. I've never seen a company
approve "strike back" tactics. Many even have policies against such
behavior. The Legion of the Underground (faced with a remarkable
lack of support from the hacker community) stated that the
declaration of war was blown out of proportion by the media. Both stories
demonstrate a dangerous mindset for revenge at any
cost. The hacker community was quick to condemn this mindset. There
has been no conclusive response from the corporate community
regarding corporate vigilantes.
An Irish ISP, Connect Ireland (www.connect.ie) claims to have been targeted for cyber-warfare by the government of Indonesia for hosting the top-level domain of East Timor. Connect Ireland was forced to shut down for two days.
Disclaimer: The information and software in this article are provided as-is and should be used with caution. Each environment is unique and the reader is cautioned to investigate with his or her company as to the feasibility of using the information and software in the article. No warranties, implied or actual, are granted for any use of the information and software in this article and neither author nor publisher is responsible for any damages, either consequential or incidental, with respect to use of the information and software contained herein.
|
Resources
About the author
Carole Fennelly is a partner in Wizard's Keys Corporation, a company
specializing in computer security consulting. She has been a Unix
system administrator for more than 15 years on various platforms and has
particularly focused on Sendmail configurations of late. Carole
provides security consultation to several financial institutions in
the New York City area.
Reach Carole at carole.fennelly@sunworld.com.
If you have technical problems with this magazine, contact webmaster@sunworld.com
URL: http://www.sunworld.com/swol-02-1999/swol-02-security.html
Last modified: