acm - an acm publication


Airport security lessons

Ubiquity, Volume 2002 Issue May, May 1- May 31, 2002 | BY M. E. Kabay 


Full citation in the ACM Digital Library

False alarms, loose lips and trash bin data mines

False alarms, loose lips and trash bin data mines

As I write this essay at 33,000 feet on my way home to Vermont, I am thinking about several security issues that I noticed within a 20-minute time period at an airport today. I hope that readers will agree that the experiences have lessons for all of who need to protect their information and resources.

The first security issue was at the security inspection station, where the agent tried to get me to put my computer through the X-ray machine before my bag went through. I made a point of putting the bag through and only putting the portable computer on the belt when I was the very next person to go through the metal detector. This is a well-established strategy to prevent the notorious scam by which pairs of criminals steal laptops: one person steps in front of the victim after the portable has entered the X-ray machine and causes an obstruction by emulating a portable pawn shop and causing repeated alarms on the detector. By the time the victim gets through the detector, the laptop is long gone.

While I'm on the topic of metal detectors, here are some practical suggestions on shortening your security check: put all your metal stuff in your shoulder bag before you get to the security checkpoint; do not carry any questionable sharp objects -- I now leave even my relatively harmless computer toolkit at home instead of clipping it onto my belt as I usually do (geek fashion, you know). Now that the warmer weather is here, I even make a point of switching to sandals before going to the airport -- saves time because they're quick to take off and put on, they don't usually smell bad, and they often don't get checked at all.

As I was packing up my computer after it got swabbed and checked for explosives residues, strobe lights began flashing, horrible grating noises rent the air, and nobody seemed to pay the slightest attention to the ruckus. "What's THAT?" I asked the security guard. He shrugged and said casually, "Fire alarm."

Lesson: If you're going to have an alarm in a place where people are not trained to recognize it, make the alarm self-explanatory. Why bright white strobe lights and a low-pitched loud retching noise should automatically be understood as a fire alarm is beyond my comprehension. In my house, the fire alarm shrieks, "FIRE!" over and over at the top of its electronic lungs.

I inquired politely, "Why isn't the terminal being evacuated?" "Oh," replied the bored guard, "it's probably just another false alarm." "So what?" I enquired. "What's the point of having an alarm at all if no one pays any attention to it? Resentful silence.

I thought about an incident a few years ago where a meeting of about 300 security experts was interrupted by a fire alarm in the hotel. Without a word, all 300 of us collected our laptop computers (minus the power cords and bags) and calmly walked out into the street where we waited for the alarm to go off or the building to burn down. When we were informed that everything was safe, we walked back in. Not one person asked, "Is this a test?" or "Do you think it's a false alarm?"

Lesson: Teach your own staff (and family) that an emergency alarm is not a TV quiz show in which you get points for guessing correctly that it's actually a false alarm. It doesn't matter if it's a false alarm: get out of the building at once. Similarly, in network security, if there's an alert from your intrusion detection system, your firewall, or your operator, act on the alarm. That doesn't necessarily mean you have to shut down your system, but it does mean that every alarm should be taken seriously and not simply ignored or shut off.

The alarm did stop after about 10 minutes, and I enquired at the bookstore (where I picked up the latest novels by P. D. James and Elizabeth Peters, yum) whether such alarms were frequent. "Oh yes," said the attendant, "they happen all the time." She also confirmed that no one pays any attention to the alarms now.

Lesson: Becoming habituated to false alarms will cause a tragedy if delayed reactions cause victims to be trapped in dangerous circumstances. Similarly, in network security, if operations and security staff habitually wait before acting on computer emergency response plans, hackers will succeed, fires will burn things, and water will soak things. Inserting an unnecessary delay in emergency response is just asking for trouble.

The agents at the airline club said the same about habituation. I asked them, "How will you distinguish a real alarm from a false alarm?" They explained, "We call the control point and ask them if it's for real."

Lesson: If false alarms happen often, fix the system, don't just work around the problem.

In the club, I sat down at a cubicle to work on my laptop. The oaf in the next cubicle was talking so loudly on the phone that everyone could hear every word from 50 feet away. Apparently he was angry at his insurance agent because the insurance company had excluded back problems from his medical policy; he explained that his son had jumped on his back and put it out so he'd gone to the chiropractor and gotten it fixed and it wasn't fair to exclude that and he'd like to think about canceling the policy unless they put the back coverage back in and it was making him mad. Then he called a colleague and discussed whether they should fire Henry Foster (not the real name) even though he really liked him and. . . . "The kind of information being bellowed by this person would be very useful for social engineering by a criminal hacker. Names, details of problems -- just the kinds of issues that let criminals sound convincing as they misrepresent themselves.

Lesson: Train your colleagues to keep their voices down in a public place and to think about the level of detail appropriate for discussing confidential information where others, possibly competitors, hackers and extortionists, can hear them.

Finally, as I threw away an apple core, I noticed a thick wad of papers at the top of the garbage bin. Succumbing to curiosity, I did a bit of scavenging and examined the documents. They consisted of 50 pages of detailed e-mails and database records concerning holders of health insurance policies. The data included full names, dates of birth, and discussions of various problems, including procedural screw-ups such as refusing policies because no one could find the registration information for the insurance broker who booked the policies. The level of detail and the number of employees named in these documents would be a goldmine for social engineering by criminal hackers. "Hi, I was talking with Jeb Holden over in the San Diego office about this screw-up with Mr Watkins and I wondered if you knew who's been taking care of the database that had problems?" In addition, there was a separate printout from the same company that contained detailed, confidential information about the financial difficulties the publicly traded company is experiencing. The information might be useful for stock manipulation; for example, were it to be released, it could cause a dip in the stock price, allowing a criminal to make a profit by selling futures. Ironically, the airline club has a high-capacity shredder. Too bad the person who discarded the confidential files didn't think to use it.

Lesson: Train your colleagues to think before discarding (or even printing) confidential information. Destroy what should not be made public.

At this point, I'm going to be writing to the FAA and the chief of security about the alarm debacle. I'll also send a polite note to the chief of security, if there is one, at the company whose confidential files were dumped in a public trash can and will return the documents to them. I hope there will be some good from those actions, and I hope that some of you readers will take the lessons from my stroll through the airport and apply them in your own work environments.

* * *

Check out the new Computer Security Handbook, 4th Edition edited by Seymour Bosworth and Michel E. Kabay; Wiley (New York), ISBN 0-4714-1258-9. Available now at your technical bookstore or from Amazon and Barnes & Noble

M. E. Kabay, PhD, CISSP is associate professor in the Department of Computer Information Systems at Norwich University in Northfield, VT. Mich can be reached by e-mail at [email protected]. He invites inquiries about his information security and operations management courses and consulting services. Visit his Web site at for papers and course materials on information technology, security and management.


Leave this field empty