Tuesday, May 23, 2006

Vets Data Stolen

A laptop with the names, social security numbers, and other private information of 26.5 million (that is one tenth the population of America!) was stolen from a "data analyst's" house. According to the Department of Veteran's Affairs, the employee was "not authorized to do" this.

This is the latest in a long string of personal information being stolen or lost -- lost tape backups, stolen laptops, etc., etc. that happened in the last year.

The VA putting the data analyst "on leave" for this is, actually, ridiculous -- the real question is: How could the private information of 26.5 million people end up on anyone's laptop?

With the advancements of technology for mobilization and advances in data storage, data is more mobile than ever before. With wireless laptops, we can now work at our local coffee house. With ever-increasing storage capacity, we could keep detailed information on every man, woman, and child on the planet on a single laptop computer.

With this incredible increase in computer capability comes responsibility, and we run right into one of Uncle Mark's technology maxims:

Just because we can do something does not mean we should do something.

You can take your computer to the beach and work there. That does not mean you should. You are at the beach, for crying out loud! We can check email with our BlackBerries or Treos while tooling along the freeway at 75 MPH. That does not mean we should.

In the case of the VA, while it is true that certain administrators and technicians can access data and get around certain safeguards built into technologies like database servers and business applications, that does not mean they should, and in this case, I would take it further: the systems that have this information should be built to make it virtually impossible for even skilled internal people to get access to mass data.

This is true across the boards, at all companies and government agencies. If there is data that should be treated confidentially, then the systems that access and store this information must be built to always treat this information this way. Appropriate access must be given, but not Carte Blanche access. In other words, the systems should be built in such a way that copying 26.5 million names and Social Security Numbers to a laptop is extremely difficult and out of the ordinary. To skilled internal people, copying any data anywhere is always going to be possible, mainly because you must be able to manipulate and work with data and data structures to build and maintain systems, but skilled designers and developers can make it difficult and well-defined, and can also create an "audit trail" of access, so you know who did the deed.

Given the extreme mobility of data, IT departments (and audit committees, CEOs, CFOs, and HR personnel) really need to evaluate their own data security, and act accordingly. First, you need to know if the data you are keeping is sensitive or not. Are you keeping SSNs of employees on a computer system, or customer tax information or credit card data? What would be the impact of someone stealing this data? Who can access this data internally? Who should have access?

On the other side of the ledger, the government and financial services industry must be responsible as well. Why does it matter if someone has your Social Secutity Number? Who cares? Well, the problem is that someone can open credit card accounts and other financial products with that information, without your ever having to be directly involved. People can pretend to be you and gain access to your existing accounts. This is a big problem.

The cat is already out of the bag, really, on stolen information. As Scott McNealy, former CEO of Sun Microsystems said five years ago "You have zero privacy anyway. Get over it." The answer is not only to make it harder for people to steal information, because the information of millions of people has already been stolen. Rather, the answer is to make the stolen information useless. Financial services companies must be absolutely sure the people they are dealing with are who they say they are, and the government should back that up by making the firms liable for charges made in any other way. In other word, if a credit card issuer opens an account because someone sent in a form with your SSN on it, without absolutely verifying that the person applying was you, then they are liable for all charges made on that card -- you are not. Likewise, if a retailer does not really know it is you they are selling to, they, too, should be liable for the full amount.

Why should you be responsible for their lack of adequate security? They opened the account without your knowledge or consent. They allowed access to your accounts without fully verifying your identify. It is a breakdown in their processes, not yours. If the bank was robbed, the bank doesn't reduce your accounts -- why is this any different?

Technology exists to fully identify you to whomever needs to know who you are. There are no fool proof methods, but there are methods that reduce the risks considerably, and reduces to zero the risk that someone can open accounts with just your SSN and mother's maiden name. There are security token devices that, in conjunction with a Personal ID number (PIN), can ID you online or in person, and is virtually impossible to break. Someone would have to steal both your security token device and know your PIN to gain access.

There is also "in person." You show up in the office of the bank and establish your ID with your birth certificate, passport, or other documents, and thumbprint. You can add retina scans, or other ID technology. Soon, no doubt, you will be able to use your DNA.

This is why I have been a proponent of strong authentication and identification technologies where it matters. If you have money in the bank, it matters that only you or your designates have access to it. It matters that only you can open a credit card account, or pay pal account, or any other account, in your name. It matters that stores really know it is you buying books or CDs from them on line, if they are charging the purchase to you. It matters that when you send an email, people know it is you who sent it, not someone masquerading as you. It is really easy to be anonymous on the web (sort of, anyway), but really hard to be, well, you.

Financial services and retailers will balk at this, because their position is that it would be "onerous" to make this type of identification work. In other words, it is hard. Fine. My view is that the price for their convenience is the cost of fraud. If they are unwilling or unable to fully establish the ID of their own customers, it should be they, not the customer, who pays for the inevitable fraud that will occur. They may say that "the customer" does not want this. My view is that "the customer" has no problem with being correctly identified, and has a big problem with having to recover from "identity theft."

The current issue of Reader's Digest has cover article on "ID Thieves' New Tricks." The fact of the matter is that technology, such as the security token, can defeat most of these tricks.

The bottom line is that companies and government agencies need to put in place systemic safeguards for personal (or any sensitive) information so that laptops or backup tapes being stolen doesn't affect millions of people, and companies and agencies that work with people need to put in place identification safeguards that will make the theft of "personal" information a non-event. The technology exists today to do both. This is one case where "just because we can do something, we should!"

0 comments: