officers to ensure compliance
with government regulations and company standards.
stakes in a global, online, real-time business world. Now that data is currency and network access ubiquitous, there's more to making privacy work than a wink and a nod. Privacy must go deeper into a company's culture, until it's part of how a company thinks and acts with its customers, partners, and the public.
Getting there isn't a mystery, even if it's hard work. The many failures have shown what needs to be done. Here are nine truths about privacy that companies must live.
It's A Strategy, Not Just A Policy
Facebook, the popular social networking
site for students, thought it was offering its users a cool new feature when it introduced News Feed in September. News Feed automatically updates Facebook users about changes to the pages of people in their social networks, such as someone adding a friend or posting to a discussion group. CEO Mark Zuckerberg was unprepared for the howls of protest from Facebook users who, instead of seeing a new networking
opportunity, saw the shadow of Big Brother. In an Internet posting on Sept. 5 prompted by an increasingly hostile user community, Zuckerberg defended the new product: "We didn't take away any privacy options. ... The privacy rules haven't changed." Zuckerberg was out of touch with his own community. In a posting three days later, he was forced to admit: "We really messed this one up." Facebook reworked News Feed, offering users new ways to control their personal data, such as the ability to nix the broadcast of specific updates and to remove the time stamp many found particularly onerous.
Companies must watch the letter of their privacy policies as well. They're legal contracts between a company and its customers, so violations can lead to litigation. AOL is being sued by three unidentified individuals who claim the Web portal
queries--addresses, phone numbers, medical conditions--so that it was possible to tie it to individuals.
AOL apologized for the gaffe immediately after it was discovered ("This was a screwup, and we're angry and upset about it," a spokesman said), and there were career consequences: It fired the researchers responsible, and its chief technology officer resigned shortly afterward.
Privacy Laws Will Change--Often
"California puts privacy laws into effect every week," says Parry Aftab, only partly tongue- in-cheek. Aftab's a privacy lawyer and executive director of WiredSafety.org
. "I can't stay on top of them," she says.
But you must. More than half the states have laws that require organizations to notify consumers if their personal data is involved in a security breach. At the federal level, several privacy bills are percolating through both houses of Congress, though the feds have shown no real urgency to act on those bills.
Smart companies don't just stay on top of privacy legislation, they also seek to influence it. Kirk Hareth, chief privacy officer for Nationwide Insurance, served as an industry lobbyist for several years in the 1990s. He helped draft HIPAA, the Health Insurance Portability and Accountability
Act. Hareth keeps in touch with Nationwide's lobbyists to stay current with pending legislation.
Case in point: On Oct. 13, President Bush signed a bill, S. 2856, that includes a provision that requires financial institutions to make their privacy statements "comprehensible to consumers, with a clear format
and design." The Federal Trade Commission has 180 days after enactment of the bill to develop the new privacy model and will seek input
from financial institutions.
Nationwide's privacy statement already complies with the new regulation, Hareth says. "We've gotten ours to an eighth-grade reading level," he says. That's because California law requires that all public documents be written below a ninth-grade reading level, and insurance companies are regulated by the states. Dealing with federal and state regs is a constant juggling act. "You need to have time to do that reconciliation," Hareth says.You Can Excel--Don't Just Avoid Screwups
Despite the raised consciousness around privacy, there's still a lot of data being treated, let's just say, cavalierly. Yet privacy can be something companies excel at, giving them competitive advantage by letting them derive full value out of data while keeping their customers' trust.
Too few companies take this view. In June, a Department of Health and Human Services employee logged on to a computer
at a Baltimore hotel to check his e-mail, and when he went to clear out the temporary cache folder
he found a spreadsheet containing the names and personal information of 17,000 people enrolled in Medicare plans through Louisville, Ky., insurer Humana. A Humana employee had downloaded the spreadsheet to that hotel PC a month earlier and failed to delete it.
That incident came to light just a month after it was revealed that the laptop of a Veterans Affairs data manager containing personal information on more than 26.5 million vets and their spouses was stolen from his home. It turned out the manager had been taking the data home regularly, most likely in violation of VA policy, for more than two years.
What's going on here? "Data is moving into the wild," says Richard Purcell, chief executive of the Corporate Privacy Group, a privacy consulting firm. The more employees work from home or on the road, the more companies have to make it easier to access the corporate system. That means data, including customer and employee information, is moving beyond company walls. Mobile devices such as cell phones, flash
drives, laptops, and PDAs make that migration
The Golden Rules
When collecting confidential data, make sure customers:
>> Know it's being collected
>> Give permission for it to be collected (opt-in)
>> Have (some) control over how it's used
>> Know it'll be used in a reasonable manner
It's not as if companies haven't put a lot of work into making their systems secure. Billions of dollars have been poured into creating secure networks. So why the never-ending string of laptop disappearances that involve a mind-boggling amount of personal data? Purcell says companies are ignoring known risks, and he suggests asking the question another way: Why are companies putting confidential information into the wild deliberately?
It's because many companies still don't take privacy seriously. "Privacy is a barrier to exploiting every opportunity a company has with its customers," Purcell says. Purcell was Microsoft's first chief privacy officer, and he says one of his hardest jobs was to convince the software
company that ensuring privacy was actually a business enabler over time--"but not in stock-ticker time."
Without that mind-set, intelligent companies can wind up with egg on their faces--or maybe coffee grounds. Earlier this month, Starbucks reported that four laptops were missing from its corporate support center in Seattle, two of which contained names and Social Security numbers of nearly 60,000 U.S. employees. Starbucks said in a statement that the laptops were "retired" (not in regular use), there was no evidence the employee information had been compromised, and that in fact the laptops "may still be in the possession of Starbucks; however, we cannot currently locate them."
All Data Is Sensitive
The definition of confidential is expanding. As with the AOL search data debacle, just because a piece of information isn't a name or a Social Security number doesn't mean it's not "personally identifiable" and doesn't have privacy implications, says Fran Maier, executive director of TRUSTe, a nonprofit organization working on privacy standards and best practices.
That's why it's important that companies know exactly what kind of confidential data they're collecting, who has access to it, and how they're using it. Confidential data takes different forms, depending on your industry, Pur- cell says.
For instance, if a health spa requires a customer to fill out a form before getting a massage, a form that asks questions like "Do you have high blood pressure?" or "Do you take prescription medicine?" that process puts the company into a heavily regulated area, says WiredSafety's Aftab. With a realistic assessment of the data in hand, companies must ask: Do I still want to collect that?
The recent flap around Hewlett-Packard's investigation of leaks to the press by members of its board highlighted a practice known as "pretexting," in which telephone records are obtained through the subterfuge of posing as someone else. And while pretexting involves fraud, the ease with which it's been done shows how lightly some companies still take their responsibility for personal data.
It's also important that companies know exactly how potentially sensitive information is being used company-wide. For instance, some developers still use real customer data in testing their applica- tions, says Christopher Grillo, director of information security
at Medica, a health insurance provider. It's a risky shortcut
that could expose confidential data. Instead, they should create fictitious data for testing, or acquire fictitious data from vendors. "Companies need to use the same controls for the test environment as for the development environment," Grillo says.Retain The Right Data, For The Right Time
Most companies value data enough to collect it, lots of it--on customers, partners, and employees--and keep it around for a long time. Not good.
The risk isn't just disclosing it and soiling your company's reputation, though that should be enough. If you ask for more information than the customer thinks is appropriate, as some health care providers or financial services firms do, "people tend to lie to you," Purcell says. Then you start making decisions based on bad information. Collect only relevant information appropriate to the relationship.
Companies don't spend enough time classifying their data, either, says Larry Ponemon, chairman of the Ponemon Institute, a privacy consulting firm. If they do data classification at all, it usually involves a simple "risky versus not risky" distinction that isn't enough to help companies protect the really sensitive data. "They need to be more surgical about data that can cause the most amount of grief," he says.
How long is too long to keep information around? "That's the million-dollar question," says David Stark, North America privacy officer for TNS, a research firm based in London. There are regulations that dictate how long corporate financial information must be retained, as well as data involved in litigation. Stark's rule of thumb: Check out the regs that apply to your business or industry, and keep information only as long as you have to.
Here's something else to factor in when assessing the cost and value of data: If you plan to keep it, know where it is and how to access it. Last April, the Supreme Court approved amendments to the federal rules of civil procedure that require companies involved in litigation to present to the judge an inventory of electronic data relevant to the case. The new rules take effect Dec. 1. The acronym used by the legal community is ESI, electronically stored information, "and it pertains to everything from Oracle databases
to Excel spreadsheets," says Michael Gold, a senior partner with Jeffer Mangels Butler & Marmaro and co-chairman of the firm's technology discovery group.
This law may cause some companies to look at their technology assets in a different light. Nationwide has a detailed data retention pol- icy, Hareth says, and the insurer has essentially banned the use of instant messaging
because of regulatory compliance. "IM is unmanageable, and it creates incredible storage
and retention issues," he says.
Consultant Ponemon attended a meeting where executives were informed that copies of confidential documents had been found in the sizable memory
of the company's copy
machines. "They were dumbfounded," he says.
Helping Can Hurt You--Even With The Feds
Companies want to be good citizens, and never was that more true than in the wake of Sept. 11, 2001, when corporate America was "liberal" in opening up its data stores to federal investigators, says Nationwide's Hareth. In 2002, airlines including Northwest and JetBlue gave a contractor working with the Transportation Security Administration--at the TSA's request--passenger records to test a flight-screening system, angering custom-ers and running afoul of the airlines' own privacy policies.
That liberal sentiment has worn off, and the public is certainly less tolerant of it now, as evidenced by the uproar over newspaper reports of telecom companies providing the NSA with data on international calls originating in the United States.
So what do you do when federal, state, or local agencies ask for confidential information? "Get it in writing," says Kristen Mathews, an attorney with Brown Raysman Millstein Felder & Steiner. Companies should require government agencies to legally compel them to turn over data. If the government agency goes through the bureaucracy of a court order or a subpoena and it blows up later on, "the company can point to the paper trail," Mathews says. Adds Nationwide's Hareth: "Ninety-eight percent of the time, the government is totally cool with that."
And companies need to include their privacy officials in these decisions. Ponemon says some companies deliberately keep their privacy offi- cers out of the loop about government requests for data to create a plausible deniability around any privacy concerns. That's a bad move. The first question that'll be asked is: Did you run this by your privacy officer?Partners Can Be Your Biggest Problem
"Where I see good companies getting in trouble," says attorney Mathews, "is that they got in bed with the wrong people."
An Internet business relationship may involve many downstream partners, says TRUSTe's Maier, and companies must do their due diligence. That's especially true when it comes to online advertising. Companies looking to spend money advertising on the Web might enter an unfamiliar world of pop-ups, page counts, and click-throughs. One especially tricky distinction is between adware
and spyware, and it's one that can come back to haunt an unwitting participant. Companies should study the business practices and learn the reputations of all their Internet partners.
The increasing use of third-party service providers causes increased potential for privacy violations. "You're as strong as you're weakest business partner," says Medica's Grillo. While many companies are content with writing the expectation of privacy controls into their contracts with outsourcers, smart companies will do more. Grillo says he goes on-site to assess the security and privacy policies of outsourcing
companies--at least the ones that handle the most sensitive data, such as claims data--before the contract is signed. "You have to identify your high-risk vendors," he says. "Contractually, it might be their fault, but all the publicity will be coming to your company." That's what happened when Card Systems Solutions, a processor
of payment card data, put millions of cardholders' data at risk last year in a security breach. MasterCard, Visa, and American Express absorbed much of the heat.
Pointing fingers doesn't cut it with customers. In fact, the backlash can be worse if a company trusts a third party that ends up disclosing personal data, says privacy consultant Ponemon, who adds that customer churn rates are twice as high when a third party causes a data breach.
Technology Can Create New Problems
Privacy and security are inextricably linked, so secure technology plays a central role in guarding confidential data. Except it doesn't always work out that way.
|Privacy leaks in health care apps have Lobenstein spooked.Photo by Sacha Lecca|
Companies must do their own testing, not rely on compliance or certification, to know if an application will protect their customers. "If I had a dollar for every health care app that was HIPAA-compliant when it doesn't meet basic 101 programming requirements, I could retire," says Ken Lobenstein, CTO and chief information security officer for Continuum Health Partners, a consortium of five New York hospitals. Vendors that write health care apps often create "privacy leaks," Lobenstein says. He's working with an application right now that has "wonderful security built in," but it also has a significant programming flaw: "The login
process opens a back door into the database." Lobenstein says he'll have to create a kludgy workaround on the off chance someone using the application might discover the flaw and use it to access confidential information.
Some technologies, chief among them radio frequency identification, carry their own privacy concerns. For many consumers, RFID is synonymous with surveillance, a way to track individual buying habits and maybe more. However, RFID
promises supply chain efficiencies--or it comes as a mandate from a key customer, such as Wal-Mart--that make it just too compelling to pass up.
Even if you think the technology poses little privacy risk, consumer concerns demand that you make privacy an integral part of your RFID strategy from the start, says Harriet Pearson, IBM's chief privacy officer. Also, many companies want to keep their RFID strategies secret, for competitive reasons or fear of a negative reaction. That's a mistake. Several companies, Wal-Mart the most prominent example, have had RFID experiments blow up in their faces when word got out to consumers through the media. It's much better to be proactive with that information.
The latest privacy concerns around RFID have to do with the ability of ill-intentioned third parties to pick up data transmitted from RFID chips as a consumer stands in line to pay for merchandise with an RFID-embedded credit card or leaves a store with RFID-tagged items. "You can't just think about what you're doing with the technology. You have to think about what somebody else might be able to do with it," says Mark Roberti, editor of RFID Journal
.One Privacy Approach Can't Cover All
Global business demands the sharing of data across borders--and complying with the privacy requirements of each country along the way.
David Hoffman, director of privacy and security policy for Intel, based his operation in Munich, Germany, two years ago mainly to establish policies and procedures around the "hugely variant" European privacy regs. "In the U.K. and the Czech Republic, they want you to tell them how you're processing
customer data," Hoffman says. "In Spain and France, they'd like you to register
each individual application, to provide details about each individual database." Hoffman says he and his staff spent a lot of time creating processes to ensure compliance with those privacy mandates, but that work is mostly finished and he'll be heading back to the United States in a few months.
To blunt criticism that its privacy laws are too lax, and to counteract the negative publicity of some recent data breaches, India is working on tightening up its regulations. Lawmakers early next year will discuss proposed amendments to India's Information Technology
Act of 2000 that impose heavy fines for data security breaches, identity theft, and e-commerce
fraud, according to published reports. That's good news for companies that outsource data-oriented operations to India, but they need to stay abreast of the potential impact of those changes.
As data gets more mobile and abundant, the list of privacy risks and rules will keep evolving. Some companies will take them more seriously than others and be willing to do the hard work to avoid privacy lapses and ensure the goodwill of their customers and partners. Therein lies the opportunity for companies willing to grab it.
Continue to the sidebars:
Technology To The Rescue
Privacy File: 10 Events That Impacted The IT Landscape
Read the blog:
Loose Local Lips Sink Familial Ships