here - Kelley Drye

Privacy and Security
Law Report
®
Reproduced with permission from Privacy & Security Law Report, 13 PVLR 1387, 08/11/2014. Copyright 姝 2014
by The Bureau of National Affairs, Inc. (800-372-1033) http://www.bna.com
Privacy and Data Security 15 Do’s and Don’ts: Tips for Avoiding FTC Enforcement
BY ALYSA Z. HUTNIK
AND
CRYSTAL N. SKELTON
he Federal Trade Commission continues to remain
laser-focused on protecting consumers’ privacy
and security online, offline and in the mobile environment. To date, the FTC has obtained more than 60
privacy settlements and brought over 50 data security
cases (two of which are currently in litigation). At least
20 of these cases involved children’s personal information; 11 involved the collection of precise geolocation
information; eight involved mobile applications or mobile devices; three were against online or mobile platforms; and one involved the Internet of things. These
T
Alysa Z. Hutnik is a partner in Kelley Drye &
Warren LLP’s Washington office. Her practice includes representing clients in all forms
of privacy, data security and advertising matters, from counseling to defending clients in
Federal Trade Commission and state attorneys general investigations and litigation.
Many of her matters focus in particular on
emerging technologies, including cloud,
mobile, calling/texting practices and big datarelated services.
Crystal N. Skelton, an associate at Kelley
Drye’s Washington office, practices in
the areas of advertising and marketing, privacy and data security and consumer product
safety. She counsels clients on all aspects of
regulatory compliance, with a focus on
statutes and regulations enforced by the FTC,
Consumer Product Safety Commission and
state attorneys general.
COPYRIGHT 姝 2014 BY THE BUREAU OF NATIONAL AFFAIRS, INC.
enforcement examples represent one or more business
practices that the FTC views as unfair and/or deceptive
pursuant to Section 5 of the FTC Act.1
While nearly any company that financially benefits
from the use of consumer personal data is a potential
FTC target, to understand what particular practices the
FTC views as unlawful can be challenging. Indeed, for
those who do not regularly practice in this area, closely
reviewing the more than 100 privacy and security FTC
enforcement cases and related business guidance (as
well as a vast body of the broader scope of FTC consumer protection law) is not realistic. To help navigate
that task, this article summarizes 15 do’s and don’ts to
consider when implementing privacy and data security
practices based on past FTC enforcement.
Privacy Do’s and Don’ts
1. DO Build in Privacy Protections From the
Beginning.
Build in consumer privacy protections—privacy by
design—in the beginning and at each key stage of a
product’s development. A cursory ‘‘legal’’ review just
before a product launch is not likely to be effective in
identifying and resolving problematic data practices.
Rather, the emphasis should be on proactively incorporating a privacy analysis as to each product, service
or platform that is capable of collecting, accessing, storing or transmitting personal or individual device information. Such analysis often includes (a) assessing
whether there are legitimate business reasons for collecting each type of personal or device information; (b)
understanding all the ways the information will be
used; (c) ensuring reasonable limits on the collection
and retention of such data; (d) implementing reasonable procedures to promote data accuracy and integrity;
and (e) employing appropriate security and access restrictions. In an initial investigation into a company’s
1
15 U.S.C. § 45(a). In addition to the FTC Act, there are 33
other laws, rules and guides that provide the agency with enforcement authority to protect consumers’ privacy and security. These include, but are not limited to, the Children’s Online Privacy Protection Act (COPPA) (15 U.S.C. § 6501 et seq.;
16 C.F.R. pt. 312), the Gramm-Leach-Bliley Act (15 U.S.C.
§ 6801 et seq.; 16 C.F.R. pts. 313–314) and the Fair Credit Reporting Act (15 U.S.C. § 1681 et seq.; 16 C.F.R. pts. 602–698).
ISSN 1538-3423
2
practices, the FTC will often look to see whether, and to
what extent, a company applied this type of analysis
from the beginning.2
2. DO Understand What Information Will Be
Collected From or About the User.
Companies should understand precisely what information will be collected actively and passively from and
about the user and classify what information gathered
is personal (and of that bucket, what information
should be treated as highly sensitive, such as children’s
information, health information, financial account information, Social Security numbers and precise geolocation information). It also is useful to proactively audit
the proposed data flow to confirm whether any additional individual data are being collected unintentionally.
Companies should understand precisely what
information will be collected actively and passively
from and about the user and classify what
information gathered is personal.
In analyzing what information is being collected, it
helps to cast a broad net and include data that are obviously individually identifiable, as well as data that
might seem anonymous but which the FTC views as
personal, such as: (1) a persistent identifier, such as a
customer number held in a cookie, a static Internet Protocol (IP) address, a mobile device ID or a processor serial number; (2) precise geolocation data of an individual or mobile device, including GPS-based, Wi-Fibased or cell-based location information; and (3) an
authentication credential, such as a user name or password.3
3. DO Clearly and Accurately Communicate Data
Practices With Consumers.
At a minimum, companies should clearly and accurately describe their data collection, use, disclosure and
protection measures in a prominent, plain language privacy policy that is readily accessible prior to the consumer’s purchase or download of the product or service. Companies should also assess what other
materials—from advertising, press releases, user manuals, etc.—describe data practices, and confirm that the
representations are accurate (as well as whether there
are any material omissions about the data practices,
which should be conveyed to provide an accurate un2
See, e.g., Complaint, In re HTC America Inc., File No. 1223049 (FTC Feb. 22, 2013), available at http://www.ftc.gov/sites/
default/files/documents/cases/2013/02/130222htccmpt.pdf (12
PVLR 377, 3/4/13).
3
See, e.g., Agreement Containing Consent Order, In re
Snapchat, Inc., File No. 132-3078 (FTC proposed May 8, 2014),
available at http://www.ftc.gov/system/files/documents/cases/
140508snapchatorder.pdf (13 PVLR 832, 5/12/14); Complaint,
In re Credit Karma, Inc., File No. 132-3091 (FTC Mar. 28,
2014), available at http://www.ftc.gov/system/files/documents/
cases/140328creditkarmacmpt.pdf (13 PVLR 557, 3/31/14).
8-11-14
derstanding of the data use by the consumer). The only
way to ensure that privacy practices are described accurately is to know what information the company (or
those acting on its behalf) actually collects, and how it
is used, shared and protected.4
4. DON’T Forget to Update Your Representations
Along the Way to Match Your Privacy Practices.
Companies should confirm that all representations
made about a product and how personal information is
handled and secured (whether in the privacy policy or
in other materials) remain consistent, even as business
practices evolve. This is especially important when
there is an update that (intentionally or unintentionally)
causes more or new types of information to be collected
from or about the user, or when the company has new
and materially different ways in which it is using or disclosing such information.
Business practices singled out by the FTC for failing
to do this include: (a) automatically collecting users’
mobile address books, without providing notice to users
or obtaining consent;5 (b) collecting and transmitting
the device’s precise geolocation to ad networks along
with persistent device identifiers that can be used to
track a user’s location over time, when it was represented that the app collected only limited information;6
(c) representing to certain users that the company
would not place tracking cookies or serve targeted advertisements based on those tracking cookies, when, in
actuality, those users did receive tracking cookies and
targeted advertisements;7 and (d) representing that the
company was committed to protecting users’ identities,
data and privacy with reasonable and appropriate security practices, when in fact the security practices were
disabled.8
5. DO Provide Users With User-Friendly Choice
Options.
The FTC believes that companies should offer clear
and concise choice mechanisms that are easy to use and
are delivered at a time and in a context that are relevant
to the consumer’s decision about whether to allow the
data collection or use.9 If the website, app, device or
other product will be collecting sensitive personal infor4
See, e.g., Complaint, In re Apple Inc., File No. 112-3108
(FTC Jan. 15, 2014), available at http://www.ftc.gov/sites/
default/files/documents/cases/140115applecmpt.pdf.
5
See, e.g., In re Snapchat, Inc., FTC File No. 132-3078;
Complaint for Civil Penalties, Permanent Injunction, and
Other Relief, United States v. Path, Inc., No. 3:13-cv-00448
(N.D.
Cal.
Jan.
31,
2013),
available
at
http://
www.bloomberglaw.com/public/document/United_States_of_
America_v_Path_Inc_Docket_No_313cv00448_ND_Cal_J/1 (12
PVLR 188, 2/4/13).
6
Complaint, In re Goldenshores Techs., LLC, File No. 1323087 (FTC Dec. 5, 2013), available at http://www.ftc.gov/sites/
default/files/documents/cases/131205goldenshorescmpt.pdf
(12 PVLR 2027, 12/9/13).
7
Complaint for Civil Penalties and Other Relief, United
States v. Google Inc., No. 5:12-cv-04177 (N.D. Cal. Aug. 9,
2012), available at http://www.ftc.gov/sites/default/files/
documents/cases/2012/08/120809googlecmptexhibits.pdf (11
PVLR 1255, 8/13/12).
8
In re Credit Karma, Inc., FTC File No. 132-3091.
9
FTC, Protecting Consumer Privacy in an Era of Rapid
Change: Recommendations for Businesses and Policymakers
(March 2012), available at http://www.ftc.gov/reports/
protecting-consumer-privacy-era-rapid-change-
COPYRIGHT 姝 2014 BY THE BUREAU OF NATIONAL AFFAIRS, INC.
PVLR
ISSN 1538-3423
3
mation from users,10 or it will involve data practices
that are likely to surprise consumers (such as charges
in connection with a product described as ‘‘free’’11 or
logging computer keystrokes to capture confidential,
personal information12), the FTC urges companies to
offer ‘‘just-in-time’’ disclosures and obtain ‘‘affirmative
express consent.’’13 What a reasonable consumer may
expect to be collected (and how his or her data will be
used) likely depends on the content being offered and
what benefits are provided to consumers. This can help
companies proactively issue spot and prevent a privacy
problem.
security investigations following a showing by the company of reasonable and prudent incident response to
the breach and a reasonable security program in
place.16 Conversely, companies that have failed to take
sufficient actions in response to a data breach, or that
have failed to implement a reasonable security program, are more likely to be the subject of FTC enforcement.17
6. DON’T Disregard a Consumer’s Choice to
Limit Data Collection.
Companies should take reasonable steps to secure
personal information stored or transmitted from the
website, app, product, service or platform. Although the
FTC has not set a specific standard for securing such information, it recognizes that overriding default Secure
Sockets Layer (SSL) settings without providing additional layers of protection;18 failing to follow wellknown and commonly accepted secure coding or programming practices;19 or transmitting or storing user
login credentials in clear, readable text on mobile devices or over the Internet20 may be considered unlawful
unfair and/or deceptive acts or practices.
If a website, app, product, service or platform provides consumers with the option to limit or prohibit the
collection of personal information, ensure that the consumer’s choice is honored. For instance, the FTC settled
allegations that the developer of the ‘‘Brightest Flashlight Free’’ app deceived consumers by presenting them
with an option to not share their information, even
though the information was shared automatically, effectively rendering the option meaningless.14
Data Security Do’s and Don’ts
7. DO Implement a Reasonable Security
Program and Have Procedures in Place to
Respond to a Data Breach.
A data breach—large or small—can often trigger attention from the FTC to investigate a company’s data
and security practices. Ideally, before a breach occurs,
a company will have implemented a security program
that is reasonably designed to (a) address security risks
related to the development and management of new
and existing products and services for consumers and
(b) protect the security, integrity and confidentiality of
consumer information.15
Once a company discovers that it may have incurred
a potential data breach, it should promptly and competently investigate the breach and, once the breach is
confirmed, take appropriate steps to send consumer notice (in accordance with state laws) and review and address existing vulnerabilities to prevent a similar recurrence. The FTC has informally closed many of its data
recommendations-businesses-policymakers (11 PVLR 590,
4/2/12).
10
In re Goldenshores Techs., LLC, FTC File No. 132-3087.
11
In re Apple Inc., FTC File No. 112-3108.
12
See, e.g., Complaint, In re DesignerWare, LLC, File No.
112-3151 (FTC Sept. 25, 2012), available at http://www.ftc.gov/
sites/default/files/documents/cases/2012/09/
120925designerwarecmpt.pdf (FTC alleged that rent-to-own
companies spied on consumers using rented computers, capturing screenshots of confidential and personal information,
logging computer keystrokes and taking webcam pictures of
users in their homes without notice to or consent from the consumer) (11 PVLR 1461, 10/1/12).
13
See, e.g., Decision and Order, In re Apple Inc., File No.
112-3108 (FTC Mar. 25, 2014), available at http://www.ftc.gov/
system/files/documents/cases/140327appledo.pdf.
14
In re Goldenshores Techs., LLC, FTC File No. 132-3087.
15
See, e.g., Agreement Containing Consent Order, In re
Credit Karma, Inc., File No. 132-3091 (FTC proposed Mar. 28,
2014), available at http://www.ftc.gov/system/files/documents/
cases/140328creditkarmaorder.pdf.
PRIVACY & SECURITY LAW REPORT
ISSN 1538-3423
8. DON’T Store or Transmit Personal Data
Without Ensuring the Data Are Reasonably
Secured.
9. DO Test and/or Audit the Program’s Security
Before Product Launch.
The FTC has settled with a number of companies
over allegations that they failed to employ reasonable
and appropriate security in the design, development,
testing or maintenance of their website, app, product,
service or platform.21 Oftentimes, a company will conduct beta testing in which it disables or does not include
certain security measures. Before releasing a product to
the marketplace, companies should identify and prevent such vulnerabilities by performing an adequate security review prior to launch to ensure these security
measures are reinstated.22
16
See, e.g., FTC Closing Letter from Joel Winston, Associate Director, FTC Division of Privacy and Identity Protection to
Orrick, Herrington & Sutcliffe LLP Regarding NovaStar Financial, Inc. and NovaStar Mortgage, Inc. (Apr. 4, 2008), available
at
http://www.ftc.gov/sites/default/files/documents/closing_
letters/novastar-financial-inc.and-novastar-mortgage-inc./
080404novastar.pdf.
17
See, e.g., In re Credit Karma, Inc., FTC File No. 132-3091;
Complaint for Injunctive and Other Equitable Relief, FTC v.
Wyndham Worldwide Corp., No. 2:13-CV-01887-ES-SCM (D.
Ariz. Jun. 26, 2012), available at http://www.ftc.gov/sites/
default/files/documents/cases/2012/06/
120626wyndamhotelscmpt.pdf (11 PVLR 1069, 7/2/12).
18
See, e.g., In re Credit Karma, Inc., FTC File No. 132-3091;
Complaint, In re Fandango, LLC, File No. 132-3089 (FTC Mar.
28, 2014), available at http://www.ftc.gov/system/files/
documents/cases/140328fandangocmpt.pdf (13 PVLR 557,
3/31/14).
19
In re HTC America Inc., FTC File No. 122-3049.
20
Complaint, In re TRENDnet, Inc., File No. 122-3090 (FTC
Sept. 4, 2013), available at http://www.ftc.gov/sites/default/
files/documents/cases/2013/09/130903trendnetcmpt.pdf
(12
PVLR 1532, 9/9/13).
21
See, e.g., In re TRENDnet, Inc., FTC File No. 122-3090; In
re Fandango, LLC, FTC File No. 132-3089.
22
See, e.g., In re Credit Karma, Inc., FTC File No. 132-3091
(The FTC alleged that Credit Karma Inc. disabled its app’s SSL
certificate validation. This was intended to be disabled ‘‘in testing only,’’ but the FTC alleged that the company failed to ensure this code’s removal from the production version of the ap-
BNA
8-11-14
4
10. DON’T Forget to Provide Adequate Training
and Proper Oversight.
Companies should provide adequate privacy and security guidance or training for those developing products or services that collect and use personal information and implement reasonable oversight procedures
for any third-party service providers that will have similar access and use privileges.23 On this front, FTC guidance suggests that companies include within their security program: (a) employee training and management,
including in secure engineering and defensive programming; (b) development and use of reasonable steps for
selecting and retaining service providers capable of
maintaining adequate security practices; and (c) a requirement that service providers by contract implement
and maintain appropriate data and security safeguards.24
Companies should provide adequate privacy and
security guidance or training for those developing
products or services that collect and use personal
information.
11. DO Confirm Users’ Identities Through a
Verification Process, as Necessary.
When a website, app, product, service or platform allows a user to register or sign up, the company should
have a reasonable method to verify the identity of the
user. This is especially important when users are sharing personal information with one another. For example, the FTC brought an enforcement action against
Snapchat Inc. after it failed to verify users’ phone numbers during registration.25 By failing to do so, Snapchat
users were actually sending personal snaps to complete
strangers who had registered with phone numbers that
did not belong to them. This resulted in a security
breach permitting attackers to compile a database of 4.6
million Snapchat user names and phone numbers.
When a user of a product, service or platform provides a purchase option, companies should take reasonable measures to verify that the purchase option and associated charge(s) are clearly and accurately presented,
that the authorized (adult) user confirms the charge
and that there are appropriate safeguards to prevent,
detect and respond to potential fraud or unauthorized
charges. FTC settlements and litigation specifically
highlight the FTC’s concern in this area relating to
charges for in-app purchases26 and text-based subscripplication. The FTC asserted that the company could have identified and prevented this vulnerability by performing an
adequate security review prior to launch).
23
See, e.g., In re HTC America Inc., FTC File No. 122-3049;
In re Credit Karma, Inc., FTC File No. 132-3091.
24
See, e.g., In re Credit Karma, Inc., FTC File No. 132-3091.
25
See, e.g., In re Snapchat, Inc., FTC File No. 132-3078.
26
In re Apple Inc., FTC File No. 112-3108; Complaint for
Permanent Injunction and Other Equitable Relief, FTC v. Amazon.com, Inc., No. 2:14-cv-01038 (W.D. Wash. Jul. 10, 2014),
available at http://www.bloomberglaw.com/public/document/
8-11-14
tions for content such as flirting tips, horoscope information or celebrity gossip.27
General Do’s and Don’ts
In addition to specific privacy and data security considerations, there are other general considerations that
companies might consider from the outset.
12. DO Apply Extra Scrutiny When It Comes to
Kids.
For any online or mobile destination that is targeted
to, or likely to be attractive to and used by, children, developers and marketers should confirm that it complies
with the Children’s Online Privacy Protection Act and
the FTC’s COPPA Rule.28 Even if children are not intentionally targeted, the same scrutiny should be applied
where it is reasonably anticipated that the online destination may collect children’s information. If it will
likely involve children, developers should also consider
whether the data collection or in-app practices make
sense from a business standpoint—weighing both the
risks and benefits.
13. DON’T Avoid Monitoring and Addressing
Consumer Complaints.
Companies should have a clearly publicized and effective channel for receiving and addressing complaints
about the product and determining whether such complaints may indicate a systemic or design issue that
needs to be addressed in the product or service.29 Proactive monitoring and resolution of consumer complaints or complaints from other sources can help dispel consumer-related privacy and security concerns before they catch the attention of the FTC or other
consumer protection groups.
14. DO Appreciate That the FTC Will Not
Hesitate to Bring a Case Against a Third Party.
The FTC takes the position that a company can be liable for the acts and practices of a third party if the
company knew or should have known of the challenged
conduct, financially benefited from such conduct and
failed to take appropriate or prompt steps to address
the concerns. In many of the FTC’s third-party liability
cases, the company often overstates the level of oversight or protection it provides over third parties or does
not take reasonable steps to confirm that its third-party
vendors or business partners can reasonably use and/or
protect personal data shared with them.30
Federal_Trade_Commission_v_Amazoncom_Inc_Docket_No_
214cv01038_WD_.
27
See, e.g., Complaint for Permanent Injunction and Other
Equitable Relief, FTC v. T-Mobile USA, Inc., No. 2:14-cv-00967
(W.D.
Wash.
Jul.
1,
2014),
available
at
http://
www.bloomberglaw.com/public/document/Federal_Trade_
Commission_v_TMobile_USA_Inc_Docket_No_
214cv00967_W.
28
15 U.S.C. §§ 6501–6506; 16 C.F.R. pt. 312.
29
See, e.g., In re Credit Karma, Inc., FTC File No. 132-3091;
In re Fandango, LLC, FTC File No. 132-3089; In re HTC
America Inc., FTC File No. 122-3049.
30
See, e.g., In re Credit Karma, Inc., FTC File No. 132-3091;
In re Fandango, LLC, FTC File No. 132-3089; In re Apple Inc.,
FTC File No. 112-3108; FTC v. Amazon.com, Inc., No. 2:14-cv01038 (W.D. Wash.); FTC v. T-Mobile USA, Inc., No. 2:14-cv00967 (W.D. Wash.).
COPYRIGHT 姝 2014 BY THE BUREAU OF NATIONAL AFFAIRS, INC.
PVLR
ISSN 1538-3423
5
15. DON’T Forget to Monitor for New
Developments at the FTC.
Many of the FTC’s enforcement actions came soon
after the agency held workshops or seminars or issued
other educational briefings addressing new areas or after issuing staff reports to industry with recommended
business guidance. If past is prologue, we can anticipate
future enforcement on facial recognition, mobile device
tracking, alternative scoring products and/or the use of
connected health and fitness devices.
world continues to grow, the FTC will closely scrutinize
companies’ practices with respect to the collection, use,
handling and security of consumers’ personal information. Staying mindful of FTC enforcement actions and
the lessons learned from such cases, as well as working
closely with experienced privacy/security counsel, can
help companies proactively identify and address privacy and data security issues and hopefully avoid appearing on the FTC’s radar.
Conclusion
The FTC will continue to make privacy and data security enforcement a priority. As our interconnected
PRIVACY & SECURITY LAW REPORT
ISSN 1538-3423
BNA
8-11-14
NEW PORTFOLIOS
& TREATISES
NOW AVAILABLE
SAFE DATA &
SOUND SOLUTIONS
==============
Privacy & Data Security
Law Resource Center™
Unparalleled news. Expert analysis
from the new Privacy & Data
Security Portfolio Practice Series.
Comprehensive new treatises.
Proprietary practice tools. State,
federal, and international primary
sources. The all-in-one research
solution that today’s professionals
trust to navigate and stay in
compliance with the maze of evolving
privacy and data security laws.
TO START YOUR FREE TRIAL
CALL 800.372.1033 OR
GO TO www.bna.com/privacy-insights
0214-JO11295
© 2014 The Bureau of National Affairs, Inc.