Skip to content

Legal Aspects of Data, Privacy, and Identity

EvanAbsher edited this page Apr 9, 2015 · 1 revision
  1. Intro

The conflict between personal agency and utility is not new. This tension is expressed in America’s very founding documents, although there it is not privacy, but property. Americans -- and citizens of all jurisdictions and sovereignties -- face a fundamental reorganization of this tension. Instead of the crown confiscating a subject’s property without compensation or recourse, now data is collected without knowledge or choice. Privacy is viewed as a substantive right unto itself, yet modern privacy concerns cannot be properly understood or protected without a reimagining of what the right to privacy protects; data.

If privacy is a relatively new right then data rights are infantile. The concept of substantive data rights are the subject of this paper. A central conceit of our paper is that data rights must be accompanied with a manageable mechanism by which the average user can exercise informed control over their data. To protect our modern concept of privacy if we must include data and mechanism by which individuals can control their data. Everyday advances in data sciences are made; increasing the utility of data for private and public purposes, while increasing the incentive to use data in nefarious and inappropriate ways. Even uses we find useful and positive must be subject to a more informed and even-handed negotiated transactions.

  1. The Problem
    1. The Basic Data Social Contract is Broken

      1. Data is diffuse and therefore hard for users to track and manage.

Most people of the general public do not know who is collecting what data on their personal activities. In fact, very few experts can fully comprehend all the implications for data collection, storage, and use. There is no doubt that anyone who uses an online service leaves a trail of unintentional data to later be collected stored and used by the initial service provider and a third party receiving data from the provider. Because of the complexity and opacity of the current system almost all are truly powerless to exercise rights over their data in the same manner one would exercise rights over property.

Currently there many entities that can and do track user behavior. First, the Internet Service Provider (“ISPs”) can track users’ across the all uses of the internet. Some even employ, much to the chagrin of privacy and consumer advocates Deep Packet Inspection (“DPI”) as a means for deep data mining.[1] While this form of data collection is relatively encompassing is extremely subtle and almost always beyond the control of the user.[2] Google and Facebook are far reaching and grow everyday, but nothing comes close to the data that pass through ISPs hands. Some, ISPs choose not to use DPI in this manner, but the ability is there none the less. DPI is also used in other “enterprise databases” like large institutions, governments and companies. Even if the user could get some how opt out the ISP using DPI on their data the use of DPI by other enterprise databases makes it impossible for one user to effectively combat DPI at on point.

Another layer of data collection lies with the browsers and how they track user traffic. In 2010 the FTC recommended that congress pass the introduced Do Not Track legislation and although those bills have yet to be passed, the recommendation triggered response in the community and many entities including the digital advertising trade group DAA have promised to implement and promote Do Not Track Technology.[3] The adoption of DNT is far from universal and standards for the tech still have not been hammered out. While Mozilla has made pains to adopt DNT, it is an open source browser without corporate interests in the traditional sense. Companies like Twitter have promised to establish and adopt DNT policies, but as of yet have failed to deliver on that promise. The FTC’s 2012 Report, Protecting Consumer Privacy in an Era of Rapid Change, states the following, “[t]he Commission agrees that the range of privacy-related harms is more expansive than economic or physical harm or unwarranted intrusions and that any privacy framework should recognize additional harms that might arise from unanticipated uses of data.” In short, we don’t know what we don’t know and this allows some stakeholders to dictate the terms to their benefit, sometimes to the harm of the consumer.

Finally comes the Internet of Things. The term Internet of Things (“IoT”) was first coined in an academic paper in 2005, since then there have been a few different definitions as to what exactly is the IoT. Regardless of the differences the basic understanding is merging physical data and the internet in to one cohesive network. The Internet of Things market is expected to grow on average by 13 percent each year through 2020, reaching $3.04 trillion and connecting billions of objects that year, according to researcher IDC. The most prominent example so ar is in manufacturing where sensors monitor large industrial machines for a variety conditions, heat, timing, etc. Many companies and governments are wading into this field with high hopes. The buzz around the IoT has reached a fevered pitch, while undoubtedly there is some hype the promise of IoT has serious potential along with serious risk. Initially the users risk their data being used in harmful and unintended ways, such as credit, employment decisions, or insurance policies. Also, even if the user does fully understand the how the data will be used and stored there is no guarantee of safety of the data.

Data has been called the new century’s oil. This seems like a very apt description; oil and data are both inert and must be sought, they are useful only after some manipulation, they seem to run many of our important networks, and without either society would seize up altogether. However data is ever increasingly overtaking oil as the prominent commodity and so we must learn from the mistakes made with the first commodity so that we do not repeat them with the new one.

  1. Terms-of-use for everyday services are impossible to understand.

The primary issue is the initial terms of use agreements most companies use to achieve acquiescence from customers to use their data. Almost no one, including the experts, reads these agreements. Most customer terms of use agreements are long, written in excruciatingly small font, and in a language more akin to Elizabethan poetry than plain English. Further the conventions of the agreement are one sided and generally compose a take-it or leave-it transaction; the company gets to do what it wants with your data, while you get to use their service at “no cost.”

The problem with these “no cost” take-it or leave-it agreements is the user rarely fully understands exactly the value of the commodity they are giving away. Moreover, the user is giving away the data completely and entirely for the duration of the data’s life, with no means to ensure the contextual use, appropriate storage, or the proper elimination of the data. Rarely does a consumer have the option to walk away if only because the service in questions has no real alternative, or if there is an industry competitor the terms of agreement are usually agreed to at a time in the transaction where it would incredibly inconvenient to start over with a new service provider (especially if the alternative provider doesn’t offer better terms). What consumer are left with is a Hobson’s choice as to whether use the service and trade all data and all control or not use the service at all.

Even if the company does offer mechanisms to control privacy settings and data use there are usually difficult to find and almost never preclude the service provider from using the data, but from other users “seeing” the data. For instance, Facebook allows users to connect with third party sites and apps -- most of the time to use the third party site or app you must allow them to access your Facebook or Google account in a take-it or leave-it agreement-- however, the mechanism to turn off the app is quite difficult to to find, buried within several layers of the user homepage.

  1. Very little trust exists between users and data operators in both the public and private sphere--understandably so.

The complexity of the Terms of Use agreements and the one-sided provisions within the agreement erode the public’s trust, not only in the private companies, but in the entire environment of data collection and use generally. This apprehension towards big data collection and use is well earned, especially after the recent PRISM (and related programs) scandal where private companies capitulated in the warrantless collection of Call Record Data without the knowledge of customers or citizens.

Recently the U.S. Senate introduced the Cybersecurity Information Sharing Act (“CISA”), which has been criticized as radically expanding the definition of cyber security threat to mean almost anything. This would allow private companies to cooperate with government agencies in a broader array of situations than is currently permitted. More troubling is the provision that would allow automatic access by the NSA to data collected and held by private corporations. This bill has been introduced before and defeated before, but with the recent hacks at Sony and J.P. Morgan there is new life in the legislation. The bill might be on the President Obama’s by the end of the month and most believe that the president has reversed his opposition to it and will sign the bill into law.

There are undoubtedly unintended consequences that will result from laws like, Section 702 Amendment to the Foreign Intelligence Surveillance Act, Section 215 of the Homeland Security Act, and the proposed CISA, or agency programs like Prism, X-Keystroke, Muscular, and Mystic. Most of Americans are unable to even identify Edward Snowden or Julian Asange, let alone know anything of the above laws and programs. When the average citizen is presented with the privacy issue as a personal invasion of privacy on behalf of the government, they respond strongly. In this instance ambivalence demonstrates ignorance; an ignorance that even the most savvy citizen find difficult in dispelling. The laws and issues are complicated and only when they are presented as a narrative can people truly understand the implications of the choices they make on the internet and how the result of those actions can be collected and held without their full understanding. The obfuscation and complication breeds ignorance and distrust, and the law current legal structure and procedures only serve to perpetuate this deep seated suspicion.

However, the question is not how do we make this system more transparent. That would be a fool’s errand. Maybe there can be some reasonably paring down of the complicated surveillance and data collection laws, both private and public, but the best option might simply be preemption of the system. Create a mechanism so that the citizen and user can control their privacy and data rights as they understand them, on their ground and on their terms. Bring the law to them and force it behave in the set parameters that the user can comprehend.

  1. Companies are incentivized to sell data, creating a basic conflict between the interests of users and commercial actors.

Even when companies are not compliant in large, secret dragnet style data collection, they are incentivised by their very business models to provide as much granular data as possible. Facebook, Google, and other “free” ubiquitous internet services must collect, aggregate and then “sell” data in order to produce a revenue stream. Google and Facebook make money (and a lot of it) from the fact that they have almost unmitigated access to a lot of data from huge swathes of society. More importantly, they can use the data generated by one person inparticular and customize the marketing to fit the person’s activity through that service. The ability to tell what you like by what you do on the site is major reason why advertising on Facebook and Google makes so much money.

Tech

  1. Tech is advancing too quickly for lawmakers to keep up.
    1. Current legal thinking on privacy protection in the digital arena is out-of-date.
      1. The Supreme Court has exhibited a basic lack of understanding of the technology, which doesn’t bode well for its appropriate regulation under the law.
    2. Current legal structures fail to adequately protect privacy.
      1. The constitutional status of a “right to privacy” remains ambiguous. States have sometimes stepped in to fill the void, but protection remains badly uneven. [a][b][c]
    3. The law lacks a fundamental understanding of basic data rights.
  2. Example of potential tech solution with conflicts between the user
    1. Google’s Pony Express, Facebook Messenger Payment.
    2. The fact that either
  3. The Legal Solution
    1. Data Bill of Rights

Recently the White House proposed a Consumer Data Bill of Rights. In this document it lists several tenets by which companies should conduct themselves in the context of collecting, storing and using consumer data. Many of these principles are based on the work of others, such as Sandy Pentland of MIT Media Labs and the FTCs Freedom of Information Practice Principles (“FIPPS”). Amongst all of these sources there common themes; Choice, Notice, Data Minimization, Context, Transparency, and Accountability. The Data Bill of Rights school of thought reimagines the way data is understood under the law. Rather than rely on a privacy-oriented framework for trying to protect users’ data interests, this idea begins to treat control over data as a property right protected by due process. Individuals can consent to sharing of their data with private actors through acquiescence to understandable terms of use. Government actors should be limited in their freedom to collect or in any way use or transmit personal data by due process rights. More importantly, the effectiveness of the protection measures must be quantified, no more of the take-it or leave-it, tiny-font terms of use.

In February the White Houses submitted to Congress proposed Data Bill Of Rights legislation. In it the government not only includes private companies, but also all levels of government. The bill would mepower the FTC to promulgate rules to protect a set of rights that would be codified in the United States Statute. Here are basic principles:

  1. Transparency - Each covered entity shall provide individuals in concise and easily understandable language, accurate, clear, timely, and conspicuous notice about the covered entity’s privacy and security practices. Such notice shall be reasonable in light of context. Covered entities shall provide convenient and reasonable access to such notice, and any updates or modifications to such notice, to individuals about whom it processes personal data.

  2. Individual Control -

Each covered entity shall provide individuals with reasonable means to control the processing of personal data about them in proportion to the privacy risk to the individual and consistent with context.

  1. Respect for Context -

If a covered entity processes personal data in a manner that is reasonable in light of context, this section does not apply. Personal data processing that fulfills an individual’s request shall be presumed to be reasonable in light of context.

  1. Focused Collection and Responsible Use -

Each covered entity may only collect, retain, and use personal data in a manner that is reasonable in light of context. A covered entity shall consider ways to minimize privacy risk when determining its personal data collection, retention, and use practices.

  1. Security -

Each covered entity shall:

  1. identify reasonably foreseeable internal and external risks to the privacy and security of personal data that could result in the unauthorized disclosure, misuse, alteration, destruction, or other compromise of such information;

  2. establish, implement, and maintain safeguards reasonably designed to ensure the security of such personal data, including but not limited to protecting against unauthorized loss, misuse, alteration, destruction, access to, or use of such information;

  3. regularly assess the sufficiency of any safeguards in place to control reasonably foreseeable internal and external risks; and

  4. evaluate and adjust such safeguards; any material changes in the operations or business arrangements of the covered entity; or any other circumstances that create a material impact on the privacy or security of personal data under control of the covered entity.

  5. Access and Accuracy

    1. Accuracy: Each covered entity shall, in a manner that is reasonable and appropriate for the privacy risks associated with such personal data, establish, implement, and maintain procedures to ensure that the personal data under its control is accurate. In developing such procedures, the covered entity shall consider the costs and benefits of ensuring the accuracy of the personal data

    2. Access: Each covered entity shall, upon the request of an individual, provide that individual with reasonable access to, or an accurate representation of, personal data that both pertains to such individual and is under the control of such covered entity. The degree and means of any access shall be reasonable and appropriate for the privacy risks associated with the personal data, the risk of adverse action against the individual if the data is inaccurate, and the cost to the covered entity of providing access to the individual.

    3. Correction and Deletion: Each covered entity shall, within a reasonable period of time after receiving a request from an individual, provide the individual with a means to dispute and resolve the accuracy or completeness of the personal data pertaining to that individual that is under the control of such entity. The means of resolving a dispute shall be reasonable and appropriate for the privacy risks and the risk of an adverse action against an individual that are associated with such personal data.

  6. Accountability-

Each covered entity shall take measures appropriate to the privacy risks associated with its personal data practices to ensure compliance with its obligations pursuant to this Act, including but not limited to:

  1. Providing training to employees who access, collect, create, use, process, maintain, or disclose personal data;

  2. Conducting internal or independent evaluation of its privacy and data protections;

  3. Building appropriate consideration for privacy and data protections into the design of its systems and practices; and

  4. Binding any person to whom the covered entity discloses personal data to use such data consistently with the covered entity’s commitments with respect to the personal data and with the requirements set forth in Title I of this Act.

The above provisions fundamentally reimagine privacy protection in an ever changing landscape of technology and policy. What this legislation and the accompanying FTC regulations would do is establish an unprecedented, but much needed level of expectations, awareness and value to individual data. Like we stated above, in the past privacy protection has relied on analogies to contract law, rights and obligations, but under this thought paradigm data rights will be viewed as their own substantive right protected by due process. No longer will one-sided, uniformed terms of use adequately protect the fundamental right to control one’s own data, which should be viewed as tantamount to an individuals physical person.

What the best legal mechanism is to facilitate the protection and dues process is not certain. Currently there are two main schools of thought, use-based and what this paper will call notice-and-choice. It is completely feasible that both of these mechanism could, separately or together serve the purpose of the data bill of rights. The problem is which will properly balance the data rights of the user with the legitimate business interests of the companies that wish to use the data.

            Use-Based Model

Many commenters in the industry advocate that the use-based model should be adopted by the FTC. Essentially the use based model would require lawmakers, regulators, service providers, and consumers to set down what is “permissible” and “impermissible” use of data. The simplicity of the system is a positive especially compared to the alternative choice-and-notice system, which many believe would be too unwieldy and cumbersome to be useful to either the business interests or data protection.

There are drawbacks to the use-based model, first the actual limitations of what a use-based system are not specifically delineated or agreed on. To fully flesh out a robust and utilitarian used-based system would require a multi-stakeholder process and likely the maturity of a service like IoT. This leaves a lot of room for negotiates and thus control. There are many who would seek to use the openness of field to advantage their interests and the use-based system would afford that openness for the foreseeable future while what are “permissible” and “impermissible” uses are determined. Second, the use-based system may not adequately solve a major flaw of the current system, user understanding. It isn’t too difficult to imagine a consumer understanding a certain use to mean something far less broad in scope than the service provider. This issue could be ameliorated with industry wide standards, which will be forthcoming regardless of the model, but even still use-based models cannot fully alleviate the common take-it or leave-it practice, for example Google might offer truncated service if the user refuses to allow Google full use of their data. Most importantly use-based models completely leave out storage and collection elements of data protection. Again, this model relies on the current contractual understanding of data rights. If the user wishes to use a services or site like Google, then it must agree to the terms of use, if the user disagrees with the terms then the user is unable to access Google. If the use-based system is deployed, the current process is left largely intact, without accommodation for the the fundamental shift in how society views data and the relationship it has to individuals physical person.

            Notice-and-Choice

N

  1. Better terms of use

The various agreements we all encounter in connection with our use of software and online services are typically a mess of incomprehensible, boilerplate garbage, riddled with the worst kind of legalese. Terms-of-use agreements can and should be accessible, understandable, and potentially interactive in a way that serves user interests. They could be made interactive by, for example, including options for users to establish privacy/data control settings with which they’re comfortable as part of the process of agreeing to the terms.

  1. The various agreements we all encounter in connection with our use of software and online services are typically a mess of incomprehensible, boilerplate garbage, riddled with the worst kind of legalese. Terms-of-use agreements can and should be accessible, understandable, and potentially interactive in a way that serves user interests.
     1. They could be made interactive by, for example, including options for users to establish privacy/data control settings with which they’re comfortable as part of the process of agreeing to the terms.
  1. Municipal Utility
  2. The Tech Solution
    1. OpenID Connect
      1. No conflict of interest between the Identity Provider and the User
    2. Uses the trust framework described by Pentland et el.
    3. Municipality provides an Identity that can be used for relying parties.

NOTES AND QUOTES ( This is something I like to do with my papers. I cut and paste quotes with the citations. That way I have everything in one location)

FTC Mobile Apps for Kids Report.

The survey found that:

  • Parents are not being provided with information about what data an app collects, who will have access to that data, and how it will be used. Only 20 percent of the apps staff reviewed disclosed any information about the app’s privacy practices.
  • Many apps (nearly 60 percent of the apps surveyed) are transmitting information from a user's device back to the app developer or, more commonly, to an advertising network, analytics company, or other third party.
  • A relatively small number of third parties received information from a large number of apps. This means the third parties that receive information from multiple apps could potentially develop detailed profiles of the children based on their behavior in different apps.
  • Many apps contain interactive features – such as advertising, links to social media, or the ability to purchase goods within an app – without disclosing those features to parents prior to download.
    • Fifty-eight percent of the apps reviewed contained advertising within the app, while only 15 percent disclosed the presence of advertising prior to download.
    • Twenty-two percent of the apps contained links to social networking services, while only nine percent disclosed that fact.
    • Seventeen percent of the apps reviewed allow kids to make purchases for virtual goods within the app, with prices ranging from 99 cents to $29.99.
    • Although both stores provided certain indicators when an app contained in-app purchasing capabilities, these indicators were not always prominent and, even if noticed, could be difficult for many parents to understand.

FTC Privacy Framework Report

  • PRIVACY FRAMEWORK
    • COMPANIES SHOULD COMPLY WITH THE FRAMEWORK UNLESS THEY HANDLE ONLY LIMITED AMOUNTS OF NON-SENSITIVE DATA THAT IS NOT SHARED WITH THIRD PARTIES.
    • THE FRAMEWORK SETS FORTH BEST PRACTICES AND CAN WORK IN TANDEM WITH EXISTING PRIVACY AND SECURITY STATUTES.
    • THE FRAMEWORK APPLIES TO OFFLINE AS WELL AS ONLINE DATA.
    • THE FRAMEWORK APPLIES TO DATA THAT IS REASONABLY LINKABLE TO A SPECIFIC CONSUMER, COMPUTER, OR DEVICE
  • PRIVACY BY DESIGN
    • THE SUBSTANTIVE PRINCIPLES: DATA SECURITY, REASONABLE COLLECTION LIMITS, SOUND RETENTION PRACTICES, AND DATA ACCURACY.
      • Should Additional Substantive Principles Be Identified?
      • Data Security: Companies Must Provide Reasonable Security for Consumer Data.
      • Reasonable Collection Limitation: Companies Should Limit Their Collection of Data.
      • Sound Data Retention: Companies Should Implement Reasonable Data Retention and Disposal Policies.
      • Accuracy: Companies should maintain reasonable accuracy of consumers’ data
    • COMPANIES SHOULD ADOPT PROCEDURAL PROTECTIONS TO IMPLEMENT THE SUBSTANTIVE PRINCIPLES.
  • SIMPLIFIED CONSUMER CHOICE
    • PRACTICES THAT DO NOT REQUIRE CHOICE.
      • General Approach to “Commonly Accepted” Practices.
      • First-Party Marketing Generally Does Not Require Choice, But Certain Practices Raise Special Concerns.
        • Companies Must Provide Consumers With A Choice Whether To Be Tracked Across Other Parties’ Websites.
        • Affiliates Are Third Parties Unless The Affiliate Relationship Is Clear to Consumers.
        • Cross-Channel Marketing Is Generally Consistent with the Context of a Consumer’s Interaction with a Company
        • Companies Should Implement Measures to Improve The Transparency of Data Enhancement.
        • Companies Should Generally Give Consumers a Choice Before Collecting Sensitive Data for First-Party Marketing.
    • FOR PRACTICES INCONSISTENT WITH THE CONTEXT OF THEIR INTERACTION WITH CONSUMERS, COMPANIES SHOULD GIVE CONSUMERS CHOICES.
      • Companies Should Provide Choices At a Time and In a Context in Which the Consumer Is Making a Decision About His or Her Data.
        • ISPs are thus in a position to develop highly detailed and comprehensive profiles of their customers – and to do so in a manner that may be completely invisible.
      • Take-it-or-Leave-it Choice for Important Products or Services Raises Concerns When Consumers Have Few Alternatives.
        • “In determining whether take-it-or-leave-it choice is appropriate, these commenters focused on three main factors. First, they noted that there must be adequate competition, so that the consumer has alternative sources to obtain the product or service in question. Second, they stated that the transaction must not involve an essential product or service.242 Third, commenters stated that the company offering take-it-or leave-it choice must clearly and conspicuously disclose the terms of the transaction so that the consumer is able to understand the value exchange.”
        • “Another example is the provision of broadband Internet access. As consumers shift more aspects of their daily lives to the Internet – shopping, interacting through social media, accessing news, entertainment, and information, and obtaining government services – broadband has become a critical service for many American consumers. When consumers have few options for broadband service, the take-it-or-leave-it approach becomes one-sided in favor of the service provider. In these situations, the service provider should not condition the provision of broadband on the customer’s agreeing to, for example, allow the service provider to track all of the customer’s online activity for marketing purposes. Consumers’ privacy interests ought not to be put at risk in such one-sided transactions.”
      • Businesses Should Provide a Do Not Track Mechanism To Give Consumers Control Over the Collection of Their Web Surfing Data.
        • “Like the preliminary staff report, this report advocates the continued implementation of a universal, one stop choice mechanism for online behavioral tracking, often referred to as Do Not Track. Such a mechanism should give consumers the ability to control the tracking of their online activities. Many commenters discussed the progress made by industry in developing such a choice mechanism in response to the recommendations of the preliminary staff report and the 2009 OBA Report, and expressed support for these self-regulatory initiatives.”
        • “First, a Do Not Track system should be implemented universally to cover all parties that would track consumers.
        • Second, the choice mechanism should be easy to find, easy to understand, and easy to use.
        • Third, any choices offered should be persistent and should not be overridden if, for example, consumers clear their cookies or update their browsers.
        • Fourth, a Do Not Track system should be comprehensive, effective, and enforceable. It should opt consumers out of behavioral tracking through any means and not permit technical loopholes.
        • Finally, an effective Do Not Track system should go beyond simply opting consumers out of receiving targeted advertisements;
          • it should opt them out of collection of behavioral data for all purposes other than those that would be consistent with the context of the interaction (e.g., preventing click-fraud or collecting de-identified data for analytics purposes).”
      • Large Platform Providers That Can Comprehensively Collect Data Across the Internet Present Special Concerns.
        • “ISPs are thus in a position to develop highly detailed and comprehensive profiles of their customers – and to do so in a manner that may be completely invisible.”
      • Practices Requiring Affirmative Express Consent.
        • Companies Should Obtain Affirmative Express Consent Before Making Material Retroactive Changes To Privacy Representations.
        • Companies Should Obtain Consumers’ Affirmative Express Consent Before Collecting Sensitive Data.
  • Transparency
    • PRIVACY NOTICES
    • ACCESS
      • Special Access Mechanism for Data Brokers
      • Access to Teen Data
    • CONSUMER EDUCATION

[1] It should be noted that DPI has many legitimate uses like network monitoring and security, etc.
[2] “ISPs are thus in a position to develop highly detailed and comprehensive profiles of their customers – and to do so in a manner that may be completely invisible.” [3] Do Not Track is the internet’s version of the Do Not Call list. The tech is relatively simple and could be universal.
[a]How do you mean? [b]The Constitution never says "privacy," and the extent to which it's meant to protect privacy beyond security in one's "papers and effects" has been debated. Obviously the birth-control/abortion cases point to some kind of a right to privacy, but I don't think it's a very secure right. Particularly as the data to be protected here involves information that's sort of been released into the public sphere, I don't think anyone's safe assuming courts would protect it. (I have in mind some Supreme Court cases from my CrimPro class about telephone-related privacy--especially something about payphones. I don't remember names and I could be misremembering.)

As for the part about state-level protection, I mostly had in mind what I'd read in this article: http://www.nytimes.com/2015/02/28/business/white-house-proposes-broad-consumer-data-privacy-bill.html?_r=0

The part at the very end speaks to it a bit. As a side note, I posted the bill discussed here to Blackboard--don't know if you saw it. [c]I think this might be a little too much for us in this context. This debate is legitimate, but I think for our purposes we want to assume the privacy right is established and that we need to rethink how we approach it. Let con-law professors argue over the existence of privacy.

Clone this wiki locally