Chapter 14

Social media and Big Data

Alessandro Mantelero; Giuseppe Vaciago

Abstract

Social media create a peculiar technological landscape in which the predictive ability that distinguishes Big Data has relevant impact in terms of implementation of social surveillance systems. Moreover, the modern social surveillance is no longer realized only by intelligence apparatus, but it is the result of the interplay between private and public sectors, based on a collaborative model made possible by mandatory disclosure orders issued by courts or administrative bodies and extended to an undefined pool of collaborations from big companies.

The aim of the authors is to suggest possible legal and policy solutions to boost a more democratic access to information and to protect individual and collective freedoms.

To reach this goal, the new European proposal on data protection indirectly increases the control over Big Data owners, in order to limit potential abuses and illegitimate advantages, as well as to prevent and limit access to European citizens’ data.

Keywords

Big Data

data protection

surveillance

law and enforcement

privacy

European Union Law

US Law

Introduction

Social media represent an increasing and fundamental part of the online environment generated by the Web 2.0, in which the users are authors of the contents and do not receive passively information, but they create, reshape and share it. In some cases, the interaction among users based on social media created communities, virtual worlds (e.g., Second Life, World of Warcraft) or crowdsourcing projects (Wikipedia). Although there are significant differences in the nature of these outputs, two aspects are always present and are relevant in the light of this contribution: large amount of information, user generated contents.

The social media platforms aggregate huge amounts of data generated by users, which are in many cases identified or identifiable (Ohm, 2010; United States General Accounting Office, 2011; See also Zang and Bolot, 2011; Golle, 2006; Sweeney, 2000b; Sweeney, 2000a; Tene and Polonetsky, 2013).1 This contributes to create a peculiar technological landscape in which the predictive ability that distinguishes Big Data (The Aspen Institute, 2010; Boyd and Crawford, 2011; see also Marton et al., 2013)2 has relevant impact not only in terms of competitive advantage in the business world (identifying in advance emerging trends, business intelligence, etc.) but also in terms of implementation of social surveillance systems by states and groups of power.

From this perspective, the following pages consider these phenomena of concentration of digital information and related asymmetries, which put in the hands of few entities a large amount of data, facilitating the attempts of social surveillance by governments and private companies. The aim of this chapter is to suggest some possible legal and policy solutions both to boost a more democratic access to information and to protect individual and collective freedom.

Big Data: The Asymmetric Distribution of Control Over Information and Possible Remedies

Big Data is not something new, but currently at the final stage of a long evolution of the capability to analyze data using computer resources. Big Data represents the convergence of different existing technologies that permit enormous data-centers to be built, create high-speed electronic highways and have ubiquitous and on-demand network access to computing resources (cloud computing). These technologies offer substantially unlimited storage, allow the transfer of huge amounts of data from one place to another, and allow the same data to be spread in different places and re-aggregated in a matter of seconds.

All these resources permit a large amount of information from different sources to be collected and the petabytes of data generated by social media represent the ideal context in which Big Data analytics can be used. The whole dataset can be continuously monitored by analytics, in order to identify the emerging trends in the flows of data and obtaining real-time or nearly real-time results in a way that is revolutionary and differs from the traditional sampling method (The Aspen Institute, 2010).

The availability of these new technologies and large datasets gives a competitive advantage to those who own them in terms of capability to predict new economic, social and political trends.

In the social media context, these asymmetries are evident with regard to the commercial platforms (e.g., Twitter, Google +, etc.), in which the service providers play a substantial role in term of control over information. Conversely, when the social media are based on open, decentralized and participative architectures, these asymmetries are countered; for this reason in the following paragraphs we will consider the role assumed by open architectures and open data in order to reach a wider access to information and a lower concentration of control over information.

In order to control and limit the information asymmetries related to Big Data and their consequences, in terms of economic advantages and social control, it seems to be necessary the adoption of various remedies, as the complexity of the phenomenon requires different approaches.

First of all, it is important to achieve a better allocation of the control over information. For this purpose, it is necessary to adopt adequate measures to control those who have this power, in order to limit possible abuses and illegitimate advantages. At the same time, we need to increase access to the information and the number of subjects able to create and manage large amounts of data, spreading the informational power currently in the hands of a few bodies.

The need to control these great aggregations of data is also related to their political and strategic relevance and should lead the introduction of a mandatory notification of the creation of a big and important database—as happened at the beginning of the computer age when there was a similar concentration of power in the hands of a few subjects due to the high cost of the first mainframes (Article 29 Data Protection Working Party, 2005; Article 29 Data Protection Working Party, 1997; Bygrave, 2002)3—and the creation of specific international independent authorities. These authorities will be able to control the invasive attitude of governmental power with regard to large databases and the power of the owner of Big Data, but can also have an important role in the definition of specific standards for data security.

This will be a long and tortuous journey, as it is based on international cooperation; nevertheless, it is important to start it as soon as possible, using the existing international bodies and multilateral dialogues between countries. At the same time, any solutions should be graduated in an appropriate manner, avoiding the involvement of every kind of data-farm built somewhere in the world, but considering only the data-farms with an absolutely remarkable dimension or a considerable importance because of the data collected (e.g., police or military databases).

Access to data and data sharing are other two central aspects that should be considered in order to limit the power of the owners of Big Data and give society the opportunity to have access to knowledge. From this perspective, a key role is played by open data (Veenswijk et al., 2012; Executive Office of the President, National Science and Technology Council, 2013)4 and the above-mentioned policies about transparency of the information society (i.e., notification), which permit to know who holds great informational power and ask to these entities to open their archives.

Opening public databases and potentially private archives (Deloitte, 2012; Enel, 2013; Nike Inc., 2013; ASOS API project, 2013; Canadian Goldcorp Inc., 2013) to citizens and giving them raw data not only reduces the power of the owners of information, in terms of the exclusive access to the data, but also limits their advantage in terms of technical and cultural analysis skills (Open Knowledge Foundation, 2004; Cyganiak and Jentzsch, 2011; Kroes, 2011).5

Finally, it is necessary to address the critical issues concerning the geopolitical distribution of informational power, which represents an emerging problem for Europe. Even though big European companies are able to collect and analyze a large amount of data, the main commercial social media are based in U.S. and this element puts this nation in a better position to control the world's informational flows generated by the users of these kinds of services.

From a geo-political perspective, this situation represents a weakness for the E.U., in terms of the loss of control over the data of its citizens due to the need to entrust the management of strategic information to foreign entities. In order to reduce this risk, the European industry is being urged to assume a more important role in ICT sector (Kroes, 2011) and, at the same time, the E.U. is strengthening the protection of personal data.

Big Data and Social Surveillance: Public and Private Interplay in Social Control

The risks related to the concentration of the control over information in the social media context and in general are not restricted to the democratic access and distribution of information and knowledge, but also to the potential systems of social surveillance that can be realized using this information.

From this perspective, the recent NSA case (European Parliament, 2013c; Auerbach et al., 2013; European Parliament, 2013a; European Parliament, 2013b)6 is being the more evident representation of the potential consequences of monitoring online interaction, although it is just the latest in a series of programs adopted by governmental agencies in various nations to pursue massive social surveillance (European Parliament, 2001; European Parliament 2013a; European Parliament 2013b; DARPA. Total Information Awareness Program (TIA), 2002; National Research Council, 2008; Congressional Research Service. CRS Report for Congress, 2008).7

In western democratic nations, the modern social surveillance is no longer realized only by intelligence apparatus, which autonomously collects a huge amount of information through pervasive monitoring systems. The social surveillance is the result of the interaction between the private and public sector, based on a collaborative model made possible by mandatory disclosure orders issued by courts or administrative bodies and extended to an undefined pool of voluntary or proactive collaborations from big companies (Council of Europe, 2008).

In this way, governments obtain information with the indirect “co-operation” of the users who probably would not have given the same information to public entities if requested. Service providers, for example, collect personal data on the base of private agreements (privacy policies) with the consent of the user and for their specific purposes (Reidenberg, 2013) but governments exploit this practice by using mandatory orders to obtain the disclosure of this information. This dual mechanism hides from citizens the risk and the dimension of the social control that can be realized by monitoring social networks or other services and using Big Data analytics technologies.

Another relevant aspect of the control deriving from Big Data is the amount of it. Analyses focused on profiling enable to predict the attitudes and decisions of any single user and even to match similar profiles. In contrast, Big Data is not used to focus on individuals, but to analyze large groups and populations (e.g., the political sentiment of an entire country).

Although, in many cases, intelligence activities have little to do with general data protection regulations—since they are authorized by specific legislative provisions introducing exceptions to general principles (Cate et al., 2012; Swire, 2012; Bailey, 2012; Wang, 2012; Brown, 2012; Tsuchiya, 2012; Pell, 2012; Cate and Cate, 2012; Svantesson, 2012; Tene, 2012; Schwartz, 2012; Abraham and Hickok, 2012; See also Brown, 2013; European Parliament, 2013b), regulations on data protection and privacy can play a relevant role in terms of reduction of the amount of data collected by private entities and, consequently, have an indirect impact on the information available for purposes of public social surveillance.

The interaction between public and private in social control could be divided in two categories, both of which are significant with regard to data protection. The first concerns the collection of private company data by government and judicial authorities (see Section “Array of Approved eSurveillance Legislation”), whilst the second is the use by government and judicial authorities of instruments and technologies provided by private companies for organizational and investigative purposes (see Section “Use of Private Sector Tools and Resources”).

Array of Approved eSurveillance Legislation

With regard to the first category and especially when the request is made by governmental agencies, the issue of the possible violation of fundamental rights becomes more delicate. The Echelon Interception System (European Parliament, 2001) and the Total Information Awareness (TIA) Program (European Parliament, 2001; European Parliament 2013a; European Parliament 2013b; DARPA. Total Information Awareness Program (TIA), 2002; National Research Council, 2008; Congressional Research Service. CRS Report for Congress, 2008) are concrete examples which are not isolated incidents, but undoubtedly the NSA case (European Parliament, 2013c; Auerbach et al., 2013; European Parliament, 2013a; European Parliament, 2013b)8 has clearly shown how could be invasive the surveillance in the era of global data flows and Big Data. To better understand the case, it’s quite important to have an overview of the considerable amount of electronic surveillance legislation which, particularly in the wake of 9/11, has been approved in the United States and, to a certain extent, in a number of European countries.

The most important legislation is the Foreign Intelligence Surveillance Act (FISA) of 19789 which lays down the procedures for collecting foreign intelligence information through the electronic surveillance of communications for homeland security purposes. The section 702 of FISA Act amended in 2008 (FAA) extended its scope beyond interception of communications to include any data in public cloud computing as well. Furthermore, this section clearly indicates that two different regimes of data processing and protection exist for U.S. citizens and residents (USPERs) on the one hand, and non-U.S. citizens and residents (non-USPERs) on the other. More specifically the Fourth Amendment is applicable only for U.S. citizens as there is an absence of any cognizable privacy rights for “non-U.S. persons” under FISA (Bowden, 2013).

Thanks to FISA Act and the amendment of 2008, U.S. authorities have the possibility to access and process personal data of E.U. citizens on a large scale via, among others, the National Security Agency’s (NSA) warrantless wiretapping of cable-bound internet traffic (UPSTREAM) and direct access to the personal data stored in the servers of U.S.-based private companies such as Microsoft, Yahoo, Google, Apple, Facebook or Skype (PRISM), through cross-database search programs such as X-KEYSCORE. U.S. authorities have also the power to compel disclosure of cryptographic keys, including the SSL keys used to secure data-in-transit by major search engines, social networks, webmail portals, and Cloud services in general (BULLRUN Program) (Corradino, 1989; Bowden, 2013). Recently the United States President’s Review Group on Intelligence and Communications Technologies released a report entitled “Liberty and Security in a Changing World.” The comprehensive report sets forth 46 recommendations designed to protect national security while respecting our longstanding commitment to privacy and civil liberties with a specific reference to on non-U.S. citizen (Clarke et al., 2014).

Even if the FISA Act is the mostly applied and known legislative tool to conduct intelligence activities, there are other relevant pieces of legislation on electronic surveillance. One need only to consider the Communications Assistance For Law Enforcement Act (CALEA) of 1994,10 which authorizes the law enforcement and intelligence agencies to conduct electronic surveillance by requiring that telecommunications carriers and manufacturers of telecommunications equipment modify and design their equipment, facilities, and services to ensure that they have built-in surveillance. Furthermore, following the Patriot Act of 2001, a plethora of bill has been proposed. The most recent bills (not yet in force) are the Cyber Intelligence Sharing and Protection Act (CISPA) of 2013 (Jaycox and Opsahl, 2013), which would allow Internet traffic information to be shared between the U.S. government and certain technology and manufacturing companies and the Protecting Children From Internet Pornographers Act of 2011,11 which extends data retention duties to U.S. Internet Service Providers.

Truthfully, the surveillance programs are not only in the United Sattes. In Europe, the Communications Capabilities Development Program has prompted a huge amount of controversy, given its intention to create a ubiquitous mass surveillance scheme for the United Kingdom (Barret, 2014) in relation to phone calls, text messages and emails and extending to logging communications on social media. More recently, on June 2013 the so-called program TEMPORA showed that UK intelligence agency Government Communications Headquarters (GCHQ) has cooperated with the NSA in surveillance and spying activities (Brown, 2013).12 These revelations were followed in September 2013 by reports focusing on the activities of Sweden’s National Defense Radio Establishment (FRA). Similar projects for the large-scale interception of telecommunications data has been conducted by both France’s General Directorate for External Security (DGSE) and Germany’s Federal Intelligence Service (BDE) (Bigo et al., 2013).

Even if it seems that E.U. and U.S. surveillance programs are similar, there is one important difference: In the E.U., under Data Protection law, individuals have always control of their own personal data while in U.S., the individual have a more limited control once the user has subscribed to the terms and condition of a service.13

Forced “On Call” Collaboration by Private Entities

Other than government agencies' monitoring activities, there are cases in which Internet Service Providers collaborate spontaneously or over a simple request from the law enforcement agencies. The exponential increase in Big Data since 2001 has provided a truly unique opportunity. In this respect, a key role has been played by Social Media. One need only reflect on the fact that Facebook, Twitter, Google +, and Instagram, all of which are situated in Silicon Valley, boast around 2 billion users throughout the world14 and many of these users are citizens of the European Union. Facebook's founder may have intended “to empower the individual,” but there is no doubt that Social Network Services (SNSs) have also empowered law enforcement (Kirkpatrick, 2013).

Data Collection for Crime Prediction and Prevention

To stay on the topic of information acquisition by the law enforcement, there are two interesting cases of the collection of Big Data for crime prevention purposes:

The first is the “PredPol” software initially used by the Los Angeles police force and now by other police forces in the United States (Palm Beach, Memphis, Chicago, Minneapolis and Dallas). Predictive policing, in essence, cross check data, places and techniques of recent crimes with disparate sources, analyzing them and then using the results to anticipate, prevent and respond more effectively to future crime. Even if the software house created by PredPol declares that no profiling activities are carried out, it becomes essential to carefully understand the technology used to anonymize the personal data acquired by the law enforcement database. This type of software is bound to have a major impact in the United States on the conception of the protection of rights under the Fourth Amendment, and more specifically on concepts such as “probable cause” and “reasonable suspicion” which in future may come to depend on an algorithm rather than human choice (Ferguson, 2012).

The second example is X1 Social Discovery software.15 This software maps a given location, such as a certain block within a city or even an entire particular metropolitan area, and searches the entire public Twitter feed to identify any geo-located tweets in the past three days (sometimes longer) within that specific area. This application can provide particularly useful data for the purpose of social control. One can imagine the possibility to have useful elements (e.g., IP address) to identify the subjects present in a given area during a serious car accident or a terrorist attack.

Legitimacy

From a strictly legal standpoint, these social control tools may be employed by gathering information from citizens directly due the following principle of public:

“Where someone does an act in public, the observance and recording of that act will ordinarily not give rise to an expectation of privacy”.

(Gillespie, 2009)

In the European Union, whilst this type of data collection frequently takes place, it could be in contrast with ECHR case law which, in the Rotaru vs. Romania case,16 ruled that “public information can fall within the scope of private life where it is systematically collected and stored in files held by the authorities.” As O’Floinn observes: “Non-private information can become private information depending on its retention and use. The accumulation of information is likely to result in the obtaining of private information about that person” (O’Floinn and Ormerod, 2001).

In the United States, this subject has been addressed in the case People vs. Harris,17 currently pending in front of the Supreme Court. On January 26, 2012, the New York County District Attorney’s Office sent a subpoena to Twitter, Inc. seeking to obtain the Twitter records of user suspected of having participated in the “Occupy Wall Street” movement. Twitter refused to provide the law enforcement officers with the information requested and sought to quash the subpoena. The Criminal Court of New York confirmed the application made by the New York County District Attorney’s Office, rejecting the arguments put forward by Twitter, stating that tweets are, by definition, public, and that a warrant is not required in order to compel Twitter to disclose them. The District Attorney’s Office argued that the “third party disclosure” doctrine put forward for the first time in United States vs. Miller was applicable.18

Use of Private Sector Tools and Resources

The second relationship concerns the use by the state of tools and resources from the private company for the purposes of organization and investigations. Given the vast oceans of Big Data, U.S. governmental authorities decided to turn to the private sector, not only for purposes of software management but also in relation to management of the data itself. One example is the CTO’s Hadoop platform (CTO labs, 2012), which is capable of memorizing and storing data in relation to many law enforcement authorities in the United States. Similarly, a private cloud system has emerged which conveys the latest intelligence information in near-real time to U.S. troops stationed in Afghanistan (Conway, 2014). Another example is the facial recognition technology developed by Walt Disney for its park and sold to the U.S. military force (Wolfe, 2012).

Considering costs saving and massive computing power of a centralized cloud system, it is inevitable that law enforcement, military forces and government agencies will progressively rely on this type of services. The afore-mentioned change will entail deducible legal issues in terms of jurisdiction, security and privacy regarding data management. The relevant legal issues might be solved through a private cloud within the State with exclusive customer key control. However, it is worth considering that, in this way, private entities will gain access to a highly important and ever expanding information asset. Therefore, they will be able to develop increasingly sophisticated and data mining tools, thanks to cloud systems’ potential. This scenario, which is already a fact in the United States, might become reality also thanks to the impulse of the Digital Agenda for Europe and its promotion of Public Private Partnership initiatives on Cloud (Commission of the European Communities, 2009; see also The European Cloud Partnership (ECP), 2013). This is why it is important that European cloud services should be based on high standards of data protection, security, interoperability and transparency about service levels and government access to information as it has recently been recognized by the European Commission (European Commission, 2013).

The Role of the E.U. Reform on Data Protection in Limiting the Risks of Social Surveillance

The framework described above shows that modern social control is the result of the interaction between the private and public sector. This collaborative model is not only based on mandatory disclosure orders issued by courts or administrative bodies, but has also extended to a more indefinite grey area of voluntary and proactive collaboration by big companies. It is difficult to get detailed information on this second model of voluntary collaboration; however, the predominance of U.S. companies in the ICT sector, particularly with regard to the Internet and cloud services, increases the influence of the U.S. administration on national companies and makes specific secret agreements of cooperation in social control easier (European Parliament, 2013a; European Parliament, 2012).

Against this background, the political and strategic value of the European rules on data protection emerges. These rules may assume the role of a protective barrier in order to prevent and limit access to the information about European citizens.19 In this sense, the E.U. Proposal for a General Data Protection Regulation (European Commission, 2012) extends its territorial scope (Article 3 (2), 2013) through “the processing of personal data of data subjects in the Union by a controller or processor not established in the Union, where the processing activities are related to:

(a) the offering of goods or services, irrespective of whether a payment of the data subject is required, to such data subjects in the Union; or

(b) the monitoring of such data subjects”.20

It should be noted that various commentators consider that the privacy risks related to Big Data analytics are low, pointing out the large amount of data processed by analytics and the de-identified nature of most of this data. This conclusion is wrong. Anonymity by de-identification is a difficult goal to achieve, as demonstrated in a number of studies (see Ohm, 2010; United States General Accounting Office, 2011; See also Zang and Bolot, 2011; Golle, 2006; Sweeney, 2000b; Sweeney, 2000a; Tene and Polonetsky, 2013). The power of Big Data analytics to draw unpredictable inference from information undermines many strategies based on de-identification (Mayer-Schönberger and Cukier, 2013; Schwartz and Solove, 2011). In many cases a reverse process in order to identify individuals is possible; it is also possible to identify them using originally anonymous data (see Ohm, 2010; United States General Accounting Office, 2011; See also Zang and Bolot, 2011; Golle, 2006; Sweeney, 2000b; Sweeney, 2000a; Tene and Polonetsky, 2013). Here, it is closer to the truth to affirm that each data is a piece of personal information than to assert that it is possible to manage data in a de-identified way.

Although the Proposal for a new regulation does not regard the data processed by public authorities for the purposes of prevention, investigation, detection, prosecution of criminal offences or the execution of criminal penalties,21 its impact on social control is significant, since in many cases the databases of private companies are targeted by public authority investigations. For this reason, reducing the amount of data collected by private entities and increasing data subjects' self-determination with regard to their personal information limit the possibility of subsequent social control initiatives by government agencies.

However, the complexity of data processes and the power of modern analytics along with the presence of technological and market lock-in effects drastically reduce the awareness of data subjects, their capability to evaluate the various consequences of their choices and the expression of a real free and informed consent (Brandimarte et al., 2010). This lack of awareness facilitates the creation of wider databases, which are accessible by the authorities in cases provided by the law, and is not avoided by giving adequate information to the data subjects or by privacy policies, due to the fact that these notices are read only by a very limited number of users who, in many cases, are not able to understand part of the legal terms usually used in these notices (Turow et al., 2007).

These aspects are even more relevant in a Big Data context rendering the traditional model of data protection to be in crisis (Cate, 2006; See also Cate and Mayer-Schönberger, 2012; Rubinstein, 2013). The traditional model is based on general prohibition plus “notice and consent”22 and the coherence of the data collection with the purposes defined at the moment in which the information is collected. However, nowadays much of the value of personal information is not apparent when notice and consent are normally given (Cate, 2006; See also Cate and Mayer-Schönberger, 2012; Rubinstein, 2013) and the “transformative” (Tene and Polonetsky, 2012) use of Big Data makes it often impossible to explain the description of all its possible uses at the time of initial collection.

The E.U. Proposal, in order to reinforce the protection of individual information, interacts with these constraints and shifts the focus of data protection from an individual choice toward a privacy-oriented architecture (Mantelero, 2013a).23 This approach, which limits the amount of data collected through “structural” barriers and introduces a preventive data protection assessment (Article 23, 2013), also produces a direct effect on social control by reducing the amount of information available.

With regard to the information collected, the E.U. Proposal reinforces users' self-determination by requiring data portability, which gives the user the right to obtain a copy of the data undergoing processing from the controller “in an electronic and structured format which is commonly used and allows for further use by the data subject”.24 Portability will reduce the risk of technological lock-in due to the technological standards and data formats adopted by service providers, which limit the migration from one service to another. However, in many cases and mainly in social media context, the limited number of companies providing the services reduces the chances for users not to be tracked by moving their account from one platform to another and, thereby, minimizes the positive effects of data portability.

Finally, the Proposal reinforces the right to obtain the erasure of data processed without the consent of the data subject, against his objection, without providing adequate information for him or outside of the legal framework (Mantelero, 2013b). An effective implementation of this right can reduce the overall amount of data stored by service providers, and may limit the amount of information existing in the archives without a legitimate reason for the processing of information. In this manner, the possibility of consulting the history of individual profiles by authorities is also reduced.

All the aspects considered above concur to limit the information available to all entities interested in social control, and therefore, also affect the request of disclosure held by government agencies and courts to private companies. Nevertheless, these powers of search and seizure and their exercise represent the fundamental core of social control.

Preserving the E.U. Data Protection Standard in a Globalized World

In order to analyze this aspect in the scenario of the future European data protection framework it is necessary to consider both proposals by the European Commission:

- the Proposal for a new General Data Protection Regulation (PGDPR) (see European Commission, 2012) and

- the less debated Proposal for a Directive in the law enforcement sector (PDPI).25

Although the second proposal is more specific on governmental and judicial control, the first considers this aspect from the point of view of the data flows.

The new Proposal for a new General Data Protection Regulation, as well as the currently in force Directive 96/46/EC, allows trans-border data flows from the Europe to other countries only when the third country provides an adequate level of data protection (Mantelero, 2012). When evaluating the adequacy of data protection in a given country, the Commission should also consider to the legislation in force in third countries “including concerning public security, defense, national security and criminal law”.26 Consequently, the presence of invasive investigative public bodies and the lack of adequate guarantees to the data subject assume relevance for the decision whether to limit the trans-border data flows between subsidiaries and holdings or between companies. Once again this limit does not affect public authorities, but restricts the set of information held by private companies available for their scrutiny.

Without considering the NSA case that is still on-going, an explanatory case on the relationship between trans-border data flows, foreign jurisdiction and the possible effects on citizens and social control is provided by the SWIFT case; the same criticism applies and has been expressed by commentators with regard to the U.S. Patriot Act. These two cases differ because in the NSA case non-E.U. authorities requested to access information held by a company based in the E.U., whereas in the SWIFT case the requests were directed to U.S. companies in order to have access to the information they received from their E.U. subsidiaries.

In the SWIFT case (Article 29 Data Protection Working Party, 2006b) the Article 29 Data Protection Working Party clarified that a foreign law does not represent the legal base for the disclosure of personal information to non-E.U. authorities, since only the international instruments provide an appropriate legal framework enabling international cooperation (Article 29 Data Protection Working Party, 2006b; see also Article 29 Data Protection Working Party, 2006a). Furthermore, the exception provided by Art. 26 (1) (b) Directive 95/46/EC27 does not apply when the transfer is not necessary or legally required on important public interest grounds of an E.U. Member State (Article 29 Data Protection Working Party, 2006b).28

In contrast (as emerged in the PATRIOT Act case and also with reference to the wider, complex and dynamic system of powers enjoyed by the U.S. government in the realm of criminal investigations and national security (van Hoboken et al., 2012)29), the U.S. authorities may access data held by the E.U. subsidiaries of U.S. companies.30 However, it is necessary to point out that there is a potential breach of protection of personal data of European citizens and that this happens not only with regards to U.S. laws, but also in relations with other foreign regulations, as demonstrated by the recent draft of the Indian Privacy (Protection) Bill31 and Chinese laws on data protection (Greenleaf and Tian, 2013; The Decision of the Standing Committee of the National People’s Congress on Strengthening Internet Information Protection 2012; Ministry of Industry and Information Technology Department Order, 2011; Greenleaf, 2013).

In order to reduce such intrusions the draft version of the E.U. Proposal for a General Data Protection Regulation limited the disclosure to foreign authorities and provided that

“no judgment of a court or tribunal and no decision of an administrative authority of a third country requiring a controller or processor to disclose personal data shall be recognized or be enforceable in any manner, without prejudice to a mutual assistance treaty or an international agreement in force between the requesting third country and the Union or a Member State”.32

The draft also obliged controllers and processors to notify national supervisory authorities of any such requests and to obtain prior authorization for the transfer by the supervisory authority (See also European Parliament, 2013c; European Parliament, 2013a; European Parliament, 2013b).33 These provisions had been dropped from the final version of the Commission’s Proposal on 25 January 2012, but have now been reintroduced by the European Parliament, as a reaction to the NSA case.34

In addition to the Proposal for a General Data Protection Regulation, the above-mentioned Proposal for a Directive on the protection of individuals with regard to the processing of personal data by competent authorities (PDPI) establishes some protection against a possible violation of EU citizens' privacy.

The goal of this Directive is to ensure that “in a global society characterized by rapid technological change where information exchange knows no borders” the fundamental right to data protection is consistently protected.35 One of the main issues at E.U. level is the lack of harmonization across Member States’ data protection law and even more “in the context of all E.U. policies, including law enforcement and crime prevention as well as in our international relations” (European Commission, 2010). Whilst a directive may not have the same impact on harmonizing national regulations currently in force in various Member States (For a critical view on this point see Cannataci, 2013; See also Cannataci and Caruana, 2014), it does in fact represent the first piece of legislation to have direct effect when compared to the previous attempts by way of Council of Europe Recommendation No. R (87)36 and Framework Decision 2008/977/JHA.37

The founding principles of this Directive, which are shared with the previous directives referred to, are twofold:

(1) First there is the need for fair, lawful, and adequate data processing during criminal investigations or to prevent a crime, on the basis of which every data must be collected for specified, explicit and legitimate purposes and must be erased or rectified without delay (Art. 4, PDPI and Art. 4b, 2013).

(2) Then there is the obligation to make a clear distinction between the various categories of the possible data subjects in a criminal proceeding (persons with regard to whom there are serious grounds for believing that they have committed or are about to commit a criminal offence, persons convicted, victims of criminal offense, third parties to the criminal offence).

For each of these categories there must be a different adequate level of attention on data protection, especially for persons who do not fall within any of the categories referred above.38

These two principles are of considerable importance, although their application on a practical level will be neither easy nor immediate in certain Member States. This is easily demonstrated by the difficulties encountered when either drafting practical rules distinguishing between several categories of potential data subjects within the papers on a court file, or attempting to identify the principle on the basis of which a certain court document is to be erased.

In addition to these two general principles the provisions of the Directive, are interesting and confirm consolidated data protection principles. Suffice to mention here the prohibition on using measures solely based on automated processing of personal data which significantly affect or produce an adverse legal effect for the data subject,39 as well as the implementation of data protection by design and by default mechanisms to ensure the protection of the data subject’s rights and the processing of only those personal data.40

Furthermore, the proposal for a Directive in the law enforcement sector entails the obligation to designate a data protection officer in all law enforcement agencies in order to monitor the implementation and application of the policies on the protection of personal data.41

These principles constitute a significant limitation to possible data mining of personal and sensitive data collection by law enforcement agencies. If it is true that most of these provisions were also present in the Recommendation No. R (87) of Council of Europe and in the Framework Decision 2008/977/JHA, it is also true that propelling data protection by design and by default mechanisms and measures could encourage data anonymization and help to avoid the indiscriminate use of automated processing of personal data.

References

Abraham S, Hickok E. Government access to private-sector data in India. Int. Data Privacy Law. 2012;2(4):302–315.

Article 3 (2), Proposal for a regulation of the European Parliament and of the Council on the protection of individual with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation),(COM(2012)0011 – C7 0025/2012 – 2012/0011(COD)), Compromise amendments on Articles 1-29 (hereinafter abbreviated as PGDPR-LIBE_1-29). http://www.europarl.europa.eu/meetdocs/2009_2014/documents/libe/dv/comp_am_art_01-29/comp_am_art_01-29en.pdf [Dec. 10, 2013]. See also Article 3 (2), PGDPR.

ASOS API project at http://developer.asos.com/page [Sept. 29, 2013].

Article 23, Proposal for a regulation of the European Parliament and of the Council on the protection of individual with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation),(COM(2012)0011 – C7 0025/2012 – 2012/0011(COD)), Compromise amendments on Articles 30-91 (hereinafter abbreviated as PGDPR-LIBE_30-91). http://www.europarl.europa.eu/meetdocs/2009_2014/documents/libe/dv/comp_am_art_30-91/comp_am_art_30-91en.pdf [Dec. 15, 2013].

Art. 4, PDPI and Art. 4b, Proposal for a directive of the European Parliament and of the Council on the protection of individuals with regard to the processing of personal data by competent authorities for the purposes of prevention, investigation, detection or prosecution of criminal offences or the execution of criminal penalties, and the free movement of such data (COM(2012)0010 – C7-0024/2012 – 2012/0010(COD)) (hereinafter abbreviated as PDPI-LIBE). Available: http://www.europarl.europa.eu/meetdocs/2009_2014/organes/libe/libe_20131021_1830.htm [Nov. 15, 2013].

Article 29 Data Protection Working Party, 2006a. Opinion 1/2006 on the application of the EU data protection rules to internal whistleblowing schemes in the fields of accounting, internal accounting controls, auditing matters, fight against, banking and financial crime.

Article 29 Data Protection Working Party, 2006b. Opinion 10/2006 on the processing of personal data by the Society for Worldwide Interbank Financial Telecommunication (SWIFT).

Article 29 Data Protection Working Party, 2005. Article 29 Working Party report on the obligation to notify the national supervisory authorities, the best use of exceptions and simplification and the role of the data protection officers in the European Union, Bruxelles. http://ec.europa.eu/justice/policies/privacy/docs/wpdocs/2005/wp106_en.pdf [10.12.13].

Article 29 Data Protection Working Party, 1997. Working Document: Notification, Bruxelles. http://ec.europa.eu/justice/policies/privacy/docs/wpdocs/1997/wp8_en.pdf [10.12.13].

Auerbach, D., Mayer, J., Eckersley, P., 2013. What We Need to Know About PRISM, June 12. https://www.eff.org [10.12.13].

Bailey J. Systematic government access to private-sector data in Canada. Int. Data Privacy Law. 2012;2(4):207–219.

Barret, D., 2014. Phone and email records to be stored in new spy plan, in The Telegraph. http://www.telegraph.co.uk/technology/internet/9090617/Phone-and-email-records-to-be-stored-in-new-spy-plan.html [31.01.14].

Bigo, D., Carrera, S., Hernanz, N., Jeandesboz, J., Parkin, J., Ragazzi, F., Scherrer, A., 2013. The US surveillance programmes and their impact on EU citizens' fundamental rights, Study for the European Parliament, PE 493.032, Sept. 2013.

Boyd, D., Crawford, K., 2011. “Six Provocations for Big Data,” presented at the “A Decade in Internet Time: Symposium on the Dynamics of the Internet and Society”. Oxford Internet Institute, Oxford, United Kingdom, Available: http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1926431 [10.12.13].

Bowden, C., 2013. The US surveillance programmes and their impact on EU citizens' fundamental rights, Study for the European Parliament, PE 474.405, 15 October 2013.

Brandimarte, L., Acquisti, A., Loewenstein, G., 2010. Misplaced Confidences: Privacy and the Control Paradox. Internet, presented at the 9th Annual Workshop on the Economics of Information Security, Cambridge, MA, USA. http://www.heinz.cmu.edu/~acquisti/papers/acquisti-SPPS.pdf [15.02.13].

Brown I. Government access to private-sector data in the United Kingdom. Int. Data Privacy Law. 2012;2(4):230–238.

Brown, I., 2013. Lawful Interception Capability Requirements. Computers & Law, Aug./Sep. 2013. http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2309413

Brown, I., 2013 Expert Witness Statement for Big Brother Watch and Others Re: Large-Scale Internet Surveillance by the UK. Application No: 58170/13 to the European Court of Human Rights. http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2336609, Sept. 27, [31.01.14].

Bygrave AL. Data Protection Law. Approaching Its Rationale, Logic and Limits, Kluwer Law International. London, New York: The Hague; 2002.

Cannataci, J.A., Caruana, M. Report: Recommendation R (87) 15—Twenty-five years down the line. http://www.statewatch.org/news/2013/oct/coe-report-data-privacy-in-the-police-sector.pdf [31.01.14].

Cannataci JA. Defying the logic, forgetting the facts: the new European proposal for data protection in the police sector. Eur. J. Law Technol. 2013. ;3(2). Available: http://ejlt.org/article/view/284/390.

Canadian Goldcorp Inc. case at http://www.ideaconnection.com/open-innovation-success/Open-Innovation-Goldcorp-Challenge-00031.html [Sept. 10, 2013].

Cate FH, Cate BE. The Supreme Court and information privacy. Int. Data Privacy Law. 2012;2(4):255–267.

Cate, F.H., Mayer-Schönberger, V., 2012. Notice and Consent in a World of Big Data. Microsoft Global Privacy Summit Summary Report and Outcomes. http://www.microsoft.com/en-au/download/details.aspx?id=35596 [15.09.2013].

Cate FH, Dempsey JX, Rubinstein IS. Systematic government access to private-sector data. Int. Data Privacy Law. 2012;2(4):195–199.

Cate, F.H. The failure of fair information practice principles. In: Winn, J. (Ed.), Consumer Protection in the Age of the Information Economy. Aldershot-Burlington, Ashgate, 2006, pp. 343–345. http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1156972

Clarke, R., Morell, M., Stone, G., Sunstein, C., Swire, P., Liberty and Security in a changing World, Report and Recommendations of The President’s Review Group on Intelligence and Communications Technologies. http://www.whitehouse.gov/sites/default/files/docs/2013-12-12_rg_final_report.pdf [31.01.14].

Commission of the European Communities, 2009. Communication from the Commission to the European Parliament, the Council, the European Economic and Social Committee and the Committee of the Regions, October 28. http://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=COM:2009:0479:FIN:EN:PDF [31.01.14].

Congressional Research Service. CRS Report for Congress, 2008. Data Mining and Homeland Security: An Overview. www.fas.org/sgp/crs/homesec/RL31798.pdf [10.12.13].

Conway, S., 2014. Big Data Cloud Delivers Military Intelligence to U.S. Army in Afghanistan, in Datanami, 6 February 2012. http://snurl.com/284ak5j [31.01.14].

Corradino E. The fourth amendment overseas: is extraterritorial protection of foreign nationals going too far?. Fordham Law Rev. 1989;57(4):617.

Council of Europe, 2008. Guidelines for the cooperation between law enforcement and internet service providers against cybercrime, Strasbourg, 1–2 April 2008. http://www.coe.int/t/informationsociety/documents/Guidelines_cooplaw_ISP_en.pdf

CTO labs, 2012. White Paper: Big Data Solutions for Law Enforcement. http://ctolabs.com/wp-content/uploads/2012/06/120627HadoopForLawEnforcement.pdf [31.01.14].

Cyganiak and A. Jentzsch, A. “The Linking Open Data cloud diagram”. Internet: http://lod-cloud.net/, Sept. 19, 2011 [Sept. 4, 2013].

DARPA. Total Information Awareness Program (TIA), 2002. System Description Document (SDD), Version 1.1. http://epic.org/privacy/profiling/tia/tiasystemdescription.pdf [10.12.13].

Deloitte, 2012. Open data. Driving growth, ingenuity and innovation. London, pp. 16–20. Available: http://www.deloitte.com/assets/dcom-unitedkingdom/local%20assets/documents/market%20insights/deloitte%20analytics/uk-insights-deloitte-analytics-open-data-june-2012.pdf [10.12.13].

http://data.enel.com/ [Sept. 29, 2013] project that shares data sets regarding Enel, an Italian multinational group active in the power and gas sectors.

European Commission, 2010. Study on the economic benefits of privacy enhancing technologies or the Comparative study on different approaches to new privacy challenges, in particular in the light of technological developments. http://ec.europa.eu/justice/policies/privacy/docs/studies/new_privacy_challenges/final_report_en.pdf [31.01.14].

European Commission, Proposal for a regulation of the European Parliament and the Council on the protection of individuals with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation), COM(2012) 11 final, Brussels, 25 January 2012 (hereinafter abbreviated as PGDPR). Available: http://ec.europa.eu/justice/data-protection/document/review2012/com_2012_11_en.pdf [Dec. 10, 2013].

European Commission, 2013. “What does the Commission mean by secure Cloud computing services in Europe?”, MEMO/13/898, 15 October. http://europa.eu/rapid/press-release_MEMO-13-898_en.htm

European Parliament, Directorate General for Internal Policies, Policy Department C: Citizens’ Rights and Constitutional Affairs, Civil Liberties, Justice and Home Affairs, 2013a. The US National Security Agency (NSA) surveillance programmes (PRISM) and Foreign Intelligence Surveillance Act (FISA) activities and their impact on EU citizens. http://info.publicintelligence.net/EU-NSA-Surveillance.pdf [10.12.2013], pp.14–16.

European Parliament, Directorate General for Internal Policies, Policy Department C: Citizens’ Rights and Constitutional Affairs, Civil Liberties, Justice and Home Affairs, 2013b. National Programmes for Mass Surveillance of Personal data in EU Member States and Their Compatibility with EU Law. http://www.europarl.europa.eu/committees/it/libe/studiesdownload.html?languageDocument=EN&file=98290 [10.12.13], pp.12–16.

European Parliament, Directorate-General for Internal Policies, Policy Department Citizens’ Right and Constitutional Affairs, 2012. Fighting cyber crime and protecting privacy in the cloud. http://www.europarl.europa.eu/committees/en/studiesdownload.html?languageDocument=EN&file=79050 [31.01.14].

European Parliament, 2001. Report on the existence of a global system for the interception of private and commercial communications (ECHELON interception system). http://www.fas.org [10.12.2013].

European Parliament, 2013c. Resolution of 4 July 2013 on the US National Security Agency surveillance programme, surveillance bodies in various Member States and their impact on EU citizens' privacy. http://www.europarl.europa.eu/sides/getDoc.do?pubRef=-//EP//TEXT+TA+P7-TA-2013-0322+0+DOC+XML+V0//EN [10.12.13].

Executive Office of the President, National Science and Technology Council, 2013. Smart Disclosure and Consumer Decision Making: Report of the Task Force on Smart Disclosure. Washington. http://www.whitehouse.gov/sites/default/files/microsites/ostp/report_of_the_task_force:on_smart_disclosure.pdf [10.12.13].

Ferguson, A., 2012. Predictive policing: the future of reasonable suspicion. Emory Law J. 62, 259–325. http://www.law.emory.edu/fileadmin/journals/elj/62/62.2/Ferguson.pdf [31.01.14].

Gillespie A. Regulation of Internet Surveillance. Eur. Human Rights Law Rev. 2009;4:552–565.

Golle P. Revisiting the uniqueness of simple demographics in the US population. In: Proc. 5th ACM Workshop on Privacy in Electronic Society. 2006:77–80.

Greenleaf, G., Tian, G., 2013. China Expands Data Protection through 2013 Guidelines: A ‘Third Line’ for Personal Information Protection (With a Translation of the Guidelines). Privacy Laws Business Int. Rep. 122, 1. http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2280037 [25.10.13].

Greenleaf G. China: NPC Standing Committee takes a small leap forward. Privacy Laws Business Int. Rep. 2013;121:1–7.

Jaycox, M.M., Opsahl, K., 2013. CISPA is Back. https://www.eff.org/cybersecurity-bill-faq [31.01.14].

Kirkpatrick D. The Facebook Effect: The Inside Story of the Company That Is Connecting the World. New York: Simon and Schuster; 2013.

Kroes, N., 2011. The Digital Agenda: Europe's key driver of growth and innovation. SPEECH/11/629. Brussels. http://europa.eu/rapid/press-release_SPEECH-11-629_en.htm [10.12.13].

Mantelero A. Cloud computing, trans-border data flows and the European Directive 95/46/EC: applicable law and task distribution. Eur. J. Law Technol. 2012. ;3(2). http://ejlt.org//article/view/96.

Mantelero A. Competitive value of data protection: the impact of data protection regulation on online behaviour. Int. Data Privacy Law. 2013a;3(4):231–238.

Mantelero A. The EU Proposal for a General Data Protection Regulation and the roots of the ‘right to be forgotten’. Computer Law Security Rev. 2013b;29:229–235.

Marton, A., Avital, M., Blegind Jensen, J., 2013. Reframing Open Big Data, presented at ECIS 2013, Utrecht, Netherlands, http://aisel.aisnet.org/ecis2013_cr/146 [10.12.2013].

Mayer-Schönberger V, Cukier K. Big Data: A Revolution That Will Transform How We Live, Work and Think. London: John Murray Publishers; 2013 154–156.

Ministry of Industry and Information Technology Department Order, Several Regulations on Standardizing Market Order for Internet Information Services, published on 29 December 2011. http://www.miit.gov.cn/n11293472/n11293832/n12771663/14417081.html [25.10.13].

National Research Council. Protecting Individual Privacy in the Struggle Against Terrorists: A Framework for Program Assessment. Washington, D.C., Appendix I and Appendix J 2008.

Nike Inc., http://www.nikeresponsibility.com/report/downloads [Sept. 29, 2013].

O’Floinn M, Ormerod D. Social networking sites RIPA and criminal investigations. Crim. L.R. 2001;24:766–789.

Ohm P. Broken Promises of privacy: responding to the surprising failure of anonymization. UCLA L. Rev. 2010;57:1701–1777.

Open Knowledge Foundation (OKF), http://okfn.org [Sept. 10, 2013].

Pell SK. Systematic government access to private-sector data in the United States. Int. Data Privacy Law. 2012;2(4):245–254.

Reidenberg J. The Data Surveillance State in the US and Europe. Wake Forest Law Rev. 2013. http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2349269#!.

Rubinstein IS. Big Data: the end of privacy or a new beginning?. Int. Data Privacy Law. 2013. ;3(2):74–87. http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2157659.

Schwartz PM. Systematic government access to private-sector data in Germany. Int. Data Privacy Law. 2012;2(4):289–301.

Schwartz PM, Solove DJ. The PII problem: privacy and a new concept of personally identifiable information. New York University L. Rev. 2011;86:1841–1845.

Svantesson DBJ. Systematic government access to private-sector data in Australia. Int. Data Privacy Law. 2012;2(4):268–276.

Sweeney L. Foundations of Privacy Protection from a Computer Science Perspective. In: Proc. Joint Statistical Meeting. Indianapolis: AAAS; 2000a.

Sweeney L. Simple Demographics Often Identify People Uniquely. Data Privacy Working Paper 3 Pittsburgh: Carnegie Mellon University; 2000b.

Swire P. From real-time intercepts to stored records: why encryption drives the government to seek access to the cloud. Int. Data Privacy Law. 2012;2(4):200–206.

Tene O, Polonetsky J. Big Data for all: privacy and user control in the age of analytics. Nw. J. Tech. Intell. Prop. 2013. ;11:239–274. http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2149364 [10.12.13].

Tene O, Polonetsky J. Privacy in the age of big data: a time for big decisions. Stan. L. Rev. Online. 2012;64:63–69.

Tene O. Systematic government access to private-sector data in Israel. Int. Data Privacy Law. 2012;2(4):277–288.

The Aspen Institute, 2010. The Promise and Peril of Big Data. David Bollier Rapporteur, Washington. http://www.aspeninstitute.org/sites/default/files/content/docs/pubs/The_Promise_and_Peril_of_Big_Data.pdf [10.12.13].

Zang H, Bolot J. Anonymization of location data does not work: a large-scale measurement study. In: Proc. MobiCom '11 Proceedings of the 17th Annual International Conference on Mobile Computing and Networking; 2011:145–156.

The Decision of the Standing Committee of the National People’s Congress on Strengthening Internet Information Protection, adopted at the 30th Session of Standing Committee of the 11th National People’s Congress on December 28, 2012. http://ishimarulaw.com/strengthening-network-information-protectionoctober-china-bulletin/ [25.10.13].

The European Cloud Partnership (ECP), https://ec.europa.eu/digital-agenda/node/609 [Dec. 10, 2013].

Tsuchiya M. Systematic government access to private-sector data in Japan. Int. Data Privacy Law. 2012;2(4):239–244.

Turow, J., Hoofnagle, C., Mulligan, D., Good, N., Grossklags, J., 2007. The Federal Trade Commission and Consumer Privacy in the Coming Decade. ISJLP 3, 723–749. http://scholarship.law.berkeley.edu/facpubs/935

United States General Accounting Office, 2011. Record Linkage and Privacy. Issues in creating New Federal Research and Statistical Information. http://www.gao.gov/assets/210/201699.pdf [10.12.13].

van Hoboken, J.V.J., Arnbak, A.M., van Eijk, N.A.N.M., 2012. Cloud Computing in Higher Education and Research Institutions and the USA Patriot Act. Institute for Information Law University of Amsterdam. http://www.ivir.nl/publications/vanhoboken/Cloud_Computing_Patriot_Act_2012.pdf [25.10.13].

Veenswijk, M., Koerten, H., Poot, J., 2012. Unravelling Organizational Consequences of PSI Reform—An In-depth Study of the Organizational Impact of the Reuse of Public Sector Data. ETLA, Helsinki, http://www.etla.fi/en/julkaisut/dp1275-en/ [10.12.13].

Wang Z. Systematic government access to private-sector data in China. Int. Data Privacy Law. 2012;2(4):220–229.

Wolfe, N., 2012. The new totalitarianism of surveillance technology. Guardian [On-line]. http://www.theguardian.com/commentisfree/2012/aug/15/new-totalitarianism-surveillance-technology [31.01.14].


1 See below par. 3.

2 The creation of datasets of enormous dimension (Big Data) and new powerful analytics make it possible to draw inferences about unknown facts from statistical occurrence and correlation, with results that are relevant in socio-political, strategical and commercial terms; Despite the weakness of this approach, more focused on correlation than on statistical evidence, it is useful to predict and perceive the birth and evolution of macro-trends, that can be later analyzed in a more traditional statistical way in order to identify their causes.

3 See Article 8 (a) of the Convention for the Protection of Individuals with regard to Automatic Processing of Personal Data, opened for signature in Strasbourg on 28 January 1981, recital 48 in the preamble to Directive 1995/46 and Articles 18–21 of Directive.

4 On the differences and interactions between Big Data and open data see A. Marton, M. Avital and T. Blegind Jensen, 2013, above at fn. 2, which point out that “while Big Data is about distributed computation and infrastructures, open data is about standards on how to make data machine-readable, and hence linkable.” From a European perspective, see the recent approved Directive 2013/37/EU of the European Parliament and of the Council of 26 June 2013, amending Directive 2003/98/EC on the re-use of public sector information. Available: http://eur-lex.europa.eu/JOHtml.do?uri=OJ:L:2013:175:SOM:EN:HTML [Dec. 10, 2013].

5 The access to data does not mean that everyone will immediately have new knowledge and predictive capacity, because, as mentioned above, technical equipment is necessary. However, the availability of the data permits citizens to put together their economic and cultural resources, even without a business-oriented action, in order to constitute groups dedicated to the analysis and processing of the raw data; see the projects and activities of the Open Knowledge Foundation (OKF), which is a non-profit organisation founded in 2004 and dedicated “to promoting open data and open content in all their forms – including government data, publicly funded research and public domain cultural content”; From this perspective, social media offer clear examples of the virtues of open data and open architecture (e.g., Dbpedia, Wikipedia, etc.).

6 See the various articles publish by The Guardian. Available: http://www.guardian.co.uk; See also the various documents available at https://www.cdt.org [Dec. 10, 2013].

7 See also more sources on TIA are available at http://epic.org/privacy/profiling/tia/.

8 See fn. 6.

9 Foreign Intelligence Surveillance Act (50 U.S.C. § 1801-1885C).

10 See Communications Assistance for Law Enforcement Act (18 USC § 2522).

11 Protecting Children From Internet Pornographers Act of 2011.

12 See Letter from John Cunliffe (UK’s Permanent Representative to the EU) to Juan Lopez Aguilar (Chairman of the European Parliament Committee on Civil Liberties, Justice and Home Affairs), 1 October 2013. Available: http://snurl.com/282nwfn [Jan. 31, 2014].

13 See United States v. Miller (425 US 425 [1976]). In this case the United States Supreme Court held that the “bank records of a customer’s accounts are the business records of the banks and that the customer can assert neither ownership nor possession of those records”. The same principle could be applied to an Internet Service Provider.

14 Google + currently has 400 million users, Instragram 90 million, Facebook 963 million e Twitter 637 million. Retrieved October 28th, 2013, from http://bgr.com/2012/09/17/google-plus-stats-2012-400-million-members/; http://www.checkfacebook.com [Jan. 31, 2014]; http://socialfresh.com/1000instagram/ [Jan. 31, 2014]; http://twopcharts.com/twitter500million.php [Jan. 31, 2014].

15 See http://www.x1discovery.com/social_discovery.html [Jan. 31, 2014].

16 See Rotaru v Romania (App. No. 28341/95) (2000) 8 B.H.R.C. at [43].

17 See 2012 NY Slip Op 22175 [36 Misc 3d 868].

18 See United States v. Miller (425 US 425 [1976]).

19 Although only information regarding natural persons are under the European regulation on data protection, the data concerning clients, suppliers, employees, shareholders and managers have a relevant strategical value in competition.

20 See also Recital 21, PGDPR and Recital 21, PGDPR-LIBE_1-29.

21 This area will fall under the new Proposal for a Directive on the protection of individuals with regard to the processing of personal data by competent authorities for the purposes of prevention, investigation, detection or prosecution of criminal offences or the execution of criminal penalties, and the free movement of such data, COM(2012) 10 final, Brussels, 25 January 2012 (hereinafter abbreviated as PDPI). Available at http://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=COM:2012:0010:FIN:EN:PDF [Dec. 10, 2013]; see the Explanatory Memorandum of the Proposal.

22 With regard to personal information collected by public entities the Directive 95/45/EC permits the data collection without the consent of data subject in various cases; however, the notice to data subjects is necessary also in these cases. See Articles 7, 8 and 10, Directive 95/46/EC.

23 See Article 23, PGDPR-LIBE_1-29 and also Article 23, PGDPR.

24 See Article 15, PGDPR-LIBE_1-29 and also Article 18, PGDPR.

25 See above fn. 21.

26 See Article 41 (2) (a), PGDPR-LIBE_30-91 and also Art. 41 (2) (a), PGDPR.

27 Art. 26 (1) (b) justifies the transfer that is necessary or legally required on important public interest grounds, or for the establishment, exercise or defence of legal claims (Article 26 (1) (d) of the Directive.

28 “Any other interpretation would make it easy for a foreign authority to circumvent the requirement for adequate protection in the recipient country laid down in the Directive”.

29 See above § 2.

30 It is necessary to underline that the guarantees provided by the U.S. Constitution in the event of U.S. government requests for information do not apply to European citizens, as well as, legal protection under specific U.S. laws applies primarily to U.S. citizens and residents.

31 See Privacy (Protection) Bill, 2013, updated third draft. Available: http://cis-india.org/internet-governance/blog/privacy-protection-bill-2013-updated-third-draft [Jan. 31, 2014].

32 See Art. 42 (1), Proposal for a General Data Protection Regulation, draft Version 56, November 29th, 2011.

33 See Art. 42 (2), Proposal for a General Data Protection Regulation, draft Version 56, November 29th, 2011. (“[The European Parliament] Regrets the fact that the Commission has dropped the former Article 42 of the leaked version of the Data Protection Regulation; calls on the Commission to clarify why it decided to do so; calls on the Council to follow Parliament's approach and reinsert such a provision”).

34 See Article 43a, PGDPR-LIBE_30-91. This provision does not clearly define the assignment of competence between the National Supervisory Authority and the Judicial Authority with regard to the request of judicial cooperation.

35 See PDPI, explanatory Memorandum, (SEC(2012) 72 final).

36 Recommendation No. R (87) 15 regulating the use of personal data in the police sector.

37 Framework Decision 2008/977/JHA on the protection of personal data processed in the framework of police and judicial cooperation in criminal matters, (2008), Official Journal L 350, pp. 60–71.

38 Art. 5, PDPI-LIBE.

39 Art. 9a, PDPI-LIBE.

40 Art. 19, PDPI.

41 Art. 30, PDPI.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset