Read this section to explore how data needs to be used responsibly, the role of artificial intelligence, and the effects of data on people.
People and Data
Remedies
Vibrant debate is now ensuing about what public policies
could help respond to these failures within and outside the
data market, and how regulations may be applied in this
sector, which up until now has been largely unregulated.
Appropriate policies – helped by emerging technologies –
could lead the data revolution to expand economic opportunities for more people. Part of this could be achieved by
making the costs and benefits transparent and redistributing
them more fairly across different players in the market.
Specific remedies could help address or minimize the risks and costs to individuals arising from the ways data markets function today. Areas that a personal data policy could address include overcoming the identified market failures – loss of privacy, control, and agency; exclusion from participation in the market; and unfair distribution of the market benefits among data market participants.
But little consensus exists for now on what remedies will work, and some approaches are yet to be tested. And current data policies are highly fragmented, with diverging global, regional, and national regulatory approaches. Moreover, these remedies do
not directly address the unequal power of individual users versus the organizations (global platforms or states), an underlying issue in data markets. This issue might only be addressed through strong regulatory or large-scale user action; but, again,
little consensus exists on how these might be achieved. Table 4.3 and the rest of this section outline emerging responses.
Table 4.3 Risks and Remedies
Risk | Remedy | Example |
Loss of privacy | Legal frameworks to protect personal data from theft and misuse, to require consent for collection and use, to keep personal data accurate and relevant (where data subjects can access and correct their personal data), to define how such data can flow (including across borders), and to specify the mechanisms to assist individuals if violations occur |
European General Data Protection Regulation (GDPR); APEC Privacy Framework; OECD Privacy Guidelines. |
Loss of agency | Informing individuals about when and how data is collected and used, including how their experiences are modified by algorithms based on that and others' data. Allowing users to switch off such algorithms or hold back their data from being used. Clarity about data sources to minimize the risk of fake data or its derivatives influencing decisions. |
None, although some companies such as Google do now allow users to "turn off" personalized search results, for example. |
Loss of control | Legal frameworks limit the collection of personal data, and limit use and disclosure to specific purposes. Data subjects should be notified about the purpose and disclosure of the data collection and can opt-out of data sharing between the data collector and other companies. They can also choose to be forgotten. |
Canadian Personal Information Protection and Electronics Documents Act; European GDPR. |
Loss of trust | Reducing personal data breaches, business codes of conduct where regulation is weak or vague, acting on feedback from user communities. |
Data Science Code of Professional Conduct. |
Exclusion | Connecting people to the better-quality, affordable internet. | Universal technology access programs and digital literacy training. |
Privacy
Privacy protections have been typically ensured through legal frameworks. A global survey, reported by UNCTAD shows that data and privacy protection legislation has been put in place in more than 100 economies, 66 developing or transitioning (see map ES.1). More than one-fifth of economies, primarily developing ones, had no legislation, and few have developed comprehensive data protection laws.
Key attributes of such a legal framework include protection of personal data collected by organizations, such as effective and appropriate security to protect the data from theft and misuse. It is also generally accepted that organizations need to keep personal data accurate, relevant, and updated. Data subjects must be able to access and correct their personal data. Widely cited frameworks to define the rules around the privacy of personal data include the European General Data Protection Regulation (GDPR); the APEC Privacy Framework; and the OECD's Privacy Guidelines. The Council of Europe's Convention 108 is a foundational data protection initiative, with a treaty that opened for ratifications in 1981. The treaty intends to "secure in the territory of each Party for every individual, whatever his nationality or residence, respect for his rights and fundamental freedoms, and in particular his right to privacy, with regard to automatic processing of personal data relating to him ('data protection')".
The GDPR, which came into force in May 2018, enables
better control over personal data, entitling individual
protection of anonymity, pseudonymity, and rights to
request and erase personal data ("right to be forgotten").
Another novel feature is data portability, giving individuals
the right to request that their data be transferred to another
controller and for data controllers to use common formats.
Cross-border personal data flows are also regulated, with
onward transmission generally only permitted if the recipient country has adequate data protection laws. Businesses
that do not comply with the regulation face significant fines.
The right of an individual to privacy is often balanced with the need to secure the greater public good. For example, even the Council of Europe's Convention 108 permits restrictions in cases when "overriding interests (e.g. State security, defense, etc.) are at stake". In other cases, privacy rules permit irreversibly anonymized data to be used for research or public interest activities. This balances the interests of individuals in safeguarding their privacy with the benefits of being able to use personal data, as described in the preceding sections.
Beyond legal frameworks, however, new approaches are emerging. This helps in areas given institutional capacity limitations, the difficulty in regulating across borders, and the "take-it-or-leave-it" nature of many services. For example, online services that embed privacy into their designs have emerged in messaging or search. A more detailed discussion is found in chapter 6.
Control
To overcome loss of control, collection of personal data should be transparent, and use or disclosure limited to specific purposes. Individuals should be notified about the purpose and disclosure of the data collection. One example is Canada's Personal
Information Protection and Electronics Documents Act passed in 2000 (passed by the Privacy Commissioner of Canada). Under the act, individuals have the right to access the information held about them, challenge its accuracy, and give consent for personal
information to be collected. Organizations have obligations to ensure data security, limit the data they collect, use personal data only for the purposes consented to by the consumer, and not retain the data when purposes for collection are no longer
in effect. The EU's GDPR also enhances individuals' control over personal data by enabling the "right to be forgotten," permitting them to control what personal data is available online or with data users. The rules also allow users to control how
personal data is used by those organizations.
Agency
Loss of agency can be averted by educating individuals in
data collection methods and in how algorithms modify their
experiences based on their data. The Data Privacy Project in
New York City trains librarians, in turn, to provide guidance
on protecting personal data to the largely vulnerable patrons
that utilize libraries' internet services. Some applications
allow individuals to switch off predictive algorithms. For
example, Google allows its users to delete their past searches
or prevent saving of searches or allows users to turn off
personalized search results that might create an "echo chamber" for users by limiting their exposure to new sources of information.
Exclusion
Exclusion of individuals from data markets can be overcome in different ways. It is estimated that well over 2 billion people did not use the internet at all in 2016, either because they had no access, could not afford it, or did not know how or want to use it. A significant proportion of these people live in rural areas of developing countries, where levels of internet infrastructure and incomes are often low. Exclusion from the data market can be overcome through introduction of information and communication technology, particularly mobile telephony and the internet, among lower-income groups and connection of more people through inexpensive phones.
Governments need equally to tackle the challenge of
people who have the needed infrastructure within reach
but do not use the internet because they lack digital literacy.
This could be done through creation of awareness about
data-driven services (such as social networks, public services, search engines), as the Indian government's Digital
India Program of 2015 does. The program helps farmers get
access to information about different wholesale markets in
their community through digital apps on smartphones and
helped cut out middlemen (see Reuters Market Light 2015).
Farmers can use this information to make better choices and
not be beholden to centuries-old systems (Bergvinson 2017).
By the end of 2015 the program had already helped increase
farmers' incomes 5–25 percent.
Trust – and the dominance of digital platforms
During the writing of this report, many episodes underscored the scale of the personal data economy, but also undermined the trust that people have in the organizations that have grown significantly in the data market. These episodes have included massive leaks of personal data, discovery of unapproved access to private data, attempts at manipulation of ostensibly neutral information sources, and sharing of personal information. The scale of these episodes is significant, given the reach and popularity of the organizations and platforms that they involve, such as Experian or Facebook.
Debate about the implications of these episodes is only just beginning, and focusing on privacy of personal data, control over who accesses and uses people's data, and the agency of users. In one account, the organizations involved in transgressions might have been unaware themselves of the potential for trouble or unable to prevent it. But such accounts do little to shore up trust in these services. Even so, the scale of organizations' networks and their importance might lead people to continue using them, even if less willingly.
It might be possible to instill greater trust through
actions to remedy some of these other risks. It might also be
possible to seek ways to manage data more collaboratively,
for instance, adopting a code of conduct (such as the Data
Science Code of Professional Conduct of the Data Science
Association), and with more
transparency, in how data is managed and used. As the next
section discusses, this may involve moving toward a more
balanced personal data market in which users regain control
over their data.