Some time ago I posted:
Big Data is leading us to Cultural De-Acceleration.
We are becoming increasingly “digitized.”
When you ask somebody from the industry, “What is Big Data?” they will usually reply that this describes the challenge that companies that collect and analyse the high volumes of Internet data face. This “big data” technically refers to the specialized tools required to store and analyse.
However, this response says very little about the significance of today’s digital revolution.
When the Sloan Digital Sky Survey started in 2000, its telescope collected more data in its first week than has been amasses in the entire history of astronomy.
Wall-Mart in the USA handles more than 1 million customers transactions every hour, feeding its databases with 2.5 petabytes- the equivalent of 167 times the books in the America’s Library of Congress.
Facebook has over 40 million photos and God only knows what Google is up to.
The point is that the world now contains an unimaginably vast amount of digital information which is growing bigger and more rapidly.
In recent years Oracle, IBM, Microsoft, and there like have spent $15 billion buying up software companies specializing in data management and analytics.
Data has become the new raw material of big business.
The trail of clicks is valuable and can be sold and you would indeed be an idiot to think that it is having no effect on your life.
The way that information is managed touches all areas of life.
What is true now is that more of our lives and activities are being stored digitally.
Like any technology, knowledge can be used for social good or to make things worse for people. Digital monopolies will wield considerable power.
There is likely to be a power imbalance if this kind of new capability of “knowing” is not well-handled by society.
There is no reason to think that the changes we are witnessing today will be any less disruptive than the Industrial Revolution.
We’re going to end up reinventing what it means to have a human society.
Who you actually are is now determined by where you spend time and which things you buy.
Big data is increasingly about real behavior and by analyzing this sort of data, scientists can tell an enormous amount about you. They can tell whether you are the sort of person who will pay back loans. They can tell you if you’re likely to get diabetes.
I am not a Edward Snowden.
If we handled Big data correctly it will bring massive benefits to us all – to our cities, to our environment, to our health, to almost everything.
Yet we also need a system that is flexible and adaptable enough to allow for bright ideas and social, business, and research entrepreneurship to build a better future, i.e. without getting tangled up in unthinkingly risk-averse bureaucracy and red tape. Without the rich getting richer and the poor living in a desert of ignorance.
We want to ensure that there is a high trust system for data sharing, not one that mitigates many of the risks.
We need to think of solutions that are sound and strong, but not brittle.
What kinds of principles and solutions are they?
There are many problems to be resolved.
Who owns, controls, or has decision rights about data? Is it the collector of the data? Certainly they may have a financial interest.
The person who the data is about?
They certainly have an interest.
In order to reap the benefits of the data revolution, it is clear that existing databases will be re-used and new databases will be created.
But then, who owns the resulting data? The re-user?
Will they be owned by the entity disclosing or collecting the data, or will they be open by default?
What about collective ownership of data, such as IWI data?
How are intellectual property rights arrived at from the data managed?
Who has decision rights over data? The collector? Provider (if different)? The person or entity that the data is about? If there is a data commons, who makes decisions.
Who is the data custodian and what are their obligations?
Who will look after the (newly created) databases?
For instance, who is responsible for the processing and storage of the data?
Where and how will data be stored, and for how long?
Who will provide safeguards for data quality and data accuracy?
Who is accountable when data gets stolen?
Who will have the authority to decide on those data access rights?
What happens to data if the custodian gets liquidated or sold off (to another
Can the liquidator on-sell the data to pay off creditors?
How do we protect the digital rights?.
We are living in a pluralistic society with differences in cultural backgrounds and value perspectives,which are spread all over the world and exposed to different cultures. These cultural differences influence our privacy perceptions and the types of data we are willing to share.
How could we maintain our cultural diversity and be an inclusive society in which the digital rights of ever one are protected?
What will be the social contract for a data-driven future?
The value of data no longer resides solely in its primary purpose. Value also resides in the re-use of data.
What do you give consent to when we cannot even imagine what possible future value that data may have?
Most data re-uses haven’t been imagined when the data is first shared, which raises the question of how individuals can give informed consent to an unknown.
Do individuals need to opt-in to an open-ended, multi-purpose arrangement?
Or are there perhaps other possible arrangements for informed consent we might be able to create?
Do children have digital rights to consent before a certain age?
What about you, and your family’s, rights when you die? Do we need digital wills?
Do we need the ability as individuals to opt out in the digital age, similar to how we can decide to opt out of target marketing campaigns of telemarketers?
Do we have a right to revoke our consent with the use of our personal data? How could this be arranged?
Will the digital footprints and breadcrumbs you have left earlier in your digital life, such as the public posting of sensitive pictures, haunt you for the rest of your life or even beyond?
How do we ensure the best outcome in a global environment where digital data crosses borders?
The Internet has, with a few notable exceptions, no borders and the digital world is truly global.
There are major questions, even on a domestic scale about the provenance and ownership of data, but these are amplified when global sharing is considered.
There are times when governments do not want your consent.
This is obvious in cases like policing and protecting children from child abuse.
There is no need to protect the privacy of some individuals.
But there are more challenging cases.
What if we could use personal health data to do research, to save lives?
What about when governments and insurance companies want to use shared data to manage their own interests?
Perhaps there is a life-threatening medical condition that a small number of people have. We want to profile them and compare them to others without the condition. But nobody wanted to opt in to share their data, though the risk to their privacy is small.
When do your interests in privacy outweigh other people’s interests or the collective interest? To track pandemic outbreaks. Who would give emergency consent to open all personal data to help stop the spread of this deadly disease?
Big data is big business for the criminal fraternity too who are adapting well to our digital future. Identity theft is increasingly common.
Like most things in this world the management of Big Data it is beyond control.
Along with Science and technology Big Data is out running our Morality.
There are a host of challenges and tensions for any society that wants
to play in this space; the sorts of challenges that we need to consider when people come asking to have and link up your data.
Challenges to safety from theft, bullying, or persecution; challenges to your autonomy and choice; challenges to freedom from interference from well-meaning (or otherwise) businesses and governments.
What can we do about it? You tell Me.