( A five-minute read)
We all know or at least we should all be aware that our world is becoming less and less transparent thanks to what we call Artificial Intelligence.
The challenge there is:
The false promise of the Internet was that it can connect people from different backgrounds, with different beliefs and across disparate locations.
The trend toward personalization by AI is impeding the fulfillment of that promise?
What is becoming more and more apparent is that while most personalization on the web is algorithmically driven, aren’t we implicitly, informing the algorithms based on the choices we’ve previously made interacting with content?
Couldn’t you then, in theory, manipulate the filter so you see what you want to see or are there too many factors beyond our control?
Even if you’re completely logged out of Google, on a new computer, the company can track 57 signals about you — from what kind of laptop you’re using to what your IP address is to what the font size in your browser is. Already, that gives a lot of important clues about age, income and demographics.
It’s ironic — the promise of personalization is that it gives us our own personal view of the world.
But the challenge is that a lot of the time, it’s actually pushing us toward a stereotyped, simplified version of ourselves: “This person is male, so we’ll show him more gadget and car news.”
So let me ask you.
Many of the major social, discovery and media sites on the Internet now implement some type of personalization. Do you feel these sites have a responsibility to educate consumers about how their information is being filtered? Do you think users should be able to opt out of personalization?
I would say Yes, on both counts.
In an increasingly complex and vast media landscape it is crucial that me maintain our private lives.
There is only one solution and that is the:
A lot of the personalization that exists today just serves up information junk food, but a growing portion is being curated by robots — computer algorithms that are filtering content and deciding what we get to see.
It may be delicious, but it doesn’t feed the soul.
Now it’s possible to live in a bubble where that stuff doesn’t ever show up — you’d never know it’s happening.
Take the Facebook “Like” button — the main way that information gets spread on Facebook. “Like” isn’t a neutral word — it’s easy to Like “I just finished a marathon,” and hard to Like “cell phones may cause cancer.”
So some kinds of information get through, and others don’t, and when that’s happening in the Facebook News Feed, where an increasing number of folks get their news, it’s a real problem.
Most people aren’t aware that their Google search results, Yahoo News links, or Facebook feed is being tailored in this way.
Filters can provide relevance and combat information overload, but with so much riding on automated decisions to ensure algorithms deal with humans fairly is now more relevant than ever.
I recently read that in five-year your smartphone could be reading your mind.
Brain- computer interface.
Personalization couldn’t exist without the massive dossiers of personal data being collected by big companies online these days. And it’s a problem because consumers don’t have much control over that.
The current laws around personal data just don’t contemplate a world in which a click on one website changes what you see on an entirely different one.
Almost all popular websites, from search engines to social networks to media outlets, are now utilizing filters in some way to personalize content for visitors.
When websites show us only what we like, we get cut off from the diverse points of view that can enrich our understanding of the world.
We get Donald Trumps.
Privacy is about controlling what the world is allowed to know about you. This is about controlling what you’re able to see of the world — what your filters let through and what they don’t.
Its time to wake up.
We can lose sight of our common problems, but they don’t lose sight of us.
It’s only a matter of time before our Fidelity/ Loyalty cards are linked up to our personal data held by banks, e-commerce sites and social media. If not already.
We will then be looking at citizen character score, which will bring credit scores to a whole new level, turning them more into to life scores, by tracking anything and everything we do. The scary bit is what is tracked and by who.
I hear you saying that this will never be accepted.
It is already on the cards for people living in China and Singapore. Humans and robot algorithms, living in peaceful harmony. Where you go, what you buy, who you know, how many points are on your driving licence, how your friends rate you.
The scores will serve not just to indicate an individual’s credit risk, but could be used in a vast array of applications and organisations such as Governments, Benefits, Hospital Operations, Visa, Education, down to all fields that makes Society including prison sentences, landlords, employers, and even romantic partners to gauge an individual’s character.
All stored in the Cloud. Which comes in many different models forms.
Ubiquitous access to the network: Self-service and on-demand access to computing capabilities. This service will most often be performed by the service provider automatically without the need for human interaction.
Cloud Computers is not the easiest of terms to define, or explain what it all actually means. Owned by Google, Twitter, Gmail and Facebook the Cloud is elusive as grabbing a cloud itself.
Perhaps we can blame it all on Leonhard Euler one of the most prolific mathematicians in history, and also a prolific inventor of canonical notation.
( An Euler path is a path that uses every edge of a graph exactly once.)
One way or the other to use a Trumpetism: It’s ain’t going to be great unless we build algorithms that have a sense of civic purpose embedded in them, giving us both entertainment and the information we really need, not profit.
All comments appreciated. All like clicks chucked in the bin.
O sorry about the line spacing in this post just cannot figure out how to correct it.