, , ,

 ( Seventeen minute read) 

We know that we are living through a climate crisis, a mass extinction and an era of normalised pollution that harms our health, but we are also confronting with an age of technology with algorithms (APPS) that are changing society to benefit of a few while exploiting the many.

There are many examples of algorithms making big decisions about our lives, without us necessarily knowing how or when they do it.

Every “like”, watch, click is stored. Extreme content simply does better than nuance on social media. And algorithms know that.

Algorithms are a black box of living. 

We can see them at work in the world. We know they’re shaping outcomes all around us. But most of us have no idea what they are — or how we’re being influenced by them.

Algorithms are making hugely consequential decisions in our society on everything from medicine to transportation to welfare, benefits to criminal justice and beyond. Yet the general public knows almost nothing about them, and even less about the engineers and coders who are creating them behind the scenes.

Algorithms are quietly changing the rules of human life and whether the benefits of algorithms ultimately outweigh the costs remains a question.

Are we making a mistake by handing over so much decision-making authority to these programs?

Will we blindly follow them wherever they lead us?

Algorithms can produce unexpected outcomes, especially machine-learning algorithms that can program themselves.

Since it’s impossible for us to anticipate all of these scenarios, can’t we say that some algorithms are bad, even if they weren’t designed to be?

Every social media platform, every algorithm that becomes part of our lives, is part of this massive unfolding social experiment.

Billions of people around the world are interacting with these technologies, which is why the tiniest changes can have such a gigantic impact on all of humanity.

I think the right attitude is somewhere in the middle:

We shouldn’t blindly trust algorithms, but we also shouldn’t dismiss them altogether. The problem is that algorithms don’t understand context or nuance. They don’t understand emotion and empathy in the way that humans do they are eroding our ability to think and decide for ourselves.

This is clearly happening, where the role of humans has been side-lined and that’s a really dangerous thing to allow to happen.

Artificial algorithms will eventually combine in ways that blur the distinction between the place of where life is imitating tech. 

Who knows where the symbiotic relationship will end?

Fortunately we’re galaxies away from simulating more complex animals, and even further away from replicating humans.

Unfortunately we’re living in the technological Wild West, where you can collect private data on people without their permission and sell it to advertisers. We’re turning people into products, and they don’t even realize it. And people can make any claims they want about what their algorithm can or can’t do, even if it’s absolute nonsense, and no one can really stop them from doing it.

There is no one assessing whether or not they are providing a net benefit or cost to society.

There’s nobody doing any of those checks except your Supermarket loyalty card.

These reveals consumer patterns previously unseen and answers important questions. How will the average age of customers vary? How many will come with families? What are the mobility patterns influencing store visit patterns? How many will take public transportation? Should a store open for extended hours on certain days?  

Algorithms are being used to help prevent crimes and help doctors get more accurate cancer diagnoses, and in countless other ways.  All of these things are really, really positive steps forward for humanity we just have to be careful in the way that we employ them.

We can’t do it recklessly. We can’t just move fast, and we can’t break things.


Sites such as YouTube and Facebook have their own rules about what is unacceptable and the way that users are expected to behave towards one another.

The EU introduced the General Data Protection Regulation (GDPR) which set rules on how companies, including social media platforms, store and use people’s data.

How data was collected from a third party app on Facebook called “thisisyourdigitallife”  Facebook recently confirmed that information relating to up to 87 million people was captured by the app, with approximately 1 million of these people being UK citizens.

It is very important to note that deleting/removing one of these apps, or deleting your Facebook account, does
not automatically delete any data held on the app. Specific steps need to be taken within each app to request the deletion of any personal information it may hold.

If illegal content, such as “revenge pornography” or extremist material, is posted on a social media site, it has previously been the person who posted it, rather than the social media companies, who was most at risk of prosecution.

The urgent question is now: 

What do we do about all these unregulated apps?

There’s an app for that”, has become both an offer of help and a joke.

Schoolchildren are writing the apps:

A successful app can now be the difference between complete anonymity and global digital fame.

A malicious app could bring down whole networks. 

Google’s Android operating system is coming up on the rails: despite launching nearly two years later, it has more than 400,000 apps, and in December 2011 passed the 10bn downloads mark. 

With the iPod and iPhone.  31bn apps were downloaded to mobile devices in 2011, and predicts that by 2016 mobile apps will generate $52bn of revenues – 75% from smartphones and 25% from tablets.

Apps have also been important for streaming TV and film services such as Netflix and Hulu, as well as for the BBC’s iPlayer and BSkyB’s Sky Go – the latter now attracts 1.5 million unique users a month.

Apps will steal data or send pricey text messages.

Entire businesses are evolving around them. 

They are the new frontier in war’s instructing drones.

No one can fearlessly chase the truth and report it with integrity.

They are shaping our lives in ways never imagined before.

Today there is an app for everything you can think of.

In a short run, Apple and Google have done what nobody ever dreamed about fucked us.

Thanks to the gigantic rise of mobile app development technology, you can now choose digitally feasible ways of not knowing yourself.

The era of digitally smart and interactive virtual assistants has begun and will not cease.

Machines can control your home, your car, your health, your privacy, your lifestyle, your life, maybe not quite yet your mother.  You leaving behind gargantuan amount of infinite data for company owners.

It goes without saying that mobile apps have almost taken over the entire world.

Mobile apps have undoubtedly come a long way, giving us a whole new perspective in life: 

Living digital. 

Yes there are countries trying to pass laws to place controls on platforms that are, supposed to make the companies protect users from content involving things like violence, terrorism, cyber-bullying and child abuse, but not on profit seeking apps, trading apps ( Wall street is 70% governed by trading apps), spying apps, truth distorting apps destroying what left of Democracy. 

A democracy is a form of government that empowers the people to exercise political control, limits the power of the head of state, provides for the separation of powers between governmental entities, and ensures the protection of natural rights and civil liberties.

Meaning “rule by the people,” but people no longer apply when solutions to problems are decided by Algorithms.  

Are algorithms a threat to democracy?

It’s not a simple question to answer – because digitisation has brought benefits, as well as harm, to democracy. 

History has shown that democracy is a particularly fragile institution. In fact, of the 120 new democracies that have emerged around the world since 1960, nearly half have resulted in failed states or have been replaced by other, typically more authoritarian forms of government. It is therefore essential that democracies be designed to respond quickly and appropriately to the internal and external factors that will inevitably threaten them.

How likely is it that a majority of the people will continue to believe that democracy is the best form of government for them?

Digitisation brings all of us together – citizens and politicians – in a continuous conversation.

Our digital public spaces give citizens the chance to get their views across to their leaders, not just at election time, but every day of the year.

Is this true?

With so many voices, all speaking at once, creating a cacophony that’s not humanly possible for us to make sense of, such a vast amount of information.  And that, of course, is where the platforms come in.

Algorithms aren’t neutral.

Such allure of Dataism and Algorithmic decisions forms the foundation of the now-cliched Silicon Valley motto of “making the world a better place.”

Dataism is especially appealing because it is so all-encompassing.

With Datasim and algorithmic thinking, knowledge across subjects becomes truly interdisciplinary under the conceptual metaphor of “everything as algorithms,” which means learnings from one domain could theoretically be applied to another, thus accelerating scientific and technological advances for the betterment of our world.

These algorithms are the secret of success for these huge platforms. But they can also have serious effects on the health of our democracy, by influencing how we see the world around us.

When choices are made by algorithms, it can be hard to understand how they’ve made their decisions – and to judge whether they’re giving us an accurate picture of the world. It’s easy to assume that they’re doing what they claim to do – finding the most relevant information for us. But in fact, those results might be manipulated by so-called “bot farms”, to make content look more popular than it really is. Or the things that we see might not really be the most useful news stories, but the ones that are likely to get a response – and earn more advertising. 

The lack of shared reality is now a serious challenge for our democracy and algorithmically determined communications are playing a major role in it. In the current moment of democratic upheaval, the role of technology has been gaining increasing space in the democratic debate due to its role both in facilitating political debates, as well as how users’ data is gathered and used.

Democracy is at a turning point.

With the invisible hand of technology increasingly revealing itself, citizenship itself is at a crossroads. Manipulated masterfully by data-driven tactics, citizens find themselves increasingly slotted into the respective sides of an ever growing and unforgiving ideology divide.


Algorithm see, algorithm do.

Policymaking must move from being reactive to actively future-proofing democracy against the autocratic tendencies and function creep of datafication and algorithmic governance.


Because today, a few big platforms are increasingly important as the place where we go for news and information, the place where we carry on our political debates. They define our public space – and the choices they make affect the way our democracy works. They affect the ideas and arguments we hear – and the political choices we believe we can make. They can undermine our shared understanding of what’s true and what isn’t – which makes it hard to engage in those public debates that are every bit as important, for a healthy democracy, as voting itself.

Digital intelligence and algorithmic assemblages can surveil, disenfranchise or discriminate, not because of objective metrics, but because they have not been subject to the necessary institutional oversight that underpins the realisation of socio-cultural ideals in contemporary democracies. The innovations of the future can foster equity and social justice only if the policies of today shape a mandate for digital systems that centres citizen agency and democratic accountability.

Algorithms Will Rule The World

A troubling trend in our increasingly digital, algorithm-driven world — the tendency to treat consumers as mere data entry points to be collected, analysed, and fed back into the marketing machine.

It is a symptom of an algorithm-oriented way of thinking that is quickly spreading throughout all fields of natural and social sciences and percolating into every aspect of our everyday life. And it will have an enormous impact on culture and society’s behaviour, for which we are not prepared.

In a way, the takeover of algorithms can be seen as a natural progression from the quantified self movement that has been infiltrating our culture for over a decade, as more and more wearable devices and digital services become available to log every little thing we do and turn them into data points to be fed to algorithms in exchange for better self-knowledge and, perhaps, an easier path towards self-actualization.

Algorithms are great for calculation, data processing, and automated reasoning, which makes them a super valuable tool in today’s data-driven world. Everything that we do, from eating to sleeping, can now be tracked digitally and generate data, and algorithms are the tools to organize this unstructured data and whip it into shape, preferably that of discernible patterns from which actionable insights can be drawn.

Without the algorithms, data is just data, and human brains are comparatively ill-equipped to deal with large amounts of it. All of which will have profound impact on our overall quality of life, for better and worse. There is even a religion that treats A.I. as its God and advocates for algorithms to literally rule the world.

This future is inevitable, as AI is beginning to disrupt every conceivable industry whether we like it or not—so we’re better off getting on board now.

As autonomous weapons play a crucial role on the battlefield, so-called ‘killer robots’ loom on the horizon. 

Fully autonomous weapons exists.

We’re living in a world designed for – and increasingly controlled by – algorithms that are writing code we can’t understand, with implications we can’t control.

It takes you 500,000 microseconds just to click a mouse.

A lie that creates a truth. And when you give yourself over to that deception, it becomes magic.

Algorithm-driven systems typically carry an alluringly utopian promise of delivering objective and optimized results free of human folly and bias. When everything is based on data — and numbers don’t lie, as the proverb goes — everything should come out fair and square. As a result of this takeover of algorithms in all domains of our everyday life, non-conscious but highly intelligent algorithms may soon know us better than we know ourselves, therefore luring us in an algorithmic trap that presents the most common-denominator, homogenized experience as the best option to everyone.

In the internet age, feedback loops move quickly between the real world.

The rapid spread of algorithmic decision-making across domains has profound real-world consequences on our culture and consumer behaviour, which are exacerbated by the fact that algorithms often work in ways that no one fully understands.

For example, the use of algorithms in financial trading is also called black-box trading for a reason.

Those characteristics of unknowability and, sometimes, intentional opacity also point to a simple yet crucial fact in our increasingly algorithmic world — the one that designs and owns the algorithms controls how data is interpreted and presented, often in self-serving ways

.In reaction to that unknowability, humans often start to behave in rather unpredictable ways, which lead to some unintended consequences. Ultimately, the most profound impact of the spread of Dadaism and algorithmic decision-making is also the most obvious one: It is starting to deprive us of our own agency, of the chance to make our own choices and forge our own narratives.

The more trusting we grow of algorithms and their interpretation of the data collected on us, the less likely we will question the decisions it automated on our behalf.

Lastly, it is crucial to bring a human element back into your decision making.

Making sure that platforms are transparent about the way these algorithms work – and make those platforms more accountable for the decisions they make.

This however I believe this is no longer feasible, because it can be especially difficult when those algorithms rely on artificial intelligence that making up the rules on there own accord. 

The ability to forge a cohesive, meaningful narrative out of chaos is still a distinct part of human creativity that no algorithm today can successfully imitate.

In order to create an AI ecosystem of trust, not to undermine the great benefits we get from platforms.


To make sure that we, as a society, are in control

If people from different communities do not, or cannot, integrate with one another they may feel excluded and isolated.

In every society, with no exception, it exists a what we could call a ”behaviour diagram of the collective life with social control been the form society preserves itself from various internal threats.  China a prime example. 

Algorithms for profit, surveillance, rewards, power, etc, are undermining what’s felt of our values, chancing the relationship of authority and the negation of hierarchies and the authority of the law.

Hypothetical reasoning forward allows us to reason backwards to solve problems.  Process is all we have control over, not results.

All human comments appreciated. All like clicks and abuse chucked in the bin.

Contact: bobdillon33@gmail.com