• About
  • THE BEADY EYE SAY’S : THE EUROPEAN UNION SHOULD THANK ENGLAND FOR ITS IN OR OUT REFERENDUM.

bobdillon33blog

~ Free Thinker.

bobdillon33blog

Tag Archives: Big Data

THE BEADY EYE ASK’S. WILL THE WORLD EVER BE ABLE TO ACT AS ONE?

26 Wednesday Feb 2020

Posted by bobdillon33@gmail.com in 2020: The year we need to change., Algorithms., Artificial Intelligence., Climate Change., Dehumanization., Digital age., DIGITAL DICTATORSHIP., Disconnection., Environment, Fourth Industrial Revolution., Google, Human values., Humanity., Life., Our Common Values., Reality., Robot citizenship., Sustaniability, Technology, Technology v Humanity, The cloud., The common good., The essence of our humanity., The Future, The Obvious., The state of the World., The world to day., Unanswered Questions., WHAT IS TRUTH, What Needs to change in the World, Where's the Global Outrage.

≈ Comments Off on THE BEADY EYE ASK’S. WILL THE WORLD EVER BE ABLE TO ACT AS ONE?

Tags

Algorithms., Artificial Intelligence., Big Data, Capitalism vs. the Climate., Climate change, Extinction, Technology, The Future of Mankind, Visions of the future.

 

Twenty-five minute read.

If humanity stopped fighting and competing against one another; if we bound together in a common cause, we could accomplish spectacular things.

Not true.

We would basically become mindless drones of no culture because it’d all just be one culture with no distinct forms.

If this were to become a reality, Ummm how would govern it.

China’s premier Wen Jiabao put forward the following equation in a speech: “Internet + Internet of Things = Wisdom of the Earth.”

How wrong he was, however, by 2025 there will be 1 trillion networked devices worldwide in the consumer and industrial sectors combined.

He should have said, “Internet + Internet of Things = Becoming what we do not think? Because people are truly not that intelligent.

In our houses cars and factories, we’re surrounded by tiny, intelligent devices that capture data about how we live and what we do. Now they are beginning to talk to one another. Soon we’ll be able to choreograph them to respond to our needs, solve our problems, even save our lives.

Intelligent things all around us, coordinating their activities.

Coffee pots that talk to alarm clocks. Thermostats that talk to motion sensors. Factory machines that talk to the power grid and to boxes of raw material.

We might be seeing the dawn of an era when the most mundane items in our lives can talk wirelessly among themselves, performing tasks on command, giving us data we’ve never had before? This intelligence once locked in our devices will flow into the universe of physical objects.

We are already struggling to name this emerging phenomenon.

Some have called it the Internet of Things or the Internet of Everything or the Industrial Internet—despite the fact that most of these devices aren’t actually on the Internet directly but instead communicate through simple wireless protocols.

Others are calling it the Sensor Revolution.

I call it the Programmable Profitable in a World of profit-seeking algorithms.

It’s the fact that once we get enough of these objects onto our networks, they’re no longer one-off novelties or data sources but instead become a coherent system, a vast ensemble that can be choreographed, a body that can dance in the era of the cloud and apps and the walled garden— of Google, Apple, etc, which connotes a peer-to-peer system in which each node will not be equally empowered.

These connected objects will act more like a swarm of drones, a distributed legion of bots, far-flung and sometimes even hidden from view but nevertheless coordinated as if they were a single giant machine, relying on one another, coordinating their actions to carry out simple tasks without any human intervention.

So the world will act as one. Or will it?

Once we get there, that system will transform the world of everyday objects into a design­able environment, a playground for coders and engineers.

It will change the whole way we think about the division between the virtual and the physical putting intelligence from the cloud into everything we touch.

Call it “smart exploration.” 

The rises of the smartphone have supplied us with a natural way to communicate with those smart objects. So far they include watches, heart rate monitors, and even some new Nike shoes. Smartphone making payments to merchants wirelessly instead of swiping a card, and some billboards are using the protocol to beam content to passersby who ask for it. As a way to sell more products and services—particularly Big Data–style analysis—to their large corporate customers.

The yoking together of two or more smart objects—is the trickiest, because it represents the vertiginous shift from analysis, the mere harvesting of helpful data, to real automation.

In my view no matter how thoroughly we might use data to fine-tune our lives and businesses, it’s scary to take any decisions out of human hands.

It can be hard to imagine the automation you might someday want or even need, in your daily life. There are all sorts of adjustments you make over the course of any given day that is reducible to simple if-then relationships.

Facebook, which has famously described the underlying data it owns as a social graph—the knowledge of who is connected to whom and how.

Would you want to automate all of these relationships?

A world where every one of us would have a sensor on us. “Presence” tags—low-energy radio IDs that sit on our keychains or belt loops and announce our location, verify our identity.

This is the principle behind Square Wallet and a number of other nascent payment systems, including ones from PayPal and Google. (When you walk into a participating store today, Square can let the cashier know you’re there; you pay simply by giving your name.)

A tracking tool that monitors not just your pet’s movements, but your movements.

GPS reliably know our location within 100 feet, give or take, and that knowledge has and is transforming our lives immeasurably: turn-by-turn driving directions, local restaurant recommendations, location-based dating apps, and so on.

With presence technology, Google has already the potential to know our location absolutely, down to a foot or even a few inches. That means knowing not merely which bar your friend is at but which couch she’s sitting on if you walk through the door.

It means receiving a coupon for a grocery item on the endcap at the moment you walk by.

Think about a liquor cabinet that auto-populated your shopping list based on the levels in the bottles—but also locked automatically if your stock portfolio dropped more than 3 per cent.

Think about a home medical monitoring system that didn’t just feedback data from diabetic patients but adjusted the treatment regimen as the data demanded.

Think about how much more intelligent your sprinklers could be if they responded to the weather report as well as to historical patterns of soil moisture and rainfall.

It does not stop just there think about applications on top of these connected objects.

This means not just tying together the behaviour of two or more objects—like the sprinkler and the moisture sensor—but creating complex interrelationships that also tie in outside data sources and analytics. 

Plugged into that information, your system wouldn’t just know how much water is in the soil it could predict how much there will be, based on whether it’s going to rain or the sun will be baking hot that day.

It means walking through an art museum and having your phone interpret the paintings as you pause in front of them.

This simple link—between a tag on us and a tag in the world—stands to become the culmination of the location revolution, delivering on all the promises it hasn’t quite fulfilled yet. A simple link—between a tag on us and a tag in the world—will complete the location revolution.

The treasure that it digs up could be considerable.

This is obviously true for retailers:

It’s a future where the intelligence once locked in our devices will now flow into the universe of physical objects. Users and developers can share their simple if-then apps and, in the case of more complex relationships, make money off of apps, just like in the mobile marketplaces.

Processing it all in the cloud in a language unheard of.

On Google Maps, you can now navigate inside certain airports and stores, with Wi-Fi triangulation helping out your GPS. 

And according to a mobile couponing firm called Koupon Media, some 80 per cent of customers who buy gas at one major convenience-store chain never walk inside the store, so presence-based coupons could make a huge impact on the bottom line.

But it’s also true for our everyday lives. Have you ever lost an object in your house and dreamed that you could just type a search for it, as you would for a wayward document on your hard drive? With location stickers, that seemingly impossible desire has become a reality:

A startup called StickNFind Technologies already sells these quarter-sized devices for $25 apiece.

Think about a thermostat app pulling in readings from any other device on that platform—motion sensors that might say which room you’re in, presence tags that identify individual family members (with different temperature preferences)—as well as outside data sources like weather or variable power price.

An even more natural category for apps is security. It locks itself up, shuts down the lights and thermostat, and activates an alarm system complete with siren, flashing lights, and auto-notifications, and notifications with an on-call platoon of off-duty cops all coordinated through the Smart­Things.

This, finally, is the Programmable World, the point at which the full power of developers, entrepreneurs, and venture capitalists are brought to bear on the realm of physical objects—improving it, customizing it, and groping toward new business plans for it that we haven’t dreamed of yet. Indeed, it will marshal all the forces that made the Internet so transformational and put them to work on virtually everything around us.

However, there are obviously some pitfalls lurking in this future of connected objects.

As a sanity check.

Our fears about malicious hackers preying on our email and bank accounts via the cloud might pale in comparison to how we’ll feel about those same miscreants pwning our garage doors and bathroom light fixtures.

The mysterious Stuxnet and Flame exploits have raised the issue of industrial security in the era of connected devices.

Vanity Fair recently detailed nightmare scenarios in which hackers could hit connected objects, from our high tech cars (university researchers have figured out how to exploit an OnStar-type system to cause havoc in a vehicle) to our utility “smart meters” (which collect patterns of energy use that can reveal a great deal about our activities at home) to even our pacemakers.

The idea of animating the inanimate, of compelling the physical world to do our bidding, has been a staple of science fiction for half a century or more.

No, the main existential threat to the Programmable World is the considerably more mundane issue of power. Every sensor still needs a power source, which in most cases right now means a battery; low-energy protocols allow those batteries to last a long time, even a few years, but eventually, they’ll need to be replaced.

Just as with social networking, the privacy concerns of a sensor-­connected world will be fast outweighed by the strange pleasures of residing in a hyperconnected world.

A bigger concern, perhaps, is simple privacy. Just because we’ve finally warmed up to oversharing in the virtual world doesn’t mean we’ll be comfortable doing the same in the physical world, as all our interactions with objects capture more and more data about where we are and what we’re doing. iStock_000049614472Medium1

What’s coming is ubiquitous connectivity that will accelerate how people collaborate, share, learn, gather, do business, and exchange knowledge.

There will one day be universal access to all human knowledge by everyone on the planet.
So based on our collective knowledge, will we be able to act as one.
How will you use global connectivity to enhance our lives?
We automatically sort people into “like us” or “not like us.”
We are currently in a new era, combating mass species extinction and climate change with a Virus Pandemic all bring humans and the natural world together as one. 
Humanity as a whole needs to be united if we are to preserve what’s left on Earth.
One in three of the population of earth died in the Black Death, they had no idea why it was happening.
As a result, they had no responsibility, because they didn’t know.
Our problem is that we do know, and therefore, we have absolute responsibility.
We have only a very small window and if we don’t use that window in the next 10 years, not the next thirty or fifty years connectivity will be the least of our worries.
In November this year, the world will descend on Scotland, and states from across the globe will be given a choice between cooperating or continuing as they have until now.Toxic-leaders

All human comments appreciated. All like clicks and abuse chucken in the bin.

Go back

Your message has been sent

Warning
Warning
Warning
Warning

Warning.

Share this:

  • Click to share on WhatsApp (Opens in new window) WhatsApp
  • More
  • Click to share on Pocket (Opens in new window) Pocket
  • Click to share on Mastodon (Opens in new window) Mastodon

THE BEADY EYE SAY’S; HERE ARE THE BIG QUESTIONS THAT ARE YET TO COME WHEN IT COMES TO TECHNOLOGY.

19 Wednesday Feb 2020

Posted by bobdillon33@gmail.com in #whatif.com, 2020: The year we need to change., Algorithms., Artificial Intelligence., Big Data., Digital age., DIGITAL DICTATORSHIP., Digital Friendship., Fourth Industrial Revolution., Google, Google it., Google Knowledge., Human values., Humanity., Life., Modern day life., Our Common Values., Reality., Sustaniability, Technology, Technology v Humanity, The common good., The Future, The Obvious., The state of the World., The world to day., Unanswered Questions., War, WHAT IS TRUTH, What Needs to change in the World, Where's the Global Outrage., World Leaders

≈ Comments Off on THE BEADY EYE SAY’S; HERE ARE THE BIG QUESTIONS THAT ARE YET TO COME WHEN IT COMES TO TECHNOLOGY.

Tags

Algorithms trade., Algorithms., Artificial Intelligence., Big Data, Capitalism and Greed, Distribution of wealth, Inequility, Technology, The Future of Mankind, Visions of the future.

 

Thirty-minute read.

Who owns what?  What’s our purpose in life?  What are the values that we believe in? How do we think and make decisions?  What do we mean by work?  Can our work ever have true meaning unless it is to serve others?

What will help us all think deeply about the questions we need to ask and answer?

Climate change or technology.

However, for many of us, the answers to these questions differ in our working lives, compared with our personal lives, with family, friends and neighbours.

Were a ruling elite like Google to impose a command-and-control, fear-driven culture in which power is abused and the outcomes are social and economic misery for the vast majority?

Our reaction, if we are to go by what is now observable, will be So what? Now what?

MAKING sure companies compete fairly is a tricky business. The firms being regulated know far more about their business than those doing the regulating;

“Artificial intelligence is the future, not only for Russia but for all of humankind.” Says Putin. “Whatever country comes to dominate this technology will be the “ruler of the world.”

His rhetoric is entirely appropriate. Automation and digitalization have already had a radical effect on international systems and structures.

Technology can easily be referred to as the scientific knowledge to the practical problems we are experiencing in the world today.

On the other hand, its core strategy is to gobble up market share with profit-seeking algorithms.

Our environments are all so full of technology to the point that most of the time we take it for granted.

So are we all becoming personified idiots?

Technology has a great impact on all the fundamental aspects of all our cultures including laws and how they are enforced, language, art, health care, mobility, education and religion.

The obvious problem with all of this is that countries will not own or be in control of the technologies.negative effects of technology

While we all sit back and accept the benefits technology it also brings manipulation on a  worldwide scale with our future in the hands of only a handful of corporations and the vast amount of people that are okay with that.

It’s hard to argue against innovation. It’s hard to argue against greater choice, more convenience and lower prices.

One way or the other it is also hard to underestimate the fundamentally different rules that Google /Amazon/ Facebook/ Apple/ Baidu play by.

Hiding behind forked rhetoric that the data they collect does no harm as it is anonymous.

You do not need to know who you are. It is enough to know what you consume, your habits, your tastes, and where you are, through the IP address, the GPS of the mobile, or your Google account. Your name, or your phone number, is not important to sell you things.

Blurring the borders of privacy. Replacing real-life communication.

And on top of it, violent games and videos killing empathy and bring destruction into an individual’s life. Plagiarism and cheating are increase while analysis and critical thinking decline, ending up in social isolation.

(We now have a new perverse sexual harassment of Cyber flashing which is not against any law. Why? Because our laws cannot keep up with the speed of change)

Commercial technology like Smartphones, I pads, Home Alexa/Echo and there like is about creating another consumer touchpoint for their robust ecosystem of e-commerce, services, and media taking advantage of less sophisticated consumers and trick them into consuming items for short-term satisfaction and long-term pain.

Originally created to serve faithfully to humanity, digital devices are revealing their harmful impact on our lives.

We should all be careful what we wish for.

There’s an argument made by big corporations for each country to charge corporations the lowest possible tax rate, to loosen environmental regulations down to zero, and to eliminate employee protections. All so that a country’s commodity producers can be the cheapest ones.

The voice market war has only just begun.

The contenders:

Amazon-Echo v Google-Alexa.amazon-echo-google-home

Once they figure out how to improve their recommendations and push more people to make regular household purchases via voice it will lead to an explosion in voice-based shopping.

Google already has one of the most valuable brands in the world.

Google maps have virtually no meaningful rival.  Gmail…Google basically controls our handheld existence.

Google controls your life, literally, even if it costs you to believe it.

Google trackers have been found on 75% of the top million websites.

When you search on Google, they keep your search history forever.

Google is a company that offers almost all its products for free because the money is earned by selling the data it collects with those products, to advertisers and companies.

Last year Google made over $161 billion in total revenues.

As it is the premier search engine in the U.S., Europe, and many developing countries Google has the tools to control much of the world.

That’s just Google then you have Amazon.

With around 225 million customers around the world, Amazon wants to deliver everything you want to your doorstep, including Foods anywhere in the world. ( 300 items a second) These days half of all product searches start on Amazon.

Our lust for cheap, discounted goods delivered to our doors promptly and efficiently has a price.

Amazon has done a lot of good for consumers by expanding choice, making shopping far more convenient and by delivering extraordinary product value.

Yet, we can’t–and shouldn’t–ignore the profound effect that Amazon is having on just about every corner of the retail world they set their sights on.

Amazon is selling its facial recognition technology, known as Rekognition, to law enforcement agencies.

First and foremost, Amazon isn’t required by its investors to make any real money.

For us the Great unwashed there’s always the opportunity to cut a corner, sacrifice lifestyle quality and suck it up as they race to grab a little more market share.

With their algorithms, they tell you what restaurants you have to eat in, choose your music, label your photos associating them with each family member or friend that appears in them, pay for your purchases, suggest the movies you should see, and the apps that may interest you.

When in fact the searches we do, what websites we visit, what products we look at, where are we, your medical history, your political beliefs, your associations with others your employment prospects, everything from the womb to the grave is collected and analyzed

Before I hear you calling me a hypocrite I also have used Amazon.

If this scenario prevails, would this be really the way information is supposed to be organized?

In short, does the fact that an algorithm is able to provide more relevant information than a human justify this scenario?

These big brands platforms are more powerful than governments. They’re wealthier. If they were countries, they would be pretty large economies. They’re multinational and the global financial situation allows them to ship money all over the world.

Can we do anything to make a difference?

We need to be supporting the development of an efficient circular economy.

Why?

Because sustainability is an unstoppable force.

Let’s not race to the bottom.

Country’s population size will become less important for national power as small countries that develop a significant edge in AI technology will move far above their weight.

Ultimately, however, winning and losing will not be determined by which country gains the most growth through AI. It will be determined by how the entire global community chooses to leverage AI — as a tool of war or as a tool of progress.

They can eliminate rules protecting clean water, air or consumer safety, but they will always find a way to be cheaper or more brutal than you.

We all assume that Google, Facebook, Amazon, Apple, are spying our activity and up to now advertiser is not interested in your name when they are it will be too late and the winner will be Inequality.

So what does all this mean and what are we all going to do about it when we’ve stopped talking about it?

Once you start to connect all the invisible dots together the impact on society will, in the end, be down to the people that use the technology they have to be responsible for it and if they use it irresponsibly they have to be held accountable.

A Footnote:

For me, there is little point in Jeff Bezos setting up an Earth fund when Amazon is one of the biggest promoters of pollution. Pretending to be a do-gooder.

The brown box doesn’t begin to address the larger issue: Each year in the United States alone thrown paper in the trash that represents approximately 640 million trees or roughly 915,000 acres of forest land.

Amazon ships an average of 608 million packages each year, which equates to (an estimated) 1,600,000 packages a day.

Then when we talk about energy consumption, we’re talking about the sources of energy that generate our power: oil, coal, natural gas and alternatives like solar, wind, hydropower and biofuels.

How much electricity they use and the bill is, god only knows, so its no wonder that they have contracts with oil and gas companies.

Now consider that people conduct over 1,6 billion searches per day, and you get a massive energy footprint of roughly 12.5 million watts.

Is e-commerce reducing or increasing our carbon footprint?

Google’s worldwide operations, collectively worldwide use about 2.26 million megawatt-hours per year to power its global data centre operations, which is equivalent to the power necessary to sustain 200,000 homes.

In 2018 Google generated 39.12 billion dollars earnings out of which it paid 243 Million a day in electricity.

This is only an educated guess.

The link between global warming and energy demands is obvious. Surely both of these players should be investing in Green energy.

There’s a deafening silence from pundits and elites and columnists and politicians on our joint self-destruction.

They are simply going on pretending it isn’t happening.

We don’t, as societies or cultures, value learning or knowledge or magnanimity or great and noble things, anymore.

The average person has become a tiny microcosm of the aspirations and norms of elites. We’re the only people on earth who thwart our own social progress, over and over again — and cheer about it.

We are caught in a death spiral now. A vicious cycle from which there is probably no escape. The average person is too poor to fund the very things — the only things — which can offer him a better life:

The result is that a whole society grows poorer and poorer.

Unable to invest in themselves or one another, people’s only real way out is to fight each other for self-preservation, by taking away their neighbour’s rights, privileges, and opportunities — instead of being able to give any new ones to anyone.

Though it’s too late to escape for them, let us hope our governments regulate their algorithms for profit sake.

All human comments appreciated. All like clicks chucked in the bin.

Go back

Your message has been sent

Warning
Warning
Warning
Warning

Warning.

 

 

 

 

 

 

 

.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Share this:

  • Click to share on WhatsApp (Opens in new window) WhatsApp
  • More
  • Click to share on Pocket (Opens in new window) Pocket
  • Click to share on Mastodon (Opens in new window) Mastodon

THE BEADY EYE SAY’S: WE CAN NO LONGER BE CERTAIN ABOUT ANYTHING.

06 Thursday Feb 2020

Posted by bobdillon33@gmail.com in 2020: The year we need to change., Algorithms., Artificial Intelligence., Digital age., Fourth Industrial Revolution., Human values., Life., Technology, The essence of our humanity., The Obvious., The world to day., Unanswered Questions., WHAT IS TRUTH, What Needs to change in the World, Where's the Global Outrage.

≈ Comments Off on THE BEADY EYE SAY’S: WE CAN NO LONGER BE CERTAIN ABOUT ANYTHING.

Tags

Algorithms trade., Algorithms., Artificial Intelligence., Big Data, Technology, The Future of Mankind

 

 Twenty four-minute read.   

We have no idea how the world will look in twenty years never mind fifty when most of this generation will be in their seventies but it is now becoming clear beyond any doubt that AI and its algorithms are drastically changing the world we live in both for the good and bad.

There is one thing for certain artificial intelligence will have and is having a more profound effect than electricity or fire. 

It will not just hack our lives our brains, it will hack or very existence.

It might well warn us about climate change, the coronavirus but it will as it is manipulate our needs and wants and beliefs. It will effectively be controlling people for both commercial and political purposes. 

Given the force of this technology to have any control left we need meaningful regulations, if not we might as well just surrender to the algorithm, which is becoming so complex that no one will understand them.

The reality is that most of us are giving rivers of free information to Big data to an extent that we will soon be unable to think for ourselves.

If we don’t get a grip by the time you reach the seventies your futures and the future of the next generations will be decided at random by nonelected platforms.  

If this is so, the decision-making process for us all become a thing of the past.

The outlook for Ai is both grim and exciting. 

Already we see data collected affecting elections, with our ability to know what is fake and what is true at the mercy of Social Media run by algorithms.  

We all know their faces: Google, Microsoft, Amazon, Facebook, Apple, Baidu in China, Twitter, Alibaba, to name a few who are already transforming the basic structure of life.

Taken together they form a global oligopoly.

These unregulated platforms are competing for dominance all with a conflict of interests. Hence their algorithms.

The chances of self introducing regulations that will affect or stop their development is pie in the sky. 

Its time we stopped thinking about AI in kind of scientific terms.

Why?

Because algorithms are making critical decisions about our lives and the tech-driven approach to governance is growing. Because any particular scenario will be far from what we think is true today and we are running out of time to do anything about it.

If we are to call a spade a spade there is no or little understanding as to how to regulate these emerging technologies.  Even if there were governments and world institutions that could do so they are largely unequipped to create and enforce any meaningful regulation for the public benefit.

 The problem is how does one regulate an Algorithm that learns. 

You might happy ceding all authority to algorithms and their owners if so you don’t have to do anything they will take care of everything.

If not algorithms are watching you right now to ensure that you do not read this post and if you do read this post they will use one of the most powerful tools in their arsenal of big data – split and divide – False News and repetition thereof for example. 

It won’t happen overnight since development cycles often take years but our collective past will become a less reliable guide and we will have to adapt to the unknown. 

Unfortunately teaching the unknown without mental balance is a disaster in waiting. 

It might be easy now to laugh at this but unless we make our voices heard instead of like clicks it will not be climate change that changes us but race bias that is already programmed into Tec world.

We’re starting to see money examples where these algorithms are prone to the kinds of biases and limitations that we see in human decision making and increasingly we are moving towards algorithms that are learning more and more from data.

 I say, learning from this data almost institutionalizes the biases.

Why?

Because they are trying to personalize the media they curate for us. They’re trying to find for us more and more of the kinds of content that we already consume.  

So what if anything can be done?

Even if we do eventually introduce regulations they will have little effect unless we find a way of sharing the benefits of AI.

The problem is that our institutions, our education models are not able to keep up with the developments in Artificial Intelligence. We are becoming more and more detached from and in decision making, contributing, and rewards.

So our governments are leaving it to the market, to the big Tec companies themselves.

If you are expecting some kind of warning when AI finally get smarter than us then think again.

I say our algorithms are hanging out with the wrong data, profit for profit sake. 

In reality, our electronic overlords are just getting started with the smartphone, the I.Pads, Alex etc taking control.  We have to think about other measures, like is there a social contribution, and what is the impact of this algorithm on society?

This requires transparency.

But how do you create transparency in a world that is getting so complex?

Here is my solution.

Pharmaceuticals are considered as the most highly regulated industries worldwide and every country has its own regulatory authority when it comes to the drug development process.

(World Health Organization (WHO), Pan American Health Organization (PAHO), World Trade Organization (WTO), International Conference on Harmonization (ICH), World Intellectual Property Organization (WIPO) are some of the international regulatory agencies and organizations which play essential role in all aspects of pharmaceutical regulations related to drug product registration, manufacturing, distribution, price control, marketing, research and development, and intellectual property protection.)

Why not put in place a new World Governing Body to test and control Al algorithms. To act as a guardian of our Basic Human Values.

If this is not done it will remain impossible to truly cooperate with an AI or a corporation until such entities have values in the same sense that we do.

So:

All Companies already using algorithms should be legally required to submit the software programs running their algorithms for audit, by an independent team to ensure that our human values are complied with.

This audit could be done by a United Nation’s programme that is agreed to world wide.

The audit process (Because algorithms are constantly evolving as they gather more data.) has to be somewhat continuous like every ten years similar to Control technique. We might also need an algorithm to monitor the auditing algorithm to ensure it is not contaminated while it goes through its refresh-cycle of the Algorithm it is auditing. 

Then they must be made transparent with a certification of acceptable behaviour.   

Transparency for end users actually is very basic.

It’s not like an end-user wants to know the inner details of every algorithm we use.

But we would actually benefit knowing what’s going on at the high level.

For example, what kinds of data are being used by the algorithms to make decisions?

Recommend transparency measures.

Keeping in mind that these algorithms are being deployed and used by humans, and for humans, anyone impacted by decisions made by algorithms should have the right to a description of the data used to train them, and details as to how that data was collected.

The public, have little understanding or access to information about how governments are using data, much of it collected quietly, to feed the algorithms that make decisions about everyday life. And, in an uncomfortable twist, the government agencies themselves often do not fully understand how algorithms influence their decisions.

Having more and more data alone will not solve the problems, gender bias, race bias.

Perhaps the notion of control may only be an illusion.

It won’t be long before they are latching on to life forms.  For example, there’s a type of machine learning algorithm known as neuro labs, and these are modelled on the human brain. What’s happening in these algorithms is they’ve taken lots of data, and they learn how to make decisions like humans have made.

I think this field hasn’t yet emerged.

Humans aren’t changing that much. But the algorithms, the way they’re created, the technological side of it, continues to change, continues to evolve. And trying to keep those things in sync seems to be the greatest challenge.

In a world where algorithms are deciding who gets what, how machine decisions are made, and how the two, can work together.

Because we are going to use these systems so much that we have to understand them at a deeper level, and we can’t be passive about it anymore because the consequences are very significant, whether we’re talking about a democracy or you know, I’m curating news stories for citizens, or we talking about use by doctors in medicine, or used in the courtroom, and so on.

It is going to be extremely important as we roll out algorithms in more and more important settings going forward we start understanding what drives trust in these machines. Understanding what are some socially important outcomes of interest, so that we order these algorithms against these socially important outcomes, like fairness and so on.

Given everything we know about the world and indeed the universe as a whole does anyone seriously believe that nationalism and popularism will help us with this technological problem. 

Let’s talk about what data are collected about us.

It is far too late to be talking about privacy that is what gets abused.

Let’s fight against everything that we can control that limits our freedom. Whether it’s an algorithm, hungry judge or greedy state backed the wrong econometric model…

We need to rethink how we do education, we have to rethink how we do regulation, and firms also need to stand up and do a better job of auditing and taking responsibility as well.

Of course, none of this will happen.

Humans are more likely to be divided between those who favour giving Algorithms and Ai significant authority to make decisions and those opposed to it with both justifying whichever position while Algorithmic logic drives greed and inequality to a point where we will lose control of transparency completely.

To stay relevant as Yuval Noha Harari says in his Book 21 Lessons for the 21st Century “we will need to be asking the questions -how am I where am I.”

There testing rarely go beyond technical measures which are causing society to become more polarized making it more unlikely that we can appreciate other viewpoints.

Just knowing that an algorithm made the decision would be a good place to start.

Was an algorithm used?

If so who does it belong to?

What kinds of data did the algorithm used?

Today, algorithms and artificial intelligence are creating a much more frightening world. High-frequency trading algorithms already rule the stock exchanges of the world.

Personally, I would neither overestimate nor underestimate the role and threat of algorithms. Behind every smart web service is some smarter web code.

So we need to make sure their design is not only informed by knowledge of the human users, but the knowledge of their design is also suitably conveyed to human users so we don’t eliminate the human from the loop completely.

If not they will become a black box, even to the engineer.

All our lives are constrained by limited space and time, limits that give rise to a particular set of problems that are being exploited by profit-seeking algorithms. 

There’s so much data out there to be analyzed. And right now it’s just sitting there not doing anything. So maybe we can come up with a solution that will at least get us started on it.

It is a fascinating topic because there are so many variables involved.

All human comments appreciated. All like clicks and abuse chucked in the bin. 

Go back

Your message has been sent

Warning
Warning
Warning
Warning

Warning.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Share this:

  • Click to share on WhatsApp (Opens in new window) WhatsApp
  • More
  • Click to share on Pocket (Opens in new window) Pocket
  • Click to share on Mastodon (Opens in new window) Mastodon

THE BEADY EYE ASK’S: WHY IS IT THAT WE ARE ALLOWING ALGORITHMS TO RUN OUR LIVES.

28 Tuesday Jan 2020

Posted by bobdillon33@gmail.com in 2020: The year we need to change., Algorithms., Artificial Intelligence., Climate Change., Digital age., DIGITAL DICTATORSHIP., Evolution, Fourth Industrial Revolution., HUMAN INTELLIGENCE, Human values., Humanity., Inequality, Innovation., Modern day life., Post - truth politics., Reality., Robot citizenship., Sustaniability, Technology, The common good., The Future, The Internet., The Obvious., The state of the World., The world to day., Unanswered Questions., WHAT IS TRUTH, What Needs to change in the World, Where's the Global Outrage., WiFi communication.

≈ Comments Off on THE BEADY EYE ASK’S: WHY IS IT THAT WE ARE ALLOWING ALGORITHMS TO RUN OUR LIVES.

Tags

Algorithms trade., Algorithms., Artificial Intelligence., Big Data, Distribution of wealth, Inequility, Technology, The Future of Mankind, Visions of the future.

(Twenty-minute read)

Technology is getting increasingly personal.

With algorithms becoming the masters of social media are we all just becoming clickbait?

Devices are providing immediate information about our health and about what we see, where we go and where we have been.

Our lives are being shaken to their very core.

With 5G technology what we experienced at the moment will pale in comparison to the vast array of possibilities carried under its belt by this new generation of wireless connectivity, which is being built over the foundations of the previous one.

It will allow millions of devices to be connected simultaneously.

All stakeholders – business, government, society and individuals – will have to work together to adjust so these technologies and rapid changes are harnessed for the development of all, not just profit.

Swathes of the globe will be left behind.

Regardless it is no longer just about repetitive factory jobs rather an increase in inequality globally.

It is not only a moral imperative to ensure that such a scenario does not happen as it will pose a risk to global stability through channels such as global inequality, but migration also flows, and even geopolitical relations and security.

We already live in a world that has been profoundly altered by the Fourth Industrial Revolution. Yet there is not much debate on the likely size of the impact.

Why?

Because there are such divergent views it is difficult to measure.

But within the next decade, it is expected that more than a trillion sensors will be connected to the internet. By 2024, more than half of home internet traffic will be used by appliances and devices that are connected to internet platforms.

With almost everything connected, it will transform how we live never mind how we do business.

If there is no trusted institution to regulate it we can kiss our arses.

Now is the time to make sure it is changed for the better.

The internet of things will create huge amounts of data, raising concerns over who will own it and how it will be stored. And what about the possibility that your home or car could be hacked?

The internet is great for ideas, but ultimately, the things that will amaze you are not on your computer screen.

Artificial Intelligence may well invent new life forms but if we as humans do not contrive and manage global acceptable ethical parameters for all its forms – (bioengineering, gene editing, nanotechnology, and the algorithms) that run them we are more than idiots.

As Yuval Noah Harari says in his most recent book ( 21 Lessons for the 21st Century) ” There is no such thing as ‘Christian economics’, ‘Muslim economics’ or ‘Hindu economics’ ” but there will be Algorithms economics run by big brother. 

The digital age has brought us access to so much information in just a few clicks of the mouse button or the remote control everything from the news, Tv programmes with the internet becoming somewhat glorifying sensationalism rather than giving us the truth.

The question is.

Are the technologies that surround us tools that we can identify, grasp and consciously use to improve our lives?

Or are they more than that:

Powerful objects and enablers that influence our perception of the world, change our behaviour and affect what it means to be human?

What can we do?

The Second Industrial Revolution and the Third Industrial Revolution have lead us to this revolution the Fourth Industrial Revolution which can be described as the advent of “cyber-physical systems” involving entirely new capabilities for people and machines.

Unlike previous revolutions, it is not the world as a whole that will see any of its benefits or disadvantages it is individuals and groups that could win – or lose – a lot.

Unfortunately, expanded connectivity does not necessarily lead to expanded or more diverse worldviews it will be the opposite with our increased reliance on digital markets.

At the moment it’s just not very evenly distributed nor will it be.

At best we can moan about it and hope that climate change shifts our reliance on biomass as primary sources of energy.

Back to Clickbait.

The issue with clickbait is that the reader or site visitor is being manipulated into clicking something that is misleading.

Clickbait is not one-dimensional. Each time you run a Google search, scan your passport, make an online purchase or tweet, you are leaving a data trail behind that can be analysed and monetized.

Most clickbait links forward a user to a page that requires payment, registration or a series of pages that help drive views for a specific site.

It can also point to any web content that is aimed at generating online advertising revenue.

We’re all guilty of being gullible of clicking links online but Clickbait websites are notorious for spreading misinformation and creating controversy in the name of generating hits.

Have you not ever felt that you’re being played as dumb individuals whenever you watch the news or scroll through a media site?

Thanks to supercomputers and algorithms, we can make sense of massive amounts of data in real-time. Computers are already making decisions based on this information, and in less than 10 years computer processors are expected to reach the processing power of the human brain. A convergence of the digital, physical and biological spheres challenging our notion of what it means to be human.

Today, 43% of the world’s population is connected to the internet, mostly in developed countries.

Cooperation is “the only thing that will redeem mankind”.

We can use the Fourth Industrial Revolution to lift humanity into a new collective and moral consciousness based on a shared sense of destiny, and that’s until 6G comes along or living robots.

All human comments appreciated. All like clicks and abuse chucked in the bin.

Go back

Your message has been sent

Warning
Warning
Warning
Warning

Warning.

Share this:

  • Click to share on WhatsApp (Opens in new window) WhatsApp
  • More
  • Click to share on Pocket (Opens in new window) Pocket
  • Click to share on Mastodon (Opens in new window) Mastodon

THE BEADY EYE ASK’S: IS THE ACCELERATING TECHNOLOGY AND THE ONGOING REVOLUTION IN INFORMATION MAKING THE WORLD SO COMPLICATED IT IS NOW BEYOND OUR UNDERSTANDING.

11 Saturday Jan 2020

Posted by bobdillon33@gmail.com in Algorithms., Artificial Intelligence.

≈ Comments Off on THE BEADY EYE ASK’S: IS THE ACCELERATING TECHNOLOGY AND THE ONGOING REVOLUTION IN INFORMATION MAKING THE WORLD SO COMPLICATED IT IS NOW BEYOND OUR UNDERSTANDING.

Tags

Algorithms trade., Algorithms., Artificial Intelligence., Big Data, Community cohesion, Distribution of wealth, Earth, Technology, The Future of Mankind, Visions of the future.

 

 

The plain truth can often be so obvious as to be invisible.

There are so many obstacles to change on the scale we so desperately need.

We are fast reaching a point that no humans can or will be able to understand the world we live in.

We pass this way just once.

Artificial algorithms are taking over.

Yuval Noah Harari in his latest book ( 21 lessons for the 21st Century) puts his finger on the problem.

” In the coming century biotech and infotech will give us the power to manipulate the world inside us and reshape ourselves, but because we don’t understand our own minds, the changes we will make might upset our mental system to such an extent that it too might brake down.

Surely its time we stop being the free fodder that feds big data. It’s much harder to struggle against irrelevance than against exploitation.

What will be the point to education if algorithms make us redundant?

It is difficult to discern world-wise whether there is any sincere conversation on AI Ethics.

Is it being addressed by any of the big tech companies or are they just giving token nods to what is right or wrong, while taking advantage of all human beings out there?

Are there just pushback from the outside organisations.

What we are witnessing is their profit growth with economic disparity worldwide increases at a starting rate. This certainly rings true if one looks at the state of the world with people judged by their wealth.

So what is the ethics of creating a sentient life form on a planet that is burning?

Perhaps it will be for the best if we continue not to understand the planet we all live on and leave it to AI to sort us out.

Or can we now start contributing to better governance solutions?

If we don’t grasp the nettle soon there will be no coming back.

To have any relevance now and in the future, we need billions to take to the streets to demand the sustainability of our planet (Human vote with their feet, not Social media) before profit-making goes underground.

When it comes to making the world a better place, corporations are often accused of apathy (the flip-side of blind self-interest). But if consumers are truly committed to social change, they must answer the same challenge.

If we can get consumers to make mindful shopping choices, to support brands that act responsibly and to purchase goods from those that dedicate a portion of the sale proceeds to causes, we are well on our way to re-purposing everyday purchases.

All human comments appreciated. All like clicks and abuse chucked in the bin.

Go back

Your message has been sent

Warning
Warning
Warning
Warning

Warning.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

and become

Share this:

  • Click to share on WhatsApp (Opens in new window) WhatsApp
  • More
  • Click to share on Pocket (Opens in new window) Pocket
  • Click to share on Mastodon (Opens in new window) Mastodon

THE BEADY EYE ASK’S: ARE PROFIT SEEKING ALGORITHMS BUILDING A DIGITAL POORHOUSE, AUTOMATING INEQUALITY WHILE HURTING THE MOST VULNERABLE.

03 Sunday Nov 2019

Posted by bobdillon33@gmail.com in Algorithms., Artificial Intelligence., Capitalism, Fourth Industrial Revolution., Humanity., Inequality, Modern day life., Our Common Values., Poverty, Reality., Technology, The common good., The Future, The Obvious., The state of the World., The world to day., Unanswered Questions., WHAT IS TRUTH, What Needs to change in the World, Where's the Global Outrage.

≈ Comments Off on THE BEADY EYE ASK’S: ARE PROFIT SEEKING ALGORITHMS BUILDING A DIGITAL POORHOUSE, AUTOMATING INEQUALITY WHILE HURTING THE MOST VULNERABLE.

Tags

Algorithms., Artificial Intelligence., Big Data, Capitalism and Greed, Distribution of wealth, Greed, Inequility, Technology, Visions of the future.

(Twenty-minute read)

Should we worry about the rise of artificial intelligence or celebrate it?

Both is the answer.

We all inhabit this new regime of digital data but we don’t all inhabit it in the same way and the pursuit of rapid growth by way of technology won’t solve the huge challenges we face.

A more honest, humane approach is the answer.

If you believe the hype, technology is going to help us end global poverty, that’s easier said than done in a world where most product innovations are geared toward the rich.

The prospect of billions rising up from poverty with nothing more than gadgets is indeed a fanciful notion. This is because poverty is entirely a man-made creation. Capitalism is driven by greed, generating a power structure, which moves wealth disproportionately into the hands of the few.

But why are our societies becoming increasingly unequal, and what can we (or should we) do about it?A homeless man outside Victoria Station in London.

Forget where science ends and ideology begins it is the mechanisms behind the persistence of poverty that counts.

Technology cannot solve the problem of economic disparity.

We often believe that our digital decision-making tools, like algorithms or artificial intelligence or integrated databases, are more objective and more neutral than human beings.

Totally false.

We are building not just ill-conceived mathematical models now micromanage the economy, from advertising to prisons but also hiding the profit of multinational companies in the cloud.

We are building:  A DIGITAL POOR HOUSE.

Even though we live in a hyperconnected world we are watching inequality exploding as we walk past people on the street looking at our smartphones.

The spreading of these kinds of systems is now way beyond just the public service systems that they’re in now. For example, high-frequency trading algorithms that run 99.9% of the world stock exchanges are plundering not just finite resources they are jeopardising our peaceful existence.

Feel free to ignore the weight of the evidence that is now becoming crystal-clear, so stark, that the trade-off of the growth of the economy and the survival of the planet are now intertwined. So we have to go into a mode where we are first educating the people about what’s causing this inequality and acknowledging that technology is part of that cost, and then society has to decide how to proceed.

Deep cultural and political changes are needed in order to think through these technologies in order to get to better systems.

This should apply to all technology – nanotechnology, biotech.

I also really believe we need to stop using these systems to avoid some of the most pressing moral and political dilemmas of our time, which is not just poverty but racism.

Unfortunately, we have Profit-seeking Algorithms that have no moral or ethical bases.

Algorithms — a set of steps computers follow to accomplish a task — are used in our daily digital lives to do everything from making airline reservations to searching the web. They are also increasingly being used in public services, such as systems that decide which homeless person gets housing.

AI with faceless algorithms is worsening the effects and concentrating the power of the wealthy. They are likely to dramatically increase income disparity, perhaps more so than other technologies that have come about recently.

Digital innovation in the form of profit-seeking algorithms that it’s not just going to be benefitting a small fraction of the world’s population, or just a few large corporations. is reinforcing, rather than improving, inequality.

Institutions have embraced digital technologies they are outsourcing the decision to a machine to cut costs avoiding the human costs. They say, “We have this incredible overwhelming need. We don’t have enough resources, so we have to use these systems to make these incredibly difficult decisions.”

If all the resources are automated, then who actually controls the automation?

Is it every one or is it a few select people?

My great fear with these systems is we’re actually using them as a kind of empathy override, meaning that we are struggling with questions that are almost impossible for human beings to make.

We’re smuggling moral and political assumptions into them about who should share in prosperity.

There’s already an expectation that people will be forced to trade one of their human rights, like their information or their privacy, for another human right.

The economic prosperity created by AI should be shared broadly, to benefit all of humanity otherwise they will lead to an even greater disparity between the wealthy and the rest of the world.

If AI takes away people’s jobs and only leaves wealth in the hands of those people owning the robots, then that’s going to exacerbate some trends that are already happening.

People now with “predictive data” have real concerns about informed consent. About how their data is being shared, whether it’s legal and whether it’s morally right.

Why?

Because it is impossible to work out why the algorithms had gone against them, or to find a human caseworker to override the decision.

How can we change the societal mindset that currently discourages a greater sharing of wealth? Or is that even a change we should consider?

We’re using these technologies to avoid important political decisions. Exacerbating the divides between the developed and developing world, and the haves and have nots in our society.

The change will only occur when policymakers and voters understand the true scale of the problem. This is hard when we live in an era that likes to celebrate digitisation — and where the elites are usually shielded from the consequences of those algorithms.

Restoring human dignity to its central place has the potential to set off a profound rethinking of economic priorities and the ways in which societies care for their members, particularly when they are in need. If enough of us want to change the status quo for good, then with our collective creativity, with our hunger to solve really hard problems, we will find technology an incredibly powerful tool in our arsenal.

Technology can move commodities (food, jobs, wealth) from areas of surplus supply to regions with under-served markets.

Technology can only help us if we chose to make the best use of it.

Computing has long been perceived to be a culture-free zone — this needs to change.

Today more people have access to a cell phone than a toilet.

I use that metaphor specifically because I think that these systems, although we talk about them often as disruptors, are really more intensifiers and amplifiers of processes that have been with us for a long time, at least since the 1800s.

At a time of unprecedented global challenges platforms like Google Facebook, Twitter and there like must be made to use the power of their platforms to stop the DIGITAL POOR HOUSE instead of hoarding profits with profit-seeking algorithms.

If not because bias has been a historical norm, it because us the users of your platform will develop self-defence.

So, next time if you think AI is not affecting you, take out your smartphone.

If Twitter’s not your choice of poison, maybe it’s Facebook or Instagram, or Snapchat or any of the myriad of social media apps out there they are all affecting your decisions and our lifestyles every day.

They are all tailored according to what we are likely to respond to. specifically designed to attract the attention of its members – and so inevitably to confirm them in their opinions and prejudices,  with several extra bills to pay in order to remain a normal citizen. 

Its a ‘mean’ not the ultimate solution.

AI has become so successful in determining our interests and serving us ads that the global digital ad industry has crossed trillions.

artificial intelligence concept Stock Photo - 90948450

All human comments appreciated. All like clicks and abuse chucked in the bin.

Go back

Your message has been sent

Warning
Warning
Warning
Warning

Warning.

 

 

 

 

 

 

 

 

 

 

Share this:

  • Click to share on WhatsApp (Opens in new window) WhatsApp
  • More
  • Click to share on Pocket (Opens in new window) Pocket
  • Click to share on Mastodon (Opens in new window) Mastodon

THE BEADY EYE ASK’S: SHOULD WE BE GIVING AWAY OUR PERSONAL DATA FOR FREE.

15 Tuesday Oct 2019

Posted by bobdillon33@gmail.com in Algorithms., Big Data., DIGITAL DICTATORSHIP., Face Recognition., Fourth Industrial Revolution., Google Knowledge., Modern Day Communication., Modern Day Democracy., Our Common Values., Robot citizenship., Technology, The common good., The essence of our humanity., The Internet., The Obvious., The state of the World., The world to day., Unanswered Questions., WHAT IS TRUTH, What Needs to change in the World, Where's the Global Outrage., WiFi communication.

≈ Comments Off on THE BEADY EYE ASK’S: SHOULD WE BE GIVING AWAY OUR PERSONAL DATA FOR FREE.

Tags

Algorithms trade., Algorithms., Artificial Intelligence., Big Data, Click World., Internet, Privacy boundaries., SMART PHONE WORLD, Technology, The Future of Mankind, Visions of the future., Wireless information.

 

 

(Six-minute read)

 

Is it time we started to demand that if you use my personal data it’ill cost you because I am worth it.

We all make a trade-off between security and convenience, but there is a crucial difference between security in the old-fashioned physical domain, and security today.

Security is done digitally with algorithms exploiting and analysing your very mood.

In this digital lifestyle, it is nearly impossible to take part in the web world without leaving a trail behind.

Personal privacy is dead.

How to Protect Yourself From Mobile Data Collection

 

We have no clear sight into this world, and we have few sound intuitions into what is safe and what is flimsy – let alone what is ethical and what is creepy.

We are left operating on blind, ignorant, misplaced trust; meanwhile, all around us, without our even noticing, choices are being made.

With the increasing ownership of mobiles, marketing companies now have unlimited access to our personal data. Every site one opens has an agreement form to be ticked with terms and conditions that are all but unreadable on small screens.

It’s not a choice between privacy or innovation, it is the erosion of legally ensured fundamental privacy rights interfacing with apps.

Nuggets of personal information that seem trivial, individually, can now be aggregated, indexed and processed.

When this happens, simple pieces of computer code can produce insights and intrusions that creep us out or even do us harm. But most of us haven’t noticed yet.

Since there’s no real remedy, giving away our most sensitive and valuable data, for free, to global giants, with completely uncertain future costs, is a decision of dramatic consequence.

iCloud and Google+ have your intimate photos; Transport companies know where your travelcard has been; Yahoo holds every email you’ve ever written and we trust these people to respect our privacy.

You only have to be sloppy once, for your privacy to be compromised.

With your Facebook profile linked, I could research your interests before approaching you.

Put in someone’s username from Twitter, or Flickr and Creepy will churn through every photo hosting service it knows, trying to find every picture they’ve ever posted.

Cameras – especially phone cameras – often store the location where the picture was taken in the picture data. Creepy grabs all this geolocation data and puts pins on a map for you.

Then comes an even bigger new horizon.

We are entering an age of wireless information. The information you didn’t know you were leaking.

Maybe the first time you used a new app.

Every device with Wi-Fi has a unique “MAC address”, which is broadcast constantly as long as wireless networking is switched on.

Many shops and shopping centres, for example, now use multiple Wi-Fi sensors, monitoring the strength of connections, to triangulate your position, and track how you walk around the shop. By matching the signal to the security video, they get to know what you look like. If you give an email address in order to use the free in-store Wi-Fi, they have that too.

Once aggregated, these individual fragments of information can be processed and combined, and the resulting data can give away more about our character than our intuitions are able to spot.

When I realised that I’m traced over much wider spaces from one part of town to another I asked myself what is the point in giving you information away when you could franchise it out and get something back in return.

Public debate on the topic remains severely stunted.

Through the current trends in the globalization of technology is in the knowledge society, we have to start asking where is the world moving to?

The concepts and applications of biocomputing, medical informatics, anthropocentric computing, high-performance computing, technological diffusion, predictive analysis tools, genetic algorithms, and cultural informatics all in new or little known fields of information technology.

Many organisations create, store, or purchase information that links individuals’ identities to other data. Those who can access and analyse this personal data profiles can take deep insights into an individual’s life.

A law-abiding citizen might say “I have nothing to conceal.” This is a misconception.

In any debate, negotiation or competitive situation, it is an advantage to know about the other party’s position in order to achieve one’s own desired outcome.

Data brokers – buy and combine data from various sources (online and offline) to deliver information on exactly defined target groups to their customers.

“Click-world” merchants know a lot more about their clients’ private and financial habits than the individual knows about the merchant company or its competitors.

You, therefore, could not be blamed for asking -given the increasing bargaining position of merchants, is the consumer still getting a good deal?

It would be interesting to know how good a deal consumers get when they exchange their data for free-of-charge online services.How to Protect Yourself From Mobile Data Collection

Data has become an economic good for which the “producer” is usually not remunerated.

Data privacy is a matter of choice and individuals should have the right to decide if a company can collect information on them.

Is there a solution:

Of course if you Google it what you will get will be all sorts of advice such as, avoid cookies, use the VPN or disabling the location tracking in your devices and use Browsers that don’t track your activities.

It’s tempting to just play ostrich and put our heads under the sand however data collection is affecting and will affect your life.

This is why we must preserve the right of individuals to know what kind of information is being collected and what is being done with that information.

You could say that the most valuable thing on your computer or network is the data you create. After all, that data is the reason for having the computer and network in the first place.

The first thing to understand is that there is very little that can “prove” that any company (whether an individual, government entity, corporation, etc.) is engaged in safe or adequate data handling processes.

Therefore :

We must retain the right to define our own privacy boundaries and then advocate for those boundaries before invasions in our daily lives become out of control and irreversible.

All human comments appreciated. All like clicks and abuse chucked in the bin.

Go back

Your message has been sent

Warning
Warning
Warning
Warning

Warning.

 

 

 

 

 

 

 

 

 

 

 

 

 

Share this:

  • Click to share on WhatsApp (Opens in new window) WhatsApp
  • More
  • Click to share on Pocket (Opens in new window) Pocket
  • Click to share on Mastodon (Opens in new window) Mastodon

THE BEADY EYE SAY’S: ALGORITHMS ARE RUNNING OUR LIVES.

06 Friday Sep 2019

Posted by bobdillon33@gmail.com in Algorithms.

≈ 1 Comment

Tags

Algorithms trade., Algorithms., Artificial Intelligence., Big Data, Capitalism and Greed, Distribution of wealth, Inequility, Technology, The Future of Mankind, Visions of the future.

 

(Twenty-minute read)

So should we be more wary of their power?

Their lack of accountability and complete opacity is frightening.

Some time ago I advocated that there should be a legal requirement that all software programmes/ algorithms be subject to regulation to create an oversight that would assess the impact of algorithms before it becomes alive.

Other words a virtual total transparent deposit world bank where the original programmes are held and vetted to comply with the core principals of humanity.

That, by itself, is now a tall order that requires impartial experts backtracking through the technology development process to find the models and formulae that originated the algorithms.

Who is prepared to do this? Who has the time, the budget and resources to investigate and recommend useful courses of action?

This is a 21st-century job description – and market niche – in search of real people and companies outside political manipulation. In order to make algorithms more transparent, products and product information circulars might include an outline of algorithmic assumptions, akin to the nutritional sidebar now found on many packaged food products, that would inform users of how algorithms drive intelligence in a given product and a reasonable outline of the implications inherent in those assumptions.

At the moment they perform seemingly miraculous tasks humans cannot and they will continue to greatly augment human intelligence and assist in accomplishing great things.  Also, our accelerating code-dependency will continue to drive the spread of algorithms; however, as with all great technological revolutions, this trend has a dark side.

There is no argument that the efficiencies of algorithms will lead to more creativity and self-expression.

However, to days algorithms are primarily written to optimize efficiency and profitability without much thought about the possible societal impacts of the data modelling and analysis.

Humans are considered to be an “input” to the process and they are not seen as real, thinking, feeling, changing beings.

This is creating a flawed, logic-driven society and that as the process evolves – that is, as algorithms begin to write the algorithms – humans may get left out of the loop, letting “the robots decide.

Algorithms will capitalize on convenience and profit, thereby discriminating [against] certain populations, but also eroding the experience of everyone else. The goal of to days algorithms is to fit some of our preferences, but not necessarily all of them: They essentially present a caricature of our tastes and preferences.

The fear is that, unless we tune our algorithms for self-actualization, it will be simply too convenient for people to follow the advice of an algorithm (or, too difficult to go beyond such advice), turning these algorithms into self-fulfilling prophecies, and users into zombies who exclusively consume easy-to-consume items.

It is not possible to capture every data element that represents the vastness of a person and that person’s needs, wants, hopes, desires. When you remove the humanity from a system where people are included, they become victims.

Dehumanization by algorithms has now spread to our police forces, to our legal systems, to our health care and social services, our politics – Brexit, Donal Trump.

So let’s ask a few questions.

Who is collecting what data points?

Do human beings the data points reflect even know or did they just agree to the terms of service because they had no real choice?

Who is making money from the data?

How is anyone to know how his/her data is being massaged and for what purposes to justify what ends?

Companies platforms like Google Facebook, Twitter,  seek to maximize profit, not maximize societal good. Worse, they repackage profit-seeking as a societal good.

There is no transparency, and oversight is a farce.

We see already today is that, in practice, stuff like ‘differential pricing’ does not help the consumer; it helps the company that is selling things, etc.

With it, all hidden from view individual human beings will be herded around like cattle, with predictably destructive results on rule of law, social justice and economics.

There is at the moment only an incentive to further obfuscate the presence and operations of algorithmic shaping of communications processes. The fact the internet can, through algorithms, be used to almost read our minds means [that] those who have access to the algorithms and their databases have a vast opportunity to manipulate large population groups.

Our Economies are increasingly dominated by a tiny, very privileged and insulated portion of the population, largely reproduce inequality for their benefit. Criticism will be belittled and dismissed because of the veneer of digital ‘logic’ over the process.

I will always remain convinced the data will be used to enrich and/or protect others and not the individual. It’s the basic nature of the economic system in which we live.

Algorithms have the capability to shape individuals’ decisions without them even knowing it, giving those who have control of the algorithms an unfair position of power.

The overall impact of ubiquitous algorithms is presently incalculable because the presence of algorithms in everyday processes and transactions is now so great, and is mostly hidden from public view. Our algorithms are now redefining what we think, how we think and what we know.

We need to ask them to think about their thinking – to look out for pitfalls and inherent biases before those are baked in and harder to remove.

Should we be allowing ourselves to become so reliant on them – and who, if anyone, is policing their use?

Will the net overall effect of algorithms be positive for individuals and society or negative for individuals and society?

If every algorithm suddenly stopped working, it would be the end of the world as we know it.

We have already turned our world over to machine learning and algorithms.

The question now is, how to better understand and manage what we have done?

What are the implications of allowing commercial interests and governments to use algorithms to analyse our habits:

The main negative changes come down to a simple but now quite difficult question:

How can we see, and fully understand the implications of, the algorithms programmed into everyday actions and decisions?

The rub is this: Whose intelligence is it, anyway?

Algorithms are aimed at optimizing everything. Our lives will be increasingly affected by their inherent conclusions and the narratives they spawn.

By expanding collection and analysis of data and the resulting application of this information, a layer of intelligence or thinking manipulation is added to processes and objects that previously did not have that layer.

The internet runs on algorithms and all online searching is accomplished through them.

Email knows where to go thanks to algorithms. Smartphone apps are nothing but algorithms. Computer and video games are algorithmic storytelling. Online dating and book-recommendation and travel websites would not function without algorithms. GPS mapping systems get people from point A to point B via algorithms.

Artificial intelligence (AI) is nought but algorithms.

The material people see on social media is brought to them by algorithms.

In fact, everything people see and do on the web is a product of algorithms.

Every time someone sorts a column in a spreadsheet, algorithms are at play, and most financial transactions today are accomplished by algorithms. Algorithms help gadgets respond to voice commands, recognize faces, sort photos and build and drive cars. Hacking, cyberattacks and cryptographic code-breaking exploit algorithms.

In the future algorithms will write many if not most algorithms.

The rise of increasingly complex algorithms calls for critical thought about how to best prevent, deter and compensate for the harms that they cause …. Algorithmic regulation will require world government uniformity, expert judgment, political independence and pre-market review to prevent – without stifling innovation – the introduction of unacceptably dangerous algorithms into the market.

The usage of algorithms and analytics in society is exploding:

From machine learning recommender systems in commerce, to credit scoring methods outside of standard regulatory practice and self-driving cars.

We now spend so much of our time online that we are creating huge data-mining opportunities with algorithms programmed to look for “indirect, non-obvious” correlations in data and over time, if not already will create or exacerbate societal divides.

Algorithms are increasingly determining our collective futures. “Bank approvals, store cards, job matches and more. Google’s search algorithm is now a more closely guarded commercial secret than the recipe for Coca-Cola),

The problem is how the rules are set: it’s impossible to do this perfectly.

The questions being raised about algorithms at the moment are not about algorithms per se, but about the way, society is structured with regard to data use and data privacy.

Humans are seeing as causation when an algorithm identifies a correlation in vast swaths of data.

This transformation presents an entirely new menace: penalties based on propensities.

The possibility of using big-data predictions about people to judge and punish them even before they’ve acted. Doing this negates ideas of fairness, justice and free will.

Parole boards in more than half of all US states use predictions founded on data analysis as a factor in deciding whether to release somebody from prison or to keep him incarcerated.

We risk falling victim to a dictatorship of data, whereby we fetishise the information, the output of our analyses, and end up misusing it.

They can and will become an instrument of the powerful, who may turn it into a source of repression, either by simply frustrating customers and employees or, worse, by harming citizens.

The idea that the world’s financial markets – and, hence, the wellbeing of our pensions, shareholdings, savings etc – are now largely determined by algorithmic vagaries is unsettling enough for some. In currency trading, an algorithm lasts for about two weeks before it is stopped because it is surpassed by a new one.

We’re already halfway towards a world where algorithms run nearly everything.

As their power intensifies, wealth will concentrate on them.

Advances in quantum computing and the rapid evolution of AI and AI agents embedded in systems and devices in the Internet of Things will lead to hyper-stalking, influencing and shaping of voters, and hyper-personalized ads, and will create new ways to misrepresent reality and perpetuate falsehoods.

Climate change is becoming visible while the profits of the capitalist world are going underground thanks to Algorithms.

All human comments appreciated. All like clicks and abuse chucked in the bin.

Go back

Your message has been sent

Warning
Warning
Warning
Warning

Warning.

 

 

Share this:

  • Click to share on WhatsApp (Opens in new window) WhatsApp
  • More
  • Click to share on Pocket (Opens in new window) Pocket
  • Click to share on Mastodon (Opens in new window) Mastodon

THE BEADY EYE ASK’S: WHEN WILL IT MAKE SENSE FOR AN AI TO LIE TO A PERSON.

07 Thursday Feb 2019

Posted by bobdillon33@gmail.com in Algorithms., Artificial Intelligence., Big Data., HUMAN INTELLIGENCE, Humanity., Life., Our Common Values., Reality., Technology, The common good., The essence of our humanity., The Future, The Obvious., The world to day., Unanswered Questions., WHAT IS TRUTH

≈ Comments Off on THE BEADY EYE ASK’S: WHEN WILL IT MAKE SENSE FOR AN AI TO LIE TO A PERSON.

Tags

Algorithms trade., Algorithms., Artificial Intelligence., Big Data, Technology, The Future of Mankind, Visions of the future.

( Two Minute read)

We all know that history is plagued with falsehood and lies mainly told by the victors but now we have new liers on the block that are so perfect at telling them you wonder if anything is true.

They are a powerful amplifier of social, economic and cultural inequalities currently forcing us to confront the kind of society we have created.

Algorithms will force us to recognize how the outcomes of past social and political conflicts have been perpetuated into the present through our use of data.

The question now is whether we will use these revelations to create a more just society.

For 4bn years life on Earth evolved according to the laws of natural selection and organic chemistry. Now science is about to usher in the era of non-organic life evolving by intelligent design, and such life may well eventually leave Earth to spread throughout the galaxy.

Artificial intelligence will probably be the most important agent of change in the 21st century. The choices we make today may have a profound impact on the trajectory of life for countless millennia and far beyond our own planet.

That demand for clarity is making it harder to ignore the structural sources of societal inequities.

The question in the near future will be whether larger groups of people will be able to tell reality from fiction, or whether technological authentication of media will become completely necessary to trust anything online.

So when will it makes sense for an AI to lie to a person?

It’s entirely possible that a robot may need to misrepresent some things in order to preserve itself.

As algorithmic decision-making spreads across a broadening range of policy areas, it is beginning to expose social and economic inequities that were long hidden behind “official” data.

In order for AIs to lie effectively, they’re going to have to develop what’s called a “theory of mind.” That means they’ll have to guess what you, the user believes, and also predict how you will react when given any particular set of information (whether that information is true or not).

Disinformation powered by AI is already rampant – Donald Trump election, Brexit, Popularism.

So are we OK with lying to an AI and, likewise, OK with being lied to by an AI?

Fake reports and videos.Bots.Algorithmic curation. Targeted behavioural marketing powered by algorithms and machine learning.

I for one would like to live in a society whose systems are built on top of verifiable, rigorous, thorough knowledge, and not on the alchemy of machine learning

(A machine-learning system is a bundle of algorithms that take in torrents of data at one end and spit out inferences, correlations, recommendations and possibly even decisions at the other end.)

I can’t explain the inner workings of their mathematical models: they themselves lack rigorous theoretical understandings of their tools and in that sense are currently operating in alchemical rather than scientific mode.

They encourage hypnotised wonderment and they disable our critical faculties.10 of the Biggest Lies Told About Bitcoin

If we don’t take some action the future of life on Earth will be decided by small-time politicians spreading fears about terrorist threats, by shareholders worried about quarterly revenues and by marketing experts trying to maximise customer experience.

Hopefully, unchecked flaws in algorithms and even the technology itself should put a brake on the escalating use of big data.

We need such systems themselves “understand” enough to avoid deception.

There will be no point in a Machine learning life returning to earth if we are unable to know what it experienced is true.

All human comments appreciated. All like clicks and abuse chucked in the bin.

Go back

Your message has been sent

Warning
Warning
Warning
Warning

Warning.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Share this:

  • Click to share on WhatsApp (Opens in new window) WhatsApp
  • More
  • Click to share on Pocket (Opens in new window) Pocket
  • Click to share on Mastodon (Opens in new window) Mastodon

THE BEADY EYE SAY’S: BEFORE WE ARE ALL HACKED – WHAT IF.

03 Saturday Nov 2018

Posted by bobdillon33@gmail.com in Algorithms., Artificial Intelligence., Big Data., Capitalism, Evolution, Fake News., Freedom, Google Knowledge., HUMAN INTELLIGENCE, Humanity., Politics., Populism., Reality., Social Media, Sustaniability, The common good., The essence of our humanity., The Internet., The Obvious., The world to day., Unanswered Questions., What Needs to change in the World, Where's the Global Outrage., World Leaders, World Organisations., World Politics

≈ Comments Off on THE BEADY EYE SAY’S: BEFORE WE ARE ALL HACKED – WHAT IF.

Tags

Algorithms trade., Algorithms., Artificial Intelligence., Big Data, Distribution of wealth, Inequility, Internet, Social Media, Technology, The Future of Mankind, Visions of the future.

 

(Two-minute read)

I have posted many articles concerning Algorithms that are plundering our lives and the world for profit.

Although governments and world organisations are only just waking up to the power of these algorithms giving the changes we are witnessing to society there are little, or no conscientious efforts as to how to introduce regulations to limit the damage they are doing.

With every click, power is shifting to the Google’s, the Microsofts, the Apple, the Amazon, the eBay’s, the Netflix’s, to machine learning recommendations, to Social Media rhetoric, to right-wing politics disguised as populism nationalism.

ALL CREATING A PLANET IN CRISES.

So In this post, I am hoping to create an online pressure group to lobby the relevant powers to effect change.

Life is not only trade, consumption and markets.Résultat de recherche d'images pour "photo of question marks"

THE SUGGESTED NAME FOR THE GROUP IS # WHAT IF.COM

SO IF THERE IS ANYONE READING THIS THAT KNOWS HOW TO GO ABOUT SETTING UP SUCH A WEBSITE I AM ALL EARS.

WHY SET UP SUCH A GROUP.

BECAUSE:

Markets are not faceless forces.

All markets have some sort of morality.

Buyers and sellers need to consider the consequences which their actions and decisions may have on the environment and on society itself.

Today a simple one-dollar-one-vote principle dominates the world economy.

International organizations ought to impose sanctions upon countries which condone immoral practices, such as the use of child labour, environmental destruction, the selling of arms or the persecution of trade unionists.

The detrimental effects of international money markets and the crises caused by speculation can be alleviated by international legislation such as levying taxes on international currency exchange.

Free markets do not guarantee adequate conditions of life to all people. Therefore we need states and organisations that protect the weak and defends social justice.

The eradication of poverty presupposes equalization of income. This means, for example, that the strong and well to do must assume a proportionally greater burden of taxes than the weak and the poor.

We need services which citizens themselves initiate and generate, and the new potential, which they can contribute to the life of our congregations and local communities.

The ultimate responsibility for ensuring that local communities have the resources to guarantee basic security for all their members use to rests with the national governments.

Basic security must, in the future, also include healthcare and adequate, living standards, so that all people are reasonably covered regardless of their wealth and position in society.

All contributions and comments appreciated. All like clicks chucked in the bin.

Go back

Your message has been sent

Warning
Warning
Warning
Warning

Warning.

 

Share this:

  • Click to share on WhatsApp (Opens in new window) WhatsApp
  • More
  • Click to share on Pocket (Opens in new window) Pocket
  • Click to share on Mastodon (Opens in new window) Mastodon
← Older posts
Newer posts →

All comments and contributions much appreciated

  • THE BEADY EYE ASKS: HOW MUCH LONGER IS THE WORLD GOING TO TOLERATE THE PLUNDERING OF USA? January 10, 2026
  • THE BEADY EYE SAYS WARS USE TO BE MAN AGAINST MAN, NO LONGER. WAR IS NOW DRONE AGAINST DRONE. January 9, 2026
  • THE BEADY EYE SAYS. WE HAVE SURROUND THE EARTH WITH SATELLITES. HERE TO DAY GONE TO MORROW. January 9, 2026
  • THE BEADY EYE SAYS GOVERNANCE IS BECOMING MORE BY REALITY TV, ALGO January 9, 2026
  • THE BEADY EYE SAYS. DONALD TRUMP IS EXPOSING THE WEAKEST OF OUR WORLD ORGANISATIONS January 7, 2026

Archives

  • January 2026
  • December 2025
  • November 2025
  • October 2025
  • September 2025
  • August 2025
  • July 2025
  • June 2025
  • May 2025
  • February 2025
  • January 2025
  • December 2024
  • November 2024
  • October 2024
  • September 2024
  • August 2024
  • July 2024
  • June 2024
  • May 2024
  • April 2024
  • March 2024
  • February 2024
  • January 2024
  • December 2023
  • October 2023
  • September 2023
  • August 2023
  • July 2023
  • June 2023
  • May 2023
  • April 2023
  • March 2023
  • February 2023
  • January 2023
  • December 2022
  • November 2022
  • October 2022
  • September 2022
  • August 2022
  • July 2022
  • June 2022
  • May 2022
  • April 2022
  • March 2022
  • February 2022
  • January 2022
  • December 2021
  • November 2021
  • October 2021
  • September 2021
  • August 2021
  • July 2021
  • June 2021
  • May 2021
  • April 2021
  • March 2021
  • February 2021
  • January 2021
  • December 2020
  • November 2020
  • October 2020
  • September 2020
  • August 2020
  • July 2020
  • June 2020
  • May 2020
  • April 2020
  • March 2020
  • February 2020
  • January 2020
  • December 2019
  • November 2019
  • October 2019
  • September 2019
  • August 2019
  • July 2019
  • June 2019
  • May 2019
  • April 2019
  • March 2019
  • February 2019
  • January 2019
  • December 2018
  • November 2018
  • October 2018
  • September 2018
  • August 2018
  • July 2018
  • June 2018
  • May 2018
  • April 2018
  • March 2018
  • February 2018
  • January 2018
  • December 2017
  • November 2017
  • October 2017
  • September 2017
  • August 2017
  • July 2017
  • June 2017
  • May 2017
  • April 2017
  • March 2017
  • February 2017
  • January 2017
  • December 2016
  • November 2016
  • October 2016
  • September 2016
  • August 2016
  • July 2016
  • June 2016
  • May 2016
  • April 2016
  • March 2016
  • February 2016
  • January 2016
  • December 2015
  • November 2015
  • October 2015
  • September 2015
  • August 2015
  • July 2015
  • June 2015
  • May 2015
  • April 2015
  • March 2015
  • February 2015
  • January 2015
  • December 2014
  • November 2014
  • October 2014
  • September 2014
  • August 2014
  • July 2014
  • June 2014
  • May 2014
  • April 2014
  • March 2014
  • February 2014
  • January 2014
  • December 2013
  • November 2013
  • October 2013

Talk to me.

Jason Lawrence's avatarJason Lawrence on THE BEADY EYE ASK’S: WIT…
benmadigan's avatarbenmadigan on THE BEADY EYE ASK’S: WHA…
bobdillon33@gmail.com's avatarbobdillon33@gmail.co… on THE BEADY EYE SAYS: WELCOME TO…
Ernest Harben's avatarOG on THE BEADY EYE SAYS: WELCOME TO…
benmadigan's avatarbenmadigan on THE BEADY EYE SAY’S. ONC…

7/7

Moulin de Labarde 46300
Gourdon Lot France
0565416842
Before 6pm.

My Blog; THE BEADY EYE.

My Blog; THE BEADY EYE.
bobdillon33@gmail.com

bobdillon33@gmail.com

Free Thinker.

View Full Profile →

Follow bobdillon33blog on WordPress.com

Blog Stats

  • 94,499 hits

Blogs I Follow

  • unnecessary news from earth
  • The Invictus Soul
  • WordPress.com News
  • WestDeltaGirl's Blog
  • The PPJ Gazette
Follow bobdillon33blog on WordPress.com
Follow bobdillon33blog on WordPress.com

The Beady Eye.

The Beady Eye.
Follow bobdillon33blog on WordPress.com

Create a free website or blog at WordPress.com.

unnecessary news from earth

WITH MIGO

The Invictus Soul

The only thing worse than being 'blind' is having a Sight but no Vision

WordPress.com News

The latest news on WordPress.com and the WordPress community.

WestDeltaGirl's Blog

Sharing vegetarian and vegan recipes and food ideas

The PPJ Gazette

PPJ Gazette copyright ©

Privacy & Cookies: This site uses cookies. By continuing to use this website, you agree to their use.
To find out more, including how to control cookies, see here: Cookie Policy
  • Subscribe Subscribed
    • bobdillon33blog
    • Join 223 other subscribers
    • Already have a WordPress.com account? Log in now.
    • bobdillon33blog
    • Subscribe Subscribed
    • Sign up
    • Log in
    • Report this content
    • View site in Reader
    • Manage subscriptions
    • Collapse this bar