• About
  • THE BEADY EYE SAY’S : THE EUROPEAN UNION SHOULD THANK ENGLAND FOR ITS IN OR OUT REFERENDUM.

bobdillon33blog

~ Free Thinker.

bobdillon33blog

Tag Archives: Algorithms.

THE BEADY EYE SAY’S: ARE WE NOW REAPING THE REWARDS OF PROFIT FOR-PROFIT SAKE?

30 Monday Mar 2020

Posted by bobdillon33@gmail.com in 2020: The year we need to change., Algorithms., Artificial Intelligence., CORONA VIRUS., COVID-19, Evolution, Fourth Industrial Revolution., Human values., Humanity., Inequality, Life., Modern day life., Our Common Values., Reality., Survival., Sustaniability, Technology, Technology v Humanity, The common good., The essence of our humanity., The Obvious., The pursuit of profit., The state of the World., The world to day., Truth, Unanswered Questions., Universal Basic Income ., VALUES, Wealth., WHAT IS TRUTH, What Needs to change in the World, Where's the Global Outrage., World Economy.

≈ Comments Off on THE BEADY EYE SAY’S: ARE WE NOW REAPING THE REWARDS OF PROFIT FOR-PROFIT SAKE?

Tags

Algorithms., Artificial Intelligence., Capitalism and Greed, Capitalism vs. the Climate., Corona Pandemic., Coronavirus (COVID-19), Earth, Environment, Greed, Inequility, Technology, Visions of the future.

 

 

(Five-minute read) 

First, let me state the obvious.

The Covid-19 doesn’t just call our bluff it is questing the way we allow our society to be run. 

It is bringing into sharp relief what some of us have always known to be true. Our current way of living must end.

Capitalism and the culture of hierarchy that props it up is now extremely screwed up. 

The story of Capitalism up to now has been selling your labour so you don’t end up on the streets.

We should not behave to exist this way.

We come into this world kicking and screaming for our own needs while our birth’s, and our eventual departure’s, have all been turned into a product by capitalism to generate profit. We leave silent.

We live in a world where nearly everything has some kind of cost and the increased workforce automation is suggesting that things will keep getting worse.

What is considered valuable by man or the people of this world are of little or no value when one is confronted by a virus (which unfortunately some of us are witnessing this very minute) that does not discriminate any grounds.  

Money, wealth, riches, gold, property, power and so on are either transitory, fading or can be destroyed in the blink of an eye and are of no value in the long term.

In the past few years, the money markets have fallen in a heap with the global financial crisis and the value of money becoming very shaky. The same can be said of shares, property and other investments. And this is nothing new for the economic cycle goes through boom and bust every seven to ten years making fortunes at one time and destroying them at other times.

However, men believe that wealth gives you the power to be able to rise above the problems and issues of the world.

How wrong he is.

The coronavirus is not the only virus we have to confront we also have to confront capitalism and the world that sustains it.

Climate Change was not enough to make the world pause.

The challenge man faces is that we think only of the here and now.

We now have a moment to consider what a rapid response to the climate emergency would look like – how we build a society that completely transforms our social order towards something that is in equilibrium with the biosphere and gives to each according to their needs.  

But will more sustainable capitalism emerge from Covid-19 highly unlikely as the protection of private interest over public interest remains the same?  

What the coronavirus has and is showing is that our cheapskate governments can provide far more in social programmes than they have. 

While none of us can predict the future let’s hope that this time the penny drops. 

The risks of Covid – 19 are now but the risks of climate change with the clock ticking needs us to wake up before the alarm goes off. 

It’s not science, not protest, that will save the planet. Science alerted us to global warming but understand the nature of the world is crucial to dealing with it. 

Everything has a function and our function is to fit into our world and not divorce ourselves from nature.

With the age of technology and its Algorithms working themselves into everything relentless, enabling profits to disappear far from the trickle-down effect the coronavirus is revealing heroes and villains across the world.

The markets might be paralysed with numerous industries entering a state of suspended animation the environment is getting a recovery period.

Covid -19  is showing us that on the horizon, capitalism in its current form threatens value. It is built on the premise of instant gratification.

Many businesses today are aware of this failing in mankind and play to it to great effect encouraging us to insure ourselves against the cost of living and dying but we are now trading for time and for eternity.

The corona-virus is certainly a much greater reward than the fleeting pleasures of this life.

The new WFH world that emerges from this will be intriguing – Universal Basic Income.

All human comments appreciated

← Back

Thank you for your response. ✨

. All like clicks and abuse chucked in the bin. 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Share this:

  • Share on WhatsApp (Opens in new window) WhatsApp
  • More
  • Share on Pocket (Opens in new window) Pocket
  • Share on Mastodon (Opens in new window) Mastodon

THE BEADY EYE ASK’S. WILL THE WORLD EVER BE ABLE TO ACT AS ONE?

26 Wednesday Feb 2020

Posted by bobdillon33@gmail.com in 2020: The year we need to change., Algorithms., Artificial Intelligence., Climate Change., Dehumanization., Digital age., DIGITAL DICTATORSHIP., Disconnection., Environment, Fourth Industrial Revolution., Google, Human values., Humanity., Life., Our Common Values., Reality., Robot citizenship., Sustaniability, Technology, Technology v Humanity, The cloud., The common good., The essence of our humanity., The Future, The Obvious., The state of the World., The world to day., Unanswered Questions., WHAT IS TRUTH, What Needs to change in the World, Where's the Global Outrage.

≈ Comments Off on THE BEADY EYE ASK’S. WILL THE WORLD EVER BE ABLE TO ACT AS ONE?

Tags

Algorithms., Artificial Intelligence., Big Data, Capitalism vs. the Climate., Climate change, Extinction, Technology, The Future of Mankind, Visions of the future.

 

Twenty-five minute read.

If humanity stopped fighting and competing against one another; if we bound together in a common cause, we could accomplish spectacular things.

Not true.

We would basically become mindless drones of no culture because it’d all just be one culture with no distinct forms.

If this were to become a reality, Ummm how would govern it.

China’s premier Wen Jiabao put forward the following equation in a speech: “Internet + Internet of Things = Wisdom of the Earth.”

How wrong he was, however, by 2025 there will be 1 trillion networked devices worldwide in the consumer and industrial sectors combined.

He should have said, “Internet + Internet of Things = Becoming what we do not think? Because people are truly not that intelligent.

In our houses cars and factories, we’re surrounded by tiny, intelligent devices that capture data about how we live and what we do. Now they are beginning to talk to one another. Soon we’ll be able to choreograph them to respond to our needs, solve our problems, even save our lives.

Intelligent things all around us, coordinating their activities.

Coffee pots that talk to alarm clocks. Thermostats that talk to motion sensors. Factory machines that talk to the power grid and to boxes of raw material.

We might be seeing the dawn of an era when the most mundane items in our lives can talk wirelessly among themselves, performing tasks on command, giving us data we’ve never had before? This intelligence once locked in our devices will flow into the universe of physical objects.

We are already struggling to name this emerging phenomenon.

Some have called it the Internet of Things or the Internet of Everything or the Industrial Internet—despite the fact that most of these devices aren’t actually on the Internet directly but instead communicate through simple wireless protocols.

Others are calling it the Sensor Revolution.

I call it the Programmable Profitable in a World of profit-seeking algorithms.

It’s the fact that once we get enough of these objects onto our networks, they’re no longer one-off novelties or data sources but instead become a coherent system, a vast ensemble that can be choreographed, a body that can dance in the era of the cloud and apps and the walled garden— of Google, Apple, etc, which connotes a peer-to-peer system in which each node will not be equally empowered.

These connected objects will act more like a swarm of drones, a distributed legion of bots, far-flung and sometimes even hidden from view but nevertheless coordinated as if they were a single giant machine, relying on one another, coordinating their actions to carry out simple tasks without any human intervention.

So the world will act as one. Or will it?

Once we get there, that system will transform the world of everyday objects into a design­able environment, a playground for coders and engineers.

It will change the whole way we think about the division between the virtual and the physical putting intelligence from the cloud into everything we touch.

Call it “smart exploration.” 

The rises of the smartphone have supplied us with a natural way to communicate with those smart objects. So far they include watches, heart rate monitors, and even some new Nike shoes. Smartphone making payments to merchants wirelessly instead of swiping a card, and some billboards are using the protocol to beam content to passersby who ask for it. As a way to sell more products and services—particularly Big Data–style analysis—to their large corporate customers.

The yoking together of two or more smart objects—is the trickiest, because it represents the vertiginous shift from analysis, the mere harvesting of helpful data, to real automation.

In my view no matter how thoroughly we might use data to fine-tune our lives and businesses, it’s scary to take any decisions out of human hands.

It can be hard to imagine the automation you might someday want or even need, in your daily life. There are all sorts of adjustments you make over the course of any given day that is reducible to simple if-then relationships.

Facebook, which has famously described the underlying data it owns as a social graph—the knowledge of who is connected to whom and how.

Would you want to automate all of these relationships?

A world where every one of us would have a sensor on us. “Presence” tags—low-energy radio IDs that sit on our keychains or belt loops and announce our location, verify our identity.

This is the principle behind Square Wallet and a number of other nascent payment systems, including ones from PayPal and Google. (When you walk into a participating store today, Square can let the cashier know you’re there; you pay simply by giving your name.)

A tracking tool that monitors not just your pet’s movements, but your movements.

GPS reliably know our location within 100 feet, give or take, and that knowledge has and is transforming our lives immeasurably: turn-by-turn driving directions, local restaurant recommendations, location-based dating apps, and so on.

With presence technology, Google has already the potential to know our location absolutely, down to a foot or even a few inches. That means knowing not merely which bar your friend is at but which couch she’s sitting on if you walk through the door.

It means receiving a coupon for a grocery item on the endcap at the moment you walk by.

Think about a liquor cabinet that auto-populated your shopping list based on the levels in the bottles—but also locked automatically if your stock portfolio dropped more than 3 per cent.

Think about a home medical monitoring system that didn’t just feedback data from diabetic patients but adjusted the treatment regimen as the data demanded.

Think about how much more intelligent your sprinklers could be if they responded to the weather report as well as to historical patterns of soil moisture and rainfall.

It does not stop just there think about applications on top of these connected objects.

This means not just tying together the behaviour of two or more objects—like the sprinkler and the moisture sensor—but creating complex interrelationships that also tie in outside data sources and analytics. 

Plugged into that information, your system wouldn’t just know how much water is in the soil it could predict how much there will be, based on whether it’s going to rain or the sun will be baking hot that day.

It means walking through an art museum and having your phone interpret the paintings as you pause in front of them.

This simple link—between a tag on us and a tag in the world—stands to become the culmination of the location revolution, delivering on all the promises it hasn’t quite fulfilled yet. A simple link—between a tag on us and a tag in the world—will complete the location revolution.

The treasure that it digs up could be considerable.

This is obviously true for retailers:

It’s a future where the intelligence once locked in our devices will now flow into the universe of physical objects. Users and developers can share their simple if-then apps and, in the case of more complex relationships, make money off of apps, just like in the mobile marketplaces.

Processing it all in the cloud in a language unheard of.

On Google Maps, you can now navigate inside certain airports and stores, with Wi-Fi triangulation helping out your GPS. 

And according to a mobile couponing firm called Koupon Media, some 80 per cent of customers who buy gas at one major convenience-store chain never walk inside the store, so presence-based coupons could make a huge impact on the bottom line.

But it’s also true for our everyday lives. Have you ever lost an object in your house and dreamed that you could just type a search for it, as you would for a wayward document on your hard drive? With location stickers, that seemingly impossible desire has become a reality:

A startup called StickNFind Technologies already sells these quarter-sized devices for $25 apiece.

Think about a thermostat app pulling in readings from any other device on that platform—motion sensors that might say which room you’re in, presence tags that identify individual family members (with different temperature preferences)—as well as outside data sources like weather or variable power price.

An even more natural category for apps is security. It locks itself up, shuts down the lights and thermostat, and activates an alarm system complete with siren, flashing lights, and auto-notifications, and notifications with an on-call platoon of off-duty cops all coordinated through the Smart­Things.

This, finally, is the Programmable World, the point at which the full power of developers, entrepreneurs, and venture capitalists are brought to bear on the realm of physical objects—improving it, customizing it, and groping toward new business plans for it that we haven’t dreamed of yet. Indeed, it will marshal all the forces that made the Internet so transformational and put them to work on virtually everything around us.

However, there are obviously some pitfalls lurking in this future of connected objects.

As a sanity check.

Our fears about malicious hackers preying on our email and bank accounts via the cloud might pale in comparison to how we’ll feel about those same miscreants pwning our garage doors and bathroom light fixtures.

The mysterious Stuxnet and Flame exploits have raised the issue of industrial security in the era of connected devices.

Vanity Fair recently detailed nightmare scenarios in which hackers could hit connected objects, from our high tech cars (university researchers have figured out how to exploit an OnStar-type system to cause havoc in a vehicle) to our utility “smart meters” (which collect patterns of energy use that can reveal a great deal about our activities at home) to even our pacemakers.

The idea of animating the inanimate, of compelling the physical world to do our bidding, has been a staple of science fiction for half a century or more.

No, the main existential threat to the Programmable World is the considerably more mundane issue of power. Every sensor still needs a power source, which in most cases right now means a battery; low-energy protocols allow those batteries to last a long time, even a few years, but eventually, they’ll need to be replaced.

Just as with social networking, the privacy concerns of a sensor-­connected world will be fast outweighed by the strange pleasures of residing in a hyperconnected world.

A bigger concern, perhaps, is simple privacy. Just because we’ve finally warmed up to oversharing in the virtual world doesn’t mean we’ll be comfortable doing the same in the physical world, as all our interactions with objects capture more and more data about where we are and what we’re doing. iStock_000049614472Medium1

What’s coming is ubiquitous connectivity that will accelerate how people collaborate, share, learn, gather, do business, and exchange knowledge.

There will one day be universal access to all human knowledge by everyone on the planet.
So based on our collective knowledge, will we be able to act as one.
How will you use global connectivity to enhance our lives?
We automatically sort people into “like us” or “not like us.”
We are currently in a new era, combating mass species extinction and climate change with a Virus Pandemic all bring humans and the natural world together as one. 
Humanity as a whole needs to be united if we are to preserve what’s left on Earth.
One in three of the population of earth died in the Black Death, they had no idea why it was happening.
As a result, they had no responsibility, because they didn’t know.
Our problem is that we do know, and therefore, we have absolute responsibility.
We have only a very small window and if we don’t use that window in the next 10 years, not the next thirty or fifty years connectivity will be the least of our worries.
In November this year, the world will descend on Scotland, and states from across the globe will be given a choice between cooperating or continuing as they have until now.Toxic-leaders

All human comments appreciated. All like clicks and abuse chucken in the bin.

← Back

Thank you for your response. ✨

Share this:

  • Share on WhatsApp (Opens in new window) WhatsApp
  • More
  • Share on Pocket (Opens in new window) Pocket
  • Share on Mastodon (Opens in new window) Mastodon

THE BEADY EYE SAY’S; HERE ARE THE BIG QUESTIONS THAT ARE YET TO COME WHEN IT COMES TO TECHNOLOGY.

19 Wednesday Feb 2020

Posted by bobdillon33@gmail.com in #whatif.com, 2020: The year we need to change., Algorithms., Artificial Intelligence., Big Data., Digital age., DIGITAL DICTATORSHIP., Digital Friendship., Fourth Industrial Revolution., Google, Google it., Google Knowledge., Human values., Humanity., Life., Modern day life., Our Common Values., Reality., Sustaniability, Technology, Technology v Humanity, The common good., The Future, The Obvious., The state of the World., The world to day., Unanswered Questions., War, WHAT IS TRUTH, What Needs to change in the World, Where's the Global Outrage., World Leaders

≈ Comments Off on THE BEADY EYE SAY’S; HERE ARE THE BIG QUESTIONS THAT ARE YET TO COME WHEN IT COMES TO TECHNOLOGY.

Tags

Algorithms trade., Algorithms., Artificial Intelligence., Big Data, Capitalism and Greed, Distribution of wealth, Inequility, Technology, The Future of Mankind, Visions of the future.

 

Thirty-minute read.

Who owns what?  What’s our purpose in life?  What are the values that we believe in? How do we think and make decisions?  What do we mean by work?  Can our work ever have true meaning unless it is to serve others?

What will help us all think deeply about the questions we need to ask and answer?

Climate change or technology.

However, for many of us, the answers to these questions differ in our working lives, compared with our personal lives, with family, friends and neighbours.

Were a ruling elite like Google to impose a command-and-control, fear-driven culture in which power is abused and the outcomes are social and economic misery for the vast majority?

Our reaction, if we are to go by what is now observable, will be So what? Now what?

MAKING sure companies compete fairly is a tricky business. The firms being regulated know far more about their business than those doing the regulating;

“Artificial intelligence is the future, not only for Russia but for all of humankind.” Says Putin. “Whatever country comes to dominate this technology will be the “ruler of the world.”

His rhetoric is entirely appropriate. Automation and digitalization have already had a radical effect on international systems and structures.

Technology can easily be referred to as the scientific knowledge to the practical problems we are experiencing in the world today.

On the other hand, its core strategy is to gobble up market share with profit-seeking algorithms.

Our environments are all so full of technology to the point that most of the time we take it for granted.

So are we all becoming personified idiots?

Technology has a great impact on all the fundamental aspects of all our cultures including laws and how they are enforced, language, art, health care, mobility, education and religion.

The obvious problem with all of this is that countries will not own or be in control of the technologies.negative effects of technology

While we all sit back and accept the benefits technology it also brings manipulation on a  worldwide scale with our future in the hands of only a handful of corporations and the vast amount of people that are okay with that.

It’s hard to argue against innovation. It’s hard to argue against greater choice, more convenience and lower prices.

One way or the other it is also hard to underestimate the fundamentally different rules that Google /Amazon/ Facebook/ Apple/ Baidu play by.

Hiding behind forked rhetoric that the data they collect does no harm as it is anonymous.

You do not need to know who you are. It is enough to know what you consume, your habits, your tastes, and where you are, through the IP address, the GPS of the mobile, or your Google account. Your name, or your phone number, is not important to sell you things.

Blurring the borders of privacy. Replacing real-life communication.

And on top of it, violent games and videos killing empathy and bring destruction into an individual’s life. Plagiarism and cheating are increase while analysis and critical thinking decline, ending up in social isolation.

(We now have a new perverse sexual harassment of Cyber flashing which is not against any law. Why? Because our laws cannot keep up with the speed of change)

Commercial technology like Smartphones, I pads, Home Alexa/Echo and there like is about creating another consumer touchpoint for their robust ecosystem of e-commerce, services, and media taking advantage of less sophisticated consumers and trick them into consuming items for short-term satisfaction and long-term pain.

Originally created to serve faithfully to humanity, digital devices are revealing their harmful impact on our lives.

We should all be careful what we wish for.

There’s an argument made by big corporations for each country to charge corporations the lowest possible tax rate, to loosen environmental regulations down to zero, and to eliminate employee protections. All so that a country’s commodity producers can be the cheapest ones.

The voice market war has only just begun.

The contenders:

Amazon-Echo v Google-Alexa.amazon-echo-google-home

Once they figure out how to improve their recommendations and push more people to make regular household purchases via voice it will lead to an explosion in voice-based shopping.

Google already has one of the most valuable brands in the world.

Google maps have virtually no meaningful rival.  Gmail…Google basically controls our handheld existence.

Google controls your life, literally, even if it costs you to believe it.

Google trackers have been found on 75% of the top million websites.

When you search on Google, they keep your search history forever.

Google is a company that offers almost all its products for free because the money is earned by selling the data it collects with those products, to advertisers and companies.

Last year Google made over $161 billion in total revenues.

As it is the premier search engine in the U.S., Europe, and many developing countries Google has the tools to control much of the world.

That’s just Google then you have Amazon.

With around 225 million customers around the world, Amazon wants to deliver everything you want to your doorstep, including Foods anywhere in the world. ( 300 items a second) These days half of all product searches start on Amazon.

Our lust for cheap, discounted goods delivered to our doors promptly and efficiently has a price.

Amazon has done a lot of good for consumers by expanding choice, making shopping far more convenient and by delivering extraordinary product value.

Yet, we can’t–and shouldn’t–ignore the profound effect that Amazon is having on just about every corner of the retail world they set their sights on.

Amazon is selling its facial recognition technology, known as Rekognition, to law enforcement agencies.

First and foremost, Amazon isn’t required by its investors to make any real money.

For us the Great unwashed there’s always the opportunity to cut a corner, sacrifice lifestyle quality and suck it up as they race to grab a little more market share.

With their algorithms, they tell you what restaurants you have to eat in, choose your music, label your photos associating them with each family member or friend that appears in them, pay for your purchases, suggest the movies you should see, and the apps that may interest you.

When in fact the searches we do, what websites we visit, what products we look at, where are we, your medical history, your political beliefs, your associations with others your employment prospects, everything from the womb to the grave is collected and analyzed

Before I hear you calling me a hypocrite I also have used Amazon.

If this scenario prevails, would this be really the way information is supposed to be organized?

In short, does the fact that an algorithm is able to provide more relevant information than a human justify this scenario?

These big brands platforms are more powerful than governments. They’re wealthier. If they were countries, they would be pretty large economies. They’re multinational and the global financial situation allows them to ship money all over the world.

Can we do anything to make a difference?

We need to be supporting the development of an efficient circular economy.

Why?

Because sustainability is an unstoppable force.

Let’s not race to the bottom.

Country’s population size will become less important for national power as small countries that develop a significant edge in AI technology will move far above their weight.

Ultimately, however, winning and losing will not be determined by which country gains the most growth through AI. It will be determined by how the entire global community chooses to leverage AI — as a tool of war or as a tool of progress.

They can eliminate rules protecting clean water, air or consumer safety, but they will always find a way to be cheaper or more brutal than you.

We all assume that Google, Facebook, Amazon, Apple, are spying our activity and up to now advertiser is not interested in your name when they are it will be too late and the winner will be Inequality.

So what does all this mean and what are we all going to do about it when we’ve stopped talking about it?

Once you start to connect all the invisible dots together the impact on society will, in the end, be down to the people that use the technology they have to be responsible for it and if they use it irresponsibly they have to be held accountable.

A Footnote:

For me, there is little point in Jeff Bezos setting up an Earth fund when Amazon is one of the biggest promoters of pollution. Pretending to be a do-gooder.

The brown box doesn’t begin to address the larger issue: Each year in the United States alone thrown paper in the trash that represents approximately 640 million trees or roughly 915,000 acres of forest land.

Amazon ships an average of 608 million packages each year, which equates to (an estimated) 1,600,000 packages a day.

Then when we talk about energy consumption, we’re talking about the sources of energy that generate our power: oil, coal, natural gas and alternatives like solar, wind, hydropower and biofuels.

How much electricity they use and the bill is, god only knows, so its no wonder that they have contracts with oil and gas companies.

Now consider that people conduct over 1,6 billion searches per day, and you get a massive energy footprint of roughly 12.5 million watts.

Is e-commerce reducing or increasing our carbon footprint?

Google’s worldwide operations, collectively worldwide use about 2.26 million megawatt-hours per year to power its global data centre operations, which is equivalent to the power necessary to sustain 200,000 homes.

In 2018 Google generated 39.12 billion dollars earnings out of which it paid 243 Million a day in electricity.

This is only an educated guess.

The link between global warming and energy demands is obvious. Surely both of these players should be investing in Green energy.

There’s a deafening silence from pundits and elites and columnists and politicians on our joint self-destruction.

They are simply going on pretending it isn’t happening.

We don’t, as societies or cultures, value learning or knowledge or magnanimity or great and noble things, anymore.

The average person has become a tiny microcosm of the aspirations and norms of elites. We’re the only people on earth who thwart our own social progress, over and over again — and cheer about it.

We are caught in a death spiral now. A vicious cycle from which there is probably no escape. The average person is too poor to fund the very things — the only things — which can offer him a better life:

The result is that a whole society grows poorer and poorer.

Unable to invest in themselves or one another, people’s only real way out is to fight each other for self-preservation, by taking away their neighbour’s rights, privileges, and opportunities — instead of being able to give any new ones to anyone.

Though it’s too late to escape for them, let us hope our governments regulate their algorithms for profit sake.

All human comments appreciated. All like clicks chucked in the bin.

← Back

Thank you for your response. ✨

 

 

 

 

 

 

 

.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Share this:

  • Share on WhatsApp (Opens in new window) WhatsApp
  • More
  • Share on Pocket (Opens in new window) Pocket
  • Share on Mastodon (Opens in new window) Mastodon

THE BEADY EYE SAY’S: WE CAN NO LONGER BE CERTAIN ABOUT ANYTHING.

06 Thursday Feb 2020

Posted by bobdillon33@gmail.com in 2020: The year we need to change., Algorithms., Artificial Intelligence., Digital age., Fourth Industrial Revolution., Human values., Life., Technology, The essence of our humanity., The Obvious., The world to day., Unanswered Questions., WHAT IS TRUTH, What Needs to change in the World, Where's the Global Outrage.

≈ Comments Off on THE BEADY EYE SAY’S: WE CAN NO LONGER BE CERTAIN ABOUT ANYTHING.

Tags

Algorithms trade., Algorithms., Artificial Intelligence., Big Data, Technology, The Future of Mankind

 

 Twenty four-minute read.   

We have no idea how the world will look in twenty years never mind fifty when most of this generation will be in their seventies but it is now becoming clear beyond any doubt that AI and its algorithms are drastically changing the world we live in both for the good and bad.

There is one thing for certain artificial intelligence will have and is having a more profound effect than electricity or fire. 

It will not just hack our lives our brains, it will hack or very existence.

It might well warn us about climate change, the coronavirus but it will as it is manipulate our needs and wants and beliefs. It will effectively be controlling people for both commercial and political purposes. 

Given the force of this technology to have any control left we need meaningful regulations, if not we might as well just surrender to the algorithm, which is becoming so complex that no one will understand them.

The reality is that most of us are giving rivers of free information to Big data to an extent that we will soon be unable to think for ourselves.

If we don’t get a grip by the time you reach the seventies your futures and the future of the next generations will be decided at random by nonelected platforms.  

If this is so, the decision-making process for us all become a thing of the past.

The outlook for Ai is both grim and exciting. 

Already we see data collected affecting elections, with our ability to know what is fake and what is true at the mercy of Social Media run by algorithms.  

We all know their faces: Google, Microsoft, Amazon, Facebook, Apple, Baidu in China, Twitter, Alibaba, to name a few who are already transforming the basic structure of life.

Taken together they form a global oligopoly.

These unregulated platforms are competing for dominance all with a conflict of interests. Hence their algorithms.

The chances of self introducing regulations that will affect or stop their development is pie in the sky. 

Its time we stopped thinking about AI in kind of scientific terms.

Why?

Because algorithms are making critical decisions about our lives and the tech-driven approach to governance is growing. Because any particular scenario will be far from what we think is true today and we are running out of time to do anything about it.

If we are to call a spade a spade there is no or little understanding as to how to regulate these emerging technologies.  Even if there were governments and world institutions that could do so they are largely unequipped to create and enforce any meaningful regulation for the public benefit.

 The problem is how does one regulate an Algorithm that learns. 

You might happy ceding all authority to algorithms and their owners if so you don’t have to do anything they will take care of everything.

If not algorithms are watching you right now to ensure that you do not read this post and if you do read this post they will use one of the most powerful tools in their arsenal of big data – split and divide – False News and repetition thereof for example. 

It won’t happen overnight since development cycles often take years but our collective past will become a less reliable guide and we will have to adapt to the unknown. 

Unfortunately teaching the unknown without mental balance is a disaster in waiting. 

It might be easy now to laugh at this but unless we make our voices heard instead of like clicks it will not be climate change that changes us but race bias that is already programmed into Tec world.

We’re starting to see money examples where these algorithms are prone to the kinds of biases and limitations that we see in human decision making and increasingly we are moving towards algorithms that are learning more and more from data.

 I say, learning from this data almost institutionalizes the biases.

Why?

Because they are trying to personalize the media they curate for us. They’re trying to find for us more and more of the kinds of content that we already consume.  

So what if anything can be done?

Even if we do eventually introduce regulations they will have little effect unless we find a way of sharing the benefits of AI.

The problem is that our institutions, our education models are not able to keep up with the developments in Artificial Intelligence. We are becoming more and more detached from and in decision making, contributing, and rewards.

So our governments are leaving it to the market, to the big Tec companies themselves.

If you are expecting some kind of warning when AI finally get smarter than us then think again.

I say our algorithms are hanging out with the wrong data, profit for profit sake. 

In reality, our electronic overlords are just getting started with the smartphone, the I.Pads, Alex etc taking control.  We have to think about other measures, like is there a social contribution, and what is the impact of this algorithm on society?

This requires transparency.

But how do you create transparency in a world that is getting so complex?

Here is my solution.

Pharmaceuticals are considered as the most highly regulated industries worldwide and every country has its own regulatory authority when it comes to the drug development process.

(World Health Organization (WHO), Pan American Health Organization (PAHO), World Trade Organization (WTO), International Conference on Harmonization (ICH), World Intellectual Property Organization (WIPO) are some of the international regulatory agencies and organizations which play essential role in all aspects of pharmaceutical regulations related to drug product registration, manufacturing, distribution, price control, marketing, research and development, and intellectual property protection.)

Why not put in place a new World Governing Body to test and control Al algorithms. To act as a guardian of our Basic Human Values.

If this is not done it will remain impossible to truly cooperate with an AI or a corporation until such entities have values in the same sense that we do.

So:

All Companies already using algorithms should be legally required to submit the software programs running their algorithms for audit, by an independent team to ensure that our human values are complied with.

This audit could be done by a United Nation’s programme that is agreed to world wide.

The audit process (Because algorithms are constantly evolving as they gather more data.) has to be somewhat continuous like every ten years similar to Control technique. We might also need an algorithm to monitor the auditing algorithm to ensure it is not contaminated while it goes through its refresh-cycle of the Algorithm it is auditing. 

Then they must be made transparent with a certification of acceptable behaviour.   

Transparency for end users actually is very basic.

It’s not like an end-user wants to know the inner details of every algorithm we use.

But we would actually benefit knowing what’s going on at the high level.

For example, what kinds of data are being used by the algorithms to make decisions?

Recommend transparency measures.

Keeping in mind that these algorithms are being deployed and used by humans, and for humans, anyone impacted by decisions made by algorithms should have the right to a description of the data used to train them, and details as to how that data was collected.

The public, have little understanding or access to information about how governments are using data, much of it collected quietly, to feed the algorithms that make decisions about everyday life. And, in an uncomfortable twist, the government agencies themselves often do not fully understand how algorithms influence their decisions.

Having more and more data alone will not solve the problems, gender bias, race bias.

Perhaps the notion of control may only be an illusion.

It won’t be long before they are latching on to life forms.  For example, there’s a type of machine learning algorithm known as neuro labs, and these are modelled on the human brain. What’s happening in these algorithms is they’ve taken lots of data, and they learn how to make decisions like humans have made.

I think this field hasn’t yet emerged.

Humans aren’t changing that much. But the algorithms, the way they’re created, the technological side of it, continues to change, continues to evolve. And trying to keep those things in sync seems to be the greatest challenge.

In a world where algorithms are deciding who gets what, how machine decisions are made, and how the two, can work together.

Because we are going to use these systems so much that we have to understand them at a deeper level, and we can’t be passive about it anymore because the consequences are very significant, whether we’re talking about a democracy or you know, I’m curating news stories for citizens, or we talking about use by doctors in medicine, or used in the courtroom, and so on.

It is going to be extremely important as we roll out algorithms in more and more important settings going forward we start understanding what drives trust in these machines. Understanding what are some socially important outcomes of interest, so that we order these algorithms against these socially important outcomes, like fairness and so on.

Given everything we know about the world and indeed the universe as a whole does anyone seriously believe that nationalism and popularism will help us with this technological problem. 

Let’s talk about what data are collected about us.

It is far too late to be talking about privacy that is what gets abused.

Let’s fight against everything that we can control that limits our freedom. Whether it’s an algorithm, hungry judge or greedy state backed the wrong econometric model…

We need to rethink how we do education, we have to rethink how we do regulation, and firms also need to stand up and do a better job of auditing and taking responsibility as well.

Of course, none of this will happen.

Humans are more likely to be divided between those who favour giving Algorithms and Ai significant authority to make decisions and those opposed to it with both justifying whichever position while Algorithmic logic drives greed and inequality to a point where we will lose control of transparency completely.

To stay relevant as Yuval Noha Harari says in his Book 21 Lessons for the 21st Century “we will need to be asking the questions -how am I where am I.”

There testing rarely go beyond technical measures which are causing society to become more polarized making it more unlikely that we can appreciate other viewpoints.

Just knowing that an algorithm made the decision would be a good place to start.

Was an algorithm used?

If so who does it belong to?

What kinds of data did the algorithm used?

Today, algorithms and artificial intelligence are creating a much more frightening world. High-frequency trading algorithms already rule the stock exchanges of the world.

Personally, I would neither overestimate nor underestimate the role and threat of algorithms. Behind every smart web service is some smarter web code.

So we need to make sure their design is not only informed by knowledge of the human users, but the knowledge of their design is also suitably conveyed to human users so we don’t eliminate the human from the loop completely.

If not they will become a black box, even to the engineer.

All our lives are constrained by limited space and time, limits that give rise to a particular set of problems that are being exploited by profit-seeking algorithms. 

There’s so much data out there to be analyzed. And right now it’s just sitting there not doing anything. So maybe we can come up with a solution that will at least get us started on it.

It is a fascinating topic because there are so many variables involved.

All human comments appreciated. All like clicks and abuse chucked in the bin. 

← Back

Thank you for your response. ✨

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Share this:

  • Share on WhatsApp (Opens in new window) WhatsApp
  • More
  • Share on Pocket (Opens in new window) Pocket
  • Share on Mastodon (Opens in new window) Mastodon

THE BEADY EYE ASK’S: WHY IS IT THAT WE ARE ALLOWING ALGORITHMS TO RUN OUR LIVES.

28 Tuesday Jan 2020

Posted by bobdillon33@gmail.com in 2020: The year we need to change., Algorithms., Artificial Intelligence., Climate Change., Digital age., DIGITAL DICTATORSHIP., Evolution, Fourth Industrial Revolution., HUMAN INTELLIGENCE, Human values., Humanity., Inequality, Innovation., Modern day life., Post - truth politics., Reality., Robot citizenship., Sustaniability, Technology, The common good., The Future, The Internet., The Obvious., The state of the World., The world to day., Unanswered Questions., WHAT IS TRUTH, What Needs to change in the World, Where's the Global Outrage., WiFi communication.

≈ Comments Off on THE BEADY EYE ASK’S: WHY IS IT THAT WE ARE ALLOWING ALGORITHMS TO RUN OUR LIVES.

Tags

Algorithms trade., Algorithms., Artificial Intelligence., Big Data, Distribution of wealth, Inequility, Technology, The Future of Mankind, Visions of the future.

(Twenty-minute read)

Technology is getting increasingly personal.

With algorithms becoming the masters of social media are we all just becoming clickbait?

Devices are providing immediate information about our health and about what we see, where we go and where we have been.

Our lives are being shaken to their very core.

With 5G technology what we experienced at the moment will pale in comparison to the vast array of possibilities carried under its belt by this new generation of wireless connectivity, which is being built over the foundations of the previous one.

It will allow millions of devices to be connected simultaneously.

All stakeholders – business, government, society and individuals – will have to work together to adjust so these technologies and rapid changes are harnessed for the development of all, not just profit.

Swathes of the globe will be left behind.

Regardless it is no longer just about repetitive factory jobs rather an increase in inequality globally.

It is not only a moral imperative to ensure that such a scenario does not happen as it will pose a risk to global stability through channels such as global inequality, but migration also flows, and even geopolitical relations and security.

We already live in a world that has been profoundly altered by the Fourth Industrial Revolution. Yet there is not much debate on the likely size of the impact.

Why?

Because there are such divergent views it is difficult to measure.

But within the next decade, it is expected that more than a trillion sensors will be connected to the internet. By 2024, more than half of home internet traffic will be used by appliances and devices that are connected to internet platforms.

With almost everything connected, it will transform how we live never mind how we do business.

If there is no trusted institution to regulate it we can kiss our arses.

Now is the time to make sure it is changed for the better.

The internet of things will create huge amounts of data, raising concerns over who will own it and how it will be stored. And what about the possibility that your home or car could be hacked?

The internet is great for ideas, but ultimately, the things that will amaze you are not on your computer screen.

Artificial Intelligence may well invent new life forms but if we as humans do not contrive and manage global acceptable ethical parameters for all its forms – (bioengineering, gene editing, nanotechnology, and the algorithms) that run them we are more than idiots.

As Yuval Noah Harari says in his most recent book ( 21 Lessons for the 21st Century) ” There is no such thing as ‘Christian economics’, ‘Muslim economics’ or ‘Hindu economics’ ” but there will be Algorithms economics run by big brother. 

The digital age has brought us access to so much information in just a few clicks of the mouse button or the remote control everything from the news, Tv programmes with the internet becoming somewhat glorifying sensationalism rather than giving us the truth.

The question is.

Are the technologies that surround us tools that we can identify, grasp and consciously use to improve our lives?

Or are they more than that:

Powerful objects and enablers that influence our perception of the world, change our behaviour and affect what it means to be human?

What can we do?

The Second Industrial Revolution and the Third Industrial Revolution have lead us to this revolution the Fourth Industrial Revolution which can be described as the advent of “cyber-physical systems” involving entirely new capabilities for people and machines.

Unlike previous revolutions, it is not the world as a whole that will see any of its benefits or disadvantages it is individuals and groups that could win – or lose – a lot.

Unfortunately, expanded connectivity does not necessarily lead to expanded or more diverse worldviews it will be the opposite with our increased reliance on digital markets.

At the moment it’s just not very evenly distributed nor will it be.

At best we can moan about it and hope that climate change shifts our reliance on biomass as primary sources of energy.

Back to Clickbait.

The issue with clickbait is that the reader or site visitor is being manipulated into clicking something that is misleading.

Clickbait is not one-dimensional. Each time you run a Google search, scan your passport, make an online purchase or tweet, you are leaving a data trail behind that can be analysed and monetized.

Most clickbait links forward a user to a page that requires payment, registration or a series of pages that help drive views for a specific site.

It can also point to any web content that is aimed at generating online advertising revenue.

We’re all guilty of being gullible of clicking links online but Clickbait websites are notorious for spreading misinformation and creating controversy in the name of generating hits.

Have you not ever felt that you’re being played as dumb individuals whenever you watch the news or scroll through a media site?

Thanks to supercomputers and algorithms, we can make sense of massive amounts of data in real-time. Computers are already making decisions based on this information, and in less than 10 years computer processors are expected to reach the processing power of the human brain. A convergence of the digital, physical and biological spheres challenging our notion of what it means to be human.

Today, 43% of the world’s population is connected to the internet, mostly in developed countries.

Cooperation is “the only thing that will redeem mankind”.

We can use the Fourth Industrial Revolution to lift humanity into a new collective and moral consciousness based on a shared sense of destiny, and that’s until 6G comes along or living robots.

All human comments appreciated. All like clicks and abuse chucked in the bin.

← Back

Thank you for your response. ✨

Share this:

  • Share on WhatsApp (Opens in new window) WhatsApp
  • More
  • Share on Pocket (Opens in new window) Pocket
  • Share on Mastodon (Opens in new window) Mastodon

THE BEADY EYE ASK’S. WHY IS THE OBVIOUS SO DIFFICULT TO RECOGNIZE?

20 Monday Jan 2020

Posted by bobdillon33@gmail.com in #whatif.com, 2020: The year we need to change., Algorithms., Artificial Intelligence., Digital age., Facebook, Fourth Industrial Revolution., Google, HUMAN INTELLIGENCE, Human values., Humanity., Life., Modern day life., Nanotechnology, Our Common Values., Post - truth politics., Reality., Robot citizenship., Social Media, Sustaniability, Technology, The common good., The essence of our humanity., The Future, The Obvious., The world to day., Unanswered Questions., WHAT IS TRUTH, What Needs to change in the World, Where's the Global Outrage., World Organisations., World Politics

≈ Comments Off on THE BEADY EYE ASK’S. WHY IS THE OBVIOUS SO DIFFICULT TO RECOGNIZE?

Tags

Algorithms., “Crises” facing humanity., Common sense., The Obvious.

 

(Twelve-minute read) 

We live in a world where the obvious cannot be addressed.

Each and every aspect of our daily lives, work, relationships are somehow influenced or mediated by technology today, not only as individuals but collectives.

It makes one wonder about the sheer volume of ignorance which not only allows the same problems to persist decade after decade but to even get worse.

It is obvious that our very sustainability is under threat but we remain “Oblivious”

Why? 

Consider the paradoxical and strategic implications of the fact that people do not perceive things being too small or too big, too far away or too close, too wide or too narrow, too unimportant or too important for us, too slow and gradual or too sudden and fast, always present or usually absent, too often repeated or not often enough to be remarked, too general, complicated and abstract or too simple, too respectable or too unworthy, too familiar or too alien, too similar or too different too few or too many… Imagine the practical implications of such blindness!

Some of the biggest things around us dissolve into background scene, too huge to count and seemingly too big to fail.

To defeat this blindness we must ask what exactly is obvious? Why? obvious to whom? To me? to you? To everybody? Everywhere? All the time? 

Decisions about technology should not be irreversibly delegated to technocrats, corporations and tech monopolies. 

We think unknowingly with other people’s thoughts.

The conclusion is that our senses and memories cheat us, our common sense is no good and our judgement false.

It is self-evident that basic assumptions are the riverbeds of our thoughts, the compass of our judgment and choices and our actions; most of them we inherited from trusted people and from authorities, they look inherent, seem to be there from eternity, as if out of sight, so that we would not question them.

This is now leading to a ready-made thinking world of algorithms used by Facebook- Utube – Google – Smartphones -Twitter -and Social media. An invisible prison of social media where it is easier to observe other people’s basic assumptions than yours; particularly when they are dissimilar with yours; then, other people have not yet grown into your culture may be useful to detect your unquestionable beliefs; especially very different people coming from somewhere else; or you, visiting somewhere else.

I do not see much good in convincing people not to trust their own mind; we must instead accept and work around this “blindness” without moving our life into monasteries at the feet of gurus or into laboratories at the feet of the experts of the day.

After a while, you don’t notice. They become references.

The Right to an Algorithmic Opt-Out…

How to notice, by ourselves, the obvious turned imperceptible? How to detect it, how to discern it from the merely neutral “obvious” background? How to evaluate the importance and potential of change of something so evident that it escapes your attention?  How to wake up to it? How to seek and get help? How to help other people to do the same? What to do when people cannot or do not want to see the obvious? How to awaken people?

The question is still “How to open my eyes when they are open already?”

The intelligent reason should visit its basic assumptions, regularly; but it doesn’t.

Our worst enemy in discerning the obvious is a certainty, to be convinced that we know it all and that the obvious is obvious for us.

The obvious is best disguised into itself. One obvious hide another.

How banal to say that the obvious is that which is right in front of us, readily accessible to our observation, to our senses or being credible knowledge we have!

With commercial profit-seeking algorithms, this hidden price of selective blindness and thus freedom diminished.

if you repeat slogans endlessly they will become obvious for you (even some false ones), and you will end up believing them.

The most amazing for me is to observe how we only apprehend things fit to our size and relative to us. We do not grasp the incommensurable, out of proportion with us, with which we have no common standard of measurement: the trillions of billions.

Because of compression, we have become an incredibly stupid species.

The obvious known comes alive for us to do something about it only when understanding turns it into a personal image, vivid and simple enough to be of our size; otherwise, we stay paralysed and dumb. 

Perhaps it because our body believes that big things don’t move and unmoving things are harmless. 

Perhaps its because we are weak, unable to face them and we allow our judgment to slumber; we do not see what we do not wish to see, hoping that it will go away or solve itself.

Perhaps only when understood does the evidence become awareness, we are able to respond to, so that we would do something because of what it means. 

Perhaps figuring out that the elusive 20th-century social contract is gone, is too enormous for us. Therefore we will go on like cattle to the slaughterhouse. 

Why is this becoming true? 

Because as Wittgenstein, Ludwig, Philosophical Investigations states. 

“The aspects of things that are most important for us are hidden because of their simplicity and familiarity. (One is unable to notice something because it is always before one’s eyes.) The real foundations of their inquiry do not strike people at all. Unless that fact has at some time struck them. And this means: we fail to be struck by what, once seen, is most striking and most powerful.”

Only by understanding how and when common sense fails can we improve how we plan for the future. 

Then, question and challenge the obvious at the root: “Why exactly it must be so? Why it is impossible? Who says so? Where is it necessary or impossible? Only here or everywhere? Really?! For whom; for you or for the entire humanity? With what means? At what size? Within what frame of time? Forever? Which pieces in this puzzle would, if changed, make the impossible possible and the necessary less so? Maybe you or somebody else, somewhere else, with different means have other self-evidence. 

Where it will end?

Either there will be a technological or psychological breakthrough or we will see worldwide degradation like we’ve never seen before.

Old labels often obscure the obvious. 


 

I’d like to state the obvious:

Problem-solving is the only thing in life that holds value. Anything that isn’t a solution to a problem is pure excess.

The truth is that the world is not a democracy. We don’t all decide what is best – only a select few do.

We are egocentric through and through – but creating a lasting, meaningful change feeds our egos like nothing else.

Unfortunately, creating change takes time, patience and perseverance.

It appears that for every one step we take forward as a global community, we end up taking two steps backwards.

Every problem in the world is a function that is processed in an environment, on a platform with certain bounds, certain rules, and certain major players.

As far as I can see, life has little certain purpose. If there is a real reason for it, then we have to accept that we simply don’t know the reason.

However, don’t give up until you have to – until there is a better, more logical option.

Big ideas can change the world, can’t they?

Of course, we don’t know. Nobody does. It is really about what we want to happen and whether we go out there and make it happen.

Will we be able to shift direction to avoid the worst impacts of climate change?

Yes.

We face risks, called existential risks, that threaten to wipe out humanity.

These risks are not just for big disasters, but for the disasters that could end history.

Nuclear war.

Climate Change.

Bioengineered pandemic.

Superintelligence.

Nanotechnology.

Inequality. 

Unknown unknowns.

Anyone of them might mean that value itself becomes absent from the universe.

In doing so we will get the economy back on its feet again and re-orientate our financial institutions so that they cannot place the world in a similar situation to what we experienced in 2008.

In the daily hubbub of current “crises” facing humanity, we forget about the many generations we hope are yet to come.

All human comments appreciated. All like clicks and abuse chucked in the bin.

← Back

Thank you for your response. ✨

 

 

Share this:

  • Share on WhatsApp (Opens in new window) WhatsApp
  • More
  • Share on Pocket (Opens in new window) Pocket
  • Share on Mastodon (Opens in new window) Mastodon

THE BEADY EYE ASK’S: IS THE ACCELERATING TECHNOLOGY AND THE ONGOING REVOLUTION IN INFORMATION MAKING THE WORLD SO COMPLICATED IT IS NOW BEYOND OUR UNDERSTANDING.

11 Saturday Jan 2020

Posted by bobdillon33@gmail.com in Algorithms., Artificial Intelligence.

≈ Comments Off on THE BEADY EYE ASK’S: IS THE ACCELERATING TECHNOLOGY AND THE ONGOING REVOLUTION IN INFORMATION MAKING THE WORLD SO COMPLICATED IT IS NOW BEYOND OUR UNDERSTANDING.

Tags

Algorithms trade., Algorithms., Artificial Intelligence., Big Data, Community cohesion, Distribution of wealth, Earth, Technology, The Future of Mankind, Visions of the future.

 

 

The plain truth can often be so obvious as to be invisible.

There are so many obstacles to change on the scale we so desperately need.

We are fast reaching a point that no humans can or will be able to understand the world we live in.

We pass this way just once.

Artificial algorithms are taking over.

Yuval Noah Harari in his latest book ( 21 lessons for the 21st Century) puts his finger on the problem.

” In the coming century biotech and infotech will give us the power to manipulate the world inside us and reshape ourselves, but because we don’t understand our own minds, the changes we will make might upset our mental system to such an extent that it too might brake down.

Surely its time we stop being the free fodder that feds big data. It’s much harder to struggle against irrelevance than against exploitation.

What will be the point to education if algorithms make us redundant?

It is difficult to discern world-wise whether there is any sincere conversation on AI Ethics.

Is it being addressed by any of the big tech companies or are they just giving token nods to what is right or wrong, while taking advantage of all human beings out there?

Are there just pushback from the outside organisations.

What we are witnessing is their profit growth with economic disparity worldwide increases at a starting rate. This certainly rings true if one looks at the state of the world with people judged by their wealth.

So what is the ethics of creating a sentient life form on a planet that is burning?

Perhaps it will be for the best if we continue not to understand the planet we all live on and leave it to AI to sort us out.

Or can we now start contributing to better governance solutions?

If we don’t grasp the nettle soon there will be no coming back.

To have any relevance now and in the future, we need billions to take to the streets to demand the sustainability of our planet (Human vote with their feet, not Social media) before profit-making goes underground.

When it comes to making the world a better place, corporations are often accused of apathy (the flip-side of blind self-interest). But if consumers are truly committed to social change, they must answer the same challenge.

If we can get consumers to make mindful shopping choices, to support brands that act responsibly and to purchase goods from those that dedicate a portion of the sale proceeds to causes, we are well on our way to re-purposing everyday purchases.

All human comments appreciated. All like clicks and abuse chucked in the bin.

← Back

Thank you for your response. ✨

 

 

 

 

 

 

 

 

 

 

 

 

 

 

and become

Share this:

  • Share on WhatsApp (Opens in new window) WhatsApp
  • More
  • Share on Pocket (Opens in new window) Pocket
  • Share on Mastodon (Opens in new window) Mastodon

THE BEADY EYE SAYS; SO YOU ARE NOW 30 BY THE TIME YOU ARE 70 HERE IS WHAT A DAY IN YOUR LIFE WILL LOOK LIKE.

10 Friday Jan 2020

Posted by bobdillon33@gmail.com in 2020: The year we need to change., Algorithms., Artificial Intelligence., Communication., Dehumanization., Digital age., DIGITAL DICTATORSHIP., Digital Friendship., Evolution, Fourth Industrial Revolution., HUMAN INTELLIGENCE, Humanity., Life., Modern day life., Our Common Values., Reality., Robot citizenship., Social Media, Sustaniability, Technology, The common good., The essence of our humanity., The Future, The Obvious., The state of the World., The world to day., Unanswered Questions., WHAT IS TRUTH, What Needs to change in the World, Where's the Global Outrage.

≈ Comments Off on THE BEADY EYE SAYS; SO YOU ARE NOW 30 BY THE TIME YOU ARE 70 HERE IS WHAT A DAY IN YOUR LIFE WILL LOOK LIKE.

Tags

0.05% Aid Commission, Age of Uncertainty, AI, AI systems., Algorithms., Artificial Intelligence., Artificial life.

(Twenty-minute read)

The Dead Sea will be almost completely dried up, nearly half of the Amazon rainforest will have been deforested, wildfires will spread like, umm, wildfire, and the polar ice caps will be only 60 per cent the size they are now.

Wars will involve not only land and sea but space. Superhurricanes will become a regular occurrence.

Should you be worried, of course not AI/Algorithms are here to guide you.

AI-related advancements have grown from strength to strength in the last decade.

Right now there are people coming up with new algorithms by applying evolutionary techniques to the vast amounts of big data via genetic programming to find optimisations and improve your life in different fields.

The amount of data we have available to us now means that we can no longer think in discrete terms. This is what big data forces us to do.

It forces us to take a step back, an abstract step back to find a way to cope with the tidal wave of data flooding our systems. With big data, we are looking for patterns that match the data and algorithms are enabling us to find patterns via clustering, classification, machine learning and any other number of new techniques.

To find the patterns you or I cannot see. They create the code we need to do this and give birth to learner algorithms that can be used to create new algorithms.

So do you remember a time, initially, when it was possible to pass on all knowledge through the form of dialogue from generation to generation, parent to child, teacher to student?  Indeed, the character of Socrates in Plato’s “Phaedrus” worried that this technological shift to writing and books was a much poorer medium than dialogue and would diminish our ability to develop true wisdom and knowledge.

Needless to say that I don’t think Socrates would have been a fan of Social Media or TV.

The machine learning algorithms have become like a hammer at the hands of data scientists. Everything looks like a nail to be hit upon.

In due process, the wrong application or overkill of machine learning will cause disenchantment among people when it does not deliver value.

It will be a self-inflicted  ‘AI Winter’.

So here is what your day at 70th might be.

Welcome to the world of permanent change—a world defined not by heavy industrial machines that are modified infrequently, but by software that is always in flux.

Algorithms are everywhere. They decide what results you see in an internet search, and what adverts appear next to them. They choose which friends you hear from on social networks. They fix prices for air tickets and home loans. They may decide if you’re a valid target for the intelligence services. They may even decide if you have the right to vote.

7.30 am 

Personalised Health Algorithm report.

Sleep pattern good. Anxiety normal, deficient in vitamin C. Sperm count normal.

Results of body scan sent health network.

7.35 am

House Management Algorithm Report.

Temperature 65c. House secure. Windows/ Doors closed Catflap open. Heating off. Green Energy usage 2.3 Kwh per minute. (Advertisement to change provider.) Shower running, Water flow and temperature adjusted, shower head hight adjusted. House Natural light adjusted. Confirmation that smartphone and I pad fully charges. Robotic housemaid programmed.

8 am.

Personalised Shopping/Provisions Algorithm report.

Refrigerators will be seamlessly integrated with online supermarkets, so a new tub of peanut butter will be on its way to your door by drone delivery before you even finish the last one.

8.45 am. Appointments Algorithm.

Virtual reality appointment with a local doctor.

Voice mails and emails and the calendar check.

A device in your head might eliminate the need for a computer screen by projecting images (from a Skype meeting, a video game, or whatever) directly into your field of vision from within. It checks

9 am.

Personalised Financial Algorithm.

Balance of credit cards and bank accounts including citizen credit /loyalty points. Value of shares/ pension fund updated.

10 am. Still in your Dressing gown.

11 am.  The self-drive car starts. Seats automatically shift and rearrange themselves to provide maximum comfort. Personalised News and Weather Algorithm gives a report. The car books parking spot places order for coffee. Over coffee, you rent out a robot in Dublin and have it do the legwork for your forthcoming visiting – hotels.

12 pm.

Hologram of your boss in your living room.

1 pm.

Virtual work meeting to discuss the solitary nature of remote work.

Face-to-face meeting arranged.

 

2 pm. Home. Lunch delivered.

3 pm. Sporting activity with a virtual coach.

5 pm. Home

7 30 pm.

Discuss and view the Dubin robot walk around containing video and audio report. 

Dinner delivered. Six quests. The home management algorithm rearranges the furniture.

8 30 pm

Virtual helmets on for some after-dinner entertainment.

10 pm 

Ask Alixia to shut the house down not before you answer Alixia question to score points and a chance to win — Cash- Holiday- Dinner for two- a discount on Amazon- e bay- or a spot of online gambling.

                                                       ———

The fourth industrial revolution is not simply an opportunity. It matters what kind of opportunity is for whom and under what terms.

We need to start thinking about algorithms.

The core issue here is of course who will own the basic infrastructure of our future which is going to be effect all sectors of society.

They are not just for mathematicians or academics. There are algorithms all around us and you don’t need to know how to code to use them or understand them.

We need to better understand them to better understand, and control, our own futures. To achieve this we need to better understand how these algorithms work and how to tailor them to suit our needs. Otherwise, we will be unable to fully unlock the potential of this abstract transition because machine learning automates automation itself.

The new digital economy, akin to learning to read, has obscured our view of algorithms. Algorithms are increasingly part of our everyday lives, from recommending our films to filtering our news and finding our partners.

Building a solid foundation now for governance for AI the need to use AI responsibly
and to consider the broader reaching implications of this transformational technology’s use.

The world population will be over 9 billion with the majority of people will live in cities.

So here are a few questions at 30 you might want to consider.

How does the software we use influence what we express and imagine?

Shall we continue to accept the decisions made for us by algorithms if we don’t know how they operate?

What does it mean to be a citizen of a software society?

These and many other important questions are waiting to be analyzed.

If we reduce each complex system to a one-page description of its algorithm, will we capture enough of software behaviour?

Or will the nuances of particular decisions made by software in every particular case be lost?

You don’t need a therapist; they need an algorithm.

We may never really grasp the alienness of algorithms. But that doesn’t mean we can’t learn to live with them.

Unfortunately, their decisions can run counter to our ideas of fairness. Algorithms don’t see humans the same way other humans do.

What are we doing about confronting any of this –  Nothing much.

So its no wonder that people start to worry about what’s left for human beings to do.

All human comments appreciated. All like clicks and abuse chucked in the bin.

← Back

Thank you for your response. ✨

Share this:

  • Share on WhatsApp (Opens in new window) WhatsApp
  • More
  • Share on Pocket (Opens in new window) Pocket
  • Share on Mastodon (Opens in new window) Mastodon

THE BEADY EYE ASK’S: IS IT TIME TO REGULATE SOCIAL MEDIA PLATFORMS WITH LAWS.

25 Monday Nov 2019

Posted by bobdillon33@gmail.com in 2019: The Year of Disconnection., Algorithms., Democracy, Digital Friendship., Elections/ Voting, Facebook., Fourth Industrial Revolution., Google, Humanity., Modern Day Democracy., Modern day life., Our Common Values., Politics., Reality., Social Media, The common good., The essence of our humanity., The Obvious., The pursuit of profit., The state of the World., The world to day., Twitter, Unanswered Questions., What Needs to change in the World, World Politics

≈ Comments Off on THE BEADY EYE ASK’S: IS IT TIME TO REGULATE SOCIAL MEDIA PLATFORMS WITH LAWS.

Tags

Algorithms., Capitalism and Greed, Facebook and Society., Google/Amazon/Facebook/Twitter, Platforms regulation/laws., Social Media, Social media platforms., The Future of Mankind

 

 

(Ten-minute read) 

The beady eye is far from the first voice to ask this question and it certainly will not be the last.

We might even come to “question whether we still have free will.

There is no doubting that the social web has created amazing opportunities to learn, discover, connect, but its downside as it penetrates our daily lives is becoming more and more prevalent in the creation of our future lives and the societies we live in.

If the public discussion is shifting increasingly to online fora, and those fora are having more and more influence over democracy it becomes increasingly important to apply principles to them. 

Honest political debate is essential for the health of a democracy.  

If discussions of import move into space where they can be readily censored, then we will simply no longer live in a society with a free exchange of ideas, because the playing field will always be tilted.

One only has to look at how social media platforms are amplifying what is wrong with the world.  

While we all reveal a huge amount of personal information online we are losing the ability to determine honest facts that democracy depends.

Basically, companies that run social media platforms are monopolies or near-monopolies in their areas of operation, and the only way we can achieve the desired outcomes is through clear, effective legal regulations. 

We can’t always control how others use their platforms but we can apply the same regulations that govern all other forms of Media.

The public cannot rely on these company’s self-regulation, because self-regulation raises more questions than it answers.

The fact is that the formation of a platform takes place in a vacuum, whereas the formation of any competitors do not, so they cannot be considered parallels in any way. 

If we take companies like Facebook and Google they both derive most of their revenue from advertising. They essentially constitute a duopoly because they have access to the best data about individuals. Every memory, picture, emoji, song, video, link, gripe, fear, hope, want, dream and bad political opinion posted is mined and monetized as data.

As a result of their algorithms, they are creating and reinforcing divided and insular online communities that do not interact with people or information with which they disagreed.

At the end of the day, how Facebook and Google conduct their businesses undermines privacy and raises questions about ethical behaviour in the uses of our information and their role in society.

The Internet is a “utility” like water or electricity. It is essential to modern life, not an optional subscription service.

Determining how to regulate Facebook or any other platform may first require some kind of definition of what it is.

Facebook brags about connecting us to our family and friends — but it also about directly influencing the outcomes of elections across the globe.

It sits on top of industries including journalism, where it, together with Google, essentially controls the distribution channels for online news and, in effect, the way people discover information about politics, government and society.

They ( Google, Facebook, Twitter,etc) have figured out how to take advantage of this dynamic to distribute false information about political candidates and hot-button political issues in order to drive up traffic and advertising revenue.

Protecting our community is more important than maximizing their profits.

They are given protections that no one can sue them for any reason — that is Google and Facebook nither are responsible for the fake news that appears on their sites.

They are completely shielded from any responsibility for the content that appears on their service.

Changes to legal protection (which has been interpreted by judges to provide a safe harbour for online platforms even when they pay to distribute others’ content and decline the option to impose editorial oversight) would likely be devastating to online platforms like Google and Facebook and would transform the way people interact across the entire internet.

However, with legal protection, sites like these could be held responsible for libellous comments posted by readers, Google could lose lawsuits over potentially false or defamatory information surfacing in search results, and Facebook could be sued for any potentially libellous comment made by anyone on its platform against any other person.

The legal bills to defend against libel and defamation claims would be enormous.

We all need protection and the ability to request platforms to provide us with control over online information by making it accessible and removable at an individual’s request.

The government, on the other hand, has a regulatory intent to protect citizens from content that is obscene or violent.

Should Facebook and their like be regulated?

A question that is never going to end. 

However, until we recognize that there is no fool-proof safeguard to keep horrific content away from the eyes of children we rely on huge fines to the detriment of us all. 

Till then with all internet platforms deflecting criticism, social media will be more psychologically damaging than anyone expected. 

We need a radical shift in the balance of power between the platforms and the people.

It is beyond comprehension that we tolerate the present position.

Or is it? When you see the below.   

Would you ever be prepared to use a nuclear weapon?

This question is increasingly put to politicians as some kind of virility test.

The subtext is that to be a credible political leader, you must be willing to use an indiscriminate weapon of mass destruction.

We should be baulking at the casual way in which political discourse on this topic has developed which is politically unacceptable and morally despicable. 

If a mainstream politician unblinkingly said that they would use chemical weapons against civilians there would be uproar. If a self-proclaimed candidate for prime minister boasted that they would commit war crimes, it would be a national scandal. Nuclear weapons should be seen no differently. 

It’s time that nuclear advocates spelt out the reality of what their position means.

The human race is so good at speaking, it’s lost the art of listening.

It might be easy to brush away the febrile atmosphere online as a nasty byproduct of free expression: I don’t want Facebook having everyone’s verified identities. I do want their platform and other platforms to be held responsible legally for content that is false, racest, hateful, rightwing fascist propaganda.  

I do know that if the big platforms, as they already do in part, forced some verifiable information to back up use, we could tame this wild west with legal requirements

I’ll give up on the consensus-building when I can open a platform knowing who to hold legally responsible.  


All human comments appreciated. All like clicks and abuse chucked in the bin. 

← Back

Thank you for your response. ✨

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Share this:

  • Share on WhatsApp (Opens in new window) WhatsApp
  • More
  • Share on Pocket (Opens in new window) Pocket
  • Share on Mastodon (Opens in new window) Mastodon

THE BEADY EYE ASK’S: WILL ARTIFICIAL INTELLIGENCE SEE THE END OF DETERMINING MEANING.

23 Saturday Nov 2019

Posted by bobdillon33@gmail.com in Algorithms.

≈ Comments Off on THE BEADY EYE ASK’S: WILL ARTIFICIAL INTELLIGENCE SEE THE END OF DETERMINING MEANING.

Tags

Algorithms trade., Algorithms., Artificial Intelligence., Technology, The Future of Mankind, Visions of the future.

 

(Five-minute read)

We all know that new data-driven technology is transforming our society.

Our digital society is creating new and profound challenges and carry significant ethical risks to us all.

To be human means that you are persuadable in every single moment.

If you need any evidence of this you don’t have to look any further than the current impeachment hearing in the USA and England with its general election.

The troubling influence algorithms have on how we make decisions are now treating the foundations of our Societies.

Algorithms are now usually a component in a broader decision- making process involving human decision-makers. Far from being neutral and all-knowing decision tools, complex algorithms are shaped by humans, who are, for all intents and purposes, imperfect.

Determining the meaning of sensory input is one of the most constantly ongoing, and important, functions of any brain. By refining our awareness, we transform existence into beauty, thinking into philosophy.How Artificial Intelligence Will Revolutionize Our Lives

Our increasing reliance on artificial intelligence is destroying the meaning of life.

Few of us understand them or the implications they are having or will have. .

They are taking the creativity away from the decision-making process for instance.

They could turn us humans into mush-minded creatures who can’t be bothered to make our own choices.

What if the new algorithm rated my friend and another woman as a 90% match, would that mean he would simply trust the algorithm and go straight for a proposal of marriage?

Important parts of our lives are being run by AI without sufficient scrutiny.

Self-learning systems are not autonomous systems however they could lead to a Master Algorithm that could match Einstein’s theory of relativity in its world-transformative power.

Algorithms are changing our relationship with each other, with doctors, police, politics, work, health care.

They are at work where ever you look. When you do a web search, machine learning chooses the results you get. Interactive processing allowing the software to learn automatically from patterns. Algorithms are not only being used just to make a profit but life-changing decisions, from your Credit rating, how your will and who you will vote for.

With machine learning, you are programming a computer to learn by itself.

Amazon uses them to recommend products; Netflix uses them to recommend movies; Facebook and Twitter use them to choose which posts to show you.

Pretty much everything that happens online involves machine learning.

So are we putting to much trust in smart systems that learn from data?

The systems are only as good as the data it learns from.

Their goal is to provide software that can reason on input and explain on output by becoming classifiers and predictors.

Alexa – Google is an example. Like most machine-learning algorithms, Google not only analyses our behaviour: it shapes it. This goes round and round until one viewpoint dominates people’s thinking. It will control the information its algorithm pays attention to and the secretive nature of algorithms means people cannot scrutinise the decisions they make.

Since much of the data that is feed into AI’s is imperfect and bias the decision processes built on top of the Ai’s need to be made open to scrutiny.

Why?

Because Algorithms learn differently than us. They look at things differently.

It might enhance the speed, precision and effectiveness of human efforts but in the long run, it will replace our decision making.

They are also moving into areas where the benefits to those applying them may not be matched by the benefits to those subject to their ‘decisions’— We have to demand to know what kind of influence these algorithms have over us.

Google is an algorithm that we are all familiar with, but it is far from being the only algorithmic decision-making tool to influencing our daily lives.

What can be done to combat their growing influence?

Governments should play their part in the algorithms revolution in two ways

Governments should produce, publish, and maintain a list of where algorithms with significant impacts are being used.

The index to the internet should be a public instrument, owned and controlled by the public. It should be a public utility. It should be an index, pure and simple – not a tracking device or a mechanism of manipulation, put the control of algorithms back into the hands of the people that are affected by them.

Government oversight of such algorithms, where they are used by the
public sector, and to co-ordinate departments’ approaches to the development and
deployment of algorithms and partnerships with the private sector.

Governments should offer significant rewards for societies that can find the right combination of market-driven innovation and regulation to maximise the benefits of data-driven technology and minimise the harms.

We must subsequently make decisions that require value judgements and trade-offs between competing values.

New functions and actors, such as third party auditors, may also be required to independently verify claims made by organisations about how their algorithms operate.

Many of the most consequential algorithms currently being used in the public and private domains are complex and opaque, making it hard to attribute accountability to their actions.

Humans are often trusted to make these trade-offs without having to explicitly state how much weight they have put on different considerations. Algorithms are different. They are programmed to make trade-offs according to unambiguous rules.

The ethical questions in relation to bias in algorithmic decision-making vary depending on the context.

For example, High-frequency trading is an algorithm-fueled method of buying and selling stocks – among other things.

The fact that while the problematic implications of many algorithms have been exposed, we may have only just begun to skim the surface.

Improving transparency, however, is no easy task.

Companies with algorithmic products would lose their competitive edge if they were forced to make their algorithms public.

Transparency is not enough. In fact, because algorithms are quite complicated.

This is a simple matter.

What is required is a means of certification as to whether an algorithm is safe or fair to use.

Who knows, an algorithmic slider could, one day, form part of our daily lexicon. But, in the meantime, algorithms need to be managed; ensuring those with the power to shape our lives do so with some code of conduct.

In the end, all technology revolutions are propelled not just by discovery, but also by business and societal need. We pursue these new possibilities not because we can, but because we must.

As I have already said  “To be human means that you are pursuable in every single moment”

This morning without any action on my part through the post I am in receipt of an Amazon prime video card.

All human comments appreciated. All like clicks and abuse chucked in the bin.

← Back

Thank you for your response. ✨

Share this:

  • Share on WhatsApp (Opens in new window) WhatsApp
  • More
  • Share on Pocket (Opens in new window) Pocket
  • Share on Mastodon (Opens in new window) Mastodon
← Older posts
Newer posts →

All comments and contributions much appreciated

  • THE BEADY EYE SAYS. ANY OTHER PERSON WOULD BE ARRESTED. February 1, 2026
  • THE BEADY EYE SAYS FROM THE RESURRECTION OF JESUS TO THE PRESENT DAY THE HISTORICAL RECORD OF OUR WORLD IS MORE THAN HORRIBLE. February 1, 2026
  • THE BEADY EYE SAYS: THE WORLD WE LIVE IN IS BECOMING MORE AND MORE UNKNOWN. January 31, 2026
  • THE BEADY ASK. IN THIS WORLD OF FRICTIONS IS THERE ANY DECENCY LEFT ? January 29, 2026
  • THE BEADY EYE ASKS ARE WE WITH ARTIFICIAL INTELLIGENCE LOOSING THE MEANING OF OUR LIVES? January 27, 2026

Archives

  • February 2026
  • January 2026
  • December 2025
  • November 2025
  • October 2025
  • September 2025
  • August 2025
  • July 2025
  • June 2025
  • May 2025
  • February 2025
  • January 2025
  • December 2024
  • November 2024
  • October 2024
  • September 2024
  • August 2024
  • July 2024
  • June 2024
  • May 2024
  • April 2024
  • March 2024
  • February 2024
  • January 2024
  • December 2023
  • October 2023
  • September 2023
  • August 2023
  • July 2023
  • June 2023
  • May 2023
  • April 2023
  • March 2023
  • February 2023
  • January 2023
  • December 2022
  • November 2022
  • October 2022
  • September 2022
  • August 2022
  • July 2022
  • June 2022
  • May 2022
  • April 2022
  • March 2022
  • February 2022
  • January 2022
  • December 2021
  • November 2021
  • October 2021
  • September 2021
  • August 2021
  • July 2021
  • June 2021
  • May 2021
  • April 2021
  • March 2021
  • February 2021
  • January 2021
  • December 2020
  • November 2020
  • October 2020
  • September 2020
  • August 2020
  • July 2020
  • June 2020
  • May 2020
  • April 2020
  • March 2020
  • February 2020
  • January 2020
  • December 2019
  • November 2019
  • October 2019
  • September 2019
  • August 2019
  • July 2019
  • June 2019
  • May 2019
  • April 2019
  • March 2019
  • February 2019
  • January 2019
  • December 2018
  • November 2018
  • October 2018
  • September 2018
  • August 2018
  • July 2018
  • June 2018
  • May 2018
  • April 2018
  • March 2018
  • February 2018
  • January 2018
  • December 2017
  • November 2017
  • October 2017
  • September 2017
  • August 2017
  • July 2017
  • June 2017
  • May 2017
  • April 2017
  • March 2017
  • February 2017
  • January 2017
  • December 2016
  • November 2016
  • October 2016
  • September 2016
  • August 2016
  • July 2016
  • June 2016
  • May 2016
  • April 2016
  • March 2016
  • February 2016
  • January 2016
  • December 2015
  • November 2015
  • October 2015
  • September 2015
  • August 2015
  • July 2015
  • June 2015
  • May 2015
  • April 2015
  • March 2015
  • February 2015
  • January 2015
  • December 2014
  • November 2014
  • October 2014
  • September 2014
  • August 2014
  • July 2014
  • June 2014
  • May 2014
  • April 2014
  • March 2014
  • February 2014
  • January 2014
  • December 2013
  • November 2013
  • October 2013

Talk to me.

Jason Lawrence's avatarJason Lawrence on THE BEADY EYE ASK’S: WIT…
benmadigan's avatarbenmadigan on THE BEADY EYE ASK’S: WHA…
bobdillon33@gmail.com's avatarbobdillon33@gmail.co… on THE BEADY EYE SAYS: WELCOME TO…
Ernest Harben's avatarOG on THE BEADY EYE SAYS: WELCOME TO…
benmadigan's avatarbenmadigan on THE BEADY EYE SAY’S. ONC…

7/7

Moulin de Labarde 46300
Gourdon Lot France
0565416842
Before 6pm.

My Blog; THE BEADY EYE.

My Blog; THE BEADY EYE.
bobdillon33@gmail.com

bobdillon33@gmail.com

Free Thinker.

View Full Profile →

Follow bobdillon33blog on WordPress.com

Blog Stats

  • 95,090 hits

Blogs I Follow

  • unnecessary news from earth
  • The Invictus Soul
  • WordPress.com News
  • WestDeltaGirl's Blog
  • The PPJ Gazette
Follow bobdillon33blog on WordPress.com
Follow bobdillon33blog on WordPress.com

The Beady Eye.

The Beady Eye.
Follow bobdillon33blog on WordPress.com

Blog at WordPress.com.

unnecessary news from earth

WITH MIGO

The Invictus Soul

The only thing worse than being 'blind' is having a Sight but no Vision

WordPress.com News

The latest news on WordPress.com and the WordPress community.

WestDeltaGirl's Blog

Sharing vegetarian and vegan recipes and food ideas

The PPJ Gazette

PPJ Gazette copyright ©

Privacy & Cookies: This site uses cookies. By continuing to use this website, you agree to their use.
To find out more, including how to control cookies, see here: Cookie Policy
  • Subscribe Subscribed
    • bobdillon33blog
    • Join 222 other subscribers
    • Already have a WordPress.com account? Log in now.
    • bobdillon33blog
    • Subscribe Subscribed
    • Sign up
    • Log in
    • Report this content
    • View site in Reader
    • Manage subscriptions
    • Collapse this bar