• About
  • THE BEADY EYE SAY’S : THE EUROPEAN UNION SHOULD THANK ENGLAND FOR ITS IN OR OUT REFERENDUM.

bobdillon33blog

~ Free Thinker.

bobdillon33blog

Tag Archives: Google

THE BEADY EYE SAY’S: IT’S NOW OR NEVER FOR HUMAN RIGHTS.

11 Thursday Oct 2018

Posted by bobdillon33@gmail.com in Algorithms., Artificial Intelligence., Democracy, Evolution, Facebook, Google, Humanity., Modern day life., Privatization, Sustaniability, Technology, The cloud., The Obvious., The world to day., Unanswered Questions., What Needs to change in the World

≈ Comments Off on THE BEADY EYE SAY’S: IT’S NOW OR NEVER FOR HUMAN RIGHTS.

Tags

Algorithms trade., Algorithms., Artificial Intelligence., Big Data, Distribution of wealth, Google, Technology, The Future of Mankind, Visions of the future.

 

(Fifteen-minute read)

This is not a spectator sport:

It can be nearly impossible to credibly predict all the positive possibilities and negative implications for how we live, work, govern, and organize arising from the deployment of AI.

The fact that we are deeply uncertain about how technologies will evolve in the years and decades ahead makes human rights due diligence of AI very challenging. Résultat de recherche d'images pour "pictures of human rights abuse"

The fact that the rapid development of AI raises challenges for securing access to remedy, which can be especially challenging when humans often can’t cognitively understand how a decision is made by AI systems:

The fact that Artificial Intelligence (AI) technology is increasingly being used by businesses, governments, and other institutions to augment many fields of human endeavor.

The fact that Methodologies for implementing respect for human rights may need to integrate strategic tools such as strategic foresight, futures thinking, and scenario planning.

The fact that the global “platformization” of content (i.e., the rise of Video on Demand and online streaming platforms) shows us that guarantees of people’s access to the culture of their selection may now become beholden to digital intermediaries.

Proponents believe that the further development of AI creates new opportunities in health, education, and transportation, will generate wealth and strengthen economies and can be used to solve pressing social issues. However, the rapid growth of AI raises important questions about whether our current policies, legal systems, business due diligence practices, and methods to protect rights are fit for purpose.

The significant expansion of data collected and analyzed may also result in increasing the power of companies with ownership over this data and threaten our right to privacy.

HERE ARE A FEW OF PROBLEMS THAT ARE NOT DISCUSSED.

As the digital paradigm evolves, the pathway for human rights is likely to become more complicated, making appropriate regulation more important than ever. The realization of ESCR and the right to development centers on data democracies that are accountable.

Global policy discourses and frameworks around data have skewed the digital innovation tide in favor of developed countries… the global “platformization” of content… shows us that guarantees of people’s access to the culture of their selection may now become beholden to digital intermediaries.

Data and technological arrangements in the global South and North worryingly point to a wholesale private capture and consolidation of critical data regimes in the developing world: trade, agriculture, health, and education.

Not only does this leave citizens in developing countries vulnerable to acute privacy violations, but it also bears decisively on their economic, social and cultural rights (ESCRs). For example, in India, the acquisition of homegrown successes, such as Wal-Mart’s purchase of the domestic e-commerce unicorn, Flipkart, poses very serious outcomes for the livelihoods of small producers and traders.

Algorithm-based decision-making by companies could also perpetuate human bias and result in discriminatory outcomes, as they already have in some cases.

An algorithm is meant to complement and not necessarily displace human discernment, it is not hard to imagine a future where humanitarian assistance to refugees becomes predicated on their (technology proven) ability to viably assimilate and contribute to their host economies.

Could the trade-off for a smoother resettlement process be the exclusion of those that the algorithm will one day write off as “inadmissible” and “unsolvable”?

Technology-based decision-making also raises important questions on how the right to development will be realized.

Artificial intelligence is undermining society by promising unimaginable benefits without any intervention from governments or other world organizations.

As always we humans react to crises when it’s too late.Résultat de recherche d'images pour "pictures of human rights abuse"

Governments must create policies for effective data sharing between governments and the private sector for sectors that are of critical social importance.

More importantly, governments must create a data commons with independent oversight.

For example:

The municipality of Curitiba in Brazil, for instance, has taken the lead in passing local legislation that mandates anonymized data sharing between the local government and the ride aggregator Uber. The intention is to tap into Uber’s large and rich data sets towards better city planning and traffic management outcomes.

Governments must invest in the idea of “data as a public good” so it can work to enhance human rights.

Although nascent, experiments with models for managing big data repositories are increasing.  Such repositories can encourage domestically led innovation, with local start-ups and public agencies taking the lead in developing appropriate AI-based solutions for social problems.

These are pressing policy challenges, and such prediction models need to be closely and continuously tracked for possible social distortions and subject to institutional audit. The biases in AI is often the bias of humans. People will not rely on technology they do not trust.

Society needs to come together to consider these questions, explore solutions, and deploy AI that puts people first, protects human rights, and deserves the public’s trust.

Breakthroughs in technology—including artificial intelligence—can help fulfill the right to development, but digital technologies are not magic bullets; there is a strong role for governance.

The security of digital bits cannot be left to the cloud nor the internet of things, promoted by Amazon, Facebook, and Google with home hubs that can be hacked.

Civil society groups, governments, and others are rightly asking questions regarding the risks to human rights. In this age of global corporate presence and influence, we need to ensure that ordinary people and communities are able to stand up for their rights.

But the danger with artificial intelligence is greater than just our rights.

Everything that makes who we are comes from our brains. Without brain power, we would not have gone from flint arrowheads to the space station.

DESTROY THE BRAIN WITH AI AND WE DESTROY CREATIVITY.

All human comments appreciated. All like clicks chucked in the bin.

 

 

 

 

 

 

 

 

 

 

Advertisement

Share this:

  • Tumblr
  • Email
  • Pocket
  • LinkedIn
  • WhatsApp
  • Telegram
  • Skype
  • Facebook
  • Twitter
  • Reddit

Like this:

Like Loading...

THE BEADY EYE SAYS: GOOGLE IS MAKING OUR KNOWLEDGE VALUELESS.

13 Monday Mar 2017

Posted by bobdillon33@gmail.com in Artificial Intelligence., Google, Google it., Google Knowledge., HUMAN INTELLIGENCE, Humanity., Modern day life., Our Common Values., Technology, The Internet., The world to day., Twitter, Unanswered Questions., What Needs to change in the World, Where's the Global Outrage.

≈ Comments Off on THE BEADY EYE SAYS: GOOGLE IS MAKING OUR KNOWLEDGE VALUELESS.

Tags

Artificial Intelligence., Google, Google ambitions, Google knowledge., Google/Amazon/Facebook/Twitter

( A five-minute read if you don’t want to be Googled)

Artificial intelligence is changing the world we live in but are we all going to end up scratching our behinds wishing we were dead. Turned into “‘pancake people’—spread wide and thin as we connect with that vast network of information accessed by the mere touch of a button.

Our thoughts and actions scripted as if they’re following the steps of an algorithm.

Image associée

As we come to rely on computers to mediate our understanding of the world, it is our own intelligence that flattens into artificial intelligence.

The perfect coordination and optimization of our day- to – day lives controlled by Google Monopoly inc.

Google is draining of our “inner repertory of dense cultural inheritance,”

Résultat de recherche d'images pour "pictures of people using google"

Why because we will be in a state of constant Google observation with the entire world connected to the world they wish to present.

At the moment Google control over 65% of all searches, ( WHICH NO ONE KNOWS HOW IT WORKS)

Google is not required by Law to serve everyone nor for that matter is Amazon, Apple, Facebook, Snapchat, or Twitter.

Nearly every iPhone operates on its Android operating system.

WE ARE ESSENTIALLY SENTENCED TO A GOOGLE DIGITAL DEATH.

They supply the stuff of thought, but they also shape the process of thought.

For me, as for others, the Net is becoming a universal medium, the conduit for most of the information that flows through my eyes and ears and into my mind.

The Internet, an immeasurably powerful computing system, is subsuming most of our other intellectual technologies. It’s becoming our map and our clock, our printing press and our typewriter, our calculator and our telephone, and our radio and TV.

The result is to scatter our attention and diffuse our concentration.

Yet, for all that’s been written about the Net, there’s been little consideration of how, exactly, it’s reprogramming us. The Net’s intellectual ethic remains obscure.

Google’s headquarters, in Mountain View, California—the Googleplex—is the Internet’s high church, and the religion practiced inside its walls is Taylorism.

Taylor created a set of precise instructions—an “algorithm,” we might say today—for how each worker should work.

Taylor’s system is still very much with us; it remains the ethic of industrial manufacturing. And now, thanks to the growing power that computer engineers and software coders wield over our intellectual lives, Taylor’s ethic is beginning to govern the realm of the mind as well.

Google, is “a company that’s founded around the science of measurement,” and it is striving to “systematize everything” it does.

Drawing on the terabytes of behavioral data it collects through its search engine and other sites, it carries out thousands of experiments a day, according to the Harvard Business Review, and it uses the results to refine the algorithms that increasingly control how people find information and extract meaning from it.

What Taylor did for the work of the hand, Google is doing for the work of the mind.

The company has declared that its mission is “to organize the world’s information and make it universally accessible and useful.”

It seeks to develop “the perfect search engine,” which it defines as something that “understands exactly what you mean and gives you back exactly what you want.”

In Google’s view, information is a kind of commodity, a utilitarian resource that can be mined and processed with industrial efficiency. The more pieces of information we can “access” and the faster we can extract their gist, the more productive we become as thinkers.

Still, their easy assumption that we’d all “be better off” if our brains were supplemented, or even replaced, by an artificial intelligence is unsettling.

It suggests a belief that intelligence is the output of a mechanical process, a series of discrete steps that can be isolated, measured, and optimized. In Google’s world, the world we enter when we go online, there’s little place for the fuzziness of contemplation. Ambiguity is not an opening for insight but a bug to be fixed. The human brain is just an outdated computer that needs a faster processor and a bigger hard drive.

And because we would be able to “receive a quantity of information without proper instruction,” we would “be thought very knowledgeable when we are for the most part quite ignorant.” We would be “filled with the conceit of wisdom instead of real wisdom.” This is not good, as the world is in need of wisdom more than ever.

I come from a tradition of Western culture, in which the ideal (my ideal) was the complex, dense and “cathedral-like” structure of the highly educated and articulate personality—a man or woman who carried inside themselves a personally constructed and unique version of the entire heritage of the West. [But now] I see within us all (myself included) the replacement of complex inner density with a new kind of self—evolving under the pressure of information overload and the technology of the “instantly available.”

If we lose  quiet spaces, or fill them up with “content,” we will sacrifice something important not only in ourselves but in our culture. In a recent essay, the playwright Richard Foreman eloquently described what’s at stake:

As Richard Foreman so beautifully describes it, we’ve been pounded into instantly-available pancakes, becoming the unpredictable but statistically critical synapses in the whole Gödel-to-Google net. Does the resulting mind (as Richardson would have it) belong to us? Or does it belong to something else?

Will this produce a new kind of enlightenment or “super-consciousness”? Sometimes I am seduced by those proclaiming so—and sometimes I shrink back in horror at a world that seems to have lost the thick and multi-textured density of deeply evolved personality.

Reading, is not an instinctive skill for human beings. It’s not etched into our genes the way speech is.

The media or other technologies we use in learning and practicing the craft of reading play an important part in shaping the neural circuits inside our brains.

Circuits woven by our use of the Net will be different from those woven by our reading of books and other printed works.

The tools that extend our mental rather than our physical capacities—we inevitably begin to take on the qualities of those technologies.

Never has a communications system played so many roles in our lives—or exerted such broad influence over our thoughts—as the Internet does today.

Where does it end?Résultat de recherche d'images pour "pictures of people using google"

Mr Page of google said in a speech a few years back. “For us, working on search is a way to work on artificial intelligence.”

The faster we surf across the Web—the more links we click and pages we view—the more opportunities Google and other companies gain to collect information about us and to feed us advertisements.

The last thing these companies want is to encourage leisurely reading or slow, concentrated thought.  It’s in their economic interest to drive us to distraction.

There’s a tendency to glorify technological progress, there’s a countertendency to expect the worst of every new tool or machine.

Google as a substitute for the knowledge they used to carry inside their heads, they would, in the words of one of the dialogue’s characters, “cease to exercise their memory and become forgetful.”

I know that Google will argue the toss and indeed other than they becoming a monopolizing influence I would have great praise.

All comments appreciated. All like clicks chucked in the Bin.

Share this:

  • Tumblr
  • Email
  • Pocket
  • LinkedIn
  • WhatsApp
  • Telegram
  • Skype
  • Facebook
  • Twitter
  • Reddit

Like this:

Like Loading...

THE BEADY EYE SAYS: DON’T BRING YOUR IPAD TO BED.

07 Wednesday Dec 2016

Posted by bobdillon33@gmail.com in Artificial Intelligence., Big Data., Google Knowledge., HUMAN INTELLIGENCE, Humanity., Life., Modern Day Communication., Modern day life., Social Media., Technology, The Future, The world to day., Unanswered Questions., What Needs to change in the World

≈ Comments Off on THE BEADY EYE SAYS: DON’T BRING YOUR IPAD TO BED.

Tags

Artificial Intelligence., Big Data, Creative Thinking., Google, Internet, SMART PHONE WORLD

 

(This is a short follow-up read)  Re the post:

The Beady Eye Asks: Where does it end? Google.)Afficher l'image d'origine

More and more people are taking their tablets to bed with them to surf the web, check Facebook or email before switching off the light.

Few of us need to live our lives accessible to others at all times of the day.

Text alerts, Facebook notifications, Twitter mentions, and emails are often nothing more than distractions that keep us from the world right in front of us.

They clutter our mind with nonessential information. Technology ought to serve us, not the other way around.

However technology is altered human physiology. It makes us think differently, feel differently, even dream differently. It affects our memory, attention spans and sleep cycles.

We are now hard-wired to assume our phones are ringing, even when they’re not.

In a Google-happy world, when virtually any scrap of information is instantly at our fingertips, we don’t bother retaining facts.

Some cognition experts have praised the effects of tech on the brain, lauding its ability to organize our lives and free our minds for deeper thinking. Others fear tech has crippled our attention spans and made us uncreative and impatient when it comes to anything analog.

If there are areas of our life where technology is doing more harm than good it’s bed but the idea of a technology-free bedroom is a counter-cultural thought.

However the benefits of a technology-free bedroom should not be overlooked and dismissed so quickly. The most important, intimate conversations take place in your bedroom. Couples who keep a TV OR IPADS in the bedroom have sex half as often as those who don’t.  Besides, most of our excuses can be overcome with some creative thinking.  People who spend time on social media tend to experience higher levels of envy, loneliness, frustration, and anger.Afficher l'image d'origine

Social media interaction holds some benefit. But if we can intentionally remove these unhealthy emotions from our bedroom, it allows space for our minds to separate from the day’s activities.

Keeping your bedroom as a notification-free zone results in a more peaceful, engaged, calming environment.

Checking Facebook/Twitter before putting your feet on the floor is not living.

If you don’t want to feel like a zombie during the day, the findings are clear:

Read an actual, printed book if you must stimulate your mind before bed.

So if you’re having trouble sleeping, consider actually putting all those pesky electronics away and give your brain a chance to fully shut itself down when you’re looking for some shuteye.

To understand what critical and creative thinking is, an individual first must understand what thinking is.

Thinking is any mental activity that helps formulate or solve a problem, make a decision, or fulfill a desire to understand. It is searching for answers, a reaching for meaning that includes numerous mental activities throughout the process.
or
Thinking. The capacity to reflect, reason, and draw conclusions based on our experiences, knowledge, and insights. It’s what makes us human and has enabled us to communicate, create, build, advance, and become civilized.
or
Thinking encompasses so many aspects of who our children are and what they do, from observing, learning, remembering, questioning, and judging to innovating, arguing, deciding, and acting.
Thinking is critical to a person everyday life. 
People often fear the worse and manage their life’s around news or information they hear; therefore, it is very important to use critical thinking when analyzing issues, solving problems, and making everyday decisions. 
Today’s technology is target and customize ads with unparalleled precision. In fact, advertising is getting more personal, more engaging, more interesting and more thought-provoking than ever. It will result in your children having their brains wired in ways that may make them less, not more, prepared to thrive in this crazy new world of technology.
On the other hand:
Given the ease with which information can be found these days, it only stands to reason that knowing where to look is becoming more important for children than actually knowing something. Not having to retain information in our brain may allow it to engage in more “higher-order” processing such as contemplation, critical thinking, and problem solving.
This may be so;
Truth is so about something, the reality of the matter, as distinguished from what people wish were so, believe to be so, or assert to be so.
Visual intelligence has been rising globally for 50 years. More than 85 percent of video games contain violence.
The history of human thought would make it seem that there is difficulty in thinking of an idea even when all the facts are on the table. Making the cross-connection requires a certain daring.

There is no hard and fast rules concerning the source of creativity.

Morning people have more insights in the evening. Night owls have their breakthroughs in the morning.

Your Best Creative Time Is Not When You Think.

Dreams aren’t supposed to make any sense.

They’re just what happens when you put your head down for the night and your brain decides to bullshit you for eight hours about getting chased by Bigfoot while your teeth fall out.

With that said, dreams have been responsible for some major creative and scientific discoveries in the course of human history. A surprising number of society’s innovations have come from dreams, proving that sometimes there is the method to your brain’s madness.

For example …

The tune for “Yesterday” came to Paul McCartney in a dream..

Larry Page and Sergey Brin got the idea for “downloading the entire web onto computers”.dreamed it one night when he was 23 years-old.

Mary Wollstonecraft Godwin Frankenstein was inspired by a dream.

Otto Loewi (1873-1961) won the Nobel Prize for medicine in 1936 for his work on the chemical transmission of nerve impulses came to him in a dream.

Edison took short trips into the subconscious mind. There, he accessed ideas. Or perhaps, he bypassed the conscious mind and all its barriers to creativity

Elias Howe invented the sewing machine in 1845 dreamt it.

Srinivasa Ramanujan (1887-1920) was one of India’s greatest mathematical geniuses. He made substantial contributions to the analytical theory of numbers and worked on elliptical functions, continued fractions, and infinite series.  According to Ramanujan, inspiration and insight for his work many times came to him in his dreams..

The history of science is full of stories of scientists claiming a “flash of inspiration” which motivated them. One of the best known is from the chemist August Kekulé (1829-1896), who proposed that structure of molecules followed particular rules. Kekulé recounted that the structure of benzene came to him in a dream, in which rows of atoms wound like serpents before him; one of the serpents seized its own tail: “the form whirled mockingly before my eyes. I came awake like a flash of lightning.

Hannibal, who many described as a military genius, based his battle plans against the Romans on his dreams.

The Periodic Table:
Nineteenth-century Chemist Dimitri Mendeleyev fell asleep while chamber music was being played in the next room. He understood in a dream that the basic chemical elements are all related to each other in a manner similar to the themes and phrases in music.

A young Albert Einstein conceived the theory of relativity in a dream.

Modern Robotics:
Dennis Hong, genius innovator at University of Virginia uses the interface of sleep and waking to access ideas.

Jack Nicklaus’ Golf Swing came to him in a dream.

Insulin, came to Frederik Banting,in a dream.

As technology has played a bigger role in our lives, our skills in critical thinking and analysis have declined to such an extent that the world is now in dire need of readers intellects – imagination, induction, reflection and critical thinking.

Social media may well promote a culture of sharing, but there is little point in sharing trivia. So share this post. Your brain will thank you. 

Just in case you get the impression that I am totally against Technology. I believe technology can actually increase your intelligence.

The best way to make technology work for you instead of against you is to be smart about it—utilize it in order to allow you the time and mental energy to engage in higher-level cognitive activities, not as a crutch because you don’t feel like activating your neurons.Afficher l'image d'origine

But don’t ask your device how to make that happen—figure that one out for yourself.

Share this:

  • Tumblr
  • Email
  • Pocket
  • LinkedIn
  • WhatsApp
  • Telegram
  • Skype
  • Facebook
  • Twitter
  • Reddit

Like this:

Like Loading...

THE BEADY EYE ASKS: WHERE DOES IT END? – GOOGLE.

04 Sunday Dec 2016

Posted by bobdillon33@gmail.com in Artificial Intelligence., Big Data., Communication., Facebook, Google it., Google Knowledge., HUMAN INTELLIGENCE, Modern Day Communication., Modern day life., Social Media., Technology, The Future, The Internet., The world to day., Unanswered Questions., What Needs to change in the World, WiFi communication.

≈ Comments Off on THE BEADY EYE ASKS: WHERE DOES IT END? – GOOGLE.

Tags

Artificial Intelligence., Big Data, Google, SMART PHONE WORLD, The Future of Mankind, The Internet.

 

( A seven minute read)

Our worldviews are formed by who is shouting louder and more persistently into our ears.

While our Technologic vision is to create more intuitive and human-like interactions between man and machines Google, Facebook, Twitter, the Internet of Everything.  The last thing these companies want is to encourage leisurely reading or slow, concentrated thought. It’s in their economic interest to drive us to distraction.Afficher l'image d'origineAmbiguity is not an opening for insight but a bug to be fixed. The human brain is just an outdated computer that needs a faster processor and a bigger hard drive.

There’s has been little consideration of how, exactly, the Internet and these Companies are reprogramming us.

Having said that, I think internet and new media actually can be effective to fight such brainwashing.

However most of the Internet and Social Media is now presenting just superficial information we won’t even remember tomorrow. It is the illusion of knowledge by information.

Just as we coming to rely on computers to mediate our understanding of the world, it is our own intelligence that is being flattened into artificial intelligence.

True reality might be forever beyond our reach, but surely our senses give us at least an inkling of what it’s really like. Quantum mechanics is telling us that we have to question the very notions of ‘physical things’ sitting in ‘space.

If you have got this far, you might be wondering where am I going with this post.

Just as there’s a tendency to glorify technological progress, there’s a counter tendency to expect the worst of every new tool or machine.

The idea that our minds should operate as high-speed data-processing machines is not only built into the workings of the Internet, it is the network’s reigning business model as well.

The faster we surf across the Web—the more links we click and pages we view—the more opportunities Google and other companies gain to collect information about us and to feed us advertisements.

Last year, Page told a convention of scientists that Google is “really trying to build artificial intelligence and to do it on a large-scale.

Certainly if you had all the world’s information directly attached to your brain, or an artificial brain that was smarter than your brain, you’d be better off.

Still, their easy assumption that we’d all “be better off” if our brains were supplemented, or even replaced, by an artificial intelligence is unsettling.

Is it real knowledge? or a HAL-like machine that might be connected directly to our brains. “The ultimate search engine is something as smart as people—or smarter,”

Thanks to the growing power that computer engineers and software coders wield over our intellectual lives,“algorithm,” are beginning to govern the realm of the mind.

The Internet is a machine designed for the efficient and automated collection, transmission, and manipulation of information, and its legions of programmers are intent on finding the “one best method”—the perfect algorithm—to carry out every mental movement of what we’ve come to describe as “knowledge work.”

Google, is “a company that’s founded around the science of measurement,” and it is striving to “systematize everything”

It carries out thousands of experiments a day, according to the Harvard Business Review, and it uses the results to refine the algorithms that increasingly control how people find information and extract meaning from it.

The company has declared that its mission is “to organize the world’s information and make it universally accessible and useful.” It seeks to develop “the perfect search engine,” which it defines as something that “understands exactly what you mean and gives you back exactly what you want.”

In Google’s view, information is a kind of commodity, a utilitarian resource that can be mined and processed with industrial efficiency. The more pieces of information we can “access” and the faster we can extract their gist, the more productive we become as thinkers.

It would bring about a restructuring not only of industry but of society, creating a utopia of perfect efficiency. “In the past the man has been first,” he declared; “in the future the system must be first.”

An “algorithm world.”

Never has a communications system played so many roles in our lives—or exerted such broad influence over our thoughts—as the Internet does today.

Thanks to our brain’s plasticity, the adaptation occurs also at a biological level.

In the midst of a sea change in the way we read and think, the Internet promises to have particularly far-reaching effects on cognition.

The Internet, an immeasurably powerful computing system, is subsuming most of our other intellectual technologies. It’s becoming our map and our clock, our printing press and our typewriter, our calculator and our telephone, and our radio and TV.

A new e-mail message, for instance, may announce its arrival as we’re glancing over the latest headlines at a newspaper’s site. The result is to scatter our attention and diffuse our concentration.

The Net’s influence doesn’t end at the edges of a computer screen, either.

As people’s minds become attuned to the crazy quilt of Internet media, traditional media have to adapt to the audience’s new expectations. Television programs add text crawls and pop-up ads, and magazines and newspapers shorten their articles, introduce capsule summaries, and crowd their pages with easy-to-browse info-snippets. “Shortcuts” give harried readers a quick “taste” of the day’s news, sparing them the “less efficient” method of actually turning the pages and reading the articles.

Intellectual technologies —the tools that extend our mental rather than our physical capacities—we inevitably begin to take on the qualities of those technologies.

They are disassociated time from human events and helped create the belief in an independent world of mathematically measurable sequences.

The conception of the world that emerged from the widespread use of timekeeping instruments “remains an impoverished version of the older one, for it rests on a rejection of those direct experiences that formed the basis for, and indeed constituted, the old reality.

Skimming activity, hopping from one source to another and rarely returning to any source they’d already visited.

We are becoming “power browsers”

Thanks to the ubiquity of text on the Internet, not to mention the popularity of text-messaging on cell phones, we may well be reading more today than we did in the 1970s or 1980s, when television was our medium of choice.

But it’s a different kind of reading, and behind it lies a different kind of thinking—perhaps even a new sense of the self, weakening our capacity for the kind of deep reading that emerged when an earlier technology, the printing press, made long and complex works of prose commonplace.

We are becoming “mere decoders of information.”

Our ability to interpret text, to make the rich mental connections that form when we read deeply and without distraction, remains largely disengaged.

Reading, explains Wolf, is not an instinctive skill for human beings.

It’s not etched into our genes the way speech is. We have to teach our minds how to translate the symbolic characters we see into the language we understand. And the media or other technologies we use in learning and practicing the craft of reading play an important part in shaping the neural circuits inside our brains.

The circuits woven by our use of the Net will be different from those woven by our reading of books and other printed works.

The human brain is almost infinitely malleable.

People used to think that our mental meshwork, the dense connections formed among the 100 billion or so neurons inside our skulls, was largely fixed by the time we reached adulthood. But brain researchers have discovered that that’s not the case. Nerve cells routinely break old connections and form new ones. “The brain,” according to Olds, “has the ability to reprogram itself on the fly, altering the way it functions.”

Over the past few years I’ve had an uncomfortable sense that someone, or something, has been tinkering with my brain, remapping the neural circuitry, reprogramming the memory.

I think I know what’s going on.

For more than a decade now, I’ve been spending a lot of time online, searching and surfing and sometimes adding to the great databases of the Internet. For me, as for others, the Net is becoming a universal medium, the conduit for most of the information that flows through my eyes and ears and into my mind. The advantages of having immediate access to such an incredibly rich store of information are many, and they’ve been widely described and duly applauded.

In the 1960s, media are not just passive channels of information. They supply the stuff of thought, but they also shape the process of thought. And what the Net seems to be doing is chipping away my capacity for concentration and contemplation. My mind now expects to take in information the way the Net distributes it: in a swiftly moving stream of particles.

Once I was a scuba diver in the sea of words. Now I zip along the surface like a guy on a Jet Ski.

I have to fight to stay focused on long pieces of writing. Even a blog post of more than three or four paragraphs is too much to absorb.

Having a computer for a brain has its perks, but it has its drawbacks as well. Language is a tough concept for robots, as words can convey the abstract as well as the concrete and robots have trouble knowing the difference (and grasping the abstract).

That makes human-machine interaction less than intuitive for humans and confusing to ‘bots. Thoughts and actions feel scripted, as if they’re following the steps of an algorithm.

As we are drained of our “inner repertory of dense cultural inheritance,” Foreman concluded, we risk turning into “‘pancake people’—spread wide and thin as we connect with that vast network of information accessed by the mere touch of a button.”

Every day of the week new APPS replace thinking, Jobs. Humanoid robots are now able to speak in different languages with voice recognition thanks to the cloud. Robots can also ask one another about where they just came from, and which directions it is from where they currently are.

If one finds itself in an unfamiliar place, it will make up a word to describe it from randomly generated syllables. It communicates that word to other robots it meets there, establishing the name of the locale within the community. From this, a spatial and verbal framework is established to name places on the map. Creating a shared language between them.

If we lose those quiet spaces, or fill them up with “content,” we will sacrifice something important not only in ourselves but in our culture.

I find myself centered between understanding the necessity for change into the world of technology and mourning the loss of social interpretation and deep thinking.

Don’t stopped reading books altogether.Evolution. Abstract science backrounds with female portrait Stock Photo - 14446448

 

 

 

 

 

 

 

 

 

Share this:

  • Tumblr
  • Email
  • Pocket
  • LinkedIn
  • WhatsApp
  • Telegram
  • Skype
  • Facebook
  • Twitter
  • Reddit

Like this:

Like Loading...

THE BEADY EYE ASK’S ARE WE ALL GOING TO END UP GOOGLEFIED.

12 Tuesday Jan 2016

Posted by bobdillon33@gmail.com in Google it., Google Knowledge.

≈ Comments Off on THE BEADY EYE ASK’S ARE WE ALL GOING TO END UP GOOGLEFIED.

Tags

Globalization, Goog, Google, Google ambitions, The Future of Mankind, The Internet.

GOOGLE is currently the world’s most visited website but is it destroying the gymnasium of our collective minds? 

Essentially, Google has become our collective mental crutch.

Google is a publicly traded company owned by a group of shareholders.

Founders of Google, Larry Page and Sergey Brin, own most of the shares of the company.

It’s almost impossible to live without interacting with a Google product in today’s world. Google owns an incredible number of companies and, at times, was even buying a company a week!

As of 2015, Google had 75% market share in searches. People use Google to search nearly 13 billion times per month, which averages to 26 searches per person per year.

There are very few products in the world with this ubiquity and dominance.

Using complete data from the 2014 fiscal year, Google raked in revenues exceeding $66 billion.

With $64.4 billion in cash and having spent almost $5 billion on acquisitions in 2014, Google doesn’t seem to be in a hurry to slow down.

This means Google is richer than pretty much everyone, and everyone includes the majority of the world’s nations such as Iceland, the Bahamas, Guatemala, Bulgaria and Sierra Leone.

This figure does not take into account Google’s expenses for 2014, which bring the company’s total net profit down to a measly $14.44 billion. However, since the gross domestic product, or GDP, of a nation does not incorporate its debt, the revenue figure is the most accurate number to use when comparing the income of corporations to the wealth of nations.

In 2015, Google Incorporated is worth $370 billion.

(Google is not even the richest company. In fact, based on revenue alone, Google trails pretty far down the list. Wal-Mart tops the chart with revenues exceeding $485 billion, and other corporate giants such as BP, Apple (AAPL) and Bank of America (BOA) rank somewhere in between.)

Google wield astonishing power in the United States and around the world.

Google Incorporated is the third largest company, in terms of market capitalization, in the United States; Its market cap is $373.79 billion, only being edged out by Microsoft Corporation and Apple Incorporated.

With most businesses being directly or indirectly controlled by a relatively small number of global mega-companies, almost everything a consumer buys or interacts with is connected in some way to companies such as Google.

Google makes money from searches by selling promoted advertising based on search keywords.

The ads are more powerful than traditional advertising because they can be targeted by interest and geography. Advertisers like the program because they can get real-time feedback on the effectiveness and engagement of their ads. This continues to be the backbone of Google’s business and its major source of revenue.

Google Gmail today has over 425 million MONTHLY ACTIVE USERS.
Afficher l'image d'origine

THE QUESTION IS:

If the Google was a nation and declared sovereignty, issued a currency and joined the United Nations tomorrow, where would it rank on a list of the wealthiest countries?

It turns out Google’s $66 billion revenue plants it squarely at number 70 for the 2014 fiscal year.

Only 69 of the nations of the world outrank the Internet-technology giant.

While economic superpowers such as the United States and China far outstrip Google, for now, the number of countries with GDPs dwarfed by Google’s massive wealth is staggering.

The phrase “to Google” is so popular that the company is actually worried about losing trademark rights if the term becomes generic, like “escalator” and “zipper,” which were once trademarked.

It has changed our brains. Even if we aren’t conscious of it, our brains are primed to think about the Internet as soon as we start trying to recall the answer to a tough trivia question.

It has taken over our cell phones. Since the first Android phone was sold in 2008, Google’s mobile operating system has bulldozed the competition. Today it claims nearly 85% of market share, nearly doubling its hold over the last three years.

It has transformed the way we use e-mail. Gmail was invented a decade ago, before bottomless in boxes were a sine qua non. It’s hard even to remember those dark ages when storage space was sacred—and deleting emails was as tedious-but-necessary as flossing. Today our accounts serve as mausoleums, housing long-forgotten files, links, and even whole relationships. Google itself has touted alternative uses for Gmail, such as setting up a virtual time capsule for your newborn—though in practice accounts can’t be owned by anyone under 13. But even that last point is about to change.

It’s changed how we collaborate. Back in 2006, Google acquired the company behind an online word processor named Writely. With that bet, Google created a world where it’s taken for granted that people can collaborate on virtually any type of document, whether for work, play, or (literally) revolution.

It has allowed us to travel the globe from our desks. Yes, Map Quest was popular first. But Google Maps (and Earth) has become much more than a tool for measuring travel routes and times. Since Google Street View came onto the scene in 2007, it’s been possible to “visit” distant destinations, give friends a virtual tour of your hometown, plan ahead of trips, and waste even more time on the Internet.

Of course, the more popular a tool, the more useful it is to those who’d like to spy on us.

It has influenced the news we read. Ranking high in Google search results is serious business and can have a profound effect on the success of companies, media outlets, and even politicians. When I just Googled “how SEO affects journalism,” this link was at the top of my search results. How is that significant? Well, for one, that story itself has been so successfully search engine optimized that it still tops the list despite being four years old.

It has turned users into commodities. We all love free stuff, but it’s easy to forget that services offered by companies like Google and Facebook aren’t truly “free,” as data expert Bruce Schneier has pointed out. Remember that all of your data (across ALL of the services you use, and that includes Calendar, Maps, and so on) is a valuable good that Google is packaging and selling to its real customers—advertisers.

It’s changed how everyone else sees YOU. Unlike your Facebook profile, the links that turn up when potential employers (or love interests) Google you can be near-impossible to erase. Perhaps unsurprisingly, Google uses the fear of embarrassing search results to encourage people to manage their image through Google+ profiles.

Next stop:  

Self-driving cars along with a Google computer that is so artificially intelligent that it could program on its own transmitting, too Google glasses which will surely Googlefie their owners.  

Leaving little room if any for self conscious or search engines? Aren’t they all dead? —

Recent research has confirmed suspicions that 24/7 access to (near) limitless information is not only bad for human discourse— it’s also making us worse at remembering things, regardless of whether we try.

Thanks to Google we now have for lack of a better word the “wisdom of the crowd” or “social proof” (which you can “buy”) which sums up how superficial and shallow our society has really become.

Social Proofing is a phrase that applies particularly well to the large social environment created on the internet and the power of a group to come together and make a decision collectively.

(i.e. other items customers buy after viewing this item’ display or review summaries Booking.com Recommendation engines in shopping sites like Amazon.com rely on other people’s feedback to help drive sales and refer people to products they will like based on the buying behaviours of people like them.)

It doesn’t necessarily always lead to the “wisest” decision, because it can be a blind choice, made because other people made it, not necessarily based on sound rationalisation looking at the facts.

Social proof is everywhere.

Comments are indicators that enough people are paying attention to what you are writing to reply. The same can be said for things like Facebook “like” and Twitter “tweet” buttons.

We are pack animals, no matter how independent we think we are, unless you live in a cave, you are conditioned by other people around you.

Today the very nature of the Internet, being such a social environment, has resulted in social proof becoming by far the greatest force when it comes to buying decisions.

Google has recently reorganized itself into multiple companies, separating its core Internet business from several of its most ambitious projects while continuing to run all of these operations under a new umbrella company called Alphabet.

With the European Union recently beginning an investigation for monopolistic business practices, diversification might be in Google’s best interest.

Using Google to navigate the web remains the preferred method by which most people find information online. However, Google is far from a monopoly in terms of the entire gamut of Internet services. The perception of Google being a monopoly is derived from the fact it happens to have dominance in the most lucrative area of the Internet.

On the other hand.

When Google was just a start-up business in Palo Alto, Calif., it did not have enough money to pay its employees the high wages of today’s Google, so they offered them stock in place of a massive salary. Those original employees, including the head of the culinary staff, now either still own a good chunk of shares or have cashed in and enjoy a life of extreme wealth and prosperity because of Google’s explosive growth.

Google Incorporated now offers some of the best employee benefits and even death benefits. If a Google employee dies, the deceased’s spouse or partner receives half of the deceased employee’s salary for 10 years. Children of the deceased employee also receive $1,000 per month until age 19, or 23 if the children are full-time students.

It has contributed quite a bit of its income to various charities. In 2012, Google reported charitable donations exceeding $144.6 million. In addition, it gave away approximately $1 billion in free products.

Why have any concerns?

Google launched its Google Print division, now known as Google Books, which scans books into its application and website. Google intends to scan all existing books before 2020. To date, Google has scanned over 20 million books.

They are not doing this for the love of books or reading.

pirate-piracy-malware-ss-1920

Project Loon proposes to provide internet connectivity from balloons floating at a height of 20 km above the earth’s surface on a pilot basis. The idea is to connect remote areas of the country using LTE or 4G technology through the balloons, which can transmit as far as 40 km from their diameter.

If you believe that this is all they want to achieve with their Balloons you can pull my other leg.

Should we ban the wearing of Google glasses in public places.

  • As a practical act which creates areas free of surveillance or highly intrusive surveillance.
  • As a symbolic act showing concern for privacy.
  • As a way of exerting social pressure to establish norms around usage.
  • As a way of exerting market pressure to discourage people from buying and companies developing these systems.
  • Questions over machine monitored surveillance have existed for decades,

Fears over AI overlords may be groundless, but the use of machine learning to mine personal information is a worrying development, artificial intelligence experts have warned.

There is a broad consensus that technology has the potential to improve education and make it more personalized, but it is never going to come to pass if we don’t set higher standards for student and data security.”

A digital textbook is a textbook that lives on a desktop, laptop or mobile devices and is easily editable to provide educational content that is as timely and relevant as possible.

It wont challenge them to think about the knowledge, skills and abilities they’ll need to solve that problem.

But when does personalized learning get too personal?

Google researchers claim they’re working on a supercomputer that harnesses the power of quantum physics to calculate in one second problems that would take a regular machine 10,000 years to solve.

The change unlocks more computing power, allowing quantum computers to consider untold variables compared to conventional machines.

If they’re right — some people have raised questions about their claims — then the world could be at the dawn of a new age of über-powerful computers.

High-frequency trading helped cause the so-called “flash crash” of 2010, To the extent they can be speeded up even by a microsecond, it will make the problem that much worse.

On February 26th, 2015 Google’s Webmaster Central Blog announced a mobile algorithm update for April 21st, 2015. This is the first time Google has ever given an exact date for an algorithm update, so the digital world was expecting big things, dubbing the event ‘Mobilegeddon’.

Google is in such a dominant market position on mobile, that their decision to ban legitimate apps from Android amounts to Internet censorship.

The message from Google is clear, if your website is not optimised for mobile, it’s likely that your search ranking will suffer. Google cares most of all that people keep coming back to use their search engine.

The choices Google makes about what apps are banned and allowed appear in many cases to be dangerously anti-user.

The update will effectively penalise websites which Google does not deem to be mobile friendly and the impact is expected to be widespread.Google is not one to joke around so it is a definite ‘watch this space’ as the full impact of this algorithm is yet to seen…

Ethics concerns in conjunction with porn viewing and the internet cloud are presently in the news.

THINK WITH GOOGLE. 

New and expecting parents are 2.7x more likely than non-parents to use a smart phone as their primary device. So it’s not surprising  that Mobile searches related to babies and parenting have grown 25% since 2013

Searches about baby development were 72% mobile in Q1 of this year

In fact, views of parenting videos on YouTube were up 329% on mobile this year.

Is Google Making Us Stupid?

Thoughts and actions feel scripted, as if they’re following the steps of an algorithm.

We come to rely on computers to mediate our understanding of the world, it is our own intelligence that flattens into artificial intelligence.

If we fill our BRAINS  up with “content,” we will sacrifice something important not only in our selves but in our culture.

20111211-google-wordle

FALLING OUT OF LOVE WITH GOOGLE.

Understand what they’re doing, taking away time we could be using to shape our content in interesting ways.

Google’s increasingly invasive technology WILL BE IT’S DOWNFALL.

Apple’s policy is to collect no personal data, (this is of course apart from your iTunes account details allowing you to make purchases) a surprise from a huge company who aren’t exactly squeaky clean in other areas, like the working conditions for overseas workers for example.And they don’t do search – or do they? Installed on all almost 1 billion Apple devices worldwide,

Search with no ads and no data collection. So those of us with Apple devices can sidestep the issues with Google and still not have to add them to the To Sort Out list!

The fast changing pace of tech companies shows no sign of letting up and can both drive change and perhaps cause their downfall as consumers move with the tides as well.

Who would foresee Google tumbling from their great height? But perhaps this is what we are witnessing as this world of rapid changes throws up other surprises.

We are using these devices every day so we care and influence the battles outcomes.

So what we can do to get Google to re-commit to their mission of doing no evil.

We are letting the opportunity of a lifetime—of our lifetime and theirs—pass us by.

DuckDuckGo who offer search with no data collection.

They are an Open Source company who decided early in their life to offer Search with no personal tracking, no collection or sharing of your personal information. They’re a built-in search page option in Safari and Firefox, but not of course in Google Chrome!

Does any of this matter?

I suppose in the long run nothing will matter, but in the mean time Google which lives off our daily lives has a responsibility away beyond it Billions.

As humanity we will no doubt be reduced to a chip in a AI robot if we are to have any chance of escaping the planet before its demise.  ( Providing we last that long which looking at the current state of affairs it is highly unlikely.)

So perhaps it is for the best that we are all becoming Googlefied.

Google it and see.

 

 

 

 

 

 

 

 

 

 

 

Share this:

  • Tumblr
  • Email
  • Pocket
  • LinkedIn
  • WhatsApp
  • Telegram
  • Skype
  • Facebook
  • Twitter
  • Reddit

Like this:

Like Loading...

THE BEADY EYE LOOKS INTO THE COMING DIGITAL BLACK HOLE OF HISTORY.

17 Monday Aug 2015

Posted by bobdillon33@gmail.com in Education, Google Knowledge., History., Privatization, Sustaniability, Technology, The Future, The Internet.

≈ Comments Off on THE BEADY EYE LOOKS INTO THE COMING DIGITAL BLACK HOLE OF HISTORY.

Tags

Bit rot., Digital, DNA, Google, HISTORICAL Intelligence., Technology, The Ethereum., Wikipedia.

Time spent looking at a cellphone is time spent oblivious to the world.

Humanity’s first steps into the digital world could be lost to future historians.

We faced a “forgotten generation, or even a forgotten century.” through “bit rot,” when old computer files become useless junk.

There is a sense of powerlessness and fatalism about TECHNOLOGY.

If consciousness or HISTORICAL intelligence are lost, it might mean that value itself becomes absent from the universe.

We are nonchalantly throwing all of our data into what could become an information black hole without realising it.

Ancient civilizations suffered no such problems, because histories written in cuneiform on baked clay tablets, or rolled papyrus scrolls, needed only eyes to read them.

To day “intelligence” is related to statistical and economic notions of rationality – colloquially, the ability to make good decisions, plans, or inferences.  To study today’s culture, future scholars will be faced with PDFs, Word documents, and hundreds of other file types that can only be interpreted with dedicated software and sometimes hardware too.

 From This to This      

Most of the images we take today are uploaded straight from a digital camera or a phone, with the picture never actually existing as a physical artifact.

The significance of documents and correspondence is often not fully appreciated until hundreds of years later.

We’ve learned from objects that have been preserved purely by happenstance that give us insights into an earlier civilisation,”

We need history to embrace new values and institutions in pursuit of a just, fulfilling, and sustainable civilization not a “digital black hole”.

In fact, due to the intricate disconnectedness of production and economies around the world today, our technological civilization is perhaps more prone to a sudden collapse than other societies through history.

When you think about the quantity of documentation from our daily lives that is captured in digital form, like our interactions by email, people’s tweets, and all of the world-wide web the more important it is that we create legal permissions to copy and store software before it dies.

So digital objects we create today can still be rendered far into the future.

Deciding on the best format to preserve them for the next hundred years relies on anticipating what technology is likely to still be available in the future.

Computer hard disks can hold vast amounts of digitised information, but everything is lost if it fails or is wiped.

How do we preserve our interaction on Facebook, Twitter, comment threads and citizen journalism across the web?

In fact, due to the intricate disconnectedness of production and economies around the world today, our technological civilization is perhaps more prone to a sudden collapse than other societies through history. Plenty of once-great civilisations have collapsed, and our current industrialised society is by no means invulnerable –

Who will decide what worth keeping and where will we preserve a core kernel of human knowledge.

The significance of documents and correspondence is often not fully appreciated until hundreds of years later.

Even though Wikipedia represents a vast repository of information, it is not structured in a way that would guide a post-catastrophe society through stages of recovery.

Google certainly is not.

It has already changed the world by altering the way we interact with technology and there can be no questions its long-term ambitions.

Its mission is to collect information which you will have to buy with a google wallet.

“We envision a marketplace for payment instruments, commerce and loyalty services”

It’s not hard to envision a fully intact ecosystem of Google offerings with location-based mobile ads driving tracked incremental revenue via etail integrated mobile commerce, or via sales that are picked up in-store, via mobile payment.

“Now toss in Google Offers, NFC and QR codes for trigger point marketing, and the fact that Google already has the accounts open and the pot gets even richer.”

Personally I have little time for Banks but I would rather have a bank to provide a of mobile wallet products, not technology companies that can disappear into the cloud.

If Google was to make a move into supporting bank-branded wallets we would all become Googlefyed.

Google has far more on its plate than just financial services.

It’s a major player in telecommunications with its Android smart phone platform. It’s made forays into thermostats, home security and satellite imaging.

So it’s not just words and images that we risk losing for ever it’s the “grey literature” of official reports, briefings and policy statements that are only published online also risk being lost to the future?

Redstone Computer Tertiary Memory.PNG

Bit rot, a digital dark age is on the horizon unless we store information in DNA.

“It is very possible that … one machine would suffice to solve all the problems … of the whole [world]” – Sir Charles Darwin, 1946.

“Technology gives us the facilities that lessen the barriers of time and distance – the telegraph and cable, the telephone, radio, and the rest.” – Emily Greene Balch.

Perhaps The ETHEREUM IS THE ANSWER.

Importantly, because there is not a company or indeed any entity in charge of or controlling Ethereum, the cost of running the infrastructure doesn’t have to include any profit margin.

It might allow us to push the boundary on what the digital realm can cover.

But this is a what if for the history books.

Share this:

  • Tumblr
  • Email
  • Pocket
  • LinkedIn
  • WhatsApp
  • Telegram
  • Skype
  • Facebook
  • Twitter
  • Reddit

Like this:

Like Loading...

The Beady Eye looks at Google Knowledge.

12 Wednesday Aug 2015

Posted by bobdillon33@gmail.com in Google it., Google Knowledge., Social Media., The Internet.

≈ Comments Off on The Beady Eye looks at Google Knowledge.

Tags

Google, Google ambitions, Google knowledge.

Out of the seven billion people in the world how many really understand quantum mechanics, cell biology, or macroeconomics?

Knowledge is power.

The real test of knowledge is not whether it is true, but whether it empowers us. Consequently, these days truth is a poor test for knowledge. The test seems to be utility. A theory that enables us to do new things constitutes knowledge.

Knowledge is a the root of many (dare I say most) challenges we face in a given day and I have to admit I could do with a large refresher course.

Once you get past basic survival we’re confronted with knowledge issues on almost every front.

These days most of us are becoming reliant on Google it.

But when you get an answer is that answer universal knowledge or is it Google cods wallop.

It’s not possible to completely shed all our lenses which color our view of things and so it’s not possible to be certain that we’re getting at some truth “out there.” If all beliefs are seen through a lens, like Google how do we know the postmodernists beliefs are “correct?”

In order to have certainty, postmodernists claim, we would need to be able to “stand outside” our own beliefs and look at our beliefs and the world without any mental lenses or perspective.

If we do not fully understand what it is, will we not fully understand ourselves either?

But then again knowledge — can ever be fully understood.

The nature of knowledge is answerable to intuitions. This means that what may count as knowledge for you may not count as knowledge for me. An other words what you know may not be something I know even though we have the same evidence and arguments in front of us.

The bottom line is that “universal knowledge” – something everybody knows—may be very hard to come by.

I think, therefore I am.

Truth, if it exists, isn’t like this.

Truth is universal. It’s our access to it that may differ widely.

Okay, a definition is tough to come by.

But philosophers have been attempting to construct one for centuries. Over the years, a trend has developed in the philosophical literature and a definition has emerged that has such wide agreement it has come to be known as the “standard definition.”

As with most things in philosophy, the definition is controversial and there are plenty who disagree with it. But as these things go, it serves as at least the starting point for studying knowledge.

The person believes the statement to be true
The statement is in fact true
The person is justified in believing the statement to be true

Belief:

They’re in your head and generally are viewed as just the way you hold the world (or some aspect of the world) to be.

It implies that what you think could be wrong. In other words, it implies that what you think about the world may not match up with the way the world really is and so there is a distinction between belief and the next item in our list.

People will generally act, according to what they really believe rather than what they say they believe

Truth:

Truth is not in your head but is “out there.”

When you believe something, you hold that or accept that a statement or proposition is true. It could be false that’s why your belief may not “match up” with the way the world really is.

Justification:

If the seed of knowledge is belief, what turns belief into knowledge?

This is where justification comes in (some philosophers use the term “warrant” to refer to this element). A person knows something if they’re justified in believing it to be true (and, of course, it actually is true).

Justification is hard to pin down because beliefs come in all shapes and sizes and it’s hard to find a single theory that can account for everything we would want to claim to know. Even so, justification is a critical element in any theory of knowledge.

So.

  • Everyone comes to belief with a cognitive structure that cannot be set aside.
  • Our cognitive structure serves as a lens through which we view the world. Because of this, knowledge is said to be perspectival or a product of our perspective.
  • Since the evaluation of our beliefs is based on our cognitive lens, it’s not possible to be certain about any belief we have. This should make us tentative about truth claims and more open to the idea that all of our beliefs could be wrong.
  • Truth emerges in the context (or relative to) community agreement. For example  if the majority of scientists agree that the earth is warming and that humans are the cause, then that’s true. Notice that the criteria for “truth” is that scientists agree.

Are you now any more knowledgeable.   Google it and see.

There is one thing without doubt:

The fact that you are a thinking things.

In order to doubt you have to think. (The very reason that it’s not possible to doubt something without thinking about the fact that you’re doubting it). Thinking then you must be a thinking thing and so it is impossible to doubt that you are a thinking being.

If you know it all leave a comment, otherwise press the like button. Ignorance is bliss.

Some further reading and viewing>

Knowing how to Google something is not enough. 2014/09/02

Google is a business. 2015/03/02

The Imparting and Acquiring of knowledge. 2015/03/03

 

 

 

 

 

 

 

Share this:

  • Tumblr
  • Email
  • Pocket
  • LinkedIn
  • WhatsApp
  • Telegram
  • Skype
  • Facebook
  • Twitter
  • Reddit

Like this:

Like Loading...

What will money look like in the future.

06 Saturday Dec 2014

Posted by bobdillon33@gmail.com in Uncategorized

≈ Comments Off on What will money look like in the future.

Tags

alternative monetary system, Apple, Banking, Bitcoin, CASH, Cryptocurrencies, Electronic payments., Facebook, Google, Money, Money of the future, Twitter

                        

At the moment there is a lot of hysterical futuristic crap been written, about the money of the future.

” We will be making payment with the blink of an eye”  and the like.

” The cash register is coming to an end?” Not so far-fetched.

When it comes to money, there’s a lot of change going on.

Over 5000 complementary monetary experiments are already under way around the world…

Everybody knows that electronic currencies is changing the form of our conventional national currencies. (smart cards, e purses,etc.)

It seems more than likely that our current model will be replaced by electronic payments.

However the question remain whether today’s governments will allow money to disappear or we have to wait for them to fade out – or be thrown out – of existence. One way or the other there is no doubt that in a few years or several decades the hardened walls of the banking and government institutions running our economies are in for a shock when a new monetary system arises that is entirely private and not run by states.

Its shape and features will ultimately be decided by the market.  Free market monetary systems, in which the supply of money is outside political control, are likely to be systems in which money proper is a commodity of limited and fairly inelastic supply.

But it also seems improbable that a completely free market would grant any private entity the right to produce (paper or electronic) money at will and without limit. The present system is unusual in this respect and it is evidently not a free market solution. Neither is it sustainable.

Future economic historians will pity us for having worked under a strange and inefficient global patchwork of local paper currencies – and for having naively believed that this represented the pinnacle of modern capitalism.

Today, every government wants to have its own local paper money and its own local central bank, and run its own monetary policy (of course, on the basis of perfectly elastic local fiat money). This is naturally a great impediment to international trade and the free flow of capital. They are parastatal dinosaurs, joined at the hip with the bureaucracy and politics, bloated and dependent on cheap money and state subsidy for survival. They are ripe for the taking.

The world is ready for an alternative monetary system, and when the present system collapses under the weight of its own inconsistencies, there will be something there to take its place.

Money however is one of society’s most embedded, ancient institutions anchored in trust and the race to win that trust is on.

So there are many questions to be answered.

How will own the money?  How much freedom will they allow, when freedom is so attainable? How will they treat banks when, with merely a download, anyone can be their own bank?  What will they have to say about currencies which compete with their own?

What exactly is money?

Economic Textbooks define money by what it does, not by what it is – e.g. Functions of Standard of Value, Medium of Exchange, Store of Value, etc…. Money is an Agreement, within a Community, to use something as a Medium of Exchange.

Take the dollar.

It has lost over 92% of its value since its initial issuance in 1913. After the revaluation in 1934, the dollar dropped another 41%. The very volume of dollars in the world has given many people a conviction that the currency is worthless and doomed to lose its status as a global reserve currency and turn into toilet paper money by letting the printing presses run wild.

“Short-termism” is programmed by the interest feature of our conventional money.

1.3 trillion of it is traded in foreign exchange markets every day. 100 times more than the trading volume of all the stock markets of the world combined.

Only 3% of these foreign exchange transactions relate to real goods and
services. 97% is purely speculative.

What happened is that ‘speculative’ trading (i.e. trading whose sole
purpose is to make a profit from the changes in the value of the
currencies themselves) has all but taken over the foreign exchange
markets. The currency market has become the biggest single market in the
world. Foreign exchange transactions purchases and sales of
currencies) today dwarf the trading volume of all other asset classes,
even of the entire global economy.  

2/3 of all human beings who ever reached the age of 65 are alive today are looking at unfunded pensions liabilities now $3.5 Trillion in the OECD countries alone. Three times the GDP of the USA.

Not to worry as long as all major corporate decisions are made with a short-term horizon => long-term sustainability is going to be an illusion.

85% of all insurance payments worldwide compensate natural disasters. For times more people die in natural disasters than in all war and civil disturbances combined.

69% of professional biologists say we’re in ‘sixth extinction’ – we are in the process of losing 30%- 70% of the planet’s biodiversity by 2030 due to the actions of humanity!

Back to Money:

The latest smart phone technology is revolutionizing the payment process with the death of the wallet not far off.

Remittances are a gateway drug to Twitter, Facebook, Google, to achieve financial inclusion in the future.

Facebook is readying itself to provide financial services in the form of remittances and electronic money. It wants to become a utility in the developing world.

If Ireland’s central bank becomes an “e-money” institution it will allow Facebook to issue units of stored monetary value that represent a claim against the company.

Obtaining an e-money authorization in Ireland would require Facebook to hold capital of €350,000 and segregate funds equivalent to the amount of money it has issued. Facebook is already authorized for some forms of money transfer in the United States, allowing it to process payments for developers who charge users for in-app purchases.

Facebook takes a fee of up to 30 percent for such payments, and these fees account for about 10 per cent of its revenues. It recently reiterated its commitment to expanding its mobile payments and wallet products, which have yet to be widely adopted by consumers.

In 2013, the company facilitated $2.1 billion worth of transactions, almost exclusively from games.

I personally am not surprise that it is viewed with skepticism as a payment vehicle, when you look at all the crap one sees on Face Book – Would you trust Facebook to handle your money.

Google is registered in the UK to issue electronic money, in a process similar to the authorization which Facebook is seeking in Ireland.

Google and its NFC-driven Wallet, and PayPal are well on the way to providing digital payment that can move between two people as they pass each other on the street, or between two people on opposite sides of the Earth – with no difference between the character of the two payments.

Vodafone has acquired an e-money licence for the phone company to operate financial services in Europe.

Twitter, Square, PayPal, Apple are also in the race to replace Money.

The question of ‘what’s next?’ Depending on how it’s answered by governments, it might be very exciting or very frightening.

The importance of digital – potential changes in payments, branch banking, financial advice and the use of social media will accelerate change in the industry, most likely to the benefit of fast-moving incumbents.

New technologies are threatening to disrupt existing models in retail financial services; the pressure of increased operating and capital costs reducing capacity in wholesale banking; and a struggle for growth and profitability in insurance waning customer loyalty as their biggest challenge.

The banking system fundamentally makes money by keeping customers confused, making the lion’s share of profits from fees and charges, not from banking. They will have to think no longer of themselves as mere providers of financial products and services and enablers of transactions. They will need to be solution providers that play a greater role not just at the moment of transactions, but before and afterward as well.

The global e-payments value reached $256 billion in 2012, and is expected to grow three-fold by 2014 to a total of $796 billion. An average person touches his / her smart phone 150 times in a day.

So it’s no wonder that the single biggest area of investment is mobile apps for tablets and smartphones, with the ultimate target to consolidate everything you carry on you till financial transaction that involve buying something is paid for by simply saying your name.

And before I sign off we have Cryptocurrencies,

Bitcoin is a peer-to-peer currency with no centralized authority

Bitcoin is regulated by code, which determines how quickly new Bitcoins are generated without the intervention of humans. Bitcoins are stored in a wallet that resides on your computer – or a hosted wallet service off in the cloud, if that’s your preference – and transactions are nearly instantaneous. It’s the prototype for whatever improved implementation overtakes traditional currency in the future.

In the meantime, the debasement of paper money continues.

In the end, It’s great news that non-banks are challenging the traditional banking monopoly.

I leave you with a few Quotes;

“Maybe money is unreal for most of us, easier to give away than things we want. ” Lillian Hellman.

“Money is the only substance which keep a cold world from nicknaming a citizen “Hey You”. Wilson Mizner.

“Money is the poor people’s credit card. Marshall McLuhan.

“Money is what you’d get on beautifully without if only other people weren’t so crazy about it” Margaret Case Harriman.

” Wealth is nothing in itself; it is not useful but when it departs from us; its value is found only in that which it can purchase. As to corporal enjoyment, money can neither open new avenues of pleasure, nor block up the passages of anguish. Disease and infirmity still continue to torture and enfeeble, perhaps exasperated by luxury, or promoted by softness,. With respect to the mind, it has rarely been observed that wealth contributes much to quicken the discernment or elevate the imagination, but may,by hiring flattery, or laying diligence asleep, confirm error and harden stupidity.” Samuel Johnston.

 

4791385567_4a146e78c7_o

“Remember money has no sign of human worth.” Robert de Mayo Dillon.

Share this:

  • Tumblr
  • Email
  • Pocket
  • LinkedIn
  • WhatsApp
  • Telegram
  • Skype
  • Facebook
  • Twitter
  • Reddit

Like this:

Like Loading...

Perhaps the next century—we all be in a permanent identity crisis, constantly asking ourselves what humans are for.

03 Wednesday Dec 2014

Posted by bobdillon33@gmail.com in Uncategorized

≈ Comments Off on Perhaps the next century—we all be in a permanent identity crisis, constantly asking ourselves what humans are for.

Tags

Artificial Intelligence., Google, Steven Hawkins

You no doubt heard Steven Hawkins recently prediction that the human race as we know it will come to and end with the creation of a Super Artificial Intelligence.

 

He could be right.

Over the past 60 years, we have seen mechanical processes replicated behaviors and talents we thought were unique to humans,

We are not just redefining what we mean by AI—we are redefining what it means to be human. ( Greed v Poverty.  Muslim v Christian. Wealth v Benefits. Black v White. War v Peace. Health v Consumerism. Life v Death. Gratification v Pain. Space v Sustainability, Climate Change v Profit. To mention just a few)

It will be long before we’ve have to change our minds about what sets us apart. (ISIS v The Rest.)

Every day we are being forced to surrender more of what is supposedly unique about humans and we will spend the next decades doing so.

Indeed, the grandest irony of all, the greatest benefit of an everyday, utilitarian AI will not be increased productivity or an economics of abundance or a new way of doing science—although all those will happen.

The greatest benefit of the arrival of artificial intelligence is that AIs will help define humanity. We need AIs to tell us who we are.

In order to appreciate this we are we going to need a large dose of artificial smartness. Smartness is focused, measurable, specific.

If you think about it, much of the technology humans interact with is about putting you in a particular bucket. This is exactly what Online marketing is about. Making finer and finer distinctions as to which bucket you belong in.

The long-term scientific goal of artificial intelligence is human-level intelligence as it Artificial immortality is to the goals of modern medicine.

AI has attracted more than $17 billion in investments since 2009. Last year alone more than $2 billion was invested in 322 companies with AI-like technology.

Every time you type a query, click on a search-generated link, or create a link on the web, you are training the Google AI.

Each of the 12.1 billion queries that Google’s 1.2 billion searchers conduct each day tutor the deep-learning AI over and over again. With another 10 years of steady improvements to its AI algorithms, plus a thousand-fold more data and 100 times more computing resources, Google will have an unrivaled AI.

At first glance, you might think that Google is beefing up its AI portfolio to improve its search capabilities, since search contributes 80 percent of its revenue. But I think that’s backward. Rather than use AI to make its search better, Google is using search to make its AI better.

My prediction: By 2024, Google’s main product will not be search but AI.( See My Flip board ” Silent Wittiness to Google Ambitions.)

Where are we at the moment?

Parallel computation, bigger data, and deeper algorithms generated the 60-years-in-the-making overnight success of AI. As these technological trends continue—and there’s no reason to think they won’t—AI will keep improving. The smarter it gets, the more people use it. The more people that use it, the smarter it gets.

Our AI future is likely to be ruled by an oligarchy of two or three large, general-purpose cloud-based commercial intelligences offering more like IQ as you want but no more than you need.

It is transforming the Internet, the global economy, and civilization. It is enliven inert objects, much as electricity did more than a century ago.

There is almost nothing we can think of that cannot be made new, different, or interesting by infusing it with some extra IQ.

Non-Inheritable Neural Architecture intelligent machines will increasingly replace knowledge workers in the near future.

Many knowledge workers today get paid to do things that computers will soon be able to do. AI will increasingly move up the skill ladder to replace the middle-class workers.

There’s no way a human can keep on top of all possible financial instruments in the world or is it possible for doctors and nurses to stay on top of medical innovations.

Every success in AI redefines it.

A child born today will rarely need to see a doctor to get a diagnosis by the time they are an adult.  Computer Medical diagnostics is a potential “game changer,”

At this very moment it is possible to have you DNA sequence read by 23 and Me (for a fee.)

Curious.  Go on try it.

Find out if you are a suitable partner, what you are most likely to died from, WHETHER YOU ARE going to win the lotto – sorry not true.

We are still a ways off from equaling the processing power of the human brain.

An AI program may be able to write code to manipulate the stock exchange but not the lotto draw. It is still not able to “solve the problem of common sense, of endowing a computer with the knowledge that every 5-year-old is still a few years off.

Representations are the fruits of perception. Recognizing the centrality of perceptual processes makes artificial intelligence more difficult, but it also makes it more interesting, for the two types of process are inextricably intertwined.

As AIs develop, we might have to engineer ways to prevent consciousness in them—and our most premium AI services will likely be advertised as consciousness-free.

In fact, the business plans of the next 10,000 start ups are easy to forecast: Take X and add AI. 

So Is he Right.

New utilitarian AI will also augment us individually as people (deepening our memory, speeding our recognition) and collectively as a species.

Will it replace us.

Rather you than I too comeback in a few hundred years.

Food will differently be different. manufacturing materials  will differently be different, clothes, money, or any branch of science and art will have changed beyond recognition.

Concepts without precepts are empty. The essence of human perception lies in the ability of the mind to hew order from chaos. Perception goes on at many levels.

AI perception will be rigid, inflexible, and unable to adapt to the
problems provided by many different contexts.

Cognition is infused with perception.  If there is none neither human or non human will know which is which. Therefore there will be no interaction between humans and artificially intelligent beings other than war. 

So he is right.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Share this:

  • Tumblr
  • Email
  • Pocket
  • LinkedIn
  • WhatsApp
  • Telegram
  • Skype
  • Facebook
  • Twitter
  • Reddit

Like this:

Like Loading...

Knowing how to Google something is not enough.

26 Friday Sep 2014

Posted by bobdillon33@gmail.com in Uncategorized

≈ Comments Off on Knowing how to Google something is not enough.

Tags

Collaboration, Education, Global integration, Google, Internet, Moral philosophy, Peace, Technology

Like me I am sure you look at our world and wonder how has it got its self in such a mess.

If anything, the amount of knowledge one needs to know to make sense of today’s issues is staggering.

Putting inequality and greed aside, the lack togetherness has to be one of the main problems. We with our interconnected world have little if any appreciation or understanding of other cultures in the world.

So has technology plunged the world into its present day state?

”You can look anything up on you iPad, so you don’t have to know anything (“factual mastery is become less and less important.”)

Real knowledge and understanding is going out the door. In order to have a conversation, or to contemplate or invent something, you must have the information in your head, so that you can combine it with other information that is also in your head. There’s no time to “look it up. You can only work with what’s in your head. That’s why computers don’t just have hard disks, they also have “working memory.

Or is the Internet making us more skilled at asking questions, or is it more important that it is parroting answers to someone else’s questions.

I can’t help wondering what will happen if we rely mainly on electronic devices to provide information…..and then there’s a blackout.

Then who has the floor?

The person who actually KNOWS something and can articulate it. The guy looking it upon the iphone plays second fiddle.  Who do you think would win? The guy with the knowledge, and therefore, quick retorts in his head, or the one fumbling with the Ipad?

Unfortunately, this is the road we are going down. I don’t see how present days technologist are creating knowledge to improve the world. This can only be true when a microchip can be implanted in my head so that my brain can draw on all the knowledge it needs to reason and think creatively. We can’t fix a human problem with a pure technological solution. I am also afraid to tell you that technology is no replacement for caring.

Education is a dance. A theatrical moving-about. It requires more than a dump of information and a hologram or computer or Smart Phone to be Intelligent.

You might think I am talking unadulterated crap but the best educated of most of us learned was what we did it with a pencil, some paper, an adult teacher and some chalk in a classroom. Miraculously such people got a man on the moon. The very idea of computers was invented by such people (Bill Gates and the late Stephen Jobs were in their middle 50s).

Humans have always worked in collaboration for important achievements. Unfortunately these days factual mastery is becoming less and less important.

We are into a hodgepodge of moment-driven quantitative analyses.

We now have Teenagers who have a great knowledge of math and science but no capability to understand what the value of it is beyond a paycheck they might earn.

Why learn a foreign language when there is Google Translate?  We persist in using skin color as a way of defining individuals.

The capability of the human mind to keep knowledge does not increase at the same rate as the expansion of knowledge. Learning is never done and there is no arbitrary point where we are ready for the real world.

At least becoming proficient, in another language is key to global integration, not the reverse.

In this computer-driven age more and more exams in many subjects are multiple choice guessing contests graded by computer – and even math exams require one to enter the problem’s solution in an exact and strictly timed format, making it possible to get all the right answers and still fail for lack of ability to perform data entry with sufficient dexterity and speed.

You hear these day with the current terrorism in the world that young people are being radicalized. Why?

  1. Because the wheels of commerce are supercharged by the electricity of a million of Hippocratic hand shakes. USA-Iran,Syria, Iraq, Afghanistan, Israel, Palestinian.
  2. Because the alternative of the ignorant is to rely on some talking head.
  3. Because New technologies are profoundly altering the way knowledge is conveyed.
  4. Because the pursuit of knowledge and skills is too expensive.
  5. Because we don’t educate our young to gain a full range of basic intellectual skills. Value, Curiosity, Tolerance, Respect, Creativity.
  6. Because ethical thinking which is crucial for individuals and societies to better address and deal with public policy issues is a forgotten as a moral philosophy, leaving a void to radicalization, away from the materialistic world.
  7. Because we have forgotten that Education will be more about how to process and use information and less about imparting it.  It has degenerated over the last two generations from a reflective science that gave value to life through the deep thinking of individuals and the careful review of their peers into I am all right Jack syndrome.
  8. Because we rely almost entirely on passive learning.  To follow in over-trodden paths of conventionalism. If the goal is to turn students into robot-like performers they’ll certainly accomplish it, but how much can you expect the many problems in the real world to be solved by robots?
  9. Because a big part of stopping radicalization and having students thrive is to give them people to emulate, and people who support their dreams. At the rate and direction we are headed I wonder if students will be able to answer what is 13 squared without a calculator or recall who fought in WWI or II.
  10. Because we should pay for students to spend a semester or more abroad.  An educational experience worth every penny of its extra cost. To Master  A Language. would be far more beneficial than; mastering, or rethinking how to more effectively prepare.
  11. Because we need to do away the basic notion that you have to learn something in order to be able to move toward higher level skills of thinking and analysis. So it can go down more easily with students who were raised on the internet.
  12. Because the current system of grouping pupils by age across all subjects by force is archaic.
  13. Because the idea that data and collaboration are “coming” is absurd. We need an education system the is flexible enough to be custom tailored to each student without the necessity to change any part of the system itself. The capability of the human mind to keep knowledge does not increase at the same rate as the expansion of knowledge.”It’s a common error to believe language exists in a vacuum, that by simply sharing a common lexicon the world’s social, economic and political barriers will fall like leaves in an autumn wind.
  14. Because Specialization is for insects.
  15. Because teaching Mandarin to the next generation wouldn’t be a bad idea, as a one-way communication of English will otherwise lead to global misunderstanding.
  16. Because naked greed, willful ignorance, and incompetence – are as common as daylight. Let’s also deliver thoughtfulness.
  17. Because  A good example is the concept of our living in a heliocentric system rather than a earth-centric system.
  18. Because it was impossible to think seriously about the future many things in life holds that things take longer to happen than you think they will, and then happen faster than you thought they could.
  19. Because there is a huge difference between sheer memorization of a tremendous amount of data and actually knowing how to use that data.

Do people leave college with an accurate robust knowledge of probability?

No. It is only attained.

Most things are as untrue as they are true.

“A human being should be able to change a nappy, plan an invasion, butcher a pig, steer a ship, design a building, write a sonnet, balance accounts, build a wall, set a bone, comfort the dying, take orders, give orders, cooperate, act alone, solve equations, analyze a new problem, pitch manure, program a computer, cook a tasty meal, fight efficiently, die gallantly.

The purpose of public education is to teach students to think, give students real skills, and give students experiences that require use of both.

University is a preserver of the past in terms of values and outlooks and shared “facts.”

Those who do not know history and have not thought about it, repeat its mistakes use Google, Safari Library or externalized knowledge stores.

.The initial development of revolutionary ideas come from individuals.

Isn’t this the way we should be handing the reins of government, over and over, to the “best and brightest”

That one expert will, one hopes, teach the controversies about the war, rather than simply teaching their own opinions as fact, but there’s still a real danger of creating an intellectual mono culture that might, like genetically modified crops, have a shared vulnerability.

We’ve already got this  (think Tea Party and climate change or evolution).

Educators have a huge problem on their hands as to how to offload their brain empower students to be creators, rather than mere consumers.

Da Vinci today alone cannot build an jet-airplane. The people who designed the microchips, wrote the operative system, made the special glass, and so on weren’t specialists?

Let me tell you the first time as a young man I went to see the Eiffel Tower it was surrounded by pigeons now it’s surrounded by machine guns.

Its time we taught values not just our values, values that we all share and need to live on this plant in Peace.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Share this:

  • Tumblr
  • Email
  • Pocket
  • LinkedIn
  • WhatsApp
  • Telegram
  • Skype
  • Facebook
  • Twitter
  • Reddit

Like this:

Like Loading...
← Older posts

All comments and contributions much appreciated

  • THE BEADY EYE ASK’S : ARE OUR LIVES GOING TO BE RULED BY ALGORITHMS. May 20, 2023
  • THE BEADY EYE ASK’S: IS THIS A NIVE QUESTION. IS IT IN NATO INTEREST TO ALLOW THE UK TO SUPPLY CRUISE MISSILES TO THE UKRAIN. May 12, 2023
  • THE BEADY EYE ASK’S: WHAT IS A CORNATION? HAS IT ANY RELEVANCE IN TODAY’S WORLD WITHOUT HMS BRITIANNIA? May 9, 2023
  • THE BEADY EYE SAYS. WHEN IT COMES TO TECHNOLOGY THE JACK IS OUT OF THE BOX AND IT’S MAKING A PIGS MICKEY OUT OF THE WORLD WE LIVE IN. May 5, 2023
  • THE BEADY EYE ASK’S: IS OUR BIOLOGICAL REASONING BEING REPLACED BY DIGITAL REASONING. May 3, 2023

Archives

  • May 2023
  • April 2023
  • March 2023
  • February 2023
  • January 2023
  • December 2022
  • November 2022
  • October 2022
  • September 2022
  • August 2022
  • July 2022
  • June 2022
  • May 2022
  • April 2022
  • March 2022
  • February 2022
  • January 2022
  • December 2021
  • November 2021
  • October 2021
  • September 2021
  • August 2021
  • July 2021
  • June 2021
  • May 2021
  • April 2021
  • March 2021
  • February 2021
  • January 2021
  • December 2020
  • November 2020
  • October 2020
  • September 2020
  • August 2020
  • July 2020
  • June 2020
  • May 2020
  • April 2020
  • March 2020
  • February 2020
  • January 2020
  • December 2019
  • November 2019
  • October 2019
  • September 2019
  • August 2019
  • July 2019
  • June 2019
  • May 2019
  • April 2019
  • March 2019
  • February 2019
  • January 2019
  • December 2018
  • November 2018
  • October 2018
  • September 2018
  • August 2018
  • July 2018
  • June 2018
  • May 2018
  • April 2018
  • March 2018
  • February 2018
  • January 2018
  • December 2017
  • November 2017
  • October 2017
  • September 2017
  • August 2017
  • July 2017
  • June 2017
  • May 2017
  • April 2017
  • March 2017
  • February 2017
  • January 2017
  • December 2016
  • November 2016
  • October 2016
  • September 2016
  • August 2016
  • July 2016
  • June 2016
  • May 2016
  • April 2016
  • March 2016
  • February 2016
  • January 2016
  • December 2015
  • November 2015
  • October 2015
  • September 2015
  • August 2015
  • July 2015
  • June 2015
  • May 2015
  • April 2015
  • March 2015
  • February 2015
  • January 2015
  • December 2014
  • November 2014
  • October 2014
  • September 2014
  • August 2014
  • July 2014
  • June 2014
  • May 2014
  • April 2014
  • March 2014
  • February 2014
  • January 2014
  • December 2013
  • November 2013
  • October 2013

Talk to me.

bobdillon33@gmail.co… on THE BEADY EYE SAYS: WELCOME TO…
OG on THE BEADY EYE SAYS: WELCOME TO…
benmadigan on THE BEADY EYE SAY’S. ONC…
Sidney Fritz on THE BEADY EYE ASK’S: CAN…
Bill Blake on THE BEADY EYE SAYS. FOR GOD SA…

Blogroll

  • Discuss
  • Get Inspired
  • Get Polling
  • Get Support
  • Learn WordPress.com
  • Theme Showcase
  • WordPress Planet
  • WordPress.com News

7/7

Moulin de Labarde 46300
Gourdon Lot France
0565416842
Before 6pm.

My Blog; THE BEADY EYE.

My Blog; THE BEADY EYE.
bobdillon33@gmail.com

bobdillon33@gmail.com

Free Thinker.

View Full Profile →

Follow bobdillon33blog on WordPress.com

Blog Stats

  • 81,028 hits

Blogs I Follow

  • unnecessary news from earth
  • The Invictus Soul
  • WordPress.com News
  • WestDeltaGirl's Blog
  • The PPJ Gazette
Follow bobdillon33blog on WordPress.com
Follow bobdillon33blog on WordPress.com

The Beady Eye.

The Beady Eye.
Follow bobdillon33blog on WordPress.com

Create a free website or blog at WordPress.com.

unnecessary news from earth

WITH MIGO

The Invictus Soul

The only thing worse than being 'blind' is having a Sight but no Vision

WordPress.com News

The latest news on WordPress.com and the WordPress community.

WestDeltaGirl's Blog

Sharing vegetarian and vegan recipes and food ideas

The PPJ Gazette

PPJ Gazette copyright ©

Privacy & Cookies: This site uses cookies. By continuing to use this website, you agree to their use.
To find out more, including how to control cookies, see here: Cookie Policy
  • Follow Following
    • bobdillon33blog
    • Join 204 other followers
    • Already have a WordPress.com account? Log in now.
    • bobdillon33blog
    • Customize
    • Follow Following
    • Sign up
    • Log in
    • Report this content
    • View site in Reader
    • Manage subscriptions
    • Collapse this bar
%d bloggers like this: