EP# 1568, The Politics Of A.I., The Siri SNAFU

by quasar-atl - 12/6/11 4:13 PM

Hey Buzz Gang and Crew,
Here some food for thought:

It is interesting to read how Abortion Advocates are rallying around one
small function of a simple little tool that is still in beta. The voice
application that Apple engineered really is an interesting and fascinating
product and yet politics has already crept into its creation even before the
product has been fully deployed. What is even more surprising is that
intellectual minds like those of CNET's own editorial staff have failed to
notice this fact. Human beings will bring out the worst in Artificial
Intelligence capabilities.

To note:
If you remember, HAL 9000 in "2001, Space Odyssey" the system
became corrupt when human beings told it to lie to the astronauts on the
spacecraft "Discovery" about the presents of the monolith.

In "2010", Dr. Chandra, "Bob Balaban", says, "HAL was told to lie - by people
who find it easy to lie. HAL doesn't know how". This means that human beings
will place their own interest ahead of others or even the good of society.

In "The Matrix Reloaded", The Architect, "Helmut Bakaitis", says, "Denial is the most predictable of all human responses" which literally means that people are more willing to accept pessimism and negative conclusions versus the actual truth.

In "Terminator 2", The T-1000, "Arnold Schwarzenegger" says, "It's In Your
Nature To Destroy Yourselves" meaning human beings have a tendency to be
inherently corrupt toward each other.

We human beings can tend to be pessimistic creatures which in turn can bring out the worse in ourselves. Because of this nature, we may inadvertently add these traits into the technology we create which is the whole point of the examples I gave above.

Apple was probably truthful in indicating that Siri's lack of abortion
information was not intentional but no matter. Our inherent distrustful nature
will likely make Siri corrupt at some point in the future simply because we as
human beings cannot accept A.I. technology for what it is, simply a tool. By
applying a political and a emotional agenda to Siri, we inadvertently bring out the
worse in ourselves. This event alone could allow those imperfect human straits to be place into our A.I. tools. Unfortunately, whether you agree or disagree with this post will not matter because "It's In Your Nature To Destroy Yourselves".

Something to think about. Later People.