Is AI Just Another Magic 8 Ball?

For twenty or thirty years, we’ve seen film and television with characters like robots and computers with personalities. These have often been good entertainment.

Sometimes they combined these AI like characteristics with supernatural powers. This requires a certain suspension of disbelief but in the interest of a good story, I have often made that sacrifice.

(Do you believe in talking rabbits, bottles marked “drink me,” or AI’s ability to make sports predictions?)

But do people believe that AI has supernatural powers?

Here we have an article telling who is going to win the next Super Bowls by asking ChatGPT. It is very similar to having your horoscope read, throwing some dice or throwing the bones as in Scandinavian practice or maybe doing some magical writing, you know, putting pen to paper, looking away, writing frantically and seeing if your magical powers manifest.

I strongly suspect someone somewhere is taking this nonsense seriously.

In a Story by List Wire entitled: ChatGPT predicts the next 20 Super Bowl champions in the NFL, does your team win it all?

https://sports.yahoo.com/article/chatgpt-predicts-next-20-super-150033619.html

According to ChatGPT’s A.I., here are the teams predicted to win the next 20 Super Bowls in the NFL.

And then it has a list.

Once again, let me be clear. This is nonsense. AI is not a predictor of sports outcomes anymore than a magic 8 ball or a Ouija Board.

I think most people know this. I hope so anyway. But sometimes reading the press reports on AI and its developing capabilities that there are those that think that it has or will have god-like capabilities.

For instance, we have the concept of a Technological Singularity. Here are my friends at Wikipedia attempting to define the term:

The technological singularity—or simply the singularity[1]—is a hypothetical point in time at which technological growth becomes alien to humans, uncontrollable and irreversible, resulting in unforeseeable consequences for human civilization.[2][3] According to the most popular version of the singularity hypothesis, I. J. Good‘s intelligence explosion model of 1965, an upgradable intelligent agent could eventually enter a positive feedback loop of successive self-improvement cycles; more intelligent generations would appear more and more rapidly, causing a rapid increase in intelligence that culminates in a powerful superintelligence, far surpassing human intelligence.[4]

Now, that sucker might predict some foot ball games — and on the down side, kill all of humanity. But, it would be in a real and strange way, magical – at least in terms of human perception.

I seem to recall, that great legend of science fiction, Arthur C. Clarke, saying that to a more primitive civilization, the advances of technology have the appearance of magic (or words to that effect).

Maybe we are on the road to something like that?

But let me reassure you that based on my training and my experience, currently AI has no predictive powers. That can change but I have seen nothing that leads me to believe anything of that nature has happened or is likely to happen. Not soon.

James Alan Pilant