The Human Element – Part 1

By Hank Hair, Young Online

Remember the Matthew Broderick movie “War Games”? The United States government had created the “WOPR” or War Operation Plan Response computer. Here is a link to the scene where the “WOPR” is described, https://youtu.be/iRsycWRQrc8.

This computer was developed because of the concerns of the top military brass that when the time would eventually arise for our men and women in the missile silos to turn the keys, and launch a nuclear attack or response, they couldn’t do it! They knew that the weight of the moral and ethical responsibility for killing thousands, possibly millions, of people with simple turning of a key, would weigh heavily on their minds and keep them from completing the task.

So, using historical, empirical, data provided by military intelligence, WOPR played war games 24 hours a day, 365 days per year, learning how to defeat the enemies of the United States. Its job was to specifically learn how to win at all cost – even risking the total annihilation of the entire human race!

Matthew Broderick’s character, a hacker and gamer by today’s standards, accidentally hacks into the NORAD-WOPR computer believing it was a gaming company’s (Protovision) computer, as he wanted to play their new games before they were released to the public. Unbeknownst to him, he had triggered the WOPR into thinking the United States was under attack and at war with Russia, nearly triggering World War III.

This scenario is a very scary and truly extreme, but it is a warning to us to never leave the human element out of technology. With all the new AI technologies being developed, giving more permissions for computers to learn and operate on their own, it is more relevant today than ever before. Had anyone thought through what the benefit of having this computer was, and what the real “end of the game” would be?

Did they consider that the WOPR would learn that it that if they cut its power source that it would think the enemy had cut its power, in an attempt, to win “the game” and automatically trigger a full out launch of nuclear weapons?

Did they consider their computer would learn to lock them out, so they would not be able to stop a launch once it was initiated?

Had they made a mistake in totally trusting the technology they had created?

Should they have made provisions for when the technology could not solve an issue, human involvement is necessary? It is food for thought.

Technological advances since this motion picture was made have increased a thousand-fold, just look at the size of the computers in that clip. This motion picture came out 41 years ago, people were thinking about AI and machine learning even back then. All that hardware can probably fit into something the size of let us say…. your cell phone. But technology doesn’t have to be scary or intimidating, and it doesn’t have to be made difficult. All it takes is an open mind and the willingness to try something new.

With all the new technology coming down the pike, the opportunities to integrate the best of these technologies into the claims industry, are endless. The questions we must ask ourselves are…. What is the benefit of using all of this new technology?

How is this new technology going to make my life/job easier? Is it going to save us time? Is it going to save us money? These questions are limitless and the inspiration behind the development of innovation and new technologies. Problems arise every day, and solutions for those problems are being solved through new developed technologies. Necessity is the mother of invention.

Look for Part 2 in the June newsletter for thought-provoking answers!

If you would like more information about this article, please contact Hank Hair with Young Online at Hank.hair@youngonline.com or 404-863-8447.


This is a publication of Southern Loss Association, Inc., P.O. Box 421564, Atlanta, GA 30342. The articles published on this website are in a general format and are not intended to be legal advice applicable to any specific circumstances. Legal opinions may vary when based on subtle factual differences. All rights reserved.