• Skip to primary navigation
  • Skip to main content
General Aviation News

General Aviation News

Because flying is cool

  • Pictures of the Day
    • Submit Picture of the Day
  • Stories
    • News
    • Features
    • Opinion
    • Products
    • NTSB Accidents
    • ASRS Reports
  • Comments
  • Classifieds
    • Place Classified Ad
  • Events
  • Digital Archives
  • Subscribe
  • Show Search
Hide Search

Pilot pens new thriller

By General Aviation News Staff · December 27, 2013 ·

WICHITA FALLS, Texas – According to the FAA, close to 80% of aviation accidents are caused by human error.

Norman Harrison, an aeronautics instructor with 26 years experience as a U.S. naval aviator, has a special interest in this issue. Combining present-day realities of aviation with artificial intelligence gone wrong, Harrison has written a new techno-thriller, “The Entity.”

The Entity“Many people believe that artificial intelligence going rogue will eventually become a reality,” Harrison says. “Unfortunately, if safeguards are not in place, my story may well become a reality.”

“The Entity” begins with the Air Force attempting to transfer the results of a top-secret, completed mission to the Space Systems Command. An artificial intelligence system (the Entity) begins to replicate itself thousands of times, creating a worldwide matrix, all while no one is aware it exists.

Built on the premise of the unpredictability of artificial intelligence, The Entity combines reality with fantasy in a narrative that involves top-secret military projects and international intrigue, according to the publisher.

“Today, we have drones that can do an entire mission with the controller located thousands of miles from the aircraft,” Harrison said. “It is not beyond one’s imagination that someday artificial intelligence might grow beyond the intelligence of its creators.”

“The Entity” is available in softcover ($13.49) or as an e-book ($2.99).

Available at amazon.com, barnesandnoble.com and bookstore.authorhouse.com

Reader Interactions

Share this story

  • Share on Twitter Share on Twitter
  • Share on Facebook Share on Facebook
  • Share on LinkedIn Share on LinkedIn
  • Share on Reddit Share on Reddit
  • Share via Email Share via Email

Become better informed pilot.

Join 110,000 readers each month and get the latest news and entertainment from the world of general aviation direct to your inbox, daily.

This field is for validation purposes and should be left unchanged.

Curious to know what fellow pilots think on random stories on the General Aviation News website? Click on our Recent Comments page to find out. Read our Comment Policy here.

Comments

  1. Bill Leavens says

    December 30, 2013 at 5:46 pm

    Isaac Asimov wrestled with this more than 70 years ago and penned the Three Laws (of robotics) 1. A robot may not injure a human being or, through inaction, allow a human being to come to harm. 2. A robot must obey the orders given to it by human beings, except where such orders would conflict with the First Law. 3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law. I hope somebody besides me was paying attention in class that day.

© 2025 Flyer Media, Inc. All rights reserved. Privacy Policy.

  • About
  • Advertise
  • Comment Policy
  • Contact Us
  • Privacy Policy
  • Writer’s Guidelines
  • Photographer’s Guidelines