Friday , September 21 2018
Home / Artificial Intelligence / AI weapons will transform war

AI weapons will transform war

The most terrifying film of the year didn’t come from Hollywood.

It came from a think tank looking to save us all from killer machines.

In the movie’s near future dystopia, palm-sized drones loaded up with explosives use facial recognition to hunt down and slaughter people with pinpoint precision. A series of devastating attacks sweep the countryside. Swarms of the micro murderers tear through Congress, massacring Senators based on ideology. Terrorists unleash a horde of the flying monsters into schools to take out the kids of parents who dare to speak out against the threat.

What makes it all so horrifying is how close we are to making it a reality

I’m no AI alarmist.

Machine learning will bring us incredible new powers from skin cancer detection on your cell phone, to exotic materials design, to accelerated drug discovery.

Although it’s possible AI will have a devastating effect on jobs, it’s just as possible that AI brings us into a booming golden age of brand new jobs we can only begin to imagine.

While Terminators and superintelligent AIs taking over the world are mostly a fantasy problem that won’t matter for a hundred years if ever, these machines are right around the corner.

Facial recognition powered by convolutional neural nets works incredibly well.

You can buy dozens of tiny drones from Amazon today. A number of open source robo operating systems already live on Github. Even worse, a trip through the dark web will likely net you the formula for countless explosives.

Put it all together and you’ve got yourself a smart bullet ready to strike fear into the heart of the world.

But that’s just the beginning.

The biggest chip makers on the planet from Nvidia to Intel, along with several well-funded stealth startups, are hard at work building chips to supercharge a Cambrian explosion of intelligent apps. 

They’re crafting new architectures and shrinking it all down to power the self-driving cars, trucks, surgery bots, and house cleaning machines of the near future.

And make no mistake those very same chips will power autonomous weapons too.

As these chips get smarter and smarter, as well as smaller and smaller, it will give these micro-killers all kinds of new capabilities, like evasion and countermeasures to drone battalion assault coordination.

Those weapons are already here

The US Department of Defense released a video just last year showing micro-fighter jets dropped from a plane over the Navy’s cutting-edge research facility in China Lake, California.

Watch as the weapons roll and swoop to attack and defend warplanes with ease, their metallic screeches echoing from your nightmares. They use algorithms developed from studying insects and wolf pack predator behavior.

It’s not hard to imagine top guns of tomorrow dropping legions of these bots to storm enemy aircraft. They’ll smash into wings and engines from all sides with explosive fury and send warbirds spiraling from the sky.

And they built all that with the underpowered chips of today.

What can they do with the specialist AI chips that will flood the market in the next few years?

It won’t take long for something once considered sci-fi to become a dark reality.

Even worse, I just met with one of the world’s top AI researchers, Vian Chinner. I told him about the movie and how close it seemed if we could just shrink those chips small enough. He said something that chilled me to the bone.

“We don’t even need them. With wireless you could centralize the intelligence and control them all remotely.”

That means they’re possible today not tomorrow.

Game Theory, Black Budgets, and Terror

As a futurist, I love unsolvable problems. Whenever I spot a problem that has no answer I can’t resist thinking about it.

How can we stop something like this from happening?

To start with you can get involved.

The group that created the movie is a non-profit dedicated to dismantling smart weapons before they even get off the ground. They timed the release to coincide with a UN convention on autonomous weapons. Like Elon Musk and a hundred other tech luminaries, they favor a wholesale ban on AI attack systems.

I’m 100% in favor of the ban. I also think it has almost zero chance of working.

Let’s take a look at why and why we shouldn’t let that stop us.

The first reason the ban will fail is that these weapons will prove far too tempting to world leaders and militaries.

Russian President Putin already said that whoever leads AI will “rule the world.” Military masterminds will see these slaughterbots as a way to revolutionize warfare. Believe it or not, they’ll see them as more humane.

And if you think about it, they’re not wrong.

Dropping massive bombs on houses does devastating collateral damage.

They kill the wrong people, mow down innocent women and children, smash the local infrastructure to oblivion, create overwhelming humanitarian refugee crises that threaten to boil over into fresh conflicts, and turn the populace against you as mothers and fathers lose children in a whirling storm of steel.

Precision AI weapons could sweep all of that away. No more buildings coming down and innocent kids with their heads splattered across the asphalt on the nightly news.

Instead, these dark predators simply swoop in under cover of night and rip the faces off the bad guys before they even know what hit them.

It’s a lot cheaper than Seal Team Six and missiles.

It costs around $350,000 to train a single Navy Seal and another $1 million a year to keep them in the field. A patriot missile costs $3 million bucks a pop. A drone like this might cost a few thousand dollars. Maybe a super advanced one will cost fifty grand.

There’s an old joke that military intelligence is an oxymoron but it doesn’t take a genius to do the ROI on those numbers.

There’s not a single military planner in the world who won’t want to place an order for ten million of these micro monsters as soon as they go up for sale.

Another likely reason for the ban’s failure is not every country in the world will agree to it. 

If even one nation is working on these things, game theory tells us everyone else will too. Like nuclear weapons, you can’t risk someone having them if you don’t because they can roll over you in any coming conflict.

It’s an existential threat.

Even if every country agrees to the ban it really doesn’t matter. The biggest countries in the world will just go ahead and build them anyway with secret black budgets.

The real problem though is that these weapons are well within reach of just about anyone. 

If we manage to ban them at the nation-state level, they’ll prove too easy to build with off the shelf components, 3D printers, open source software and a little willpower. That means rogue actors and terrorists will get their hands on them sooner rather than later.

The next 9/11 might not be a building coming down but a precision-guided attack on perceived infidels.

Knowing all of that, I still think we should try to keep them off the battlefield.