A computer learned to beat the classic strategy game Civilization, says Ars Technica, with a manual to guide it. Apparently the system knew to look for certain phrases, but it had to learn what they meant by testing. Researchers say the computer went from winning 17% of games to winning 54% when it got to “read” the manual.
That’s cool enough. But what’ll really blow your brain is what happened when it read the Wall Street Journal.
To test the system’s robustness, the researchers also threw in phrases from the WSJ. The system usually knew to throw out the Journal — it hasn’t trusted that paper ever since the Murdoch buyout — and use the relevant manual.
But! When it did use the Journal, the system still did better: 40% victorious compared to the 17% with nothing and 54% with the manual. Basically, the software stumbled on enough rules by trying to work some sense out of the Journal that it got more than halfway to where the truth could get it.
Tada! Bible! Give a computer enough rules about price-earnings ratios or mildewing your unleavened concubines on the Sabbath, and it’ll find a way to turn them into useful rules. But it’ll never do as well as it would with the real rules.
So that’s lovely. We’re learning how to make software do what we do: divine patterns in irrelevant bullshit (no offense to the Journal, I’m sure Times is a bad game manual too) and apply them to our lives.
We’ve just got one step left: Get the computers to cling to their fake manuals even when they’re demonstrably wrong, and go to war with King Darius I every damn game.
Illustration by Caribou on Fark