(Apologies for blogging so infrequently this month. I'm currently up to my elbows in The Labyrinth Index, with a tight deadline to hit if the book's going to be published next July. Blogging will continue to be infrequent, but hopefully as provocative as usual.)
Remember Orwell's 1984 and his description of the world ahead—"if you want a vision of the future, imagine a boot stamping on a human face, forever"?
This is the 21st century, and we can do better.
George got the telescreens and cameras and the stench of omnipresent surveillance right, but he was writing in the age of microfilm and 3x5 index cards. Data storage was prodigiously expensive and mass communication networks were centralized and costly to run — it wasn't practical for amateurs to set up a decentralized, end-to-end encrypted shadow network tunnelling over the public phone system, or to run private anonymous blogs in the classified columns of newspapers. He was also writing in the age of mass-mobilization of labour and intercontinental warfare. Limned in the backdrop to 1984 is a world where atom bombs have been used in warfare and are no longer used by the great powers, by tacit agreement. Instead, we see soldiers and machine-guns and refugees and the presentation of inevitable border wars and genocides between the three giant power blocs.
Been there, done that.
What we have today is a vision of 1984 disrupted by a torrent of data storage. Circa 1972-73, total US manufacturing volume of online computer storage — hard drives and RAM and core memory, but not tape — amounted to some 100Gb/year. Today, my cellphone has about double that capacity. I'm guessing that my desk probably supports the entire planetary installed digital media volume of 1980. (I'm looking at about 10Tb of disks ...) There's a good chance that anything that happens in front of a camera, and anything that transits the internet, will be preserved digitally into the indefinite future, for however long some major state or corporate institution considers it of interest. And when I'm taking about large-scale data retention, just to clue you in, Amazon AWS already offers a commercial data transfer and storage service using AWS Snowmobile, whereby a gigantic trailer full of storage will drive up to the loading bay of your data center and download everything. It's currently good for up to 100PB per Snowmobile load. (1PB is a million gigabytes; 1EB is a billion gigabytes; ten snowmobile loads is 1EB, or about 10,000,000 1973's worth of global hard drive manufacturing capacity). Folks, Amazon wouldn't be offering this product if there wasn't a market for it.
These heaps and drifts of retained data (and metadata) can be subjected to analytical processes not yet invented — historic data is still useful. And some of the potential applications of neural network driven deep learning and machine vision are really hair-raising. We've all seen video of mass demonstrations over the past year. A paper to be presented at the IEEE International Conference on Computer Vision Workshops (ICCVW) introduces a deep-learning algorithm that can identify an individual even when part of their face is obscured. The system was able to correctly identify a person concealed by a scarf 67 percent of the time against a "complex" background. Police already routinely record demonstrations: now they'll be able to apply offline analytics to work out who was there and track protestors' activities in the long term ... and coordinate with public CCTV and face recognition networks to arrest them long afterwards, if they're so inclined.
It turns out that facial recognition neural networks can be trained to accurately recognize pain! The researchers were doubtless thinking of clinical medical applications — doctors are bad at objectively evaluating patients' expressions of pain and patients often don't self-evaluate effectively — but just think how much use this technology might be to a regime bent of using torture as a tool of social repression (like, oh, Egypt or Syria today). They also appear to be better than human beings at evaluating sexual orientation of a subject, which might be of interest in President Pence's Republic of Gilead, or Chechnya, or Iran. (There's still a terrible false positive rate, but hey, you can't build an algorithmic dictatorship without breaking heads.)
(Footnote: it also turns out that neural networks and data mining in general are really good at reinforcing the prejudices of their programmers, and embedding them in hardware. Here's a racist hand dryer — it's proximity sensor simply doesn't work on dark skin! Engineers with untested assumptions about the human subjects of their machines can wreak havoc.)
All of this is pretty horrific — so far, so 2017 — but I'd like to throw two more web pages in your face. Firstly, the Gerasimov Doctrine which appears to shape Russian infowar practices against the west. We've seen glaring evidence of Russian tampering in the recent US presidential election, including bulk buying of micro-targeted facebook ads, not focussing on particular candidates but on party-affiliated hot-button issues such as race, gay rights, gun control, and immigration. (I'm not touching the allegations about bribery and Trump with a barge pole — that way lies the gibbering spectre of Louise Mensch — but the evidence for the use of borderline-illegal advertising to energize voters and prod them in a particular direction looks overwhelming.) Here's a translation of Gerasimov's paper, titled e Value of Science Is in the Foresight: New Challenges Demand Rethinking the Forms and Methods of Carrying out Combat Operations. As he's the Russian army Chief of General Staff, what he says can be taken as gospel, and he's saying things like, "the focus of applied methods of conflict has altered in the direction of the broad use of political, economic, informational, humanitarian, and other nonmilitary [my emphasis] measures — applied in coordination with the protest potential of the population". This isn't your grandpa's ministry of propaganda. Our social media have inadvertently created a swamp of "false news" in which superficially attractive memes outcompete the truth because humans are lousy at distinguishing between lies which reinforce their existing prejudices and an objective assessment of the situation. And this has created a battlefield where indirect stealth attacks on elections have become routine to the point where savvy campaigns pre-emptively place bait for hackers.
There are a couple of rays of hope, however. The United Nations Development Program recently released a report, Journey to extremism in Africa: drivers, incentives and
the tipping point for recruitment that pointed out the deficiencies in the Emperor's wardrobe with respect to security services. Religion and ideology are post-hoc excuses for recruitment into extremist groups: the truth is somewhat different. "The research specifically set out to discover what pushed a handful of individuals to join violent extremist groups, when many others facing similar sets of circumstances did not. This specific moment or factor is referred to as the 'tipping point'. The idea of a transformative trigger that pushes individuals decisively from the 'at-risk' category to actually taking the step of joining is substantiated by the Journey to Extremism data. A striking 71 percent pointed to 'government action', including 'killing of a family member or friend' or 'arrest of a family member or friend', as the incident that prompted them to join. These findings throw into stark relief the question of how counter-terrorism and wider security functions of governments in at-risk environments conduct themselves with regard to human rights and due process. State security-actor conduct is revealed as a prominent accelerator of recruitment, rather than the reverse." In fact, the best defenses against generating recruits for extremist organizations seemed to be things like reduced social and eonomic exclusion (poverty), improved education, having a family background (peer pressure), and not being on the receiving end of violent repression. Because violence breeds more violence — who knew? (Not the CIA and USAF with their typical "oops" response whenever a drone blows up a wedding party they've mistaken for Al Qaida Central.)
So, let me put some stuff together.
We're living in a period where everything we do in public can be observed, recorded, and will in future provide the grist for deductive mills deployed by the authorities. (Hideous tools of data-driven repression are emerging almost daily without much notice, whether through malice or because they have socially useful applications and the developers are blind to the potential for abuse.) Foreign state-level actors and non-state groupings (such as the new fascist international and its hive of internet-connected insurgents) are now able to use data mining techniques to target individuals with opinions likely to appeal to their prejudices and inflame them into activism. Democracy is directly threatened by these techniques and may not survive in its current form, although there are suggestions that what technology broke, technology might help fix (TLDR: blockchain-enabled e-voting, from the European Parliament Think Tank). And there are some signs that our existing transnational frameworks are beginning to recognize that repressive policing is one of the worst possible shields against terrorism.
Social solidarity. Tolerance. Openness. Transparency that runs up as well as down the personal-institutional scale. And, possibly, better tools for authenticating public statements such as votes, tweets, and blog essays like this one. These are what we need to cleave to if we're not going to live out our lives in a shiny algorithmic big data hellscape.