Inference is the core technique for determining what happened for which you have little or no data. Lewis Mumford was dissatisfied with the stone tools that had been found all over the world and dated back hundreds of thousands of years. Not because he did not consider them telling of the state of technology employed by Paleolithic humans, but because he thought there was an over-emphasis on early humans as tool makers.
Indeed, a recent find of a seven pound rock that had been flaked into a “tool” by an ancestral species to us some 3 million years ago would have reinforced what he inferred from such finds: the evolution of human civilization took place in the mind much more than in the flaking of stones into sharp edges. The tool that humans perfected first was their own bodies, their minds, their power to communicate and pass on history to one another. Nice that they could flake flint or obsidian. Mumford dismissed the over-reliance on artifacts that persist as a lack of reasoning on the people examining the artifacts. And whether you call it inference or deduction, there are things that have to exist which are not concrete for the artifacts which are to have come into being.
I want to follow Mumford’s lead and take his advice. Consider his idea that technological advances are rarely made at first with utilitarian designs firmly understood or even wished for. If that was true in the Paleolithic, why should we think it is less true today?
Maybe it is somewhat less true, of course. There’s too much money to be made for us to just let technology evolve slowly to a point of being useful. But behind the hard-nosed business cases of all the big data start-ups out there, there is still something worth noting that is not strictly speaking a matter of revenue.
Credit scoring is at least 40 years old. Actuarial analysis of large amounts of data go back at least that far. In the time from then to now, there have been large data warehouse projects, advances in Business Intelligence tools and even more recent database technologies that go beyond SQL. Analytic techniques like Six Sigma have demonstrated that if “it” is not data driven, “it” is out of control; almost regardless of what “it” is. For this reason, we tend to see the development of Big Data as a matter of advances in database technology and analytics.
We can even point to “folk” databases such as the logs that diabetics have kept, writing their blood sugar levels multiple times a day into notebooks. Or immigration record books at Ellis Island. Or the banal evil of the record books from places like Buchenwald (see Friedlander and Milton’s The Holocaust: Ideology, Bureaucracy, and Genocide. The San Jose Papers. Millwood, NY: Kraus International Publications. 1980.)
And the Social Sciences give us examples of analytics going back to the 1930’s as well. Notably the beginnings of interpreting survey data.
Mumford’s point was always that the invention of a new application of a technology (for example, flaking stones to make them sharp) pales in comparison to the social underpinnings that make the technology acceptable and useful. How can this change our thinking about Big Data and, perhaps, about the Internet of Things?
Perhaps the technological advances mean nothing without a cultural shift. And that cultural shift is the elevation of the clerical (in its secular meaning), the belief that keeping a record is as important as the transaction it records. This shift is more to be inferred than proven since without it, the technologies we see emerging would not be nearly as useful.