If you look back not that far in history, the word “computer” was a term applied a person working in a particular role. According to Webster’s 1828 American Dictionary of the English Language, a computer was defined as “[o]ne who computes or reckons; one who estimates or considers the force and effect of causes, with a view to form a correct estimate of the effects”.
Going back a bit further, to Samuel Johnson’s magnum opus, we see a “computer” is defined more concisely as a “reckoner” or “accountant”.
In a disambiguation page, Wikipedia identifies Computer_(job_description)
, quoting Turing’s Computing Machinery and Intelligence paper in Mind (Volume LIX, Issue 236, October 1950, Pages 433–460):
The human computer is supposed to be following fixed rules; he has no authority to deviate from them in any detail. We may suppose that these rules are supplied in a book, which is altered whenever he is put on to a new job.
Skimming through a paper that appeared in my feeds today — CHARTDIALOGS: Plotting from Natural Language Instructions [ACL 2020; code repo] — the following jumped out at me:
Humans. As computers. Again.
Originally, the computer was a person doing a mechanical task.
Now, a computer is a digital device.
Now a computer aspires to be AI, artificial (human) intelligence.
Now AI is, in many cases, behind the Wizard of Oz curtain, inside von Kempelen’s “The Turk” automaton (not…), a human.
Human Inside.
A couple of of other things that jumped out at me, relating to instrumentation and comparison between machines:
Just like you might compare the performance of different implementations of an algorithm in code, we also compare the performance of their instationation in digitial or human computers.
At the moment, for “intelligence” tasks (and it’s maybe worth noting that Mechanical Turk has work packages defined as HITs, “Human Intelligence Tasks”) humans are regarded as providing the benchmark god standard, imperfect as it is.
Dehumanising?
See also: Robot Workers?