So many days spent "writing" go by in blurry fits and starts. Those quotes are there because, while I typed a lot for many hours, and while the word document got longer and took up slightly more disk space, I didn’t really write all that much—i.e., start piecing together arguments, or catching glimpses of what they might be. I did manage to find a really fun “side track” in trying to situate this wild dude Burke liked named Paget, an English physicist-phoneticist who spent lots of his time fashioning models of the human larynx out of paraffin, plasticene, and rubber bands. He would give little public demonstrations of these models in various locations around London (this is circa 1926), and have the models say things like "Hullo!" and "London, are you there?" This apparently tore up the audience, inasmuch as a 1926 London audience could be torn up.
My sidetrack/diversion wasn’t so much these laryngeal demonstrations, which I’ve known about for some time, but some sifting through broader scholarship having to do with this strange tendency to want to produce “working” models of human parts, a tendency that Jessica Riskin, a history of science professor at Stanford, traces to at least the 18th century. She compares similar kinds of rudimentary models with contemporary efforts to create artificial intelligence and also with the notion of “wetware,” a very new term used by computer scientists and science fiction writers to refer to the brain and nervous system. Riskin has an awesome article in Critical Inquiry that talks about one wacky French guy by the name of Jacques Vaucanson, who in 1738 fashioned a mechanical duck with see-through feathers that would, also in front of an audience, eat grain. As if that weren't enough, while the audience looked on, apparently slack-mouthed, the mechanical duck would poop.
I think one Christmas I got a doll that peed, but the novelty, believe me, wore off far more quickly than my niece and nephew's Q-20 game, whereby the computer, by asking a series of questions, can frequently guess the word in your head. It's almost uncanny that the word my niece and nephew both most frequently have in their respective heads is, you guessed it: poop. Or one of its many synonyms. Q-20 can guess this one, by the way, with about 6 questions.
So what is it that fellows like Paget and Vaucanson and presumably their audiences learn from these simulated models? If Riskin is right, and I think she is, these zany little experiments served to test the boundaries between human or animal and machine, to determine which bodily “mechanisms” work in a machine-like fashion, and which ones don’t. The interesting thing, as she points out in her wetware article, is that such boundaries tend to be tested on actions that are deemed prime indicators of life—like defecating or talking.
So while my diversion took me away from Burke, who frankly didn’t think much of machines and in fact thought they were The Big Problem, I’m still pretty sure today’s sifting wasn’t diversive at all—maybe a fit, but hopefully a start.