Overemotionalizing Technology

Johnny 5 — alive!
The Washington Post ran a very interesting piece the other day documenting the affinity that our military’s finest tend to develop with the robots placed in their care. The article starts off with, quite literally, a bang:

The most effective way to find and destroy a land mine is to step on it.
This has bad results, of course, if you’re a human. But not so much if you’re a robot and have as many legs as a centipede sticking out from your body. That’s why Mark Tilden, a robotics physicist at the Los Alamos National Laboratory, built something like that. At the Yuma Test Grounds in Arizona, the autonomous robot, 5 feet long and modeled on a stick-insect, strutted out for a live-fire test and worked beautifully, he says. Every time it found a mine, blew it up and lost a limb, it picked itself up and readjusted to move forward on its remaining legs, continuing to clear a path through the minefield.
Finally it was down to one leg. Still, it pulled itself forward. Tilden was ecstatic. The machine was working splendidly.
The human in command of the exercise, however — an Army colonel — blew a fuse.
The colonel ordered the test stopped.
Why? asked Tilden. What’s wrong?
The colonel just could not stand the pathos of watching the burned, scarred and crippled machine drag itself forward on its last leg.
This test, he charged, was inhumane.

The rest of the article goes on to note that humans, as a group, tend to ascribe human characteristics to their technological tools, many times despite the fact that the tool in no way resembles a human or even an animal that might be considered a pet. In my years as a systems admin, I’ve noticed that this applies to users’ computers quite frequently. Users personalize their machines, customize their desktops, etc. and begin to think of them as extensions to themselves, be that good or bad. It’s a sentiment I can sympathize with, but I do have to admit: it’s just plain weird to think of a lump of metal and silicon as having “feelings” or even posessing the capability to be “spiteful” or “uncooperative”. Do we ever think of hammers as being “petulant” or staplers as having it in for us?
I guess we do tend to think of our cars as having personality — does this suggest that there is some minimum threshhold for a number of moving parts in order for humans to anthropomorphize an object?

Doug
Doug

Husband & father with youngins; Presbyterian; Will devops for boardgames; Dadjoke Enthusiast; Longtime WordPress user; The failure mode of “clever” is...

Articles: 2529