The line between man and machine

Saw this on Jon Udell's blog via the #RDFIG chump feed, from Sergey Brin: "I'd rather make progress by having computers understand what humans write, than by forcing humans to write in ways computers can understand."

Well, sandro on #rdfig writes "Why am I arguing with a sound-bite?" Why not? :) Here's a counter-sound-bite: Use Newton handwriting recognition, then try Palm's Graffiti and come back and tell me which seemed more worth while.

The way I look at it, people have muscle memory and can form habitual patterns and can adapt around interfaces that can become transparent and second nature. That is, if the interface doesn't go too far away from usability. I think Graffiti was a good compromise between machine and human understanding. Let the machine focus with its autistic intensity on the task at hand, and let the human fill in the gaps. This is why I fully expect to see Intelligence Amplification arrive many, many moons before Artificial Intelligence arrives, if ever.

I doubt that machines will ever come up far enough to meet man, but man and machine can meet halfway and still have an astonishing relationship. So, one can spend enormous resources trying to make computers understand people (who barely understand themselves), or we can make understandable interfaces and mostly intelligible systems and fudge the rest.

shortname=ooobhf

Archived Comments

  • For my money, the Newton 2x00's handwriting recognition kicks Graffiti all the way up the street and then back down again. I'd say it's at least 90% accurate for my handwriting. And of the errors, the vast majority can be fixed with a double-tap on the incorrect word and a pick from the alternatives list. And that was in 1998. I've yet to see anyone do it better since. Your mileage may vary. :)
  • Yeah, mileage varies, and that's why sound bites suck :) As for myself, I could never get a Newton to reliably understand my writing, which consists of printing with tall and short capital letters. Whereas I was up and speeding along with Graffiti in the first day. I've also heard many similar stories, and there were jokes about it, so I figured it'd be a good sound bite.
  • Having done Newton development I could offer some commentary here. One thing many users didn't appreciate was how various decisions were being made by the software. Granted, they shouldn't have to care in most situations. But for the recognizer once you understood what it was trying to do you could do a select few things to 'help' it avoid screwing things up. For example, take just a little extra care to start your writing of word would greatly increase it's accuracy. But for various reasons this little step wasn't shared with the users. So while the HWR had it's own defects, leaving out how to avoid the most fundamental of them meant it was a downhill slide and the users had no idea why. One other staggering problem was/is shortsightedness for the sake of being stingy. There was a wealth of rich metadata that could be collected as users created data. But since there were serious memory/cpu/power limitations it wasn't practical to 'keep everything'. Couple this with the fact that programmers don't generally work in 'real world' conditions, you end up with arbitrary decisions being made early that harm the later long term versatilty. What end up happening is the machine can't make the sort of insightful guesses the user wants because the earlier programmers didn't think it was necessary to let the earliest parts of the system collect the data needed. As a result, the later efforts try to synthetically guess what was intended and those guesses are often wrong (or terribly complicated to implement). For some reason developers vehemently fight metadata as if it was something horrible. The continue to do so and the users suffer. Sound bites only make it worse.
I like JohnCompanies servers  Previous I'm finally timeshifting radio again Next