Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Why would I do that? If I want to know what Siri can do, I can just say "Siri, (do the thing)" and find out. That's -way- easier than scrolling through a list (and more accurate, to boot; just because she can do it, if I can't figure out an acceptable magical incantation, she can't, for all intents and purposes).

Just looking over the list still doesn't address the OP's point. Maybe if I memorized it, in the way that if I had an exhaustive list of everything a 4 year old child can do that I memorized I'd be able to 'intuit' what they could do, but why would I do that?

That's just it; the whole 'theory of mind' is basically the idea that we can intuit what someone else can do, think, etc, without such a list. I'm able to limit my own own mind to have the same limits as someone else. That is, I can imagine what someone else is likely thinking given a subset (or even theoretical superset! I.e., "They know if there is money in this account. If there is, they are likely to do X. If there isn't, they are likely to do Y") of information that I have. I can determine what a child will be able to do based on exposure of other things they can do.

None of that applies to Siri. I can't infer what she has access to (both in terms of data and functionality). I can't use capabilities of one thing to infer capabilities in another. She can order me an Uber; can she order me a pizza? Can she order me a highly detailed expandable oak table from a boutique vendor? I can infer what a real PA is likely to be able to do (even if I don't know the specifics of how), but I can't do that with Siri. It's a black box. Giving me an exhaustive list of all the things doesn't change that; it's now a black box with a manual. Yes, okay, maybe if I memorize the manual I can determine what she can do, but the point the op is making is that for a real PA I can infer what capabilities they have without memorizing a manual. Until Siri can as well, she's not a replacement, feels gimmicky, and has real barriers to adoption to overcome.



>Why would I do that? If I want to know what Siri can do, I can just say "Siri, (do the thing)" and find out. That's -way- easier than scrolling through a list (and more accurate, to boot; just because she can do it, if I can't figure out an acceptable magical incantation, she can't, for all intents and purposes).

Funny, I'm the opposite. I like the ability to flip through and see "Oh, I didn't know it could do that!" Then I mentally file it away as a thing that exists.

I would never have learned that Siri (via Wolfram Alpha) can tell me what planes are overhead just by trying it except for having read it in a list somewhere, because I would never have thought to ask that. But since I read a list of interesting things that Siri knows, I now know it has that information.

Just trying to guess what capabilities are available is like trying to learn how a unix command works with no man page. "Just run it with every possible flag and see what happens!" It'd be great if Siri could do everything, but she can't, and the search space of possible actions with natural language is far too large to find everything I might use by guesswork. A black box with a manual is better than a black box without one.


In the example you gave, you learned about a specific thing siri can do- tell you what planes are overhead. Now say you know from experience that WA also provides the current altitude and speed of those planes. If you had a strong ToM for Siri you would have a very good intuition about whether you could rely on siri for info as well. As it is, I have no idea. Do you? Don't you think it'd be a much better experience if we did know what to expect?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: