Watch surprise input method: Nose
I have noticed the following unexpected things things about interaction with the watch:
- some actions are hands free, voice activated at first but then require a tap to complete.
- these same situations are also not eyes free - this didn't matter as much with phone, but is really noticeable on watch. Example using siri to send a message. Requires, voice or tap, then eyes and tap to complete.
- when hands and eyes free works, it's amazing. Example: once navigating is live, you get audible and haptic feedback.
- I sometimes use my nose to wake, swipe and even select buttons. Example: other hand is not available and you want to tap screen.
Hands free is very helpful. Eyes and hands free is incredible when it works and noticeably missing when absent.
If they added siri audible prompts and voice responses as it works with the phone and think about eyes free with the watch, the watch experience would be even more liberating and exhilarating.
Note: the accessibility options seem to require eyes and hands on. To me this isn't the right approach at all and a really awkward solution for those with vision impairment.
What we all need is hands and eye free interaction. Siri would be absolutely amazing if it had those goals in mind. The phone siri gets close. Somehow the watch siri is behind in this regard.
The watch could get us closer to the computer voice interaction in star trek that we all dream of. The watch is there and ready to interface for us. It's so close! And not there yet. Amazingly these are all not technical hurdles to be jumped but interface choices made to not focus on hand and eyes free.
Sent from my iPhone using Tapatalk