When Apple’s new iPad was unveiled last week, one of the features users had hoped for didn’t come with it. Siri, the voice-controlled personal assistant that’s been such a hit on the iPhone 4S, wasn’t among the tablet’s new features. (Apple did add a dictation feature, but it has none of Siri’s interactivity; all you can do is one-way dictation.)
Despite the disappointment of Siri users, this is actually not a bad move on Apple’s part. Siri is still in beta and could use a little polishing before being rolled out to the iPad. Even though I found in my first month of use that it is good enough to change users’ habits, Apple clearly wants to make damn sure Siri works as billed. Even in beta, Siri’s easy interaction, fast results and sometimes quirky responses produce an emotional reaction that has encouraged people to use it — a lot.
It also sent rivals scrambling to come up with a similar service. Since Siri’s debut in October, several Android and iPhone apps have tried without much luck to recreate the basic experience and companies are reinvesting resources in their own virtual assistants. Not surprisingly, Google is hoping to create its own version of Siri.
Meanwhile, Siri’s prominence continues to grow: A quarter of Wolfram Alpha requests are now attributed to Siri and there was even an episode of The Big Bang Theory featuring a subplot dedicated to Siri. Apparently, she is the only female who Raj can actually speak to while sober.
There are even spoof videos on YouTube showing how Siri wants to kill you — one example can be found here and another here. That may not be the message Apple wants, perhaps, but it’s a certain sign that the technology has entered the zeitgeist.
I now find myself relying on Siri about a dozen times a day, mostly during my daily commute. But what’s good enough for me (for now) isn’t good enough for Apple. With that in mind, there are some needed enhancements I’d like to see before Siri is in the hands of millions more users.
The most important feature Siri needs is reliability. Not surprisingly, Siri users want a number of new features — more commands, access to more databases and, for those not in the US, more everything. (Localised support for Siri continues to expand; iOS 5.1 offers Japanese support.) But for me, reliability is the more important thing Apple should be working on.
Apple sued over Siri
In its current form, Siri still has some weak points: It needs an active internet connection and Apple’s servers need to be accessible to Siri commands. There haven’t been widespread outages similar to those that occurred last fall, but there are still moments when it is unable to connect to a network. Needless to say, Siri is no longer a timesaver if you’ve spent the last 20 seconds waiting to find out that it can’t connect to the network. (Issues with the technology even prompted a lawsuit against Apple over Siri’s shortcomings.)
Timing adds to the frustration, as there’s nothing like realising that your long-winded “note to self” has disappeared into the ether. Sure, the annoyance is softened by cute responses, such as: “My mind is going, Mike, I can feel it. I can feel it.” But the inability to operate is still a failure. With enough of those failures — including the long wait for Siri to time out — users will begin to question the logic of using voice commands instead of just doing something themselves.
When it comes to new technology, users are often quick to judge and slow to change their minds. Once a technology gets a bad rap, whether deserved or not, it’s in trouble.
Local functions on the phone?
The solution is easy to pinpoint, but apparently harder to implement: Siri commands for local phone functions should be processed on the phone itself. As is, all of the work done by Siri happens on the back-end on Apple’s servers. The good news: Any upgrades to Siri’s servers mean everyone benefits at the same time. The bad news: You still must be able to connect to Siri’s servers. Removing the need for an active network connection will help make Siri more reliable for local functions.
Beyond reliability, there are a few no-brainer changes needed. Siri should be able to manipulate software settings, such as toggling Bluetooth or Wi-Fi on and off; changing the screen brightness; launching apps; and integrating with third-party apps. But I’m more interested in expanding Siri’s conversation range so that anything that can be done via touch can be done via voice.
I’d like to see Apple expand Siri’s Hands-Free operation, perhaps by making it more chatty. Currently, when Siri is asked a question, it either displays the answer as a graphic or, sometimes, replies out loud. Siri gives more information verbally, such as reading back a dictated text message, when Hands-Free mode is detected. The way I figure it, if I’m asking Siri out loud instead of manually manipulating the screen to get the answer, I’m probably in a situation where I can’t easily look at the screen. Give me the answer out loud, not as a display on the screen.
A good example is when I use traffic in Maps. Downtown Orlando isn’t always the easiest stretch to drive through during peak hours and there are days when I want to know what the local traffic is like. I’d feel better about asking Siri to tell me local traffic if I didn’t have to divert my eyes from the road to see an answer. In fact, tying the voice of Siri into the Maps app so that, when asked, Siri literally guides you to your destination would be a logical step. With the current system, where Siri displays a map graphic and a few routes, the process seems incomplete.
Another example of where more interactivity would help involves answers to search results. Mac OS X has a built-in Summary service that can summarise paragraphs effectively enough. Why not bring that tech to Siri?
Then the conversation, such as it is, would go like this:
Me: “Siri, who is Ben Franklin?”
Siri: “Ben Franklin was one of the founding fathers of the United States. Do you want to read more about him?”
Me: “No, just give me highlights.”
Since I spend a lot of time in my car, I’d like access to reading material stored in Reading List or Instapaper. Currently, you can activate text-to-speech under Settings> General>Accessibility>Speak Selection and tap to Select All on an article’s text. After tapping a Speak popup option, the computer voice reads the article. But integration with my text reader would be really welcome.
Beyond a more interactive Siri, I’d love to see a more proactive Siri. If I receive alerts when Siri is in Hands-Free mode, it would be nice if Siri actually volunteered more information. A proactive Siri could tell me who an email is from and actually read out loud the subject line, perhaps asking if I’d like the rest of the text read out loud. This proactive approach can be applied to text messages, calendar invites and really any kind of notification.
For example, when a calendar notification arrives, Siri could say, “Your boss just sent a meeting invitation for next Wednesday. You have nothing scheduled for that time. Would you like me to create an appointment for you?”
Or in another case, the conversation could go like this: Siri: “Mike, you have an email from Jack about the recent Mac you fixed for him.”
Me: “Thanks. Remind me to call him when I get home.”
If Apple were to take that route, it should include the ability to turn off chatty Siri for those who don’t want to be bothered with extra information. That way, I could respond: “Not now, Siri; don’t interrupt my music for this trip.”
With a more proactive Siri in Hands-Free mode, she could follow up requests with, “Can I help you with anything else?” That would be an opportunity to continue with another command. Or Siri could be dismissed with a “No, thank you.”
When I work from home, I have the luxury of being in a quiet environment. That’s when having an active listening mode for Siri would be appreciated. If the iPhone could respond when Siri’s name is called, that would be helpful in many situations, especially if you’re like me and people only try to communicate with you after you jump in the shower.
Apple would be smart to roll out a universal translator, like one shown on The Verge; it should add the ability to search the App Store (“Siri, what are the top 10 apps for iPhone?”; “Siri, tell me about the best adventure game for iPad”); and it could offer personality packs (unlikely, given that consistency is part of Apple’s brand, but the Iron Man JARVIS personality or Star Trek computer voice packs would surely be a hit).
Where to, now?
Don’t get me wrong, I’m still very positive about Siri; time spent in my car is no longer wasted as I can still be productive using the Hands-Free system while dictating notes, reminders and messages. The countless taps I can now avoid with voice controls insure that I use Siri all the time. There are some areas where Siri could use a next step or two, but I can’t recall an interaction method as pleasing or engaging as when Siri is working properly.
Whenever Apple focuses on a specific technology, rivals offer up similar efforts to stay competitive. The result is a technological tide that raises all boats and benefits everyone. Obviously, Siri’s capabilities will continue to grow; in a few years, I expect we’ll look back at its early limitations and marvel at how far it has come.
I also think it’s inevitable that as Siri becomes more reliable, it will find its way across the whole of Apple’s product lines (especially given its clear usefulness as an aid to accessibility). That means Siri will eventually turn up on Apple’s computers and, of course, the iPad.
For now, though, we’ll have to wait for the technology to mature.