Is Siri really Apple’s future?

Siri is a promise. A promise of a new computing environment, enormously empowering to the ordinary user, a new paradigm in our evolving relationship with machines. Siri could change Apple’s fortunes like iTunes and App Store…or end up being like the useful-but-inessential FaceTime or the essential-but-difficult Maps or the desirable-but-dead Ping. After spending hundreds of millions on acquiring and improving it, what does Apple expect to gain from Siri, at once the butt of late-night TV jokes but also the wonder of teary-eyed TV commercials?

Everyone expects different things from Siri. Some think top 5 wishes for Siri should include the ability to change iPhone settings. The impatient already think Siri should have become the omniscient Knowledge Navigator by now. And of course, the favorite pastime of Siri commentators is comparing her query output to Google Search results while giggling.

Siri isn’t a sexy librarian

The Google comparison, while expected and fun, is misplaced. It’d be very hard for Siri (or Bing or Facebook, for that matter) to beat Google at conventional Command Line Interface search given its intense and admirable algorithmic tuning and enormous infrastructure buildup for a decade. Fortunately for competitors, though, Google Search has an Achilles heel: you have to tell Google your intent and essentially instruct the CLI to construct and carry out the search. If you wanted to find a vegetarian restaurant in Quincy, Massachusetts within a price range of $25-$85 and you were a Google Search ninja, you could manually enter a very specific keyword sequence: “restaurant vegetarian quincy ma $25…$85″ and still get “about 147,000 results (0.44 seconds)” to parse from. [All examples hereon are grossly simplified.]

Linear

This is a directed navigation system around The Universal Set — the entirety of the Internet. The user has to essentially tell Google his intent one. word. at. a. time and the search engine progressively filters the universal set with each keyword from billions of “pages” to a much smaller set of documents that are left for the user to select the final answer from.

Passive intelligence

Our computing devices, however, are far more “self-aware” circa 2012. A mobile device, for instance, is considerably more capable of passive intelligence thanks to its GPS, cameras, microphone, radios, gyroscope, myriad other in-device sensors, and dozens of dedicated apps, from finance to games, that know about the user enough to dramatically reduce the number of unknowns…if only all these input and sensing data could somehow be integrated.

Siri’s opportunity here to win the hearts and minds of users is to change the rules of the game from relatively rigid, linear and largely decontextualized CLI search towards a much more humane approach where the user declares his intent but doesn’t have to tell Siri how do it every step of the way. The user starts a spoken conversation with Siri, and Siri puts an impressive array of services together in the background:

  • precise location, time and task awareness derived from the (mobile) device,
  • speech-to-text, text-to-speech, text-to-intent and dialog flow processing,
  • semantic data, services APIs, task and domain models, and
  • personal and social network data integration.

Let’s look at the contrast more closely. Suppose you tell Siri:

“Remind me when I get to the office to make reservations at a restaurant for mom’s birthday and email me the best way to get to her house.”

Siri already knows enough to integrate Contacts, Calendar, GPS, geo-fencing, Maps, traffic, Mail, Yelp and Open Table apps and services to complete the overall task. A CLI search engine like Google’s could complete only some these and only with a lot of keyword and coordination help from the user. Now lets change “a restaurant” above to “a nice Asian restaurant”:

“Remind me when I get to the office to make reservations at a nice Asian restaurant for mom’s birthday and email me the best way to get to her house.”

“Asian” is easy, as any restaurant-related service would make at least a rough attempt to classify eateries by cuisine. But what about “nice”? What does “nice” mean in this context?

A conventional search engine like Google’s would execute a fairly straight forward search for the presence of “nice” in the text of restaurant reviews available to it (that’s why Google bought Zagat), and perhaps go the extra step of doing a “nice AND (romantic OR birthday OR celebration)” compound search to throw in potentially related words. Since search terms can’t be hand-tuned for an infinite number of domains, this comes into play for highly searched categories like finance, travel, electronics, automobiles, etc. In other words, if you’re searching for airline tickets or hotel rooms, the universe of relevant terms is finite, small and well understood. Goat shearing or olive-seed spitting contests, on the other hand, may not benefit as much from such careful human taxonomic curation.

Context is everything

And yet even when a conventional search engine can correlate “nice” with “romantic” or “cozy” to better filter Asian restaurants, it won’t matter to you if you cannot afford it. Google doesn’t have access to your current bank account, budget or spending habits. So for the restaurant recommendation to be truly useful, it would make sense for it to start at least in a range you could afford, say $$-$$$, but not $$$$ and up.

Therein comes the web browser vs. apps unholy war. A conventional search engine like Google has to maintain an unpalatable level of click-stream snooping to track your financial transactions to build your purchasing profile. That’s not easy (likely illegal on several continents) especially if you’re not constantly using Google Play or Google Wallet, for example. While your credit card history or your bank account is opaque to Google, your Amex or Chase app has all that info. If you allow Siri to securely link to such apps on your iPhone, because this is a highly selective request and you trust Siri/Apple, your app and/or Siri can actually interpret what “nice” is within your budget: up to $85 this month and certainly not in the $150-$250 range and not a $25 hole-in-the wall Chinese restaurant either because it’s your mother’s birthday.

Speaking of your mother, her entry in your Contacts app has a custom field next to “Birthday” called “Food” which lists: “Asian,” “Steak,” and “Rishi Organic White Tea”. On the other hand, Google has no idea, but your Yelp app has 37 restaurants bookmarked by you and every single one is vegetarian. Your mother may not care, but you need a vegetarian restaurant. Siri can do a proper mapping of the two sets of “likes” and find a mutually agreeable choice at their intersection.

So a simple search went from “a restaurant” to “a nice Asian vegetarian restaurant I can afford” because Siri already knew (as in, she can find out on demand) about your cuisine preference and your mother’s and your ability to pay:

Restaurant chain

Mind you, all these series of data lookups and rule arbitrations among multiple apps happen in milliseconds. Quite a bit of your personal info is cached at Apple servers and the vast majority of data lookups in third party apps are highly structured and available in a format Siri has learned (by commercial agreement between companies) to directly consume. Still, the degree of coordination underneath Siri’s reassuring voice is utterly nontrivial. And given the clever “personality” Siri comes with, it sounds like pure magic to ordinary users.

The transactional chain

In theory, Siri’s execution chains can be arbitrarily long. Let’s consider a generic Siri request:

Check weather at and daily traffic conditions to an event at a specific location, only if my calendar and my wife’s shared calendar are open and tickets are available for under $50 for tomorrow evening.

Siri would parse it semantically as:

Chain1

and translate into an execution chain by apps and services:

Chainarrow

Further, being an integral part of iOS and having programmatic access to third party applications on demand, Siri is fully capable of executing a fictional request like:

Transfer money to purchase two tickets, move receipt to Passbook, alert in own calendar, email wife, and update shared calendar, then text baby sitter to book her, and remind me later.

by translating it into a transactional chain, with bundled and 3rd party apps and services acting upon verbs and nouns:

Chain4

By parsing a “natural language” request lexically into structural subject-predicate-object parts semantically, Siri can not only find documents and facts (like Google) but also execute stated or implied actions with granted authority. The ability to form deep semantic lookups, integrate information from multiple sources, devices and 3rd party apps, perform rules arbitration and execute transactions on behalf of the user elevates Siri from a schoolmarmish librarian (à la Google Search) into an indispensable butler, with privileges.

The future is Siri and Google knows it

After indexing 40 billion pages and their PageRank, legacy search has largely run its course. That’s why you see Google, for example, buying the world’s largest airline search company ITA, restaurant rating service Zagat, and cloning Yelp/Foursquare with Google Places, Amazon with Google Shopping, iTunes and App Store with Google Play, Groupon with Google Offers, Hotels.com with Google Hotel Finder…and, ultimately, Siri with Google Now. Google has to accumulate domain specific data, knowledge and expertise to better disambiguate users’ intent in search. Terms, phrases, names, lemmas, derivations, synonyms, conventions, places, concepts, user reviews and comments…all within a given domain help enormously to resolve issues of context, scope and intent.

Whether surfaced in Search results or Now, Google is indeed furiously building a semantic engine underneath many of its key services. “Normal search results” at Google are now almost an afterthought once you go past the various Google and third party (overt and covert) promoted services. Google has been giving Siri-like answers directly instead of providing interminable links. If you searched for “Yankees” in the middle of the MLB playoffs, you got real-time scores by inning, first and foremost, not the history of the club, the new stadium, etc.

Siri, a high-maintenance lady?

Google has spent enormous amounts of money on an army of PhDs, algorithm design, servers, data centers and constant refinements to create a global search platform. The ROI on search in terms of advertising revenue has been unparalleled in internet history. Apple’s investment in Siri has a much shorter history and far smaller visible footprint. While it’d be suicidal for Apple to attack Google Search in the realm of finding things, can Apple sustainably grow Siri to its fruition nevertheless? Very few projects at Apple that don’t manage to at least provide for their own upkeep tend to survive. Given Apple’s tenuous relationship with direct advertising, is there another business model for Siri?

By 2014, Apple will likely have about 500 million users with access to Siri. If Apple could get half of that user base to generate just a dozen Siri-originated transactions per month (say, worth on average $1 each, with a 30% cut), that would be roughly a $1 billion business. Optimistically, the average transaction could be much more than $1 or the number of Siri transactions much higher than 12/month/user or Siri usage more than 50% of iOS users, especially if Siri were to open to 3rd party apps. While these assumptions are obviously imaginary, even under the most conservative conditions, transactional revenue could be considerable. Let’s recall that, even within its media-only coverage, iTunes has now become a $8 billion business.

As Siri moves up the value chain from its original CLI-centric simplicity prior to Apple acquisition to its current status of speech recognition-dictation-search to a more conversationalist interface focused on transactional task completion, she becomes far more interesting and accessible to hundreds of millions of non-computer savvy users.

Siri as a transaction machine

A transactional Siri has the seeds to shake up the $500 billion global advertising industry. For a consumer with intent to purchase, the ideal input comes close to “pure” information, as opposed to ephemeral ad impression or a series of search results which need to be parsed by the user. Siri, well-oiled by the very rich contextual awareness of a personal mobile device, could deliver “pure” information with unmatched relevance at the time it’s most needed. Eliminating all intermediaries, Siri could “deliver” a customer directly to a vendor, ready for a transaction Apple doesn’t have to get involved in. Siri simply matches intent and offer more accurately, voluntarily and accountably than any other method at scale that we’ve ever seen.

Another advantage of Siri transactions over display and textual advertising is the fact that what’s transacted doesn’t have to be money. It could be discounts, Passbook coupons, frequent mileage, virtual goods, leader-board rankings, check-in credits, credit card points, iTunes gifts, school course credits and so on. Further, Siri doesn’t even need an interactive screen to communicate and complete tasks. With Eyes Free, Apple’s bringing Siri to voice controlled systems, first in cars, then perhaps to other embedded environments that don’t need a visual UI. Apple having the largest and the most lucrative app and content ecosystem on the planet with half a billion users with as many credit card accounts would make the nature of Siri “transactions” an entirely different value proposition to both users and commercial entities.

Siri, too early, too late or merely in progress?

And yet with all that promise, Siri’s future is not a certain one. A few potential barriers stand out:

  • Performance — Siri works mostly in the cloud, so any latency or network disruption renders it useless. It’s hard to overcome this limitation since domain knowledge must be aggregated from millions of users and coordinated with partners’ servers in the cloud.
  • Context — Siri’s promise is not only lexical, but also contextual across countless domains. Eventually, Siri has to understand many languages in over 100 countries where Apple sells iOS devices and navigate the extremely tricky maze of cultural differences and local data/service providers.
  • Partners — Choosing data providers, especially overseas, and maintaining quality control is nontrivial. Apple should also expect bidding wars for partner data, from Google and other competitors.
  • Scope — As Siri becomes more prominent, so grow expectations over its accuracy. Apple is carefully and slowly adding popular domains to Siri coverage, but the “Why can’t Siri answer my question in my {esoteric field}?” refrain is sure to erupt.
  • Operations — As Siri operations grow, Apple will have to seriously increase its staffing levels, not only for engineers from the very small semantic search and AI worlds, but also in the data acquisition, entry and correction processes, as well as business development and sales departments.
  • Leadership — Post-acquisition, two co-founders of Siri have left Apple, although another one, Tom Gruber, remains. Apple recently hired William Stasior, CEO of Amazon A9 search engine, to lead Siri. However, Siri needs as much engineering attention as data partnership building, but Stasior’s A9 is an older search engine different from Siri’s semantic platform.
  • API — Clearly, third party developers want and expect Apple someday to provide an API to Siri. Third party access to Siri is both a gold mine and a minefield, for Apple. Since same/similar data can be supplied via many third parties, access arbitrage could easily become an operational, technical and even legal quagmire.
  • Regulation — A notably successful Siri would mean a bevy of competitors likely to petition DoJ, FTC, FCC here and counterparts in Europe to intervene and slow down Apple with bundling/access accusations until they can catch up.

Obviously, no new platform as far-reaching as Siri comes without issues and risks. It also doesn’t help that the two commercial online successes Apple has had, iTunes and App Store, were done in another era of technology and still contain vestiges of many operational shortcomings. More recent efforts such as MobileMe, Ping, Game Center, iCloud, iTunes Match, Passbook, etc., have been less than stellar. Regardless, Siri stands as a monumental opportunity both for Apple as a transactional money machine and for its users as a new paradigm of discovery and task completion more approachable than any we’ve seen to date. In the end, Siri is Apple’s game to lose.

55 thoughts on “Is Siri really Apple’s future?

  1. If Siri is the future of Apple I recommend selling every share you own and buying Google instead. Their voice search app is already vastly superior to Apples and they have the same power to integrate in Android. Apple should let them do the same on iOS and stick to their strengths. If voice control turns out to be the make or break feature of mobile computing and Apple doesn’t compromise, (which they won’t) they’re going to get destroyed by Google like they did by Microsoft 20 years ago.

    • Surely…

      …”Thread those gorgeous body parts of yours on a stretchable string, and let me worry about whose penis shall sublimation wring!

      …after all, you could do much worse than to have I, tangents-sleuthing algorithm, as your soul’s foreign-affairs pimp!”…

      …not.

  2. Google has refined voice recognition with the recent update.
    ( the only thing is they haven’t added the speech markup to speak back …that might be happening soon to …)

    This with their current search ( simply feeding the tokens correctly recognized to existing search ) gives back great results.

    What Apple should recognize is SIRI fundamentally is broken in two major areas( both require ML approaches ).

    Here is one way to interpret those:

    A. Recognize speech into correct language tokens.
    ( all the while building a custom dictionary of speech phonemes )

    B. Use these tokens properly:
    1. Use any good search engine ( need not be Apple’s , but user’s default that got selected on iphone/ipad ).
    2. Get results and speak back.
    If the token content %age in the top 3-4 results vary greatly, initiate a directed-dialog and ask the user the right question again.

  3. Read this and got so excited about Siri, then I used it and got sad. Whilst I do actually like Siri (for me at least it consistently bests Google voice search for speed and word recognition whilst it obviously has an advantage regarding system hooks on iOS) it is still _mostly_ just potential right now. Here’s hoping Apple’s plans are as grand as those you’ve outlined…

  4. Great article, Great comments by Dexter also. Siri is the ‘next big thing’ – a voice activated platform. It will take a long time.

    Siri + Maps + Passbook can add $100B to Apple’s market cap

  5. You left out Facebook’s Open Graph which further builds up the semantics of people and things and intents and relationships. Would love to see the terms of the agreement between Apple and Facebook for the OS integration. Couple that with Facebook’s newfound focus on mobile apps instead of html5 and it gets even more interesting.

  6. This future was fully imagined by sf great Frederik Pohl in his 1965 novel The Age of the Pussyfoot. In the novel, citizens created an “interests profile” that was augmented by their lifelong verbal interaction with a nationwide computer network using a “joymaker” (http://goo.gl/JGv1H like a super smartphone). The network knew you so well it could predict exactly what activities, jobs, purchases and restaurants you would like the best:

    “…Have you filled out an interests profile?”

    “I don’t think so.”

    “Oh, do! Then it will tell you what programs are on, what parties you will be welcomed at, who you would wish to know. It’s terrible to go on impulse, Charles,” she said earnestly. “Let the joymaker help you.”

    “I don’t understand,” he said. “You mean I should let the joymaker decide what I’m going to do for fun?”

    “Of course. There’s so much. How could you know what you would like?”

    (http://goo.gl/p6ky9)

  7. Siri has several specific things for Apple that work.

    Location is obviously one. The other is hands free as in sending a quick text without typing while driving.

    On the transaction side, Apple iPhone/iPad users are more affluent and use the web far more than their counterparts. Hence, SIRI & and i-Device spells income for 3rd party concepts in finding and ordering.

  8. Another vector here: security. The Android platform is the wild,wild west. Though not perfect, Apple does at least QC the app submission process. Trust becomes a huge issue down the line.

  9. Google Now is the (real) competitor to Siri;
    Technically, Google are years ahead because it is a natural evolution of their technology;
    Apple would have to invest heavily in this area as it is not part of their DNA;
    Voice translation per Siri and Google Voice is not the key user-interface because the personal assistant should work with and without voice eg. exhibit autonomous behavior;
    Apple does have the data to build a viable personal assistant for their users but doesn’t have the technology;
    AI-based personal assistants should not be the preserve of Apple, Google and other vendors eg. healthcare and education organizations should be able to deliver their own.

  10. Having played with Google Now I believe its actually firing off a new search with each word, which are recognised pretty much instantly, so I think the view that Siri is slower because it has to wait to discover context is incorrect, I suspect Siri is slower simply because the infrastructure behind it is less mature.
    But ask GN “What can you do?” and you get a standard google search result for “What can you do?”.
    I think the decider here will be whether it’s easier for Apple to drastically increase Siri’s responsiveness or Google to add contextual awareness of associated services to Google Now.
    Ultimatly though, whoever ‘wins’, this type of human language interface is going to be massive.

    • Google Now is not what they call the predictive search in the task bar. You don’t search for anything with Google Now, it presents results when you’re in a context where you might be interested in them and learns based on your response to them. It’s only available in the newest version of Android, so not a lot of people have played with it yet, but it’s remarkable. It noticed I was getting directions to my home address a few times and asked if it should be marked as home – now that it is, I get traffic updates and commute times in late afternoon. It presents cards for restaurants and events in the area. It gives me detailed flight information. It tracks packages.

      Yes, there’s a search bar when you open the Google app – but if you enter terms there and search, it opens a browser to load them, not a card. If you have JB and haven’t seen cards yet, launch the “google” application once and look more things up. It’ll get started pretty quickly from there.

  11. You are aware, I hope, that iOS 6.1 allows Siri to tender movie ticket reservations (via Fandango, I believe) – which is probably the usual Apple early try to create a revenue generating cash stream for Siri (you can bet that Fandango will pay a percentage to Apple).

    Also, a Siri API has already been released, allowing Siri queries to query third-party apps. It’s rudimentary, but somethings developers can build on.

  12. Siri is just (SHRDLU + 40 years). Well, not really, it’s maybe 20+ with some voice recog. I use it daily, but need something that works reliably. A nice toy thing though!

  13. Jacqui at Ars Technica had a post today on how Siri and Google voice search handled various queries. I think you’re misrepresenting the difference between the two services. Google performs very well as figuring out ambiguous queries with no explicit context. No, Google cannot do things that Apple doesn’t supply API access for, like setting appointments. And Google by deliberate policy has chosen not to be chatty with users (as in “OK, I’m on it.”). That is a matter of style and not of the quality of the answer.

    Here’s what I think: Google has been concentrating on this stuff in house for a long time, hiring all kinds of Ph.D.s. Apple acquired a company that developed Siri, some of whose personnel are still there and some of whom have cashed out. Apple now needs to hire and keep and manage, forever and consistently, a top-flight research and execution group. This is not Apple’s forte. Is it Eddie Cue’s forte? He has about ten hours a week to devote to this after Maps and iCloud and so on. I think Apple’s in trouble. They have a lot of foundation building to do within their company if they want to do stuff like this.

  14. The idea that only Siri can do a conversational interface or has some fundamental technical lead in this area is hopelessly naive. Perhaps you should look up SHDRLU, released in, I kid you not, 1968, which had an even more impressive conversational interface than Siri, and had a built in planner that understood the state of its virtual world, how to answer questions about it, and execute specific multistep actions in it.

    Google could nullify Siri’s supposed non-CLI advantage in a heartbeat, the basic technology is decades old. Siri’s ‘context’ stuff isn’t even as impressive as old AI projects in this regard.

    Remember, Google has some of the industry’s best machine learning and artificial intelligence scientists, they’ve been working in this area at scale for a long time. In fact, they recently published a paper where they trained the largest neural network ever by a factor of 20 and achieved better than human-visual-recognition performance which is used to read street signs in Google Street View or recognize cats on YouTube.

    The other half of the post makes assumptions about advantages Siri may have about knowing about you, but let’s look at the reality of what Google has:
    1) Your Search History
    2) Gmail and Google Talk, your contacts, who you talk to
    3) Files and content you’ve created and people you’ve collaborated with
    4) YouTube views
    5) Stuff bought on Google Play, through Google Checkout, or Google Wallet
    6) Plus everything your Android device learns about you.

    I’d say at this point, Google probably knows far more about you then Siri.

    The only area Siri has an inarguable advantage in is the ability to integrate with iOS, so Google can never ship an application that does what Siri does on iOS, but this is more a Microsoftian strategy of denying competitors access through APIs, or just having an OS that is not as modular and extensible as it should be (e.g. no Intent system like Android)

    • I’d say at this point, Google probably knows far more about you then Siri.”

      It did do for me but no longer. Microsoft was the high-tech thief of the 20th century, Google that of the 21st century and they make MS look amateurs at it. Wouldn’t trust them with a barge pole.

      Google: Don’t do evil – it’s our job.

  15. Plus – it’s not even about english, or voice regognition. The data sources are also important. In Europe, we don’t use Yelp, Zagat, and we don’t know what the hell is MLB. Each country here has it’s own portals, in their own languages. Siri would have to support all of them.

    • I’d really help if you actually read the article first:

      “Partners — Choosing data providers, especially overseas, and maintaining quality control is nontrivial. Apple should also expect bidding wars for partner data, from Google and other competitors.”

  16. Articles like this always forgets that Siri works well only in English. And voice recognition as such has only worked for English like since forever. USA is not the only place on earth, and if Apple wants to be a global player – they would have to teach Siri at least 500 languages. OK, maybe start with 100

    • Did you miss?

      “Context — Siri’s promise is not only lexical, but also contextual across countless domains. Eventually, Siri has to understand many languages in over 100 countries where Apple sells iOS devices and navigate the extremely tricky maze of cultural differences and local data/service providers.”

  17. The thing that clangs in my mind is the conflict between Siri being a $1 billion business while also being a privileged butler. Can it be both?

  18. Using google voice search right now. Voice recognition is almost exactly as slow as Siri. Or as fast….

    Google is everything Siri is not. And vice versa. Siri can tap into my iPhone’s data while Google has a much better voice and more comprehensive search. I find myself using both: Siri to help me schedule, make lists, send texts and organize my life and google to fill in the gaps (and they’re significant gaps).

  19. I tried some of the queries in the article and most of them work reliably with Siri:

    1. Remind me to call my mom when I get home tomorrow – works perfectly.

    2. I want to make reservations to an Italian restaurant. Works and filters appropriate.

    Most queries related to stocks, weather, sports, movies, restaurants, contacts, calendars and directions work.

    It would be nice if it could make reservations for flights.

    • Except that Google Now is only really useful in the US – like many other Google Services.
      Historically, Google has left the rest of the world to hang dry – new features take years (yeah, years) to appear, and if they do (because they don’t always do), they don’t really work as expected, or not at all. I won’t even mention things like Google voice – hell, not even transit directions work at all!
      I’ve been on an internship in France these past months, and Google’s public transportation directions send me walking (yes, walking) accross the street. So much for Google being awesome… I won’t even go into mapping and wrong/inexistent streets…Google Now sounds great, it really does – the rest of the world simply doesn’t care, though – it doesn’t work there.

      Also, historically, Apple has been much better at worrying and providing support for other countries (it’s track record is not stellar, but it is much faster and reliable than Google’s). I mean, they already even have local restaurant and movie finding in Portugal integrated with Siri – Portugal is way down the list of priorities of companies support. I only expect it to get better with Apple. I expect no support from Google whatsoever. I really hope Apple takes advantage of this opportunity and improves on Siri like a rabid dog.

    • I am sure Google Now is available in Australia and many other places. But are you sure it’s exactly the same way as in the US? Google maps is also available in France and Portugal. Is it as good there? Definitely not. We have no transit directions (unless you count walking as such). In Portugal, Streets are commonly wrong. There are no services available other then searching for places. There are much fewer POIs.

      Maybe my point didn’t come across completely. Google’s services are very often unavailable in the same way they are available in the US. I obviously can’t speak for every country, but I can speak for where I live, and the places I visit. Google is very very slow to provide an even reasonably similar level of service other then the US, and sometimes a couple other outliers. It has been like this for years.

      There are no restaurants, flights, hotels, sports, traffic information, transportation information, package tracking, movies, concerts…I have no clue whatever else they have in other places. Here it’s barren. Hell, Siri has more than google, and it’s a lot younger. Sure, bigger countries tend to get these features earlier, but even then, they usually take a lot longer.

      I’m glad you have all of those features on Google Now (do you?) in your home country. Here? I have maps with directions. Still cool, but I honestly don’t care about Google Now – it is worthless here – it does not work.

  20. Great article. I think some of the commenters here have missed your point entirely (‘tl;dr’ is an immediate indicator of a moron. Nothing personal to that commenter, but he should know that’s what intelligent people think when they see it).

    This article wasn’t about what Siri is now, it was about what Siri has the potential to become. It wasn’t an Apple vs Google article, it was merely stating that Google’s current setup – which is perfectly valid in 2012 and is mentioned because it is the undisputed market leader in search and advertising – won’t cut it in 2015/2020/2025 if Apple, Microsoft or even Google themselves can make good the promise of a technology like Siri.

    It’s a food for thought article; eat it and think.

  21. Google knows as much about me as Apple does. I don’t think the distinction that you’re trying to make is valid.

  22. Apple has always sold over-priced products. Its two big wins in twenty years were an aberration. It will have to introduce something stellar like foldable tablets that fit in a shirt pocket to continue such impressive success. The competitive rush has already lead to phones that are a far better value than the iPhone (galaxy 3 sales far surpass iPhone numbers)… because VALUE will win… the same reason over 90% of home computers run Microsoft & Apple played the sore bullied loser until they had their 5 years of fame.

    • The iPod’s and iPhone’s success were not accidents. The iPod worked because of seamless iTunes integration and simple syncing. The first iPhone was a radical new paradigm for a mobile phone. It’s hard to look back and remember that the market was still monochrome nokias and even smartphones were trying to pass off itty-bitty keyboards and compromised mobile web experiences.

      The iPad could have been made by any company and it would have been successful. No file systems, infinitely nested folders, etc. Just a clean matrix of easily identifiable applications. Microsoft et al squandered years trying to push watered down versions of their clunky windows interface. Android suffers the same with it’s weird menagerie of space-whoring widgets (Does a search box really need half the screen?), apps hidden in drawers (why?!) etc. The iPhone interface is far from perfect. But it has used the opportunity to challenge (for the better) some fundamental assumptions we have about the required complexity of computer interfaces.

      There are two sides to the “value” discussion; what you pay and what you get. Yes Apple products have healthy margins (just try and buy a cheap mac with a decent graphics card!) but they also retain incredible resale prices. I genuinely love using my Apple products in a way that I never, ever experienced with the products of any other company (aside from Nintendo). Every time my iPad responds to touch input without latency or I appreciate the gloriously rendered typography in mac OS X (versus the tacky blocky fonts in Windows) I literally feel better about the world. How much “value” do those bargain basement $99 HP Palm OS tablets really represent two years on?

      User base is also only half the story of doing business. Profit and an active and loyal customer base are key. Apple products consistently top their categories in consumer satisfaction, usage statistics and willingness of consumers to invest in the application ecosystem. Apple consistently makes a profit in excess of their market share. This is a GOOD thing for the consumer. It means they are investing in an ecosystem with a future. The worst outcome would be the apps you invest in become worthless because the manufacturer went bankrupt. No manufacturer really wants a lock on the whiny-cheapskate market? That market spawned netbooks and industry profits (and future support) have been negligible ever since.

    • How about 3 wins in 10 years + the iMac all-in-one and the MacBook Air. They reinvented three different markets. How many wins does MS have in 10 years?

      The GS3 did finally pass the iPhone 4s. Congrats to them on out shipping a 10 month old phone in a single month.

  23. Sorry, but nobody need Siri. Had this with a early windows (CE) phone already and never ever used it. Never will talk to my phone, too silly.

  24. tl;dr you like Siri and built your conclusion around the premise. In reality, however, nothing in this post is factual. There’s a flawed hypothesis early on: “you have to tell Google your intent and essentially instruct the CLI to construct and carry out the search”

    Why is this true? Hint: trick question, it’s not true. Why can Siri have this magical “passive self-awareness circa 2012″ but Google can’t? Location aware search is just an implementation detail. An implementation detail, which, by the way, Google already has implemented.

    Browsers already have location awareness. This isn’t circa 2012, this is circa way earlier than 2012 (the earliest incantations of geo-localization date back to at least 2008 if not earlier). Google also has text-to-speech and speech-to-text thanks to HTML5 support. And, with Google+, it also has a social network to mine extra context from… and finally, with Android’s inherent ability for third party apps to integrate with each other, the better social sites like Facebook/Twitter could also lend a hand.

    In short, Google already has everything Siri does. I can already speak “cheap vegetarian restaurant” into my phone and get results in my current physical area at the price range I want.

    The rest of your post is mooted by this fact.

    • I detest your dreary world of “facts.” Facts, facts, facts should NEVER trump dreams, dreams, dreams. You’d have a few fewer things to criticize if Steve Jobs wasn’t able to make his dreams come true with Apple. But then again, you’d probably find plenty more things to fill that gap.

    • Vangarde, the fact that you’re an Apple customer and actually said what you just did is making me want to switch to Android. To be associated, even if only in my own mind, with such vapidity is painful.

    • Talk about butt hurt, you didn’t read the article but felt a need to post a reply. Let me guess, even the fat chicks find you repulsive.

  25. Great article and projection.

    I’ve already found myself using Siri on a regular basis for the convenient things like alarms and timers, reminders and calls, in spite of its shortcomings.

    I use Google as well, but its rapid search is a one-dimensional equivalent of what I already achieve with Siri, which also entertains and impresses me as well as my friends while doing so.

    The possibilities are intriguing

  26. After trying Google Voice Search on my iPhone I have concluded that Siri is years behind Google in the voice recognition part of these services. Google voice is instantaneous whereas Siri is extremely slow by comparison most of the time. In fact, all too frequently Siri gets bogged down and doesn’t even respond. Siri’s only advantage is that it alone has direct access to the iPhone so it can play a song or open an App. As for Internet searches don’t even bother with Siri as it will take more than twice as long to get results.

    • The latency you experience is not related to voice recognition. Apple’s implimentation of “the cloud” is in fact years behind Google’s. Apple’s data centers are new, smaller capacity, and therefore not as responsive. This will change as Apple brings more centers on line, but for now, it is definitely slower

    • Like the article states, Google Now is performing a CLI search and CAN filter results as you speak words, just as it does when you type words in a search field. (Example: Open iTunes enter something in the search field – results are instantaneous because its filtering as your typing the CLI search query.)

      Siri cannot do this as it has to “know” your entire intentions and the context in which to perform it. Siri is not a “search” engine, it’s an assistant for performing tasks. One of those tasks maybe to search the Internet, but it doesn’t know that until after you’re done telling it.

      Of course Google Now is going to perform internet searches much faster, that’s what it is designed to do.

  27. Tim Cook said the iCloud was the platform for next ten years.
    Siri is equivalent to iTunes in the Computer is your Hub analogy.
    Apple would have to have greater knowledge of the user than Google in order for Siri to go to the next level. But I think it is more along with incremental upgrade with Home Automation, Automobile assistant, Office Assistant, etc. Sadly AI is just a distant dream at least Apple doesn’t try to sell Siri as such.

    Then again you didn’t say anything about Dragon being a non Apple tech that Apple depends on to have the Siri voice.
    Which is another red flag.

    • Really Jean-Louis Gassée? In one of the interactions above, Kontra writes “… and email me the best way to get to her house.” That’s so archaic. As you must know, Google Now would have shown a time to leave card with directions and traffic conditions at the right time. And, that’s just one example. I’m afraid he’s skating on thin ice when it comes to talking about areas that aren’t Apple’s core strengths — design, UX, supply chain, etc.

  28. Siri could also stand some intonation to make her responses easier to understand.

    Her curt “I live to serve” is merely less funny, but her monotonic “Take exit 123 for 56th street” is almost incomprehensible.

Comments are closed.