Apple has reportedly put together a team of voice parsing specialists in Boston, including former Nuance employees, which has led to some speculation that Apple is looking to roll their own engine for Siri. Google did this from the beginning with Google Now, and the power and flexibility that gave them is often thought to be one of the reasons for it being better, faster, and more localized than Apple's Nuance relationship allows. According to Xcomomy:
Fittingly, the current Boston-area Apple speech team all once worked at VoiceSignal Technologies, a speech software company that was purchased for $293 million by Burlington, MA-based Nuance in 2007.
That includes Gunnar Evermann, Larry Gillick, and Don McAllastar, and notably Apple is apparently letting them all stay in Boston instead of relocating them to HQ in Cupertino.
Architecturally, Apple could certainly replace Nuance with their own voice parsing engine. Practically, Nuance owns so many patents in the area, and is so insanely, obliterating-ly litigious about them, that it could be annoying and expensive for Apple to roll their own. In other words, Nuance's practice of buying, suing - or both - competitors out of the space makes their ingestion or removal painful. Of course, Apple has so much money, they can suck up a lot of pain if they choose to.
The alternative is that this team, in Boston, where Nuance is, is working on making Siri work better with the existing Nuance engine. Or, perhaps both.
Either way, Google is far ahead in this area, and with natural language and voice a critical part of the future of human interface, it's something Apple has to be paying attention to, and putting considerable resources behind.
Looks like they're doing that. Hopefully we'll see the results in device-side processing, and better, faster, more reliable querying. I mean, how's Siri working for you lately?
Source: http://feedproxy.google.com/~r/TheIphoneBlog/~3/bAkdBzJDGe4/story01.htm
Nessun commento:
Posta un commento