If this is your first visit, please click the Sign Up now button to begin the process of creating your account so you can begin posting on our forums! The Sign Up process will only take up about a minute of two of your time.

Page 2 of 2 FirstFirst 1 2
Results 11 to 11 of 11
  1. #11
    Senior Member
    Join Date
    Nov 2005
    Member #
    Haven't you read up on web 3.0?
    Don't get me started on Web 3.0, but yes. Semantics require the same kind of AI that we're talking about -- and we ain't got that.

    Notice the "stuff" surrounding Web 2.0 -- where people differ on what the term means? ...With everybody getting all hot and bothered because the vast majority haven't a clue?!

    That is semantics. And if you thought the grammar nazis were goofy, you'll love the semantics nazis (trust me, I've gotten involved with the ABBR versus ACRONYM debate)

    Google spits back results. That's pretty huge.
    You're probably talking about SayWhere. Getting a result from a keyword parser isn't voice recognition, and Google spitting out a result is not huge.

    "Ahem. Google?"

    "Please show me the route from [point A] to reach [point B] in order to avoid the traffic jam."

    That's huge. Recognition the shortest route isn't necessarily fastest is huge. That's Web 3.0. And that ain't happenin'

    Spitting out a result is programmer centric -- neatly removing the human context. You input, the result is an output -- what more do you want? Humans don't want a SERP output, they want what they're looking for.

    Guess what. The killer app for search is FIND. Not 23,654 SERPs.

    Here is what a Web 3.0 level smart phone does...

    SenSay is a context-aware mobile phone that adapts to
    dynamically changing environmental and physiological
    states. In addition to manipulating ringer volume,
    vibration, and phone alerts, SenSay can provide remote
    callers with the ability to communicate the urgency of their
    calls, make call suggestions to users when they are idle,
    and provide the caller with feedback on the current status
    of the Sensay user. A number of sensors including
    accelerometers, light, and microphones are mounted at
    various points on the body to provide data about the userís
    context. A decision module uses a set of rules to analyze
    the sensor data and manage a state machine composed of
    uninterruptible, idle, active and normal states. Results from
    our threshold analyses show a clear delineation can be
    made among several user states by examining sensor data
    trends. SenSay augments its contextual knowledge by
    tapping into applications such as electronic calendars,
    address books, and task lists. The phone alleviates
    cognitive load on users by various methods including
    detecting when the user is uninterruptible and
    automatically turning the ringer off.


Page 2 of 2 FirstFirst 1 2

Remove Ads

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
All times are GMT -6. The time now is 01:06 AM.
Powered by vBulletin® Version 4.2.3
Copyright © 2019 vBulletin Solutions, Inc. All rights reserved.
vBulletin Skin By: