Web 2.0

I was talking to a friend of mine the other day and he mentioned that Web 2.0 doesn’t really exactly exist.

I mean, yes, we can point to applications on the web and say these are clearly web 2.0 applications, but no one has (that I know of) laid down a concrete standard for what the dividing line between 2.0 and 1.0 is.

For that matter, web 1.0 didn’t exist either. We had several diverging standards of how http was to be rendered – to the point that in some cases web developers were forced to write seperate versions of HTML for seperate browsers. Web 1.0 exists only in retrospect.

What I’m sort of wondering, at this point, is where web 3.0 is going to be taking us. I would share what’s in my crystal ball, but I’d probably be (very) wrong. One of the interesting trends is the perpetual game of pushing CPU load off to the client – or the server. It seems possible that Google will release a desktop OS that turns all the computers in the world into one massively parallel computer, and you’ll never know if your spreadsheet is stored on your local hard drive or somewhere in another country. People who use skype already accept that they’re going to be relaying off their neighbors, and vice versa. It may be that web 3.0 will be the end of the server-client mentality and we’ll all be using one monster peer-to-peer system.

Or it might not. I can’t imagine which direction web 3.0 will go in, because I don’t know what radically new developments are just over the horizen. Most of the technology we’ve seen in the last ten years have been logical extensions of Moore’s law – but it seems like there are a lot of concepts that are completely unexplored, and there are a lot more people out there to explore them.

On a unrelated note, I still wonder when and if we will see hybrid analog-digital computers. I thought this would start with each computer having several registers of random noise, generated using some sort of very high quality white noise generator. I also keep thinking certain functions that are very expensive in CPU cycles are very easy with op-amps. Of course, it’s possible this is already being done inside GPUs – anyone know if GPUs have analog computers as part of them?

I also wonder if the science of analog pattern recognition or analog recognition assistance shouldn’t be bumped up a notch or three. We’ve gotten a little too obsessed with digital of late – not saying digital isn’t great, wonderful, the dog’s bark and the cat’s meow, but a hybrid digital/analog computer might be able to achieve things that neither a analog or a digital system could do alone.

3 Responses to “Web 2.0”

  1. ClintJCL Says:

    Very true. After all, isn’t HDMI just a digital in the form of analog transmissions?

    But to answer your questions, I suggest reading these pages. I have before:

    http://en.wikipedia.org/wiki/Web_2.0

    http://en.wikipedia.org/wiki/Web_3.0

    One such answer from that page:

    “But if I were to guess what Web 3.0 is, I would tell you that it’s a different way of building applications… My prediction would be that Web 3.0 will ultimately be seen as applications which are pieced together. There are a number of characteristics: the applications are relatively small, the data is in the cloud, the applications can run on any device, PC or mobile phone, the applications are very fast and they’re very customizable. Furthermore, the applications are distributed virally: literally by social networks, by email. You won’t go to the store and purchase them… That’s a very different application model than we’ve ever seen in computing.”

  2. sheer_panic Says:

    Good articles.. I wouldn’t have thought to look for such data in wikipedia – but I should have 😉

  3. Cygnostik Says:

    “If you build it, they will cum.”

    oh I can’t believe I just wrote that… oh well.

    *click*

Leave a Reply