These are the old pages from the weblog as they were published at Cornell. Visit for up-to-date entries.

December 17, 2004

Feynman & REST

At times I re-read Richard Feynman's lectures. Not just for their marvelous insights, from which I learn time and time again, but also to enjoy Feynman's role as an educator. He is among this group of academics that are born to educate, to explain the complex world in manner that makes it all seem coherent. One of Feynman's strengths is that he not only give you insight into the subject matter itself, but also into mind and reasoning of the scientists that developed the theories.

The week I was reading a bit in his "QED - The Strange Theory of Light and Matter" which is a transcript of a UCLA Mautner Memorial lecture he gave on Quantum Electro Dynamics. I think this lecture is one of his best, as it is purely focused on making people understand the bigger picture instead of just techniques and theories.

One of the points he makes early in the book is that Physics and the laws of Physics are just models. Nature doesn't use these laws to regulate it self, Nature just does its thing, and we use physics to derive an "after-the-fact" model such that we can reason about Nature. His point is that the model is far from perfect, because we don't really know what rules Nature, and that we will never will. But that through experiments we try to validate the model and with some probability we can then say whether or not it seems to match. The better we get at experimenting, at observing and at reasoning about it, we evolve existing theories, or even dump them for ones that appear to math nature better.

All of this has no influence on Nature itself, it continues to do its thing. It only has influence on the tools we build that rely on the model. Given that the model is only reliable in a probabilistic sense, and might be replaced by a better fitting model, there is a certain risk associated with taking a model as the absolute truth.

This got me thinking about whether there are analogies with the world of computing. I think that REST falls into this category; it is a model derived "after-the-fact" that appears reasonable to many. It gives us good tools to reason about the phenomena we call "the web" or "the internet".

However we need to continue to take care that we do not consider The Model to be The Truth. The web based internet is a massive organic process that is similar to Nature, and we can develop models to observe its phenomena. We can use these models to build our tools on, but we have keep in mind that we cannot use the model force the organic process to behave the way we want it to.

Whether we use the REST model, or another model to be developed that appears to match it closer or from a different perspective, "the web" and other large scale distributed systems will continue to do "their thing", whatever model we put on it. The distributed, decentralized, bottom-up, autonomous nature of the web, exhibits complex organic interactions, that are not driven by models or laws, just as that Nature is not driven by the laws of Physics.

We must learn from Physics that models are imperfect and only an approximation of something that is much larger, and more complex that we can imagine. Models can be improved or replaced by others, and competing models can exist at the same time.

But in the end they are just models. They help us understand Nature, but they are not Nature itself.

Posted by Werner Vogels at December 17, 2004 01:00 PM

The Map is Not the Territory
Excerpt: Werner makes an excellent point; [W]e need to continue to take care that we do not consider The Model to be The Truth. The web based internet is a...
Weblog: mnot’s Web log
Tracked: February 7, 2005 05:45 PM