If you mean from a client point of view, knowing what sites are involved in loading a page may help tracing the boundaries of a web application, and possibly detect cross-site scripting attacks.
The behavior analysis of HTTP requests can be carried out at both the ends: server side and client side. There are a lot of things one can analyze from the behavioral analysis e.g. average end-to-end delays, average throughput, etc. For example, at the client side, one might be interested in the average end-to-end delays for a particular web server and the throughput of the connection. At the server side, one might be interested, for example, in which parts of the world, the particular webpage is accessed or downloaded. There are several tools for carrying out the analysis of HTTP requests.
HTTP requests consider the benchmark of performance metrics to calculate the CIA (confidentially, Integrity and Availability for your data) for your network connections. It is depending on your application. Web pages security issues are considered the most important matter to calculate based on HTTP behavior.
There are other aspects than computing power and rapid answer. In the logs you have the word use in a research services, and you can analyses the independant entries and see what are the track of your visitor ttghrough your site and the time he spent on each page or not. I have used a very complex tools in perl that analyse the http logs for that services. I will try to recall.