CaltechTHESIS
  A Caltech Library Service

Hard vs. soft bounds in probablilistic robustness analysis and generalized source coding and optimal web layout design

Citation

Zhu, Xiaoyun (2000) Hard vs. soft bounds in probablilistic robustness analysis and generalized source coding and optimal web layout design. Dissertation (Ph.D.), California Institute of Technology. doi:10.7907/1f3r-va82. https://resolver.caltech.edu/CaltechETD:etd-05042006-131410

Abstract

NOTE: Text or symbols not renderable in plain ASCII are indicated by [...]. Abstract is included in .pdf document. Part I: The relationship between hard vs. soft bounds and probabilistic vs. worst-case problem formulations for robustness analysis has been a source of some apparent confusion in the control community, and this thesis attempts to clarify some of these issues. Essentially, worst-case analysis involves computing the maximum of a function which measures performance over some set of uncertainty. Probabilistic analysis assumes some distribution on the uncertainty and computes the resulting probability measure on performance. Exact computation in each case is intractable in general. In the past most research focused on computing hard bounds on worst-case performance. This thesis explores the use of both hard and soft bounds in probabilistic robustness analysis, and investigates the computational complexity of the problems through extensive numerical experimentation. We focus on the simplest possible problem formulations that we believe reveal the difficulties associated with more general probabilistic analysis. By extending the standard structured singular value [...] framework to allow for probabilistic descriptions of uncertainty, probabilistic [...] is defined, which characterizes the probability distribution of some performance function. The computation of probabilistic [...] involves approximating the level surface of the function in the parameter space, which is even more complex than the worst-case [...] computation, a well-known NP-hard problem. In particular, providing sufficiently tight bounds in the tail of the distribution is extremely difficult. This thesis proposes three different methods for computing a hard upper bound on probabilistic [...] whose tightness can be tested by comparison with the soft bound provided by Monte-Carlo simulations. At the same time, the efficiency of the soft bounds can be significantly improved with the information from the hard bound computation. Among the three algorithms proposed, the LC-BNB algorithm is proven by numerical experiments to provide the best average performance on random examples. One particular example is shown in the end to demonstrate the effectiveness of the method. Part II: The design of robust and reliable networks and network services has become an increasingly challenging task in today's Internet world. To achieve this goal, understanding the characteristics of Internet traffic plays a more and more critical role. Empirical studies of measured traffic traces have led to the wide recognition of self-similarity in network traffic. Moreover, a direct link has been established between the self-similar nature of measured aggregate network traffic and the underlying heavy-tailed distributions of the Web traffic at the source level. This thesis provides a natural and plausible explanation for the origin of heavy tails in Web traffic by introducing a series of simplified models for optimal Web layout design with varying levels of realism and analytic tractability. The basic approach is to view the minimization of the average file download time as a generalization of standard source coding for data compression, but with the design of the Web layout rather than the codewords. The results, however, are quite different from standard source coding, as all assumptions produce power law distributions for a wide variety of user behavior models. In addition, a simulation model of more complex Web site layouts is proposed, with more detailed hyperlinks and user behavior. The throughput of a Web site can be maximized by taking advantage of information on user access patterns and rearranging (splitting or merging) files on the Web site accordingly, with a constraint on available resources. A heuristic optimization on random graphs is formulated, with user navigation modeled as Markov Chains. Simulations on different classes of graphs as well as more realistic models with simple geometries in individual Web pages all produce power law tails in the resulting size distributions of the files transferred from the Web sites. This again verifies our conjecture that heavy-tailed distributions result naturally from the tradeoff between the design objective and limited resources, and suggests a methodology for aiding in the design of high-throughput Web sites.

Item Type:Thesis (Dissertation (Ph.D.))
Degree Grantor:California Institute of Technology
Division:Engineering and Applied Science
Major Option:Electrical Engineering
Thesis Availability:Public (worldwide access)
Research Advisor(s):
  • Doyle, John Comstock
Thesis Committee:
  • Doyle, John Comstock (chair)
  • Chandy, K. Mani
  • Effros, Michelle
  • Murray, Richard M.
  • Hou, Thomas Y.
Defense Date:16 May 2000
Record Number:CaltechETD:etd-05042006-131410
Persistent URL:https://resolver.caltech.edu/CaltechETD:etd-05042006-131410
DOI:10.7907/1f3r-va82
Default Usage Policy:No commercial reproduction, distribution, display or performance rights in this work are provided.
ID Code:1607
Collection:CaltechTHESIS
Deposited By: Imported from ETD-db
Deposited On:19 May 2006
Last Modified:19 Apr 2021 22:37

Thesis Files

[img]
Preview
PDF (Zhu_x_2000.pdf) - Final Version
See Usage Policy.

7MB

Repository Staff Only: item control page