*********************************************************************** EE385B - Wed, Feb 28 12:15pm, Gates Room 100 An Optimizing On-Chip Cache Delay Model Grant McFarland *********************************************************************** In order to realistically model the effect of different on-chip caches on processor performance, the access time of the cache must be modeled. Performing realistic circuit simulations of a large number of possible organizations is difficult and time consuming. Analytical models of cache delay provide quick estimates of access time to be used in making architectural tradeoffs. However, these models often require the user to provide large numbers of circuit dependent fitting parameters and ignore the common tradeoff of noise margin for delay. I will present a new on-chip cache delay model which automatically performs circuit optimizations to greatly reduce the number of fitting parameters required and models the effect of noise margin on delay.