Accession Number:

ADA295490

Title:

Informed Prefetching and Caching,

Descriptive Note:

Corporate Author:

CARNEGIE-MELLON UNIV PITTSBURGH PA SCHOOL OF COMPUTER SCIENCE

Report Date:

1995-05-11

Pagination or Media Count:

26.0

Abstract:

The underutilization of disk parallelism and file cache buffers by traditional file systems induces IO stall time that degrades the performance of modern microprocessor-based systems. In this paper, we present aggressive mechanisms that tailor file system resource management to the needs of IO-intensive applications. In particular, we show how to use application-disclosed access patterns hints to expose and exploit IO parallelism and to allocate dynamically file buffers among three competing demands prefetching hinted blocks, caching hinted blocks for reuse, and caching recently used data for unhinted accesses. Our approach estimates the impact of alternative buffer allocations on application execution time and applies a cost-benefit analysis to allocate buffers where they will have the greatest impact. We implemented informed prefetching and caching in DECs OSF1 operating system and measured its performance on a 150 MHz Alpha equipped with 15 disks running a range of applications including text search, 3D scientific visualization, relational database queries, speech recognition, and computational chemistry. Informed prefetching reduces the execution time of the first four of these applications by 20 to 87. Informed caching reduces the execution time of the fifth application by up to 30.

Subject Categories:

  • Computer Programming and Software
  • Computer Hardware

Distribution Statement:

APPROVED FOR PUBLIC RELEASE