Block-oriented compression techniques for large statistical databases

Disk I/O has long been a performance bottleneck for very large databases. Database compression can be used to reduce disk I/O bandwidth requirements for large data transfers. The authors explore the compression of large statistical databases and propose techniques for organizing the compressed data...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on knowledge and data engineering Vol. 9; no. 2; pp. 314 - 328
Main Authors Wee-Keong Ng, Ravishankar, C.V.
Format Journal Article
LanguageEnglish
Published New York, NY IEEE 01.03.1997
IEEE Computer Society
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Disk I/O has long been a performance bottleneck for very large databases. Database compression can be used to reduce disk I/O bandwidth requirements for large data transfers. The authors explore the compression of large statistical databases and propose techniques for organizing the compressed data such that standard database operations such as retrievals, inserts, deletes and modifications are supported. They examine the applicability and performance of three methods. Two of these are adaptions of existing methods, but the third, called tuple differential coding (TDC), is a new method that allows conventional access mechanisms to be used with the compressed data to provide efficient access. They demonstrate how the performance of queries that involve large data transfers can be improved with these database compression techniques.
Bibliography:ObjectType-Article-2
SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 23
ISSN:1041-4347
1558-2191
DOI:10.1109/69.591455