The Art of Computer Programming / Edition 2

Hardcover (Print)
Used and New from Other Sellers
Used and New from Other Sellers
from $9.82
Usually ships in 1-2 business days
(Save 87%)
Other sellers (Hardcover)
  • All (23) from $9.82   
  • New (13) from $34.83   
  • Used (10) from $9.82   

Overview

The bible of all fundamental algorithms and the work that taught many of today's software developers most of what they know about computer programming.

Byte, September 1995

I can't begin to tell you how many pleasurable hours of study and recreation they have afforded me! I have pored over them in cars, restaurants, at work, at home... and even at a Little League game when my son wasn't in the line-up.

—Charles Long

If you think you're a really good programmer... read [Knuth's] Art of Computer Programming... You should definitely send me a resume if you can read the whole thing.

—Bill Gates

It's always a pleasure when a problem is hard enough that you have to get the Knuths off the shelf. I find that merely opening one has a very useful terrorizing effect on computers.

—Jonathan Laventhol

The first revision of this third volume is the most comprehensive survey of classical computer techniques for sorting and searching. It extends the treatment of data structures in Volume 1 to consider both large and small databases and internal and external memories. The book contains a selection of carefully checked computer methods, with a quantitative analysis of their efficiency. Outstanding features of the second edition include a revised section on optimum sorting and new discussions of the theory of permutations and of universal hashing.

Read More Show Less

Editorial Reviews

Booknews
Knuth is eminent for his pioneering work on algorithms and programming techniques, for his invention of TEX and METAFONT systems for computer typesetting, and for his prolific and influential writing. This volume, revised from its 1980 version and the second of what will be a seven-volume work, offers a complete introduction to the field of seminumerical algorithms, with separate chapters on random numbers and arithmetic. Summarizing the major paradigms and basic theory of such algorithms, it provides a comprehensive interface between computer programming and numerical analysis. Noteworthy in this edition is Knuth's new treatment of random number generators, and his discussion of calculations with formal power series. Annotation c. by Book News, Inc., Portland, Or.
Booknews
This last volume in a set of three surveys classical computer techniques for sorting and searching, considering both large and small databases and internal and external memories. Recently updated to reflect new developments in the field, the volume also contains several hundred new exercises. This new edition of Volume Three is described as matching Volume One (3rd edition) covering fundamental algorithms and Volume Two (3rd edition) treating seminumerical algorithms. Annotation c. by Book News, Inc., Portland, Or.
Read More Show Less

Product Details

  • ISBN-13: 9780201896855
  • Publisher: Addison-Wesley
  • Publication date: 4/17/1998
  • Edition description: REV
  • Edition number: 2
  • Pages: 780
  • Sales rank: 962748
  • Product dimensions: 6.84 (w) x 9.59 (h) x 1.73 (d)

Meet the Author

Donald E. Knuth is known throughout the world for his pioneering work on algorithms and programming techniques, for his invention of the Tex and Metafont systems for computer typesetting, and for his prolific and influential writing. Professor Emeritus of The Art of Computer Programming at Stanford University, he currently devotes full time to the completion of these fascicles and the seven volumes to which they belong.

Read More Show Less

Read an Excerpt

Chapter 5: Sorting


...Overcoming latency time. Let us consider first the problem of minimizing the delays caused by the fact that the disks aren't always positioned properly when we want to start an I/O command. We can't make the disk spin faster, but we can still apply some tricks that reduce or even eliminate all of the latency time. The addition of more access arms would obviously help, but that would be an expensive hardware modification. Here are some software ideas:

  • If we read or write several tracks of a cylinder at a time, we avoid the latency time (and the seek time) on all tracks but the first. In general it is often possible to synchronize the computing time with the disk movement in such a way that a sequence of input/output instructions can be carried out without latency delays.
  • Consider the problem of reading half a track of data (Fig. 90): If the read command begins when the heads are at axis A, there is no latency delay, and the total time for reading is just the transmission time, 1/2 x 25ms. If the command begins with the heads at B, we need 1/4 of a revolution for latency and 1/2 for transmission, totaling 3/4 x 25ms. The most interesting case occurs when the heads are initially at C: With proper hardware and software we need not waste 3/4 of a revolution for latency delay. Reading can begin immediately, into the second half of the input buffer; then after a 1/2 x 25ms pause, reading can resume into the first half of the buffer, so that the instruction is completed when axis C is reached again. In a similar manner, we can ensure that the total latency plus transmission time will never exceed the time for one revolution, regardless of the initial position of the disk. The average amount of latency delay is reduced by this scheme from half a revolution to 1/2(1-x2) of a revolution, if we are reading or writing a given fraction x of a track, for 0 < x is less than or equal to 1. When an entire track is being read or written (x=1), this technique eliminates all the latency time.
Drums: The no-seek case. Some external memory units, traditionally called drum memories, eliminate the seek time by having one read/write head for every track. If the technique of Fig. 90 is employed on such devices, both seek time and latency time reduce to zero, provided that we always read or write a track at a time; this is the ideal situation in which transmission time is the only limiting factor.

Let us consider again the example application of Section 5.4.6, sorting 100,000 records of 100 characters each, with a 100,000-character internal memory. The total amount of data to be sorted fills half of a MIXTEC disk. It is usually impossible to read and write simultaneously on a single disk unit; we shall assume that two disks are available, so that reading and writing can overlap each other. For the moment we shall assume, in fact, that the disks are actually drums, containing 4000 tracks of 5000 characters each, with no seek time required.

What sorting algorithm should be used? The method of merging is a fairly natural choice; other methods of internal sorting do not lend themselves so well to a disk implementation, except for the radix techniques of Section 5.2.5. The considerations of Section 5.4.7 show that radix sorting techniques of Section 5.2.5. The considerations of Section 5.4.7 show that radix sorting is usually inferior to merging for general-purpose applications, because the duality theorem of that section applies to disks as well as to tapes. Radix sorting does have a strong advantage, however, when the keys are uniformly distributed and many disks can be used in parallel, because an initial distribution by the most significant digits of the keys will divide the work up into independent subproblems that need no further communication. (See, for example, R.C. Agarwal, SIGMOD Record 25, 2 (June 1996), 240-246.)...

Read More Show Less

Table of Contents

5. Sorting.

Combinatorial Properties of Permutations.

Inversions.

Permutations of a Multiset.

Runs.

Tableaux and Involutions.

Internal sorting.

Sorting by Insertion.

Sorting by Exchanging.

Sorting by Selection.

Sorting by Merging.

Sorting by Distribution.

Optimum Sorting.

Minimum-Comparison Sorting.

Minimum-Comparison Merging.

Minimum-Comparison Selection.

Networks for Sorting.

External Sorting.

Multiway Merging and Replacement Selection.

The Polyphase Merge.

The Cascade Merge.

Reading Tape Backwards.

The Oscillating Sort.

Practical Considerations for Tape Merging.

External Radix Sorting.

Two-Tape Sorting.

Disks and Drums.

Summary, History, and Bibliography.

6. Searching.

Sequential Searching.

Searching by Comparison of Keys.

Searching an Ordered Table.

Binary Tree Searching.

Balanced Trees.

Multiway Trees.

Digital Searching.

Hashing.

Retrieval on Secondary Keys.

Answers to Exercises.

Appendix A: Tables of Numerical Quantities.

Fundamental Constants (decimal).

Fundamental Constants (octal).

Harmonic Numbers, Bernoulli Numbers, Fibonacci Numbers.

Appendix B: Index to Notations.

Index and Glossary. 0201896850T04062001

Read More Show Less

Preface

Cookery is become an art,
a noble science;
cooks are gentlemen.

TITUS LIVIUS, Ab Urbe Condita XXXIX.vi
(Robert Burton, Anatomy of Melancholy 1.2.2.2)

This book forms a natural sequel to the material on information structures in Chapter 2 of Volume 1, because it adds the concept of linearly ordered data to the other basic structural ideas.

The title "Sorting and Searching" may sound as if this book is only for those systems programmers who are concerned with the preparation of general-purpose sorting routines or applications to information retrieval. But in fact the area of sorting and searching provides an ideal framework for discussing a wide variety of important general issues:

  • How are good algorithms discovered?
  • How can given algorithms and programs be improved?
  • How can the efficiency of algorithms be analyzed mathematically?
  • How can a person choose rationally between different algorithms for the same task?
  • In what senses can algorithms be proved ''best possible''?
  • How does the theory of computing interact with practical considerations?
  • How can external memories like tapes, drums, or disks be used efficiently with large databases?

Indeed, I believe that virtually every important aspect of programming arises somewhere in the context of sorting or searching!

This volume comprises Chapters 5 and 6 of the complete series. Chapter 5 is concerned with sorting into order; this is a large subject that has been divided chiefly into two parts, internal sorting and external sorting. There also are supplementary sections, which develop auxiliarytheories about permutations (Section 5.1) and about optimum techniques for sorting (Section 5.3). Chapter 6 deals with the problem of searching for specified items in tables or files; this is subdivided into methods that search sequentially, or by comparison of keys, or by digital properties, or by hashing, and then the more difficult problem of secondary key retrieval is considered. There searching related to sorting is a surprising amount of interplay between both chapters, with strong analogies tying the topics together. Two important varieties of information structures are also discussed, in addition to those considered in Chapter 2, namely priority queues (Section 5.2.3) and linear lists represented as balanced trees (Section 6.2.3).

Like Volumes 1 and 2, this book includes a lot of material that does not appear in other publications. Many people have kindly written to me about their ideas, or spoken to me about them, and I hope that I have not distorted the material too badly when I have presented it in my own words.

I have not had time to search the patent literature systematically; indeed, I decry the current tendency to seek patents on algorithms (see Section 5.4.5). If somebody sends me a copy of a relevant patent not presently cited in this book, I will dutifully refer to it in future editions. However, I want to encourage people to continue the centuries-old mathematical tradition of putting newly discovered algorithms into the public domain. There are better ways to earn a living than to prevent other people from making use of one's contributions to computer science.

Before I retired from teaching, I used this book as a text for a student's second course in data structures, at the junior-to-graduate level, omitting most of the mathematical material. I also used the mathematical portions of this book as the basis for graduate-level courses in the analysis of algorithms, emphasizing especially Sections 5.1, 5.2.2, 6.3, and 6.4. A graduate-level course on concrete computational complexity could also be based on Sections 5.3, and 5.4.4, together with Sections 4.3.3, 4.6.3, and 4.6.4 of Volume 2.

For the most part this book is self-contained, except for occasional discussions relating to the MIX computer explained in Volume 1. Appendix B MIX computer contains a summary of the mathematical notations used, some of which are a little different from those found in traditional mathematics books.

Preface to the Second Edition

This new edition matches the third editions of Volumes 1 and 2, in which I have been able to celebrate the completion of Tex and MF by applying those systems to the publications they were designed for.

The conversion to electronic format has given me the opportunity to go over every word of the text and every punctuation mark. I've tried to retain the youthful exuberance of my original sentences while perhaps adding some more mature judgment. Dozens of new exercises have been added; dozens of old exercises have been given new and improved answers. Changes appear everywhere, but most significantly in Sections 5.1.4 (about permutations and tableaux), 5.3 (about optimum sorting), 5.4.9 (about disk sorting), 6.2.2 (about entropy), 6.4 (about universal hashing), and 6.5 (about multidimensional trees and tries).

The Art of Computer Programming is, however, still a work in progress. Research on sorting and searching continues to grow at a phenomenal rate. Therefore some parts of this book are headed by an ''under construction'' icon, to apologize for the fact that the material is not up-to-date. For example, if I were teaching an undergraduate class on data structures today, I would surely discuss randomized structures such as treaps at some length; but at present, I am only able to cite the principal papers on the subject, and to announce plans for a future Section 6.2.5 (see page 6.2.5). My files are bursting with important material that I plan to include in the final, glorious, third edition of Volume 3, perhaps 17 years from now. But I must finish Volumes 4 and 5 first, and I do not want to delay their publication any more than absolutely necessary.

I am enormously grateful to the many hundreds of people who have helped me to gather and refine this material during the past 35 years. Most of the hard work of preparing the new edition was accomplished by Phyllis Winkler (who put the text of the first edition into Tex form), by Silvio Levy (who edited it extensively and helped to prepare several dozen illustrations), and by Jeffrey Oldham (who converted more than 250 of the original illustrations to METAPOST format). The production staff at Addison Wesley has also been extremely helpful, as usual.

D. E. K.
Stanford, California
February 1998
There are certain common Privileges of a Writer,
the Benefit whereof, I hope, there will be no Reason to doubt;
Particularly, that where I am not understood, it shall be concluded,
that something very useful and profound is coucht underneath.

JONATHAN SWIFT, Tale of a Tub, Preface (1704)
Read More Show Less

Customer Reviews

Average Rating 5
( 1 )
Rating Distribution

5 Star

(1)

4 Star

(0)

3 Star

(0)

2 Star

(0)

1 Star

(0)

    If you find inappropriate content, please report it to Barnes & Noble
    Why is this product inappropriate?
    Comments (optional)