Skip to main content Accessibility help
×
Home
  • This chapter is unavailable for purchase
  • Print publication year: 2009
  • Online publication date: June 2012

12 - Decision trees

from PART TWO - LOWER BOUNDS FOR CONCRETE COMPUTATIONAL MODELS

Summary

Let no one say that taking action is hard … the hardest thing in the world is making a decision.

– Franz Grillparzer (1791–1872)

Currently, resolving many of the basic questions on the power of Turing machines seems out of reach. Thus it makes sense to study simpler, more limited computing devices as a way to get some insight into the elusive notion of efficient computation. Moreover, such limited computational models often arise naturally in a variety of applications, even outside computer science, and hence studying their properties is inherently worthwhile.

Perhaps the simplest such model is that of decision trees. Here the “complexity” measure for a Boolean function f is the number of bits we need to examine in an input x in order to compute f(x). This chapter surveys the basic results and open questions regarding decision trees. Section 12.1 defines decision trees and decision tree complexity. We also define nondeterministic and probabilistic versions of decision trees just as we did for Turing machines; these are described in Sections 12.2 and 12.3, respectively. Section 12.4 contains some techniques for proving lower bounds on decision trees. We also present Yao's Min Max Lemma (see Note 12.8), which is useful for proving lower bounds for randomized decision tree complexity and, more generally, lower bounds for randomized complexity in other computational models.