InformationBased Decision: Uncertainty, Risk, and Valuation
This lesson presents different approaches for determining the value of information, including a way to assess the value of "perfect" information. Introduces the concept of uncertainty and risk and the influence they have on the value of information. Shows how to use decision trees to calculate expected value of information to aid decision making. Introduces Bayesian analysis of risk.
Objectives
Upon completion of this lesson, you will be able to
Upon completion of this lesson, you will be able to
 describe the value of information
 distinguish between uncertainty and risk
 build decision trees to quantify risk
 use probability to describe risk
 calculate probabilities using Bayesian Analysis
 determine the value of "perfect information"
 construct decision models in spreadsheet programs
Required Lessons

Lesson 1

Lesson 2

Lesson 3

Lesson 4

Lesson 5
<
>
Probability & Probabilistic Information

This lesson introduces the very basic concepts in probability for the purpose of building decision trees.

Rational Decision Making Using Decision Trees

Decision trees are a common method for visualizing a decision path that incorporates chance. In this context decision trees are employed for rational decision making and not for classification of an object. The latter is a common application of decision trees in machine learning in which an object's type is determined through a series of decisions based on properties of the object. The decision making process presumes a rational decision maker who bases their decision on some aspect of utility (economic utility, money, time, happiness, etc.) and that utility can be quantified.

Bayesian Inference

Bayesian inference is a decision making method where one seeks to determine the likely cause of some outcome. It is a type of hypothesis testing approach in which there are several potential reasons for some observed outcome and a decision maker wants to know the probability of each underlying cause so they can choose the most likely cause, i.e., the cause with the highest probability. Bayesian inference is an alternative to the Fisherian approach of hypothesis testing which is based on rejecting the null hypothesis at some level of probability. Hypothesis tests such as the ttest or the ztest are examples of Fisherian hypothesis tests. Bayesian hypothesis testing is often used in medicine for disease screening and in machine learning for spam filtering.

Bayesian Analysis in Practice: Naive Bayes Decision Making

This example shows an application of Bayesian analysis in a machine learning context using empirically derived probabilities to make a decision.

Making Decisions: Perfect Information

Often a decision maker has to decide whether to seek more information in order to choose a path in a decision process. However, seeking additional information often requires effort which takes time and may have an economic cost, i.e., you have to pay for it. The question (or decision) then becomes: do I  as the decision maker  pay for additional information and how much should I pay. The most common approach is to calculate what the decision payoff would be if the new information was "perfect" and allowed a decision maker to make the best possible decision (given all other uncertainties). The difference between the payoff without and with the information is the limiting case and thus the value of that "perfect information". Since no information is perfect, the decision maker should never pay more for "imperfect information" than the value of "perfect information".

Suggested Readings
Data Sets