Business and Personal Finance Dictionary
# A B C D E F G H I J K L M N O P Q R S T U V W X Y Z
- MARTINGALE MEASURE
Any probability measure (q.v.) under which a stochastic variable is a martingale (q.v.), i.e., its expected change equals zero. Example: Consider the probability measure that assigns a probability of 1/2 to a head or a tail, and for which successive coin tosses are independent. Then let X(n) be the random variable that starts at zero and increases by one with each "heads" outcome and decreases by one with each "tails" outcome. Then E[X(n)-X(n-1)|X(n-1)] = 1/2 (1) + 1/2 (-1) = 0, and X(n) is a martingale.Back