202405201744
Status: #idea
Tags: Probability
Probability Measure (According to Kolmogorov)
A Probability measure is a special type of measure which maps elements from a
It has the same properties as you would expect for a standard measure:
- If
are disjoint then
This is the standard definition of probability. Note that there is a different but congruent definition of probability which can be built not from Measure Theory axioms, but from the concept Expected Value directly. This is Probability Measure (Based on Expected Value).
Since they are congruent and have the same properties one can go from one to the other with impunity, but it is important to note.
Properties of Probability Measures
Formula for Complement
But from that we can derive everything we come to know about probability.
For example, since we know that
You can also prove easily that
Using Measure Theory as the foundation of Probability Theory makes all the other derivations similarly beautiful.
Formula for unions : The Inclusion-Exclusion Principle
Let's first start by proving the following
This is my derivation of it.
Boom!
This gives us the rigorous proof for why this equality holds, now what is the general form of the formula? What if instead of
What is the formula for

screencap from Wikipedia cause I ain't typing allat.
How to Prove it?
- Induction
- Indicator Variables and Expectation
I did the latter for an assignment, for the former... well good luck with that.
Continuity of Probability Measures
If
This looks rather obvious (I mean it looks really similar to how we definite infinite summations,) but there's actually more to this statement than meets the eye. This is a really important theorem that is used all the time in Probability Theory. Also, among other things we are NOT taking the limit inside the brackets, an actual rigorous proof is required to show the equivalency.
Also
(this limit refers to increasing inclusions of sets) it is not wise to move the limits in or out.
Indeed, one should be careful when introducing limits in a Measure Theory context, especially when it comes to bringing a limit in and out of a measure. Unless there's a specific argument to support it, or a convergence theorem that says we can, one should not assume it is correct to do so.
A proof of this theorem can be found on YouTube, the second link in the references covers it during the lecture.
Corollary
1.
If
Then by the previous result:
2.
If
Then by previous result and DeMorgan's:
This is not a typo, in fact it makes perfect sense. If I keep taking intersections of a sequence of set which are non-increasing (in the sense that
Observation, we see that if the sequence is non-increasing, or non-decreasing the limit of an infinite sequence of set will simply be
Union-Bound Property
Let
Intuitively, if all the
This can be proven pretty neatly using Indicator Variables, but it can be also shown directly using the
References
Probability Spaces
Probability Measure (Based on Expected Value)
Probability Measures Lecture