Statistical Inference/Probability Theory

From Wiki**3

< Statistical Inference
Statistical Inference
Probability Theory
Transformations and Expectations
Common Families of Distributions
Multiple Random Variables
Properties of a Random Sample
Principles of Data Reductions
Point Estimation
Hypothesis Testing
Interval Estimation
Asymptotic Evaluations
Analysis of Variance and Regression
Regression Models

Set Theory

Sample Space [Definition 1.1.1]

The set, S, of all possible outcomes of a particular experiment is called the sample space for the experiment.

Event [Definition 1.1.2]

An event is any collection of possible outcomes of an experiment, that is, any subset of S (including S itself).

Let A be an event, a subset of S. We say the event A occurs if the outcome of the experiment is in the set A. When speaking of probabilities, we generally speak of the probability of an event, rather than a set. But we may use the terms interchangeably.

Base relationships

We first need to define formally the following two relationships, which allow us to order and equate sets:

Containment:

<amsmath>A \subset B \Leftrightarrow x \in A \Rightarrow x \in B</amsmath> 

Equality:

<amsmath>A = B \Leftrightarrow A \subset B \land B \subset A</amsmath>

Base operations

Given any two events (or sets) A and B , we have the following elementary set operations:

Union: The union of A and B, written <amsmath>A \cup B</amsmath>, is the set of elements that belong to either A or B or both:

<amsmath>A \cup B = \{ x : x \in A \lor x \in B \}</amsmath>

Intersection: The intersection of A and B, written <amsmath>A \cap B</amsmath>, is the set of elements that belong to both A and B:

<amsmath>A \cap B = \{ x : x \in A \land x \in B \}</amsmath>

Complementation: The complement of A, written <amsmath>A^c</amsmath>, is the set of all elements that are not in A:

<amsmath>A^c = \{ x : x \notin A \}</amsmath>

Event Operations

The elementary set operations can be combined: for any three events, A, B, and C, defined on a sample space S, the following relationships hold [Theorem 1.1.4].

Commutativity

  • <amsmath>A \cup B = B \cup A</amsmath>
  • <amsmath>A \cap B = B \cap A</amsmath>

Associativity

  • <amsmath>A \cup (B \cup C) = (A \cup B) \cup C</amsmath>
  • <amsmath>A \cap (B \cap C) = (A \cap B) \cap C</amsmath>

Distributive Laws

  • <amsmath>A \cap (B \cup C) = (A \cap B) \cup (A \cap C)</amsmath>
  • <amsmath>A \cup (B \cap C) = (A \cup B) \cap (A \cup C)</amsmath>

DeMorgan Laws

  • <amsmath>\overline{A \cup B} = \overline{A} \cap \overline{B}</amsmath>
  • <amsmath>\overline{A \cap B} = \overline{A} \cup \overline{B}</amsmath>

Basics of Probability Theory

Conditional Probability and Independence

Random Variables

Distribution Functions

Density and Mass Functions