Processing math: 100%
Posts
Logical Quantifier
Cancel

Convolutions of RVs — Part 2: A linear combination of uniform RVs

1. Introduction In the last post, we defined the convolution of the PDFs fX and fY of two independent, continuous random variables X and Y, and showed that it computed the ...

Convolutions of random variables — Part 1: Introduction

1. Introduction Suppose that X and Y are two continuous random variables. Given their PDFs PDFs fX and fY, how do you compute the fX+Y of the sum of X+Y? In th...

Order statistics — Part 3: Proof of expectation formula for uniform RVs

1. Introduction In the last post, we proved the following general formulae, after conducting some numerical experiments to gain intuition: \begin{align} F_{X_{(k)}}(x) &= \sum_{j=k}^{n} \bi...

Order statistics — Part 2: General formulae for PDFs and expectations

1. Introduction In the last post, we defined the order statistics of a collection of iid random variables X1,,Xn, to try and answer questions like “What’s the expected value of...

Order statistics — Part 1: Introduction and definitions

1. Introduction On average, how tall is the tallest 8th grader in a class of 30 students, in the US? What is the value of the average winning bid in a Vickery Auction, where the highest bidder w...

Cross entropy — Part 2: Convexity of the objective function

1. Introduction In the previous post, I recalled the definition of the cross entropy H(p,q) of two discrete PDFs p and q over the same support S={x1,,xn}. It...

Cross entropy — Part 1: Introduction and definitions

1. Introduction In this post, I discuss the cross entropy H(p,q) of two discrete PDFs p and q. It is a loose measure of similarity of p and q, and so is used in machine lear...