Posts
Logical Quantifier
Cancel

Convolutions of RVs — Part 2: A linear combination of uniform RVs

1. Introduction In the last post, we defined the convolution of the PDFs \(f_{X}\) and \(f_{Y}\) of two independent, continuous random variables \(X\) and \(Y\), and showed that it computed the ...

Convolutions of random variables — Part 1: Introduction

1. Introduction Suppose that \(X\) and \(Y\) are two continuous random variables. Given their PDFs PDFs \(f_{X}\) and \(f_{Y}\), how do you compute the \(f_{X + Y}\) of the sum of \(X+Y\)? In th...

Order statistics — Part 3: Proof of expectation formula for uniform RVs

1. Introduction In the last post, we proved the following general formulae, after conducting some numerical experiments to gain intuition: \begin{align} F_{X_{(k)}}(x) &= \sum_{j=k}^{n} \bi...

Order statistics — Part 2: General formulae for PDFs and expectations

1. Introduction In the last post, we defined the order statistics of a collection of iid random variables \(X_{1}, \ldots, X_{n}\), to try and answer questions like “What’s the expected value of...

Order statistics — Part 1: Introduction and definitions

1. Introduction On average, how tall is the tallest 8th grader in a class of 30 students, in the US? What is the value of the average winning bid in a Vickery Auction, where the highest bidder w...

Cross entropy — Part 2: Convexity of the objective function

1. Introduction In the previous post, I recalled the definition of the cross entropy \(H(p, q)\) of two discrete PDFs \(p\) and \(q\) over the same support \(S = \{ x_{1}, \ldots, x_{n} \}\). It...

Cross entropy — Part 1: Introduction and definitions

1. Introduction In this post, I discuss the cross entropy \(H(p, q)\) of two discrete PDFs \(p\) and \(q\). It is a loose measure of similarity of \(p\) and \(q\), and so is used in machine lear...