Skip to content

symmetric matrix

Bivariate Normal Distribution

Bivariate normal distribution – derivation by linear transformation of a random vector of two independent Gaussians

In an another post on properties of a Bivariate Normal Distribution [BVD] I have motivated the form of its probability density function [pdf] by symmetry arguments and the underlying probability density functions of its marginals, namely 1-dimensional Gaussians. In this post we will derive the probability density function by following the line of argumentation for a general Multivariate Normal Distribution… Read More »Bivariate normal distribution – derivation by linear transformation of a random vector of two independent Gaussians

Linear transformed 3-dim Z-distribution

Multivariate Normal Distributions – II – Linear transformation of a random vector with independent standardized normal components

In Machine Learning we typically deal with huge, but finite vector distributions defined in the ℝn. At least in certain regions of the ℝn these distributions may approximate an underlying continuous distribution. In the first post of this series we worked with a special type of continuous vector distribution based on independent 1-dimensional standardized normal distributions for the vector components.… Read More »Multivariate Normal Distributions – II – Linear transformation of a random vector with independent standardized normal components

Surfaces of n-dimensional ellipsoids – I – quadratic form and matrix equation

Multidimensional ellipsoids are mathematically interesting figures per se. But there is a reason why they sometimes also appear in the context of Machine Learning experiments. One reason is that Multivariate Normal Distributions [MND] often describe the statistical distributions of properties which characterize natural objects we investigate by ML-methods. And the locations of constant probability density of MNDs are surfaces of… Read More »Surfaces of n-dimensional ellipsoids – I – quadratic form and matrix equation