Functions, categories, and information. How mathematical objects relate to each other — morphisms, entropy, signal processing.
Mapping (Part VII: Mapping) Chapters: 22, 23, 24, 25 Plane Position: (0.2, 0.4) radius 0.4 Primitives: 42
Functions, categories, and information. How mathematical objects relate to each other — morphisms, entropy, signal processing.
Key Concepts: Category, Probability Axioms, Functor, Natural Transformation, Shannon Entropy
Category (definition): A category C consists of a collection of objects ob(C), a collection of morphisms hom(C) between objects, an identity morphism id_A for each object A, and a composition operation that is associative and respects identities.
Probability Axioms (axiom): Kolmogorov's axioms: For a sample space Omega with sigma-algebra F, a probability measure P satisfies: (1) P(A) >= 0 for all A in F, (2) P(Omega) = 1, (3) P(union A_i) = sum P(A_i) for countably many disjoint events A_i.
Functor (definition): A functor F: C -> D maps objects of C to objects of D and morphisms of C to morphisms of D, preserving identity morphisms F(id_A) = id_{F(A)} and composition F(g . f) = F(g) . F(f).
Natural Transformation (definition): A natural transformation eta: F => G between functors F, G: C -> D is a family of morphisms eta_A: F(A) -> G(A) for each object A in C, such that for every morphism f: A -> B in C, the diagram commutes: G(f) . eta_A = eta_B . F(f).
Shannon Entropy (definition): For a discrete random variable X with probability mass function P(x), the Shannon entropy is H(X) = -sum_x P(x) log_2 P(x), measuring the average information content in bits per symbol.
Fourier Transform (definition): The Fourier transform of an integrable function f: R -> C is F{f}(xi) = integral_{-inf}^{inf} f(t) e^{-2pi i xi t} dt, mapping the function from the time domain to the frequency domain.
Random Variable (definition): A random variable X: Omega -> R is a measurable function from the sample space to the real numbers, assigning a numerical value to each outcome. Its distribution is characterized by the CDF F_X(x) = P(X <= x).
Variance (definition): The variance of a random variable X is Var(X) = E[(X - E[X])^2] = E[X^2] - (E[X])^2, measuring the expected squared deviation from the mean. Standard deviation sigma = sqrt(Var(X)).
Probability Distributions (definition): A probability distribution assigns probabilities to events. Key distributions: Binomial B(n,p) with PMF C(n,k)p^k(1-p)^{n-k}; Poisson Poi(lambda) with PMF e^{-lambda}lambda^k/k!; Normal N(mu,sigma^2) with PDF (1/(sigma*sqrt(2pi)))exp(-(x-mu)^2/(2sigma^2)).
Joint Probability Distribution (definition): The joint distribution of random variables X, Y is P(X in A, Y in B) for all measurable sets A, B. For continuous variables, the joint density f_{X,Y}(x,y) satisfies P(X in A, Y in B) = integral_A integral_B f(x,y) dy dx.