Informationstheorie und Codierung



Zeit und Ort:

  • Di 12:15-13:45, Raum H6
  • Alle zwei Wochen Mi 14:15-15:45, Raum R4.15

Studienfächer / Studienrichtungen

  • WF EEI-BA 56
  • PF EEI-MA-INT 1234
  • PF CE-BA-TA-IT 5
  • WF CE-MA-TA-IT 1
  • PF EEI-BA-INT 56
  • WPF IuK-BA 56
  • PF IuK-MA-ÜTMK-EEI 1234
  • WPF IuK-MA-ES-EEI 1234
  • WPF IuK-MA-KN-EEI 1234
  • WPF IuK-MA-MMS-EEI 1234
  • WPF IuK-MA-REA-EEI 1234
  • WPF IuK-MA-ES 1234
  • PF IuK-MA-KOMÜ 1234
  • WPF IuK-MA-MMS 1234
  • WPF WING-MA 1234
  • PF CME-MA 1
  • PF ASC-MA 1
  • WPF MT-MA-BDV 123456789ABCDEF


Introduction to coding and information theory (binomial distribution, (7,4)-Hamming code, parity-check matrix, generator matrix); Probability, entropy, and inference (entropy, conditional probability, Bayes’ law, likelihood, Jensen’s inequality); Inference (inverse probability, statistical inference); Source coding theorem (information content, typical sequences, Chebychev inequality, law of large numbers); Symbol codes (unique decidability, expected codeword length, prefix-free codes, Kraft inequality, Huffman coding); Stream codes (arithmetic coding, Lempel-Ziv coding, Burrows-Wheeler transform); Dependent random variables (mutual information, data processing lemma); Communication over a noisy channel (discrete memory-less channel, channel coding theorem, channel capacity); Noisy-channel coding theorem (jointly-typical sequences, proof of the channel coding theorem, proof of converse, symmetric channels); Gaussian channel (AWGN channel, multivariate Gaussian pdf, capacity of AWGN channel); Binary codes (minimum distance, perfect codes, why perfect codes are bad, why distance isn’t everything); Message passing (distributed counting, path counting, low-cost path, min-sum (=Viterbi) algorithm); Marginalization in graphs (factor graphs, sum-product algorithm); Low-density parity-check codes (density evolution, check node degree, regular vs. irregular codes, girth); Lossy source coding (transform coding and JPEG compression)