Introduction to the Theory of Computing

Entropy, relative entropy. The second law of thermodynamics. «Asymptotic equipartition» property. Entropy and stochastic processes. Data compression. Optimal codes, Huffman code, Shannon-Fano-Elias code. Kolmogorov algorithmic complexity. Channel capacity. Shannon’s fundamental theorem. Differential entropy. Gauss bus. Information theory and advanced statistical issues. Maximum Entropy. Source coding. Approximation by Markov process. Galois bodies. Rate-loss function. Signals and noise. Error Correction Codes. Hamming codes, Reed-Muller codes. Information Theory Applications in investment theory.
Code Hours Type eClass Semester
ΗΥ030 4 Elective e-Class 3

bibliography:

  • ““Εισαγωγή στη θεωρία της πληροφορίας”, Αφράτη Φώτω, Εκδόσεις Συμμετρία, 1994”
  • ““Εισαγωγή στη θεωρία Πληροφοριών, Κωδίκων και Κρυπτογραφίας”, Ν. Αλεξανδρής, Β. Χρυσικόπουλος, Κ. Πατσάκης, Εκδόσεις Βαρβαρήγου, ISBN 978-960-7996-39-8, 2008”