Victoria Kostina

About Me

Victoria Kostina

I am a Professor of Electrical Engineering and Computing and Mathematical Sciences at Caltech.

Prior to joining Caltech in the fall of 2014, I had received a Bachelor's degree from Moscow Institute of Physics and Technology, where I was affiliated with the Institute for Information Transmission Problems of the Russian Academy of Sciences, a Master's degree from University of Ottawa, and a PhD degree from Princeton University. I spent the spring of 2015 as a Research Fellow at Simons Institute for the Theory of Computing.

I received the 2013 Princeton Electrical Engineering Best Dissertation Award and the 2017 NSF CAREER award.

My research interests lie in information theory, theory of random processes, coding, wireless communications, learning, and control. I am particularly interested in fundamental limits of delay-sensitive communications.



I am looking for strong students and postdocs to join my research group.

Prospective students: I apologize I am unable to respond to all inquiries — please do apply online and mention my name as a possible research advisor. I supervise students from the Computing and Mathematical Sciences (CMS), Control and Dynamical Systems (CDS), and Electrical Engineering (EE) PhD programs.

Prospective postdocs: please apply through the Center for the Mathematics of Information (CMI) postdoctoral fellowship program and mention my name in the application.



Journal Articles

Ph.D. Dissertation

Conference Papers


  • M. Effros, V. Kostina, R. C. Yavas, Systems and methods for random access communication, US Patent No. 10,951,292, Mar. 16, 2021.

Matlab Toolbox


  • EE/CS/IDS 160: Fundamentals of Information Transmission and Storage Winter 2020, Winter 2021
    Basics of information theory : entropy, mutual information, source and channel coding theorems. Basics of coding theory: error-correcting codes for information transmission and storage, block codes, algebraic codes, sparse graph codes. Basics of digital communications: sampling, quantization, digital modulation, matched filters, equalization.
  • EE/CS/IDS 167: Introduction to Data Compression and Storage Spring 2017, Spring 2019
    Prerequisites: Ma 3 or ACM/EE/IDS 116. The course will introduce the students to the basic principles and techniques of codes for data compression and storage. The students will master the basic algorithms used for lossless and lossy compression of digital and analog data and the major ideas behind coding for flash memories. Topics include the Huffman code, the arithmetic code, Lempel-Ziv dictionary techniques, scalar and vector quantizers, transform coding; codes for constrained storage systems.
  • EE/Ma/CS/IDS 127: Error-Correcting Codes Winter 2016, Winter 2017, Fall 2017, Winter 2019
    This course develops from first principles the theory and practical implementation of the most important techniques for combating errors in digital transmission or storage systems. Topics include algebraic block codes, e.g., Hamming, BCH, Reed-Solomon (including a self-contained introduction to the theory of finite fields); and the modern theory of sparse graph codes with iterative decoding, e.g., LDPC codes, turbo codes. The students will become acquainted with encoding and decoding algorithms, design principles, and performance evaluation of codes.
  • EE/Ma/CS/IDS 136: Topics in Information Theory Spring 2016, Spring 2018
    Prerequisites: Ma 3 or ACM/EE/IDS 116. This class introduces information measures such as entropy, information divergence, mutual information, information density from a probabilistic point of view, and discusses the relations of those quantities to problems in data compression and transmission, statistical inference, language modeling, game theory, and control. Topics include information projection, data processing inequalities, sufficient statistics, hypothesis testing, single-shot approach in information theory, large deviations.
  • EE 150: Nonasymptotic Information Theory Fall 2014
    Prerequisites: EE/Ma 126. Delay-constrained theory of information: single-shot results, information spectrum methods. Information-theoretic limits for sources and channels with memory and/or general alphabets. Advantages of variable-length, feedback, and joint source-channel coding in the nonasymptotic regime. Error exponents, source, and channel dispersion.


In the News


Office: Moore 162A
Mailing address: 1200 E California Blvd
MC 136-93
Pasadena CA 91125
Phone number: (626) 395-1320
Admin Assistant: Gabrielle Weise
Moore 162B
626-395-4715  email:

Last updated September 12, 2022   .