INFORMATION-THEORETIC METRICS, THERMODYNAMIC METRICS, AND THEIR APPLICATIONS
In this thesis, we present information-theoretic metrics, thermodynamic metrics and their applications in computer science. First, we show that the information rate of the language accepted by a reversal-bounded deterministic counter machine is computable. For the nondeterministic case, we provide computable upper bounds. For the class of languages accepted by multi-tape deterministic finite automata, the information rate is computable as well. Second, we study the Shannon information rate of accepting runs of various forms of automata. This rate is a complexity indicator for executions of these automata. Accepting runs of finite automata and reversal-bounded nondeterministic counter machines, as well as their restrictions and variations, are investigated and are shown, in many cases, to have computable execution rates. Third, we analyze the bit rate of programs and use it to conduct dynamic program analysis. We consider a program as a device that generates discrete time signals, where a signal is an execution. Shannon information rate, or bit rate, of the signal may not be uniformly distributed across the signal so that testing strategies may benefit from the non-uniform distribution. When the program is specified by a finite state transition system, algorithms are provided in identifying information-rich components, on which test cases should focus. For a black-box program that has a partial specification or does not even have a specification, a bit rate signal and its spectrum are studied, which make use of data compression and the Fourier transform. The signal provides a bit-rate coverage for testing the black-box while its spectrum indicates a visual representation for execution’s information characteristics. Fourth, we present a free energy model to measure similarity between automata and formal languages. Last, we propose two heuristic algorithms for graph isomorphism problem.