Abstract
The notion of δ-mutual information between non-stochastic uncertain variables is introduced as a generalization of Nair's non-stochastic information functional. Several properties of this new quantity are illustrated and used in a communication setting to show that the largest δ-mutual information between received and transmitted codewords over ϵ-noise channels equals the (ϵ,δ)-capacity. This notion of capacity generalizes the Kolmogorov ϵ-capacity to packing sets of overlap at most δ and is a variation of a previous definition proposed by one of the authors. Results are then extended to more general noise models, including non-stochastic, memoryless, and stationary channels. The presented theory admits the possibility of decoding errors, as in classical information theory, while retaining the worst-case, non-stochastic character of Kolmogorov's approach.