Abstract: A scientifically adequate theory of semiotic processes must ultimately be founded on a theory of information that can unify the physical, biological, cognitive, and computational uses of the concept. Unfortunately, no such unification exists, and more importantly, the causal status of informational content remains ambiguous as a result. Lacking this grounding, semiotic theories have tended to be predominantly phenomenological taxonomies rather than dynamical explanations of the representational processes of natural systems. This paper argues that the problem of information that prevents the development of a scientific semiotic theory is the necessity of analyzing it as a negative relationship: defined with respect to absence. This is cryptically implicit in concepts of design and function in biology, acknowledged in psychological and philosophical accounts of intentionality and content, and is explicitly formulated in the mathematical theory of communication (aka “information theory”). Beginning from the base established by Claude Shannon, which otherwise ignores issues of content, reference, and evaluation, this two part essay explores its relationship to two other higher-order theories that are also explicitly based on an analysis of absence: Boltzmann’s theory of thermodynamic entropy (in Part 1) and Darwin’s theory of natural selection (in Part 2). This comparison demonstrates that these theories are both formally homologous and hierarchically interdependent. Their synthesis into a general theory of entropy and information provides the necessary grounding for theories of function and semiosis.