Marr's Levels of Analysis
Marr's Levels of Analysis are a framework to analyze systems on three levels of abstraction.[1][2] These levels specify (1) the system's input/output behavior, (2) the system's computational operations abstracted as algorithmic steps, and (3) the system's physical implementation. Marr's levels can be used to classify theories of consciousness.
Example: Selection Sort
Our running example to illustrate Marr's Levels will be a process implementing selection sort, arguably the simplest sorting algorithm. Note that selection sort is generally defined as an algorithm, so whether or not a physical process is an instance of selection sort will depend on the algorithmic description.
Computational Level
The computational level is the most abstract level of the three. You can think of it as treating the process as a black box, where the internal mechanisms are hidden. The computational description consists of:
- the input (which here is an unsorted list)
- the output (which here is a sorted version of the same list)
Additionally, one may consider the runtime (which here depends on the size of the list but will generally be proportional to ) to be part of the input/output description since it can be observed from the outside, even though this wasn't done in Marr's original formulation. Unlike time complexities typically given in complexity theory, this time would be specified absolutely (i.e., as a specific time in seconds) rather than asymptotically.
Note that the name "computational level" is standard but arguably misleading since "computation" doesn't generally imply omitting all algorithmic details. A more precise name would be "input/output level".
Algorithmic Level
The algorithmic level specifies the computational steps of a process. In our case, a simplified algorithmic description would be "search for the smallest element of the list, put it at the beginning, then repeat the process for the remaining list until it is sorted". A full algorithmic description is generally provided as code, such as the implementation in C given by Wikipedia.
Given an algorithmic description, one can infer a complete description on the input/output level. In general, every description entails the more abstract ones but not vice-versa because abstraction is synonymous with a systematic omission of information. Consequently, any two adjacent levels have a relationship. In this case, selection sort is just one possible algorithmic description that fits the input/output behavior; most sorting algorithms would be equally valid, assuming their runtime can be adjusted. Similarly, many different descriptions at the implementation level correspond to selection sort at the algorithmic level.
Implementation Level
The implementation level includes a full physical description of a system. Such a description cannot be fully articulated, even for simple cases. In fact, since the description is not abstracted at all, the most efficient "description" on the implementation level would be the physical system itself.
For selection sort, such a system could be a physical computer on which selection sort is running, the brain of a person who is sorting a list (although people are naturally more likely to do something similar to insertion sort), or even a mechanical device that implements selection sort, which would be an example of nonstandard computation.
Limitations
The definition of the algorithmic description in Marr's Levels relies on a rhetorical sleight of hand. Since computation is frame-dependent, there is no fact of the matter as to what the "computational steps" of a process are. This problem is obscured by the fact that many systems, such as digital computers, are built around a fixed set of elementary computational mechanisms (e.g., the set of operations a processor can execute) and thus offer a natural algorithmic description.
In general, if one defines an algorithmic description as one that approximately defines the implementation description while being substantially simpler, algorithmic descriptions do not exist for most systems. This result is immediate from the fact that it is impossible to compress arbitrary messages, which is an elementary result of information theory. Conversely, any physical description that can be (approximately) compressed must include systematic redundancies that can be abstracted away. One such example is the abstraction of physical wire pieces in digital computers (which are deliberately designed to take on one of two states) into logical bits. Even in the case of programs running on such computers, however, the concept of "algorithmic description" is not precise as it can refer to programs in different languages (e.g., Assembly Code and Python).
Similarly, the input/output description is frame-dependent as well since there is no ground truth as to what constitutes the input or output of a system. As of before, this fact is obscured by the dominance of computer programs that take on well-defined inputs and outputs by design. In general, any part of a system after any amount of time can be considered an output if the information about its state is utilized by another system.
Application to Consciousness
Marr's Levels can be used to classify different views on consciousness. In particular, Functionalism holds that consciousness lives on the algorithmic level. (The precise version of this statement is that the algorithmic level is the lowest level that is sufficient to determine the consciousness of a system in every case – the physical description is necessarily sufficient as well since it entails the algorithmic description.) Conversely, QRI holds that consciousness lives on the implementation level. This view implies that consciousness cannot be coherently discussed through the language of algorithms or programming.
Due to the problems discussed in the previous section, the claim that consciousness lives on the algorithmic level is not generally well-defined. However, this ambiguity is inherent in the concept of functionalism – e.g., the Stanford Encyclopedia of Philosophy defines functionalism as "the doctrine that what makes something a mental state of a particular type does not depend on its internal constitution, but rather on the way it functions, or the role it plays, in the system of which it is a part".[3] Thus, defining functionalism in terms of Marr's Levels may be the conceptually cleanest option despite its limitations.
A concrete example of a theory that puts consciousness on the algorithmic level is Integrated Information Theory (IIT), which addresses the problem of different possible "scales" for an algorithmic description explicitly. IIT asserts that its formulas are applicable on every scale, but due to a rule called the Exclusion Postulate, only the scale for which integrated information is largest ultimately counts.[4] In extreme cases, this could mean that the individual components modeled as computational units are atoms or even quarks or electrons.
It is also possible to hold that consciousness lives on the input/output level. Under this view, a system consisting of only a lookup table can exhibit consciousness.
References
- ↑ Marr, D. (1982). Vision: A computational investigation into the human representation and processing of visual information. Henry Holt.
- ↑ Marr, D., & Poggio, T. (1976). From understanding computation to understanding neural circuitry (A.I. Memo No. AIM-357). Artificial Intelligence Laboratory. Massachusetts Institute of Technology. https://hdl.handle.net/1721.1/5782
- ↑ Levin, J. (2023). Functionalism. In E. N. Zalta & U. Nodelman (Eds.), The Stanford Encyclopedia of Philosophy (Summer 2023 Edition). https://plato.stanford.edu/archives/sum2023/entries/functionalism/
- ↑ Albantakis, L., Barbosa, L., Findlay, G., Grasso, M., Haun, A. M., et al. (2023). Integrated information theory (IIT) 4.0: Formulating the properties of phenomenal existence in physical terms. PLOS Computational Biology, 19(10), Article e1011465. pp. 9, 18-19. https://doi.org/10.1371/journal.pcbi.1011465