I'll be giving a talk for the upcoming 2023 online conference (November-December) of the International Society for the Philosophy of the Sciences of the Mind and am looking for constructive feedback in advance that will help improve the presentation. Pointers to similar approaches in the literature most appreciated!
The talk is based on this paper of the same title: https://psyarxiv.com/79xg8
- Tom Clark, firstname.lastname@example.org
Talk abstract: I start this talk with a widely accepted characterization of minds: as natural kinds they are in the business of modeling the world outside the minded system, as well as its internal states, to control behavior in service to achieving its goals (Clark 2013, Williams 2017). Minds are thus in a representational relation to reality. It’s also widely but not universally accepted (Dennett 2016, Frankish 2016) that some minds, including our own, are characterized as being phenomenally conscious. We undergo experiences that have two primary features: 1) having qualitative character and 2) being subjective, that is, existing only for the conscious system, not as publicly observable events. In this talk I’ll suggest that phenomenal consciousness understood as a species of representational content can help explain both subjectivity and qualitativeness.
Regarding subjectivity, I will argue that as a general rule representational content – e.g., concepts, propositions, numbers, and crucially for this argument, sub-personal behavior guiding content – is not available to intersubjective inspection in the way its physically instantiated representational vehicles are (Clark 2019, Pautz 2020); rather it’s inferred in explanations of behavior (Shea 2018, Piccinini 2022). Content, although real, is not produced by its representational vehicles as a physically observable phenomenon. This can help explain the subjectivity – the categorical privacy – of phenomenal consciousness: experience, construed as a species of representational content, is only available to the conscious system; my feeling of pain isn’t locatable in spacetime as a public object. It constitutes an untranscendable concrete representational reality for the conscious mind system: the terms in which the world, a represented reality, appears to it.
Regarding qualitativeness, I will suggest that content might necessarily assume a qualitative aspect for complex world-modeling systems like ourselves since the system must block what would otherwise be a disabling representational regress. Metzinger calls this “autoepistemic closure” (Metzinger 2013, pp. 175-9). A basic experiential quality (e.g., red, pain, sweetness) is just that which admits of no further subjective decomposition into specifiable components, thus presents itself as a cognitively impenetrable, unstructured building block of the system’s conscious world model. For systems such as ourselves to behave effectively in real time with limited metabolic resources, representational processing must operate over what I call epistemic primitives: content elements which themselves are not objects of further representation. Basic qualities can be construed as entailments of such primitives as manifested in phenomenal consciousness: the subjective aspect of representational content which functions as the regress-blocking epistemic ground floor of a self-in-the-world reality model. Epistemic primitives are not themselves objects of knowledge, but rather counters in the game of sensory representation that can’t themselves be made objects of play. This can help explain the subjective irreducibility and ineffability of the basic phenomenal qualities which characterize conscious sensory modalities.
Progress in understanding representational content from a neuroscientific perspective, as for example in biology (Shea 2018), the predictive processing paradigm (Williams 2017, Clark et al. 2019, Hohwy & Seth 2020), and higher order theories of consciousness (Lau & Rosenthal 2011, Lau 2022), will help flesh out these proposals. A mature science of representation may show why, as a matter of representational necessity, not physical causation or emergence, systems like ourselves host conscious experience. Representational content as naturalized by cognitive science could thus be the key to explaining consciousness as both qualitative and subjective.
Keywords: phenomenal consciousness, content, representationalism, subjectivity, experiential qualities
Clark, A. (2013). Whatever next? Predictive brains, situated agents, and the future of cognitive science. Behavioral and Brain Sciences 36:3, 1-24.
Clark, A., K. Friston, and S. Wilkinson (2019). Bayesing qualia: consciousness as inference, not raw datum. J Conscious Stud, 26, 9-10. pp. 19-33.
Clark, T. W. (2019). Locating consciousness: why experience can’t be objectified, J Conscious Stud, 26, 11-12. pp. 60-85.
Dennett, D. (2016). Illusionism as the default theory of consciousness, J Conscious Stud, 13, 11-12, pp.65-72.
Frankish, K. (2016). Illusionism as a theory of consciousness. Journal of Consciousness Studies 23:11-12, 11-39.
Hohwy, J., & Seth, A. (2020). Predictive processing as a systematic basis for identifying the neural correlates of consciousness. Philosophy and the Mind Sciences, 1(II), 3. https://doi.org/10.33735/phimisci.2020.II.64
Lau H. & Rosenthal, D. (2011). Empirical support for higher-order theories of consciousness. Trends in Cognitive Sciences, Vol. 15, No. 8. doi:10.1016/j.tics.2011.05.009
Lau, H. (2022). In Consciousness We Trust. Oxford: Oxford University Press.
Metzinger, T. (2003). Being No One: The Self-Model Theory of Subjectivity. Cambridge, MA: MIT Press.
Pautz, A. (2020). Representationalism about consciousness, in the Oxford Handbook of the Philosophy of Consciousness, Kriegel, U. editor. Oxford: Oxford University Press.
Piccinini, G. (2022). Situated neural representations: solving the problems of content. Front. Neurorobot. 16:846979. doi: 10.3389/fnbot.2022.846979
Shea, N. (2018). Representation in Cognitive Science. Oxford: Oxford University Press.
Williams, D. (2017). Predictive processing and the representation wars. Minds and Machines. doi:10.1007/s11023-017-9441-6.