Tuomas E. Tahko
University Lecturer & Academy of Finland Research Fellow
University of Helsinki
University of Helsinki
Why do we need a ‘science of being’, and how is such a science
possible? Why cannot each special science, be it empirical or a priori, address its own
ontological questions on its own behalf, without recourse to any overarching ‘science
of being’? The short answer to this question is that reality is one and truth indivisible.E.J. Lowe, The Four-Category Ontology (2006, OUP), p. 4
Panel Discussion on the Freedom and Limits of Science
The Science Forum, University of Oulu, September 30, 2017.
Where Do You Get Your Protein? Or: Biochemical Realization
3rd Society for the Metaphysics of Science Conference, Fordham University, October 6, 2017.
Workshop on Metaphysical and Mathematical Explanation, University of Pavia, December 14-16, 2017.
Go to PhilPapers for complete list
Can the neo-Aristotelian uphold a pluralist substance ontology while taking seriously the recent arguments in favour of monism based on quantum holism and other arguments from quantum mechanics? In this article, Jonathan Schaffer’s priority monism will be the main target. It will be argued that the case from quantum mechanics in favour of priority monism does face some challenges. Moreover, if the neo-Aristotelian is willing to consider alternative ways to understand ‘substance’, there may yet be hope for a pluralist substance ontology. A speculative case for such an ontology will be constructed on the basis of primitive incompatibility.
In this chapter, a generic definition of fundamentality as an ontological minimality thesis is sought and its applicability examined. Most discussions of fundamentality are focused on a mereological understanding of the hierarchical structure of reality, which may be combined with an atomistic, object-oriented metaphysics. But recent work in structuralism, for instance, calls for an alternative understanding and it is not immediately clear that the conception of fundamentality at work in structuralism is commensurable with the mereological conception. However, it is proposed that once we understand fundamentality as an ontological minimality thesis, these two as well as further conceptions of fundamentality can all be treated on a par, including metaphysical infinitism of the ‘boring’ type, where the same structure repeats infinitely.
The epistemology of essence is a topic that has received relatively little attention, although there are signs that this is changing. The lack of literature engaging directly with the topic is probably partly due to the mystery surrounding the notion of essence itself, and partly due to the sheer difficulty of developing a plausible epistemology. The need for such an account is clear especially for those, like E.J. Lowe, who are committed to a broadly Aristotelian conception of essence, whereby essence plays an important theoretical role. In this chapter, our epistemic access to essence is examined in terms of the a posteriori vs. a priori distinction. The two main accounts to be contrasted are those of David S. Oderberg and E.J. Lowe.
The present paper discusses different approaches to metaphysics and defends a specific, non-deflationary approach that nevertheless qualifies as scientifically-grounded and, consequently, as acceptable from the naturalistic viewpoint. By critically assessing some recent work on science and metaphysics, we argue that such a sophisticated form of naturalism, which preserves the autonomy of metaphysics as an a priori enterprise yet pays due attention to the indications coming from our best science, is not only workable but recommended.
In this chapter, it is suggested that our epistemic access to metaphysical modality generally involves rationalist, a priori elements. However, these a priori elements are much more subtle than ‘traditional’ modal rationalism assumes. In fact, some might even question the ‘apriority’ of these elements, but I should stress that I consider a priori and a posteriori elements especially in our modal inquiry to be so deeply intertwined that it is not easy to tell them apart. Supposed metaphysically necessary identity statements involving natural kind terms are a good example: the fact that empirical input is crucial in establishing their necessity has clouded the role and content of the a priori input, as I have previously argued. For instance, the supposed metaphysically necessary identity statement involving water and its microstructure can only be established with the help of a controversial a priori principle concerning the determination of chemical properties by microstructure. The Kripke-Putnam framework of modal epistemology fails precisely because it is unclear whether the required a priori element is present. My positive proposal builds on E. J. Lowe’s work. Lowe holds that our knowledge of metaphysical modality is based on our knowledge of essence. Lowe’s account strives to offer a uniform picture of modal epistemology: essence is the basis of all our modal knowledge. This is the basis of Lowe’s modal rationalism. I believe that Lowe’s proposal is on the right lines in the case of abstract objects, but I doubt that it can be successfully applied to the case of natural kinds. Accordingly, the case of natural kinds will be my main focus and I will suggest that modal rationalism, at least as it is traditionally understood, falls short of explaining modal knowledge concerning natural kinds. Yet, I think that Lowe has identified something of crucial importance for modal epistemology, namely the essentialist, a priori elements present in our modal inquiry. The upshot is that rather than moving all the way from modal rationalism to modal empiricism, a type of hybrid approach, ‘empirically-informed modal rationalism ’, can be developed.
A minimal truthmaker for a given proposition is the smallest portion of reality which makes this proposition true. Minimal truthmakers are frequently mentioned in the literature, but there has been no systematic account of what they are or of their importance. In this article we shall clarify the notion of a minimal truthmaker and argue that there is reason to think that at least some propositions have minimal truthmakers. We shall then argue that the notion can play a useful role in truthmaker theory, by helping to explain the truth of certain propositions as precisely as possible.
The title of this paper reflects the fact truthmaking is quite frequently considered to be expressive of realism. What this means, exactly, will become clearer in the course of our discussion, but since we are interested in Armstrong’s work on truthmaking in particular, it is natural to start from a brief discussion of how truthmaking and realism appear to be associated in his work. In this paper, special attention is given to the supposed link between truthmaking and realism, but it is argued that this link should not be taken too seriously, as truthmaking turns out to be, to a large extent, ontologically neutral. Some consequences of this are studied.
Recent work on Natural Kind Essentialism has taken a deflationary turn. The assumptions about the grounds of essentialist truths concerning natural kinds familiar from the Kripke-Putnam framework are now considered questionable. The source of the problem, however, has not been sufficiently explicated. The paper focuses on the Twin Earth scenario, and it will be demonstrated that the essentialist principle at its core (which I call IDENT)—that necessarily, a sample of a chemical substance, A, is of the same kind as another sample, B, if and only if A and B have the same microstructure—must be re-evaluated. The Twin Earth scenario also assumes the falsity of another essentialist principle (which I call INST): necessarily, there is a 1:1 correlation between (all of ) the chemical properties of a chemical substance and the microstructure of that substance. This assumption will be questioned, and it will be argued that, in fact, the best strategy for defending IDENT is to establish INST. The prospects for Natural Kind Essentialism and microstructural essentialism regarding chemical substances will be assessed with reference to recent work in the philosophy of chemistry. Finally, a weakened form of INST will be presented.
Three popular views regarding the modal status of the laws of nature are discussed: Humean Supervenience, nomic necessitation, and scientific/dispositional essentialism. These views are examined especially with regard to their take on the apparent modal force of laws and their ability to explain that modal force. It will be suggested that none of the three views, at least in their strongest form, can be maintained if some laws are metaphysically necessary, but others are metaphysically contingent. Some reasons for thinking that such variation in the modal status of laws exists will be presented with reference to physics. This drives us towards a fourth, hybrid view, according to which there are both necessary and contingent laws. The prospects for such a view are studied.
Ontological dependence is a relation—or, more accurately, a family of relations—between entities or beings. For there are various ways in which one being may be said to depend upon one or more other beings, in a sense of “depend” that is distinctly metaphysical in character and that may be contrasted, thus, with various causal senses of this word. More specifically, a being may be said to depend, in such a sense, upon one or more other beings for its existence or for its identity. Some varieties of ontological dependence may be analyzed in modal terms—that is, in terms of distinctly metaphysical notions of possibility and necessity—while others seem to demand an analysis in terms of the notion of essence. The latter varieties of ontological dependence may accordingly be called species of essential dependence. Notions of ontological dependence are frequently called upon by metaphysicians in their proposed analyses of other metaphysically important notions, such as the notion of substance.
In formal ontology, infinite regresses are generally considered a bad sign. One debate where such regresses come into play is the debate about fundamentality. Arguments in favour of some type of fundamentalism are many, but they generally share the idea that infinite chains of ontological dependence must be ruled out. Some motivations for this view are assessed in this article, with the conclusion that such infinite chains may not always be vicious. Indeed, there may even be room for a type of fundamentalism combined with infinite descent as long as this descent is “boring,” that is, the same structure repeats ad infinitum. A start is made in the article towards a systematic account of this type of infinite descent. The philosophical prospects and scientific tenability of the account are briefly evaluated using an example from physics.
The starting point of this paper concerns the apparent difference between what we might call absolute truth and truth in a model, following Donald Davidson. The notion of absolute truth is the one familiar from Tarski’s T-schema: ‘Snow is white’ is true if and only if snow is white. Instead of being a property of sentences as absolute truth appears to be, truth in a model, that is relative truth, is evaluated in terms of the relation between sentences and models. I wish to examine the apparent dual nature of logical truth (without dwelling on Davidson), and suggest that we are dealing with a distinction between a metaphysical and a linguistic interpretation of truth. I take my cue from John Etchemendy, who suggests that absolute truth could be considered as being equivalent to truth in the ‘right model’, i.e., the model that corresponds with the world. However, the notion of ‘model’ is not entirely appropriate here as it is closely associated with relative truth. Instead, I propose that the metaphysical interpretation of truth may be illustrated in modal terms, by metaphysical modality in particular. One of the tasks that I will undertake in this paper is to develop this modal interpretation, partly building on my previous work on the metaphysical interpretation of the law of non-contradiction (Tahko 2009). After an explication of the metaphysical interpretation of logical truth, a brief study of how this interpretation connects with some recent important themes in philosophical logic follows. In particular, I discuss logical pluralism and propose an understanding of pluralism from the point of view of the metaphysical interpretation.
Aristotle talks about ‘the first philosophy’ throughout the Metaphysics – and it is metaphysics that Aristotle considers to be the first philosophy – but he never makes it entirely clear what first philosophy consists of. What he does make clear is that the first philosophy is not to be understood as a collection of topics that should be studied in advance of any other topics. In fact, Aristotle seems to have thought that the topics of Metaphysics are to be studied after those in Physics. In what sense could metaphysics be the first philosophy in the context of contemporary metaphysics? This is the question examined in this essay. Contemporary topics such as fundamentality, grounding, and ontological dependence are considered as possible ways to understand the idea of first philosophy, but I will argue that the best way to understand it is in terms of essence.
One type of deflationism about metaphysical modality suggests that it can be analysed strictly in terms of linguistic or conceptual content and that there is nothing particularly metaphysical about modality. Scott Soames is explicitly opposed to this trend. However, a detailed study of Soames’s own account of modality reveals that it has striking similarities with the deflationary account. In this paper I will compare Soames’s account of a posteriori necessities concerning natural kinds with the deflationary one, specifically Alan Sidelle’s account, and suggest that Soames’s account is vulnerable to the deflationist’s critique. Furthermore, I conjecture that both the deflationary account and Soames’s account fail to fully explicate the metaphysical content of a posteriori necessities. Although I will focus on Soames, my argument may have more general implications towards the prospects of providing a meaning-based account of metaphysical modality.
It is argued that if we take grounding to be univocal, then there is a serious tension between truth-grounding and one commonly assumed structural principle for grounding, namely transitivity. The primary claim of the article is that truth-grounding cannot be transitive. Accordingly, it is either the case that grounding is not transitive or that truth-grounding is not grounding, or both.
This paper defends the idea that there must be some joints in reality, some correct way to classify or categorize it. This may seem obvious, but we will see that there are at least three conventionalist arguments against this idea, as well as philosophers who have found them convincing. The thrust of these arguments is that the manner in which we structure, divide or carve up the world is not grounded in any natural, genuine boundaries in the world. Ultimately they are supposed to pose a serious threat to realism. The first argument that will be examined concerns the claim that there are no natural boundaries in reality, the second one focuses on the basis of our classificatory schemes, which the conventionalist claims to be merely psychological, and the third considers the significance of our particular features in carving up the world, such as physical size and perceptual capabilities. The aim of this paper is to demonstrate that none of these objections succeed in undermining the existence of genuine joints in reality.
What is our epistemic access to metaphysical modality? Timothy Williamson suggests that the epistemology of counterfactuals will provide the answer. This paper challenges Williamson’s account and argues that certain elements of the epistemology of counterfactuals that he discusses, namely so called background knowledge and constitutive facts, are already saturated with modal content which his account fails to explain. Williamson’s account will first be outlined and the role of background knowledge and constitutive facts analysed. Their key role is to restrict our imagination to rule out irrelevant counterfactual suppositions. However, background knowledge turns out to be problematic in cases where we are dealing with metaphysically possible counterfactual suppositions that violate the actual laws of physics. As we will see, unless Williamson assumes that background knowledge corresponds with the actual, true laws of physics and that these laws are metaphysically necessary, it will be difficult to address this problem. Furthermore, Williamson’s account fails to accommodate the distinction between conceivable yet metaphysically impossible scenarios, and conceivable and metaphysically possible scenarios. This is because background knowledge and constitutive facts are based strictly on our knowledge of the actual world. Williamson does attempt to address this concern with regard to metaphysical necessities – as they hold across all possible worlds – but we will see that even in this case the explanation is questionable. These problems, it will be suggested, cannot be addressed in a counterfactual account of the epistemology of modality. The paper finishes with an analysis of Williamson’s possible rejoinders and some discussion about the prospects of an alternative account of modal epistemology.
When I say that my conception of metaphysics is Aristotelian, or neo-Aristotelian, this may have more to do with Aristotle’s philosophical methodology than his metaphysics, but, as I see it, the core of this Aristotelian conception of metaphysics is the idea that metaphysics is the first philosophy . In what follows I will attempt to clarify what this conception of metaphysics amounts to in the context of some recent discussion on the methodology of metaphysics (e.g. Chalmers et al . (2009), Ladyman and Ross (2007)). There is a lot of hostility towards the Aristotelian conception of metaphysics in this literature: for instance, the majority of the contributors to the Metametaphysics volume assume a rather more deflationary, Quinean approach towards metaphysics. In the process of replying to the criticisms towards Aristotelian metaphysics put forward in recent literature I will also identify some methodological points which deserve more attention and ought to be addressed in future research.
The priority monist holds that the cosmos is the only fundamental object, of which every other concrete object is a dependent part. One major argument against monism goes back to Russell, who claimed that pluralism is favoured by common sense. However, Jonathan Schaffer turns this argument on its head and uses it to defend priority monism. He suggests that common sense holds that the cosmos is a whole, of which ordinary physical objects are arbitrary portions, and that arbitrary portions depend for their existence on the existence of the whole. In this paper, we challenge Schaffer’s claim that the parts of the cosmos are all arbitrary portions. We suggest that there is a way of carving up the universe such that at least some of its parts are not arbitrary. We offer two arguments in support of this claim. First, we shall outline semantic reasons in its favour: in order to accept that empirical judgements are made true or false by the way the world is, one must accept that the cosmos includes parts whose existence is not arbitrary. Second, we offer an ontological argument: in order for macro-physical phenomena to exist, there must be some micro-physical order which they depend upon, and this order must itself be non-arbitrary. We conclude that Schaffer’s common sense argument for monism cannot be made to work.
The distinction between a priori and a posteriori knowledge has been the subject of an enormous amount of discussion, but the literature is biased against recognizing the intimate relationship between these forms of knowledge. For instance, it seems to be almost impossible to find a sample of pure a priori or a posteriori knowledge. In this paper, it will be suggested that distinguishing between a priori and a posteriori is more problematic than is often suggested, and that a priori and a posteriori resources are in fact used in parallel. We will define this relationship between a priori and a posteriori knowledge as the bootstrapping relationship. As we will see, this relationship gives us reasons to seek for an altogether novel definition of a priori and a posteriori knowledge. Specifically, we will have to analyse the relationship between a priori knowledge and a priori reasoning , and it will be suggested that the latter serves as a more promising starting point for the analysis of aprioricity. We will also analyse a number of examples from the natural sciences and consider the role of a priori reasoning in these examples. The focus of this paper is the analysis of the concepts of a priori and a posteriori knowledge rather than the epistemic domain of a posteriori and a priori justification.
The goals of this paper are two-fold: I wish to clarify the Aristotelian conception of the law of non-contradiction as a metaphysical rather than a semantic or logical principle, and to defend the truth of the principle in this sense. First I will explain what it in fact means that the law of non-contradiction is a metaphysical principle. The core idea is that the law of non-contradiction is a general principle derived from how things are in the world. For example, there are certain constraints as to what kind of properties an object can have, and especially: some of these properties are mutually exclusive. Given this characterisation, I will advance to examine what kind of challenges the law of non-contradiction faces; the main opponent here is Graham Priest. I will consider these challenges and conclude that they do not threaten the truth of the law of non-contradiction understood as a metaphysical principle.
In this paper I offer a counterexample to the so called vagueness argument against restricted composition. This will be done in the lines of a recent suggestion by Trenton Merricks, namely by challenging the claim that there cannot be a sharp cut-off point in a composition sequence. It will be suggested that causal powers which emerge when composition occurs can serve as an indicator of such sharp cut-off points. The main example will be the case of a heap. It seems that heaps might provide a very plausible counterexample to the vagueness argument if we accept the idea that four grains of sand is the least number required to compose a heap—the case has been supported by W. D. Hart. My purpose here is not to put forward a new theory of composition, I only wish to refute the vagueness argument and point out that we should be wary of arguments of its form.
This paper challenges the Kripkean interpretation of a posteriori necessities. It will be demonstrated, by an analysis of classic examples, that the modal content of supposed a posteriori necessities is more complicated than the Kripkean line suggests. We will see that further research is needed concerning the a priori principles underlying all a posteriori necessities. In the course of this analysis it will emerge that the modal content of a posteriori necessities can be best described in terms of a Finean conception of modality – by giving essences priority over modality. The upshot of this is that we might be able to establish the necessity of certain supposed a posteriori necessities by a priori means.
In this paper I will offer a novel understanding of a priori knowledge. My claim is that the sharp distinction that is usually made between a priori and a posteriori knowledge is groundless. It will be argued that a plausible understanding of a priori and a posteriori knowledge has to acknowledge that they are in a constant bootstrapping relationship. It is also crucial that we distinguish between a priori propositions that hold in the actual world and merely possible, non-actual a priori propositions, as we will see when considering cases like Euclidean geometry. Furthermore, contrary to what Kripke seems to suggest, a priori knowledge is intimately connected with metaphysical modality, indeed, grounded in it. The task of a priori reasoning, according to this account, is to delimit the space of metaphysically possible worlds in order for us to be able to determine what is actual.
National Library of Finland
Books & Volumes
- An Introduction to Metametaphysics. 2015, CUP.
- Contemporary Aristotelian Metaphysics. 2012, CUP.
- Aristotelian Metaphysics: Essence and Ground. Riin Sirkel & Tuomas E. Tahko (eds.) 2014. Studia Philosophica Estonica, Vol 7.2.
- Mahdollisuus (“Possibility”, in Finnish). Ilkka Niiniluoto, Tuomas Tahko & Teemu Toppinen (eds.). 2016. Helsinki: Philosophical Society of Finland.
Reviews & Commentaries
Book review of ‘More Kinds of Being: A Further Study of Individuation, Identity, and the Logic of Sortal Terms’. By E. J. Lowe.
Book review of ‘Tropes: Properties, Objects, and Mental Causation’ (2011, OUP). By Douglas Ehring.
Book review of ‘The Universe As We Find It’ (2012, OUP). By John Heil.
This is a critical commentary on Kathrin Koslicki’s book The Structure of Objects (OUP, 2008).