6

I'm sure my terminology is poor here (background in math more than philosophy), but are there any philosophers who have advanced a distinctly non-relativist epistemology without ultimately coming out foundationalist? I'm not talking about coherentism or something like that; rather I'm wondering if any philosophers have argued that we may be able (incidentally, as it were) to know some things absolutely without claiming that any particular beliefs are axiomatically known to everyone?

For example, let's say that person A has a belief or set of beliefs which when understood in their entirety are self-evident (perhaps "I think therefore I am"). Rather than being merely coherent with person A's other beliefs, this conclusion is taken to be true in an absolute sense. However a person B (with respect to whose framework A's belief must also be considered true since it is an absolute) might be unable to rationally conclude that A's belief or set of beliefs is true, not only with respect to A ("A thinks, therefore A is") but even with respect to herself ("B thinks, therefore B is"). In fact, given the right circumstances it might even be inherently impossible for B to reach this conclusion. And similarly, B might be able to correctly reach absolute conclusions which A is unable to justify (even in regard to herself). Through the process of life the absolute claims which are and are not justifiable may even change for both A and B respectively. And no beliefs of any kind would be considered exempt from this possibility. Thus there exists Reality, an understanding of which is sometimes attainable, but there is no guarantee that any individual will be able to lay claim to a given part of it.

Is there any philosopher who would claim that this could be the case, and advances an argument in support of it? I hope I've explained well enough what I mean. Perhaps this could be called "incidental absolutism". Or is there a better phrase to describe a position like this?

eMansipater
  • 1,460
  • 2
  • 15
  • 19

2 Answers2

3

I think the better phrase to describe a position like this would be "confused."

You say that person A's beliefs are "self-evident", and therefore can be "taken as true in an absolute sense."

Whatever epistemological warrant allows you to make the determination of what is or is not "self-evident" would then become foundational, so the resulting epistemology would in fact come out foundationalist.

Michael Dorfman
  • 23,267
  • 1
  • 44
  • 69
  • I don't think I explained myself well. What I mean is, what if there existed no general algorithm for determining something to be self-evident or not? So in this case, person A's beliefs happen to be self-evident (because they *do* think, and *are*), but in general it can't be guaranteed that the same thing will be self-evident for anyone else. This type of "self-evident" doesn't feel very foundationalist to me, since there's no guarantee of applicability for other frameworks or thinkers. And it could also change quickly (say as dimentia eliminated A's understanding of their own beliefs). – eMansipater Aug 03 '11 at 18:42
  • or their ability to spell. – eMansipater Aug 03 '11 at 19:09
  • You're still striving for something that doesn't exist. If there is a way-- general algorithm, intuition, or otherwise-- that we can designate some beliefs as "true in an absolute sense", that "absolute" leaks over to the epistemological warrant and makes it foundational. That's what foundationalism is. "Absolute truths", to be considered absolute, have to be founded on *something*. If they're not, they're not *absolute.* – Michael Dorfman Aug 03 '11 at 19:19
  • Ahh, I think you're confusing my description of the position with the position itself. In order to avoid a complicated reflexive phrasing, I described A and B's situation from an imaginary outside perspective. If this position were true, of course, you and I are in the same position as A and B. So we may or may not be able to designate the referred-to beliefs as absolute, depending on our knowledge and capabilities. If we *were* able to, then we could be certain of them. But for any specific belief we might lose that ability at any time. – eMansipater Aug 03 '11 at 20:13
  • So for any given person at any given time who happened to complete their understanding of a self-evident concept, they could be considered to have a foundationalist epistemology with their understanding of the self-evident concept as a basic belief. However, which basic beliefs any given person has, and whether they have any at all, might vary drastically according to circumstances beyond their control. And all of this is a description of a hypothetical situation that we might empirically find ourselves in, not a proof that it is the case. – eMansipater Aug 03 '11 at 20:20
  • And what, pray tell, would be the distinguishing characteristic between a relativist account and one which is "absolute only for a given person at a given time, and not absolute to anyone else"? – Michael Dorfman Aug 04 '11 at 08:00
  • You can at least lose the condescending tone until you actually understand my question. I've said nothing at all about certain things being absolute for a given person at a given time, but not absolute to anyone else. All absolutes are by definition absolute to everyone, but they might also be unprovable (a.k.a. unable to be recognised as absolutes) within another reference frame. Godel showed us that there are infinite examples of this within any framework of sufficient complexity. Also, please don't use quotes for something I haven't actually said. – eMansipater Aug 04 '11 at 17:08
  • I was using scare quotes, to delimit the concept-- it wasn't my intention to put words in your mouth. And I certainly don't mean to be condescending, so let's start anew: what is it you'd like to achieve here? Personally, I'm more than happy with coherentism, but you seem to be aiming for some kind of "absolute" while remaining non-foundationalist. As I've already indicated, I don't think that's possible-- but even if it were, why is it desirable? – Michael Dorfman Aug 04 '11 at 19:51
  • Fair enough. For me, coherentism doesn't seem to reflect the weight of reality. The fact that I have the experiences that I have is beyond undeniable. The fact that within that experience I am utterly unable to do certain things like snap my fingers and be transported to the moon, or produce significant self-change without effort, has an undeniable gravity to it that feels as absolute as anything could be. Like Neo in the Matrix, I might be uncertain of the ultimate nature of my reality, but that I can be absolutely certain of some things seems trivially obvious, especially as a mathematician. – eMansipater Aug 05 '11 at 16:55
  • I'm also well-versed in intercultural communication, so I'm highly familiar with the fact that most people are blissfully unaware of their own mental frameworks; and that certain ways of approaching the world are genuinely mutually exclusive, yet internally coherent. So it seems equally obvious to me that choosing any one particular place as foundational is at best ethnocentric nonsense. – eMansipater Aug 05 '11 at 16:59
  • These two facts don't seem at odds to me though, and I've developed a personal philosophy that embraces them which I can articulate quite well to myself. I'm just searching for the academic vocabulary so that I will be able to both relate my views and their extensive lemmas to a broader audience, and subject them to a wider range of criticism than I've personally come up with. – eMansipater Aug 05 '11 at 17:02
  • As you've discovered, there is a broad area between foundationalism and relativism-- I think you'll find that most anti-foundationalists reject relativism. However, this usually also means giving up a claim to "absolute" knowledge-- I think you'll find that, as a mathematician, those things that you think are trivially obvious you probably aren't "absolutely certain" of in any epistemologically rigorous sense. (Personally, I'd suggest fictionalism in this regard.) – Michael Dorfman Aug 05 '11 at 19:30
  • Epistemologically rigorous enough that friends of mine who have their master's degrees in philosophy believed it worth pursuing--hence my search for related academic work. My basic approach is highly-detailed, fully defined, specifically qualified information-theoretic claims that ultimately obtain on demonstration (since the provability of a claim is taken to change dynamically). For example, the claim "You are reading this sentence" obtains when you actually *do* read the sentence, and not otherwise. This "passes off" the epistemic load to the cause of your experience, whatever it is. – eMansipater Aug 10 '11 at 15:32
  • Interesting. So how do I know if I am actually reading the sentence? What is the epistemic warrant? I imagine that you are not going to say that all perception is veridical-- the canonical counter-example in Indian philosophy is a coiled rope taken for a snake. – Michael Dorfman Aug 10 '11 at 19:56
  • The phrase "reading this sentence" is taken as a gloss for a much more complex particular thought, and for each person they must identify it specifically in order to obtain epistemic warrant for an absolute belief. Like the complexity of code required to make a simple image appear on your screen, this "fully specified thought" (with respect to the person's own understanding) may be incredibly complicated. But, my belief is that through technology we may be able to put simple user interfaces on top of this necessary complexity while keeping it precise. More at http://pastebin.com/cTWTTbSG :) – eMansipater Aug 11 '11 at 16:36
  • I wish you luck, but I don't think that information theory is going to help you with epistemology. Your statements in the pastebin still harbor all kinds of question-begging presuppositions, and I don't think any degree of specification is going to be able to eliminate those presuppositions. At the end of the day, you need a warrant, and either that warrant is going to be absolute (in which case you have a foundationalist epistemology) or not (in which case you have a non-foundationalist epistemology, and give up any hope of absolute truths). No degree of specification is going to help. – Michael Dorfman Aug 11 '11 at 20:21
  • I can hardly blame you if my hand-wavy explanation is unconvincing--it's more of a "point in that direction" than a case for it. The gist is that the information theory reduces claims to math, justification to proof, questions of certainty/absoluteness to probability analyses; and then replaces foundational claims/axioms with demonstration of capabilities. So the "warrant" ends up a lot like Turing Completeness--you can build it on top of pretty much anything, and when the right level of complexity is reached, the warrant just dynamically appears as a consequence of the claim--at "runtime". – eMansipater Aug 12 '11 at 20:35
  • But the point is not to argue the position, just to describe it vaguely and hopefully identify which philosophers take a similar approach. – eMansipater Aug 12 '11 at 20:37
  • Let me also try to use math as an analogy, in that case. If you want to build up from axioms, you're foundationalist-- those axioms are your foundation. If you want to avoid axioms, and say that statement are true if they don't contradict other statements known to be true, you're coherentist. Naturally, there are plenty of folks who take each approach. What you aren't going to find is someone who claims absolute knowledge without some foundation lurking behind the "absolute"; and that seems to be what you are looking for. – Michael Dorfman Aug 13 '11 at 12:33
  • I can see you're still not understanding it. If I were you I'd be quite tired by now, but since you keep replying I'll assume you're not bored to tears. – eMansipater Aug 18 '11 at 20:24
  • It's the _act of making or comprehending a claim itself_ that becomes the (very complex) axiom for that claim's absolute conclusion. It's only a flicker in time, because the conclusion only obtains as the ability of the thinker to understand the claim is demonstrated--not before, and not after. Just as in "You are reading this sentence". So different claims have different axioms. There (may be) no master schema tying all the claims together--the fact that the claim can be made is not "proved into existence". Rather the making of the claim, if it can be made, is merely an incidental capability. – eMansipater Aug 18 '11 at 20:27
  • Who can say why we might be able to make claims? Any justification for such a theory is well beyond _my_ capability. One would have to give a justification for why the universe exists, which seems quite epistemically hard. So instead we just have these islands of absolute that obtain at runtime. Within each claim you can be considered a foundationalist, but your foundations keep dynamically changing all the time. Doesn't seem to collapse into either camp very nicely as far as I can tell. – eMansipater Aug 18 '11 at 20:30
  • I'm not bored to tears, at all. I'm still finding your epistemology to reduce to a variant of coherentism, and not absolutist at all (despite your continuing to use that term.) Perhaps another way to view this would be through the lens of the Münchhausen Trilemma-- at the end of the day, every epistemology is going to be based either on a) axioms, b) circular reasoning, or c) an infinite regress. Which are you aiming for? – Michael Dorfman Aug 19 '11 at 07:00
  • Are you familiar with the style Gödel takes in proving his famous incompleteness theorems? He begins by devising a way to represent the statements of a formal system as natural numbers, then constructs statements about properties of those natural numbers which turn out to describe themselves under the equivalence thus set up. So in essence, the simple claims about numbers "climb all the way up" and out of the natural numbers in order to offer conclusions about any formal system capable of representing them. This broader implication can itself be proven as a simple property of natural numbers. – eMansipater Aug 22 '11 at 16:21
  • A simpler mental metaphor for the different "levels" going on here is that of virtual machine software in the computing world. We can think of Gödel as constructing a kind of "virtual machine" inside the natural numbers, and then using it to make an assertion about the capabilities of the ultimate "hardware" on which it's being run. Even though no virtual machine could ever know whether it is being run directly on hardware or simply under yet another virtual machine, we know that the hardware itself must be capable of at least what the virtual machine is, at whatever level it "bottoms out". – eMansipater Aug 22 '11 at 16:36
  • Similarly, I take the approach of constructing claims within a thinker's mind that "climb all the way up" and out of the mind to hold implications about whichever thing it is that underpins that mind's existence. Whereas many epistemological systems are thought of in terms of relationships between abstract concepts like axioms, propositions, and deductions, this approach allows us to split the flow of logical implication from the chain of causality. The former obtains self-referentially at a certain point, but in a certain way that "climbs all the way up" the latter, wherever it might lead. – eMansipater Aug 22 '11 at 16:44
  • So in terms of our dear Münchhausen, we might say that both axioms and an infinite regress are involved. The self-referentially obtained absolutes may be internally treated as axiomatic, but only because they obtain in a way that passes their epistemic warrant up the chain of virtual machines to the hardware, wherever it may be. Or perhaps there is only an infinite chain of virtual machines. Who knows? You don't have to figure this out in order to successfully make certain claims about that larger system, a.k.a. "reality", and be fully warranted in being absolutely certain of them. – eMansipater Aug 22 '11 at 16:48
  • Like Gödel's second incompleteness theorem, the proposed epistemology itself can be examined "within the virtual machine" and in fact must be because of its self-referential implications. When we do so we realise that this larger epistemological system has no grounds to claim absolute status. It can only be offered as a system that, like a scientific theory, might be a useful framework within which to reach conclusions. However, my empirical postulate is that such a system may suffice to address those questions of certainty which we actually have. If so, it would be quite useful. – eMansipater Aug 22 '11 at 16:54
  • An empirical reason that this might be the case is that our current scientific models of the physical universe seem to describe a countably-complex universe, and neuroscience is starting to offer quite firm support for the idea that physics suffices to describe the mind. Taking these two facts together, it seems quite likely that an information theoretic model of the type I propose could be sufficient to apply to all human thought. In so doing, it would not _end_ most of our important questions, but it would provide a framework within which different perspectives on them could be examined. – eMansipater Aug 22 '11 at 16:59
  • It still looks like coherentism to me; and it still looks like there is no epistemological reason to consider any findings to be "absolute." Personally, that suits me fine-- I don't have any problem with coherentism, and have no need for "absolute knowledge"-- but it seems to be important to you that avoid that problematic, which is unfortunate, in my opinion. "Self-referentially obtained absolutes that may be treated as axiomatic" certainly appears to be a variety of coherentism, as we are measuring the propositions against each other (and not against any external standard.) – Michael Dorfman Aug 22 '11 at 19:32
  • @eMansipater Please consider taking more extended discussions to chat -- comments are really supposed to be just for clarifying the contributed answer or question. Since none of this discussion was apparently clarifying enough to require updating the answer, I'm currently considering these 30 comments candidates for purging. (If it was indispensable, please consider contributing an edit that updates the answer to reflect the needed clarification.) – Joseph Weissman Sep 26 '11 at 20:02
3

I'm not exactly sure how to classify Paul Churchland's views, but they seem to be something like what you're looking for.

There's a strategy which one can take to end up as a non-foundational non-relativist of a sort, but you might quibble with one (or both!) of the non-'s: you adopt a basically coherentist point of view, except you note that, conveniently enough, people all seem to end up bound by physical laws (i.e. that is what is coherent), and so you're not really relative in any meaningful sense--even though you formally allowed the possibility, in practice people agree that, for example, snow is white, and anyone who says otherwise (when looking at white snow) is wrong, not operating with a different set of mostly-self-consistent propositions. In my opinion, this is the approach that the neurophilosopher crowd seems to take (Dennett, Churchland, etc.), although I don't recall having any of them spell it out in exactly these terms.

(One could argue that they are foundational because they take as axioms something like the scientific method to learn about the world and do not question those; one could argue that they are relativists because if it so happened that there were ten different societies with radically different but equally predictive interpretations of the physical world then their approach would force them to accept all of them as "true".)

Rex Kerr
  • 15,806
  • 1
  • 22
  • 45
  • any idea of a source where he discusses this? I seem to be finding a lot of stuff on his eliminative materialism, but not much in the way of epistemology. – eMansipater Aug 04 '11 at 19:18
  • @eMansipater - No, sorry; I'm just inferring from the style of arguments he uses when talking about e.g. eliminative materialism. You could try reading the section on epistemology in Matter and Consciousness to infer what his views are w.r.t. your questions (I don't think he addresses it directly). – Rex Kerr Aug 04 '11 at 19:49
  • alright, I'll give it a shot. – eMansipater Aug 05 '11 at 17:04