Why don't we speak of negative numerals?
"Rationalism without empiricism is empty; empiricism without rationalism is blind."
Topics on this page ...
- Whole and Negative Numbers
- Children are encouraged, and like, to use their imaginations
- Equivalent-word definitions
- "Empiricism" and "Rationalism"
- For "What philosophy is?"
- Metaphors and Philosophical clarity
- Aeneas and the golden bough
- Unburied Souls
- What do we call academic 'scholarship'?
- Knowledge by Any Other Name
Background: the following are more, or less, rough draft logic of language remarks: what is the relationship between logic and meaning? Is meaning a matter of form or of use in the language? How is sense distinguished from linguistic nonsense in philosophy?
Whole and Negative Numbers
Note: words marked "Query" are from the access logs of this Web site.
Query: how many whole numbers are there?
This is not a closed class (as is e.g. the number of even numbers between 7 and 23): instead there is a rule that tells you how to go on indefinitely [n + 1, and there is too our complete confidence that it will always be clear to us how to follow that rule]. In saying this we have rejected the question (the grammatical model that it tries to impose in this case), or the assumption implicit in the question (that the class is closed), and we have given a meaning to the question. It is not the only possible meaning, but it does describe how we use the expression 'whole numbers'.
Worshiping negative numbers
Query: what are negative numbers?
The whole educational problem in four words. But don't ask what things (e.g. "negative numbers") are; ask for an explanation of the use of language: how do we use the combination of words 'negative numbers'? The origin of our philosophical muddles: "Words are the names of things"; and if that is so, then "What is the thing the name 'negative numbers' stands for, because it must stand for in something if it isn't nonsense?"
As soon as you prescind [withdraw your attention] from the way we are taught in mathematics classes to actually use the expression 'negative number', then "what a negative number is" may present itself to you as something mysterious, as a shadow existence or "some such thing" (because, of course, you have no idea what).
Query: negative number as a philosophical illusion.
Is the query about whether or not negative numbers "really" exist? Is that why don't we speak of "negative numerals"? Someone finds it useful to indicate a deficit ("I am dollars short of the amount I need if I am to buy bread") by placing a minus-sign in front of an number-sign (as e.g. "-2") -- and then someone else imagines that a new class of objects has been discovered.
[Worshipping what we have ourselves hypostatized. (Grammar -- grammatical analogies -- induced imaginings, like opium-induced dreams.) Numbers and geometric points and what can be reified can thence be deified. If the words 'negative number' are thought to be the name of a class of spirits, then those spirits can be worshipped as eternal truths. (The Pythagorean priesthood, and Pythian Apollo.)]
Query: do negative numbers exist?
Maybe Frege's "geometric heaven" (spirit realm of Platonic-like mathematical forms) casts a shadow, and in that shadow negative numbers live their super-ghostly (for there are naught but ghosts in Geometry's Heaven, which is itself a ghost) existence.
Query: spiritual significance of minus numbers.
Spiritual significance? Significance to what -- astrology? I would think there would be temples dedicated to negative numbers. And if so they would be what landscape architects call "follies". But there would be nothing to "praise" (pace Erasmus). There would be deities made out of the negative numbers. For what man creates man may worship, even if it is a conceptual confusion created by his own mind (How created? But these are not necessarily the only possibilities: (1) by failing to understand the "logic of language"; (2) by failing to imagine/conceive e.g. an alternative way of looking at things, which is a failure of the grammatical (The word 'grammar' in Wittgenstein's jargon) imagination; (3) by failing to put inherited dogmas to the touch of dialectic; usw.)
Query: games, activities for undefined terms.
Now that is cult behavior. Ritual worship by a culture that revels in the authority of tradition, rote learning, a culture that once asked me questions such as "What is a point in geometry?" but which not receiving the received-from-across-the-generations answer turned away from asking me: cult takes the place of thought. (There are many word games that can be played with 'geometric point', many grammatical jokes that can be made.)
Query: why don't we use negative numbers in daily life?
But we do use negative numbers in daily life, e.g. when we make entries in our check book: Check #521 -47.00.
Query: why negative numbers are not whole numbers?
Is that an empirical question, for that is what the word 'why' suggests -- a justification? But what is the justification for a classification scheme? Surely its usefulness for a particular purpose. But whose purpose? We ask for a definition -- a rule for using the combination of words 'whole number'; and someone had to make that definition; it was not given by nature. Or do we imagine that 'number' is the name of an object about which there are theories -- e.g. is negative-2 really a whole number or not?
Query: from rhyme to reason in mathematics.
If I know what I am talking about, and I may not, then: If the sum of the numbers that together "name" a number is evenly divisible by 3, then the number named is itself evenly divisible by 3; as e.g. the numbers that name 492, namely 4, 9 and 2, sum up as 4 + 9 + 2 = 15, and 15 is evenly divisible by 3, and "therefore" 492 is evenly divisible by three (= 164). That seems to me an example of "rhyme" rather than reason in maths; as there is no such pattern with for example 7 (9 + 8 = 17, but although 98 is evenly divisible by 7, 17 is not). [Actually, to me it seems that the case of evenly-divisible-by-3 is without rhyme or reason both, if 'reason' = 'planned'. If I know what I am talking about ...]
But what is an example of reason? Maybe pointing out that zero is merely a place-holder; it is not a number (quantity), and that negative numbers are not "negative quantities" -- i.e. maybe conceptual clarification is "reason" in maths? Certainly the opposite of those claims -- as e.g. saying that a negative number is a "negative quantity", or that zero is a "quantity without quantity" -- are "rhyme", or more exactly "nonsense verse". [More examples of mathematical-grammatical myths.]
"You mean you ca'n't take less," said the Hatter: "it's very easy to take more than nothing." (Alice in Wonderland vii; in normal usage: if you add something to none, you are not adding more to none.)
Think too of this -- you said I have no Flesh before I came away; therefore I cannot have less at Easter, for there cannot be less than none, except mathematically. (Russell, Arthur Stanton: a memoir (London, 1917), i, Stanton wrote to his mother from Oxford in 1862)
The "cannot be" here is logical not real possibility. Or invent a description -- a use in the language -- for there being "less than none" that is not mathematical (Such might be invented of course, as is with "square circle", but that would only be an invention, not a "real" discovery).
Negative Numerals, Words and "Word-als"
Note: the following continues the main discussion: Philosophy of Mathematics - What are Numbers?
"... so suggestible are we." 4 - 5 = -1 = "Four minus five equals negative one". And now a whole new realm of existents [non-existents], a whole new realm has been discovered: "negative numbers". A whole new realm of imaginary objects, of existents, of "abstractions". Human beings are most remarkable creatures.
"Negative numbers". -- This case is similar to "The meaning is the method of verification" (which has applications, although not universal application): if you do not know the calculus in which negative numbers have a role, you do not know the meaning of the word-combination 'negative number'. You must ask: what are negative numbers when they are at home? And that is answered by showing how they are derived, how we indicate a shortfall e.g. How were you taught the meaning [this technique, of using language]? how would you explain [teach] this technique to someone else? Nothing here involves pointing to otherworldly objects, shadows, deceased numbers.
Is it strange that one is less inclined to treat 'negative number' as the name of an object than one is to treat 'whole number' or 'geometric point' that way? "In the former case, only a child might be deceived." (Well I don't know about that.)
Query: are there negative numerals?
Maybe, e.g. '-6 ' is a negative numeral, as is ' - VI '. Yet another is 'negative six' [or, 'minus six']. By 'numeral' we simply mean a particular way [or, notation] of writing a number, and ' -6 ' or ' - VI ' or 'negative six' are all different ways of writing the number -6 (or, negative VI, or negative 6). On the other hand, see the next query.
Query: can numerals be negative?
What kind of possibility is being asked about here? Surely logical possibility, for either the combination of words 'negative numeral' is defined in mathematics or it is not; and there are no "real possibilities" in maths, for it is not an empirical science. Is " -9 " a negative number or a negative numeral? And then is " -IX " a negative number or a negative numeral? And "minus ['minus' = 'negative'] nine"? You could say: a 'numeral' is simply a notation for a number, and therefore although a number can be negative, a notation cannot be (i.e. the combinations of words 'negative notation' and 'positive notation' are undefined language in mathematics). What is the difference between saying that " -1 " is a numeral and that " -1 " is a negative numeral? There is none, unless you simply mean that in our notation ' - ' is used to indicate a negative, rather than a positive number (But that is only a rule for using the ' - ' sign in maths, which is not the philosophical [philosophy of mathematics] problem we are talking -- or seem to be talking -- about here, because our confusion is not about maths but about the grammar of our language).
If we did not speak of negative numerals, why would that be? Is it because 'number' is the name of something real [an object that really exists] whereas numerals are mere sounds or marks on paper? Which form of expression is more nearly correct -- 'negative numbers' or 'negative numerals'? (Word magic.)
Query: are negative numbers numerals?
The query may, I think, be asking if there really are negative numbers, or if negative numbers are really only numerals, as if to say: the number -2 is really only the numeral -2. What we might say is that -2 is no more than a notation for indicating a mathematical operation (rather like a roadway traffic sign), as for example 'take away 2' or 'minus two' or 'subtract 2'. That is, "-2 of something" does not indicate a positive existence (nor yet a negative existence either!) as "2 of something" does (or may do).
What can be said is that the query makes a category mistake: Asking if a number is a numeral is like asking if a verb is a noun. In any case, the reply to the query is that: We use the words 'number' and 'numeral' differently; those words are not interchangeable; and so: No, a negative number is not a numeral, not as we normally use the words 'number' and 'numeral' (Note that a positive number also is not a numeral; I say "positive" because in the notation of a system that uses negative numbers, all non-negative numbers might be written as, for example, +2).
Query: meaning of "abstract" in maths.
If you want to picture numbers as abstract or intangible (i.e. imperceptible) objects, you can, because that picture belongs to the Philosophy of Maths, which is the view from outside maths (it does not interfere in the calculus). Does it matter whether we talk about abstract objects or about rules of grammar in this context? Can we really talk about "seeing things aright" here? (Aporia)
What are numbers? What are words?
The question "What are numbers?" is the same as the question "What are words?" (The relation between grammar and sense and nonsense is shown by this: that a sign -- i.e. a sound, an ink mark -- as such is without meaning. What gives the sign meaning?)
But, someone says, look at the things that can be done with numbers! Look at the things that can be done with words: e.g. telling children a fairy tale full of things that never existed and events that never happened. Does that seem any less remarkable to you than mathematics? All signs are "mere words" (ink marks, sounds, the purely physical part of language): the question is: what gives signs their meaning? Not their being queer entities whose properties we discover, but the rules that have been invented for using them.
"What remarkable things numbers are!" Why are numbers any more remarkable than any other words? "Mere words" [combinations of signs] can be used to ruin a man's reputation (gossip, false testimony) or send a man to prison for twenty-five years -- or to the gallows. We see a hooded corpse twisting in the wind, suspended by a rope and noose, following a court's judgment: "Look at what can be done with this language-game!" Is that any less remarkable than the mathematical calculations behind the construction of a suspension bridge?
Query: philosophy, numbers existing before humans.
Did species of flora and fauna (plants and animals) exist before human beings did? According to our scientific pictures, they did. But did numbers exist before human beings invented mathematics (if that is what they did) and will it cease to exist when human beings cease to exist? That question might, of course, have many different senses. "Was counting possible before human beings counted anything?" It is indeed our view of things that is was. "Was there a creature who counted things before human beings did?" We have no reason to believe that there was. But, "Do numbers essentially belong to the human form of life, not being possible without it?" -- That is a philosophical question, if philosophy is a "speculative science".
Words and "wordals"
But we do have the word 'nominals' already, but that does not have the effect of 'numerals' -- why? Perhaps because the English language is not taught in schools the way mathematics is. We seldom use the word 'nominal' there (and when we do it tends to be in the dismissive sense of 'trivial', 'unimportant', 'barely worth mentioning': 'in name only').
What is the difference between the case of words [nominals] and the case of numbers [numerals]?
In the case of numbers, there is nothing to point to, whereas in the case of names there is -- often -- something to point to. Or rather: although there is something in mathematics, e.g. 2 chalk marks on a blackboard, we say that "That is not really the number 2" (as Plato might have said that "That is not really a cow"). From which it follows by analogy that if there is nothing visible to point to, there must be something invisible to point to. However, in the case of words ... No, there is nothing like the word 'wordals' [Since changed for clarity to 'word-als'] that I invented (which, although it isn't Latin like 'numeral', is effective in its bluntness) many years ago.
Is there some other reason for our self-mystification about numbers and numerals but not also about words and word-als? But maybe there is self-mystification in the latter case, under other names: 'concrete' and 'abstract'.
Rather than "wordals", I could have used the word 'signals' "sign" ("sign-als").
Imagination and the Persistence of Textbook-Grammar
Note: this continues the discussion The False Grammatical Account.
We are encouraged to imagine souls in the objects that we do see, and we are taught to imagine objects that we never see.
Why do we encourage children to use their imaginations? We do not encourage adults to do this ("Don't be fanciful"). And it is an important very general fact of nature that children like to use their imaginations.
Human credulity; maybe it comes naturally to us, but it is also taught. -- Children are told: use your imagination; but there never comes a day when children are told to evaluate the truth or falsity of the things they have imagined, "historical" narratives e.g. (Stories of the form historical-narrative). Most adults are not sure whether the stories about King Arthur and the Round Table are legend (myth, folklore, fable) or history. We have the same blank uncertainty [We don't know what to say if someone asks us or if we ask ourselves] about stories from the Old Testament.
It might be very revealing for logic were we to imagine [invent a description of] a tribe of human beings that did not use counter-factual statements and who could not be taught to use them, a people who had no imagination.
Note: this continues the discussion Three Types of Definition.
It is a remarkable feature of our language that there are equivalent-word definitions, not only of natural language, but of mathematics, that words can be defined using other words.
Find it remarkable that we can explain the meaning of words by using other words [synonyms]! of sounds by using other sounds! "Remarkable" -- i.e. do not take this very general fact of nature for granted, because it might not have been this way, because we can imagine a language in which explanations of meaning were only ever given by giving examples [i.e. by means of ostensive and "play-acted" definitions]. Why shouldn't definition-by-example be the only type of definition we had?
We are the inheritors of the ancient Greek logic of language, a logic which according to Aristotle was invented by Socrates. Was this logic imposed on our thinking as an unnatural requirement made on natural language? Wittgenstein certainly believed so, although he did not believe this imposition to be either the original or the only source of philosophical confusion.
We can describe a people who only gave explanations of meaning by giving examples (sample applications) and to whom the very idea of an "essential definition" was unknown.
Would this people still be "acquainted with the problems of philosophy"? The answer to this will depend on how broadly we define the word 'philosophy'. But maybe philosophy would be done in a different way from the way of Bertrand Russell and C.D. Broad: perhaps the people would very much more often ask the question: "What [exactly] do you mean [What examples do you have in mind] by saying [asking] that?" They would not delude themselves -- or be deluded -- by piling words on top of other words.
"Piling words on top of words on top of words." -- That is how philosophy looks to me. Perhaps the people would be more aggressive in their reading, in their listening, immediately halting every bit of nonsense -- i.e. undefined language. "It's a metaphor." -- "A metaphor for what? Say it in prose." Maybe they would be impatient, intolerant, even violent toward babblers.
Definition in Mathematics
"It is a remarkable feature of our language that there are equivalent-word definitions, not only of natural language, but also of mathematics." -- But is it a remarkable -- or an essential feature of mathematics?
Can we imagine mathematics without equality? Is equals (=) essential to what we mean by 'a calculus'? One can e.g. count without equality, but no "add and take away" is possible. It would seem disingenuous not to say here equality = identity = definition. [Which is not to say that all equalities are definitions.]
But are equalities the only thing we call 'definitions' in mathematics? No, no more in arithmetic than in geometry. But here we must remember the "account books of mathematics": what is part of the calculus and what is external to it; a definition, an explanation of meaning (e.g. as given to a schoolchild), might be either. Look and see! Give examples, many examples. Don't try to guess (PI § 340), which is what philosophers by and large do.
"Empiricism" and "Rationalism"
Note: this begins the discussion of Wittgenstein's remarks about Empiricism versus Rationalism.
I believe, although I don't know, that what is called "Empiricism" is the view that all that can be known is what is known "through the senses, through sense perception", or, as I would say, through [i.e. by means of] observation. One problem with this idea [this conception] is that it is based on the notion that there are "theories of knowledge" of which "Empiricism" is but one.
But there is no theory here; instead there is a definition: This is how we define the word 'evidence' -- i.e. The way we learn to use the word 'evidence' is precisely by making observations: seeing things, hearing things. These are our grammatical models of knowledge -- i.e. the paradigms we follow to apply the word 'knowledge', 'to know' -- that we are taught as children. What we are taught is not a theory of knowledge [whatever that is when it's at home]; we are taught a definition -- i.e. a concept not a theory (Z § 223). Although these grammatical models are indeed part of what we call 'common sense' (sound judgment), they are not as G.E. Moore thought "common sense beliefs" [or "common sense knowledge" as in "A Defense of Common Sense"].
What we call 'knowledge' is "simply the human language-game" (OC § 554); it is simply the use of a tool [Think of the words 'I know' as a tool, an instrument that has some role in our life]: it is not a claim to possess something that goes beyond the limits of every imaginable ground for skepticism [even those posed by "Questions without Answers"]. This is what we call 'to know' -- that is all it is: a move in a [language-]game. A game that does not itself doesn't have a foundation.
Knowledge is a fluid concept. Some people are unwilling to apply the word 'know' to anything outside their own [personal/direct] experience. To anything else -- e.g. the testimony of scientists, historians, journalists -- they will only apply the word 'believe'. This is a definition (a limiting of the application/meaning) of a word, not a theory about some nebulous something or other called 'knowledge'. Some people's attitude toward what they are told is more skeptical than other people's, and this attitude is shown by [reflected in] their application of the word 'know'. And in various ways by how they act/live: the use of one word is interconnected to the uses of countless other words: the definition of individual words is far more complicated than the things we say in logic suggest.
"Empiricism" is conceptual confusion [as in confusing verbal and real definitions], the usual philosophical false step: making theories, positing ill-defined categories ["-isms" that create the illusion of knowing/understanding something]. But if we take away the grammatical models of seeing and hearing (and our memories of what we have seen and heard as well -- i.e. experience) --, saying skeptically that "This is not real knowledge", then we deprive the word 'knowledge' of any meaning. [Such skepticism reflects our perception of just how limited we are, e.g. by our five senses and the frequency with which we make mistakes and delude ourselves. This skepticism displays an attitude toward life, although it may be one of humility or of cynicism.] It is like Eddington saying: "The table we see isn't really solid" (But now what is the word 'solid' to mean).
The two halfs of knowledge seeking
On the other hand, we might regard "Empiricism" as a method of scientific investigation, one that contrasts for example with the "Rationalism" of Descartes. Descartes in his physics claimed to "reason to the truth" that There can be no extension that is extension of nothing. But then when Pascal's "Empiricism" demonstrates that there are vacuums in space, Descartes' "Rational" method may be regarded as less useful to physics. But Descartes' "Rationalism", according to the Empiricist view, has the same relation to reality as axiomatic geometry has: its application to what is observed to be the case has to be demonstrated (verified e.g. through experiments). This does not make axiomatic geometry useless as a method; instead it shows the limits of its usefulness.
To speak here of a "theory of knowledge" makes nothing clearer. It would be clearer to say that Rationalism is the method that we use in inventing hypotheses to be tested, and Empiricism is the method we use to test our hypotheses. The question is simply of how we actually use the word 'knowledge' in our life.
In other words, empiricism and rationalism are not competing "theories of knowledge". Instead, they are complementary methods, methods employed in the search for knowledge. (The rest is conceptual -- i.e. philosophical -- muddle.)
If we wanted we could adapt Kant's saying here: "Concepts [rationalism] without percepts [empiricism] are empty, and percepts [empiricism] without concepts [rationalism] are blind." Rationalism without empiricism is empty, and empiricism without rationalism is blind. (By 'concepts' we mean simply language, and by 'percepts' sense-impressions.)
[Philosophical ideas do not emerge from the stone like the "captives" of Michelangelo. Because the shape that emerges is not a shape that we foresaw when we began our work. Nor is it the "final" shape, because there is no final shape (See Preface). If I said that my ideas are always evolving, that would only mean that I'm never sure where they are going to end up [come to rest for the moment]. Revision [of ideas] is the condition of philosophy; how could it be otherwise?]
Francis Bacon (1561-1626)
Philosophical works of pure rationalism Francis Bacon called "Idols of the Theater" -- i.e. he likened then to the productions of a playwright who invents a world out of his own imagination rather than describing the actual world (An 'idol' is a false god [truth], a delusion). But he also criticized pure empiricism (i.e. the simple collection of facts about the actual world):
Those who have handled sciences have been either men of experiment [i.e. experience] or men of dogmas [i.e. rationalists]. The men of experiment [i.e. experience] are like the ant, they only collect and use; the reasoners [i.e. rationalists] resemble spiders, who make cobwebs out of their own substance ["who spin webs out of themselves", tr. Urbach, Gibson]. But the bee takes a middle course: it gathers its material from the flowers of the garden and of the field, but transforms and digests it by a power of its own.
Not unlike this is the true business of philosophy; for it neither relies solely or chiefly on the powers of the mind, not does it take the matter which it gathers from natural history and mechanical experiments and lay it up in the memory whole, as it finds it, but lays it up in the understanding altered and digested. Therefore from a closer and purer league between these two faculties, the experimental [i.e. experience] and the rational (such as has never yet been made), much may be hoped. (New Organon, Aphorisms - Book One, XCV, tr. Spedding, Ellis, Heath)
That is what Drury says a scientific theory is: facts (data) plus imagination. Without facts it is Rationalism, but with facts only -- i.e. without imagination -- it is not science.
What Bacon describes is not ancient philosophy ("Learning") in its fullness, but only one of its branches: natural philosophy (what the Greeks had called "physics"). Bacon's world-view, like Isaac Newton's (1643-1727), was simply Christian; the rest of human knowledge he ascribed to divine revelation -- i.e. he rejected both natural theology (Aristotelian metaphysics) and philosophical ethics: mankind can know nothing about God except what God himself tells us, and right and wrong are also dictated by God. (Bacon saw the justification for natural philosophy in God's having granted dominion to Adam, a dominion lost that needed to be recovered.) As to the third branch of philosophy (logic or dialectic), Bacon rejected the Aristotelian syllogistic logic as a method for obtaining knowledge about the world; indeed, Bacon regarded the dominance of the ideas and methods of Aristotle and Plato over his contemporaries as the principle obstacles to the advancement of knowledge; he rejected "contemplative knowledge": philosophy should be practical knowledge (productive, serviceable to mankind). [This paragraph is mostly based on Fulton H. Anderson's Introduction (1960) to The New Organon.]
But if philosophy is neither rationalism nor empiricism nor a natural science, then what is it -- what is its subject matter? For Wittgenstein there remains only Logic ("logic of language" in my jargon), but others include Ethics, and others, like Schweitzer, include the search for a "true and serviceable" world-view, which may or may not include Metaphysics.
For "What philosophy is?"
Note: the following was written as comments to the topic Socratic ignorance.
It is not necessary to have read lots of philosophy to have read more than I have. Because in philosophy I have read only what appeared useful to my own interests (those being the logic of language [meaning, sense and nonsense], life and death, and truth and falsity and appearance [seeming, plausibility, opinion]).
What do you yourself want from philosophy? That is a question one must ask oneself. And do you only want someone else's views, or do you want to think about things for yourself? [There is a distinction between the history of philosophy and philosophizing.]
Why think about philosophy? What do I want from philosophy? I want to clear up the the vagueness and confusion [and the only-apparent metaphors] by which I feel surrounded. In other words, I want to make things clear to myself [Wittgenstein's "Philosophy is the logical clarification of thought", but not in the TLP sense [4.112]; instead 'thought' = 'language' ["operating with signs" (BB p. 6)] and 'logical' = 'grammatical': concerning rules of sense and nonsense, i.e. linguistic conventions, definitions], even if only by discovering in just what way they are unclear. That is what I believe philosophy was for Socrates, -- reason directed toward distinguishing what I know [and of course I cannot know nonsense] from what I only think I know --, and that is what it has always been for me.
Metaphors and Philosophical clarity
Always I have tried to avoid using metaphors. First, because metaphors say what something is like, not what it is [The logic of comparison]. And second, because metaphors very often cover up [are a veil for] ignorance: it is often impossible to translate this language into prose, and if it cannot be translated into prose then it is, by Wittgenstein's definition (i.e. grammatical remark), not metaphor. There are many only-apparent metaphors [i.e. propositions that although they appear to be metaphors in reality are not] in common utterance, nonsense in masquerade (We utter a lot of nonsense) that philosophy needs to make patent (PI § 464).
The meaning of a metaphor [metaphorical language] is not context insensitive: like all other language, its meaning has to be deciphered in each particular instance of its use.
At school we are not taught to ask always: "What does that mean? What's that when it's at home?" Those questions, if we have criteria -- i.e. a logic of language, by which to answer them could unmask metaphors that are either nonsense or so ill-defined as to amount to nonsense.
Even God cannot understand nonsense ("mere sound without sense") -- i.e. where there is nothing to understand, nothing can be understood. Meaning can be given after the fact to any combination of words or sounds, but those words or sounds do not have meaning in themselves. Philosophers are like children who scribble some marks on paper and then ask the adult: What does this mean? (cf. CV MS 112 114: 27.10.1931). Do you think it must have a meaning (as if it were some inscrutable natural phenomenon: Let's see, what's this really?) Logic is, in Wittgenstein's philosophy, only concerned with the conventional meaning of language.
"Do you think it must have a meaning?" The Italian equivalent of "What does it mean?" is an equally misleading form of expression, namely Cosa vuol dire? ("What does it want to say?") In itself language can't "mean" or "want to say" anything -- but only with language someone can "want to say" something, so that the question is "What do you mean by ...?" or "What does so-and-so mean by ...?"
No comparison has to be made. And from A resembles B in some way or another, nothing follows with logical necessity. The adoption of a metaphor is entirely discretionary; it is entirely a matter of being persuaded to look at things in a particular way. Frazer's Golden Bough does not explain modern religious rituals by comparing them to ancient rituals and "finding" their origins there; in fact he finds nothing: he simply draws our attention to similarities, which may or may not persuade us to adopt his point of view; but his point of view is not a proof that B is the origin of A. Family resemblances are easily found among the faces of unrelated human beings. The uncritical are easily persuaded to belief-in; but anthropology must be something more than comparison-making, if it is to be a branch of knowledge rather than simply the substitution of one point of view for another.
Aeneas and the golden bough
Query: why Frazer named his book Golden Bough.
The first chapter begins with the words: "Who does not know Turner's picture of the Golden Bough?... In antiquity this sylvan landscape was the scene of a strange and recurring tragedy." Preface: "The primary aim of this book is to explain the remarkable rule which regulated the succession to the priesthood of Diana at Aricia" [southeast of Rome, on the banks of Lago di Nemi, which was called "the mirror of Diana" (Seyffert)] .... "a priest who bore the title of King of the Wood, and one of whose titles to office was the plucking of a bough -- the Golden Bough -- from a tree in the sacred grove.... According to the public opinion of the ancients the fateful branch was that Golden Bough which, at the Sibyl's bidding, Aeneas plucked before he essayed the perilous journey to the world of the dead." (Chapter One)
It was Frazer's view that, rather than tree worship, behind this practice was fear of the dead, which he believed to be a principal source of ancient religion. Of course there is no truth or falsity to these views, only plausibility. (James George Frazer, who was born in 1854, was a Fellow of Trinity College, Cambridge, 1879-1941.)
Comparisons that rob or seek to rob things of their uniqueness, I think cheapen our life.
The souls of the unburied might not [i.e. were not allowed, for Charon would not ferry them] pass the river that encircles the kingdom of death, but must wander in desolation, with no abiding-place, no rest ever for their weariness [When Aeneas descends into the lower world, carry the golden bough, he sees many of these souls (iv, 4, 2, p. 330 and 333)]. To bury the dead was a most sacred duty, not only to bury one's own, but any stranger one might come upon. (Hamilton, Mythology (1942), v, 2, p. 386)
But in the case of Antigone's rebellious brother, Polyneices, the Theban ruler Creon proclaimed it a crime to bury his body and said that anyone who did would be put to death. And so Creon condemns Antigone with the words, "You knew my law, and yet you broke it."
Your law, but not the law of Justice who dwells with gods," Antigone said. "The unwritten laws of heaven are not of today or yesterday, but for all time." (ibid. 387)
The girl's story is taken from Sophocles' play, and it asks the eternal question of whether good and evil exist or are only man's laws "of today or yesterday", mere customs as the Sophists claimed, not qualities of reality.
What do we call academic 'scholarship'?
Note: this supplements the discussion C.D. Broad and Wittgenstein.
"Review of the literature"
About John Maynard Keynes' A Treatise on Probability, which began as a prize fellowship dissertation for King's College:
Unlike Keynes's other work, it is notable ... for its extensive discussion of the literature of the subject. (John Maynard Keynes by D.E. Moggridge (1976), p. 15)
Instead of contenting myself with simply providing a solution, I took it upon myself to investigate and write the history of the problem. That I thrice attempted to pursue this laborious detour is the fault of Aristotle. How often have I cursed the hour when I first read the section of his Metaphysics [Book 1 (980a21-993a), according to A.E. Taylor] where he explores the problem of philosophy through a criticism of earlier philosophizing!... Since then I have experienced over and over again that compulsion to grasp the nature of the problem not only as it now stands but by tracing its evolution through history. (Albert Schweitzer, Out of My Life and Thought, tr. Lemke (1990), Chapter 12, p. 118-119)
... the description [in Civilization and Ethics] of European philosophy's tragic struggle to arrive at an ethical basis for acceptance of the world [as opposed e.g. to resignation], was forced on me. I felt an inner need to explore the historical development of the problem and offer my solution as a synthesis of all previous solutions. I have never regretted having succumbed to this temptation. In my attempt to understand the thought of others my own thought became clearer. (ibid. Chapter 14, p. 159-160)
That is what is called 'scholarship', I believe, and it is what is demanded at school rather than simply original-to-oneself thinking. And possibility with this reason: the intention of schooling is to give the student a background/foundation ("the received view") in the subject-matter. And so the student is commanded to write "reviews of the literature". Of course Wittgenstein never did this [It would be quite difficult, I think, to imagine Wittgenstein writing a "review of the literature", especially given that he regarded himself as the equal of his predecessors, for whom, at least at first reading, he had little regard (with the exception of Russell and Frege; see the Preface to the TLP)]: a good book by his lights was one that provoked you to close it and think about the question for yourself, which is what he hoped his own books would do ("stimulate others to thoughts of their own").
I think, however, that if you agree with your teacher of philosophy, then your teacher is no good for you. Because if you agree with what your teacher says, this only means that you have learned (more or less) to follow a particular line of thought; it does not mean that you have learned to thoroughly question that line of thought through rebellion against it (or, by trying to refute it). Your teacher is no good because his views have not forced you to think for yourself, to find another point of view (your own, for instance).
And this is why I wrote in the Preface that my Synopsis is not a work of scholarship: there is no "review of the literature". I hope this was simply because I found the secondary literature useless, much less clear than Wittgenstein's own writings -- i.e. either not written in "readable sentences" (Malcolm's expression) at all, or written in the old style of philosophizing in generalities with either no or very few examples.
The secondary literature [most of it the work of the professors Wittgenstein called "philosophical journalists"] was criticism of Wittgenstein's writing that failed to understand the nature of Wittgenstein's logic of language -- that it is a logic in definitions and metaphors (not e.g. philosophy as Russell conceived it, a collection of speculative theories allied to the sciences). This was demonstrated by that criticism's way of speaking e.g. of "Wittgenstein's theory of the language-game" and its wanting principally to determine (classify) which type of "-ist" Wittgenstein was; or the criticism was merely careless readings of Wittgenstein -- e.g. "reading the old man to sleep" as an example of 'use of language' = 'meaning of language' [Wittgenstein: the use of a word is often its meaning in the language (and vice versa)]. It was not what I wanted at all.
As presented by Moggridge, Keynes was a philosopher, meaning that he worked in the spirit of philosophy -- (constantly questioning and revising) -- not an ideologue. Indeed, ideologues attack him for "a tendency to change his mind", something which anyone who continues to grow in understanding must do; economics, according to Keynes, is as much an art as a science. He did change his mind to adjust his ideas to the facts of the economic situation, not the facts to his ideas. This was just as the philosopher Myson advised in circa 600 B.C.
[Aside. A student once wrote to me that he had found a quotation from Wittgenstein in a book he had read, and now he wanted to use that quotation in a paper he was writing. But the author of that book had not indicated the source of the quotation, and "obviously I would like to cite the original text". But that is not scholarly. If you find a quotation that is useful to you in someone else's work, then you attribute the quotation this way: "Wittgenstein, as quoted by so-and-so in such-and-such." That is not an endorsement of the other author's way of thinking; it is simply giving credit where credit is due, for that author was the one who found the quotation, not you -- and you should not pretend otherwise. That is scholarship; it is also intellectual honesty.]
Knowledge by Any Other Name
Note: this continues the discussion Induction and the Philosophy of Science.
Frank Ramsey's criticism of Keynes' probability: "we could accept the premises but reject the conclusion without contradicting ourselves"; in the case of induction the relation between premise and conclusion is not one of logical necessity. One the one hand, deduction, being tautology, is not knowledge; and on the other hand, induction, not being deduction, is not knowledge. So, then, is any conclusion knowledge? Obviously not. Only premises (facts, evidence) are known; induced conclusions are only hypotheses (theories or predictions based on a theory [generalization]). Now, is that the way we use the word 'knowledge'?
We have given definitions of 'knowledge', but nowhere stated the essential meaning of that word; it does not have one. (cf. Language of fairy tales)
"Yes, and no." We have described some of the ways that word is used. However, we have also done what Wittgenstein did in the case of the word 'meaning': we have also chosen the meaning of the word 'knowledge' that we want to use; we have limited the application of that word to a particular type of case. We know the evidence; we invent hypotheses (predictions), but we do not know them. That is the distinction we wish [have chosen] to make.
Concepts ... are the expression of our interest, and direct our interest. (PI § 570)
There is in some philosophers (and students of philosophy, me e.g.) a hostility toward knowledge -- i.e. to the claim that anything is known or even can be known. The claim that anyone knows anything -- No, we do not want that to be true: our minds are only at rest/peace after we have proved/demonstrated that no one knows -- because no one can know -- anything. This is a curious/strange attitude. In many cases, this attitude is provoked by the foolish/ignorant claims that scientists (or popularizers of science) and religious believers make. But it is more than that: "Scientists claim to know that? But I don't want anyone to know that!" However, this attitude is not the justification for our limiting our application of the word 'knowledge'; that justification goes back to Socrates: it is our task in philosophy to say no more than we know.
Back to Ramsey and Keynes. A peculiarity of the 1918 influenza pandemic:
... whereas influenza normally was a mild disease that killed only the very young and the very old, this influenza was most dangerous to people 21 to 29 years of age. This influenza took the strong and spared the weak.
Now, that would be knowledge: here are the statistics: total infected, age range, mortality: i.e. here is the evidence, the facts. And then you could say: as you see there is a very strong positive correlation between age group 21-29 and mortality. And that correlation can -- and would most naturally be -- called 'knowledge'; however, seeing a correlation, a pattern, is an act of deduction [i.e. of applying a definition], not of induction [i.e. collecting new evidence]; it is a tautology [The tautology is not that the correlation is positive (It might after all have been negative, which had historically been the case), but in the choice to look at (i.e. to select and arrange) the data in a particular way]. And it is not yet a theory: it does not say: And based on this correlation, we can predict that such-and-such will happen. And here we must choose whether we want to call this prediction 'knowledge' or not. Some scientists would be comfortable calling it so; but most philosophers would say, No, a prediction is based on evidence; it is not itself evidence -- and, therefore, it is not knowledge. This is a matter, as we see, of definition, not of facts (Both sides already know the facts). But, as I believe Wittgenstein showed, definitions are important to our life:
What after all is so important about the sign 'knowledge', which is after all only sounds, ink marks?" The importance we attach to the word 'knowledge' is the importance we attach to the word 'truth'. (cf. "The Distinction between 'Evidence' and an 'Hypothesis'")
With respect to the type of arguments Keynes's constructs in his account of probability, Ramsey wrote that: "we could accept the premises and utterly reject the conclusions without any sort of inconsistency or contradiction." But unless by 'inconsistency' and 'contradiction' Ramsey means the same thing, then could we "reject the conclusions without any sort of inconsistency"?
Contradiction is not the only god in logic. It is not even a god in logic.
Contradiction is not the only god in logic, and furthermore it isn't even a god in logic, except in "formal/symbolic/mathematical logic" (something which has little to do with philosophy). Taking the example of the mortality rates for people aged 21 to 29 above, is not a rejection of the conclusion that health officials should be prepared for a high-proportion of influenza deaths to be in that age group not inconsistent with the facts (the premise)? If the pattern is well-established, then would it not be inconsistent, although not a contradiction, to reject the conclusion (prediction) that the pattern will continue (everything else being constant, obviously)?
Is this not like "I know it [to be the case], but I don't believe it [to be the case]"? This is not nonsense because it is a contraction, but because the two statements conjoined with 'but' are inconsistent with one another. 'It is of the highest probability that this pattern will continue, but I do not believe that this pattern will continue.' -- This is not a contradiction, but is it self-consistent? 'It's probable, but I don't believe it.'
What interests philosophy here (or, what interests me here) is the logic (meaning, use) of our natural [ordinary, everyday] language [of our common grammar, definitions], not the rules of some invented calculus (mathematical logic e.g.). -- What do we mean by 'inconsistent' when we talk about probability, the likeliness that one thing will happen rather than another? It is not a formal contradiction to say that although there has been a constant pattern and -- although nothing has changed -- the pattern will not continue. However, is it consistent with the evidence to say that the pattern will not continue? Ramsey was concerned with mathematical implication, but I am not. What do we mean by 'consistent'?
If anyone said that information about the past could not convince him that something would happen in the future, I would not understand him.... If these are not grounds, then what are grounds?... For the question here is not one of an approximation to logical inference. (PI § 481)
This sort of statement about the past is simply what we call a ground for assuming this will happen in the future. (ibid. § 480)
What I want to do in this case is to "look on the language-game as the primary thing" (§ ibid. 656) -- i.e. I only want to describe what we do, not what we in some sense "ought" to do. And, I think, that as we ordinarily use the word 'inconsistency' when talking about probability, then Ramsey was incorrect (again, that is, if he was not simply using that word as a synonym for 'formal contradiction').
"But an assumption is not knowledge." No, a prediction is made on the basis of knowledge, but a prediction is not itself knowledge. Or is it? As I wrote just above, here we must choose whether we want to call this prediction 'knowledge' or not. Which family resemblances do we wish to emphasize, or which differences do we wish to stress?
Obviously we are only saying that if this pattern holds, then ... It doesn't matter whether or not there's a formal contradiction. This is like, "I know it, but I don't believe it." I do not see why this should not be judged to be/classified as knowledge; it is not as if 'knowledge' had an essential meaning.
Recall that we can give any combination of words a meaning if we wish to use that combination of words, contradictions in form e.g. which we are disinclined to use. The proposition 'I know it, but I don't believe it' can be used to mean, in the context of "virtue is knowledge", that: I say I know, but I believe I know something better, which has the form 'p and not-p'. And that is why I do not do what I say I know to be right: because I believe that there is something, as it were, righter. This happens often in ethics; it is the consequence of not thinking things all the way through. So this is a use for the combination of words 'I know it, but I don't believe it'. Remember that in [Wittgenstein's] logic of language, meaning is a question of use rather than of form.
"This is what grounds look like here; in this particular case, this is what is meant by 'knowing' something." No, we do not know that it will happen; what we know is that if the pattern continues, then it will happen. But that is a tautology. And we do not call tautologies knowledge. "Let the proposition's truth conditions show you what is known, what it means 'to know' here."
If it is now asked: But how can previous experience be a ground for assuming that such-and-such will occur later on? -- the answer is: What general concept have we of grounds for this kind of assumption? This sort of statement about the past is simply what we call a ground ... (PI § 480)
I have not seen Frank Ramsey's words in the context in which he wrote them. I have, in any case, only used them in this spirit:
... reading philosophical books. If we took a book seriously, he would say, it ought to puzzle us so much that we would throw it across the room and think about the problem for ourselves.
That is what Karl Britton actually wrote, as alluded to above (the allusion that immediately follows it is to the Preface to the Philosophical Investigations), in "Portrait of a Philosopher" in the The Listener (10 June 1955).
The most important point is this: if you are going to simply call induction "a useful mental habit", as Ramsey does, then the standards you are setting for logic are too distant from actual life (practice, i.e. the place/role of induction in our life) for the standards to be useful -- or even correct (in the sense of a true description of what we do). A very different account needs to be given.
And Ramsey may have gone on to give such an account; his criticism of Keynes's account was here only from the point of view of mathematical logic, I think.
Maybe the question to ask is, Which use of the word 'logic' is the most useful to us? But I have not thought any of this through.
Site copyright © September 1998. Please send corrections and criticism of this page to Robert [Wesley] Angelo. Last revised: 1 July 2013 : 2013-07-01
The URL of this Web page: