[Skip Navigation] [CSUSB] / [CNS] / [CSE] / [R J Botting] / [ MATHS ] / logic_20_Proofs100
[Contents] [Source Text] || [Notation] || [Copyright] || [Contact] [Search ]
Mon Sep 10 13:46:45 PDT 2012

Contents


    Help with Proofs and Demonstrations

      Introduction

      My MATHS notation is designed to make the validation and proof of software a more practical process. The idea is that the proofs should be machine-checkable. However, the real work of constructing proofs is informal and happens before you start to write it out properly. This page is about the informal planning stage and offers some resources that I have found useful over the years, plus my own thoughts. Carrying out your plan involves more rigorous techniques and [ logic_25_Proofs.html ] proposes a way to use the ASCII code to encode rigorous algebraic and natural deduction proof methods.

      For Internet pointers to automated proof systems, see [ Automation ] below.

      There has been a lot of research that makes use of formal logic of one sort an another. To generate a quick list of pointers to citations see LOGIC

      For a discussion of some differences between the way people think and formal logic, see [ Three Kinds of Inference ] below.

      For more on how formal methods and creativity complement each other see [ Further Reading ] below.

      But Why Prove anything Anyway?

      I figure there are three or four main reasons:
      1. To convince unbelievers. Many proofs are needed to convince the readers of a paper or thesis that the writer knows what they are talking about. In other cases it is the writer that doesn't quite believe a result and constructs a proof to convince themselves.
      2. To provide an explanation for why a result is true. Proofs are constructed using assumptions and rules. These form an explanation of the result. You can then think about and challenge the assumptions. For example, to make a planned system safer, construct proofs of its safety, and then challenge the assumptions you used.
      3. To see where the result fits into the other facts. Proofs provide a structure to a collection of statements. The power of Euclid is the structure that he provided to geometry reasoning from postulates to theorems and constructions. The structure in turn provides a mnemonic framework to help you learn and recall the results.
      4. Working out a proof helps you memorize the result. This is one reason for all those exercises in Math, Physics, and Computer Science classes.
      5. Training and exercises. When proofs are assigned as class or home work they are preparing the student for a future presentations and publications.

      My proofs are too long -- am I stupid or something

      No. The proofs that are published are the result of sometimes years of refinement, pruning, editting, reviews. ... See [ 230 ] (Abstruse Goose) and [ 429.html ] (SpikedMath)

      Hints

        Analyze before you prove

        Follow polya's structure:
        1. understand the problem;
          • What are you given?
          • Are you trying to prove something or to find something?
          • What is the goal?
          • How to relate the givens to the goal?
        2. Define a notation.
        3. Make a plan;
        4. Carry out the plan;
        5. Review the result.
        6. Write up result...

        [ ../samples/polya.html ]

        Proving both_and

        Typically you need to prove the parts first and then combine them using [ logic_25_Proofs.html#Valid Deductive Steps ]

        Combine what you know already

        Often you can combine two things that you know already into a more useful result. These steps are based on what are called Syllogisms by ancient logicians. Here is the classic example:
        1. All men are mortal.
        2. Socrates is a man.
        3. :. Socrates is mortal

        In formal logic:


        1. For all x, if man(x) then mortal(x).
        2. man(socrates)
        3. (-2)|-mortal(socrates)

        or in set theory
        1. Man ==> Mortal, socrates isa Man,
        2. (-2)|-socrates isa Mortal.

        By medieval times we had a gigantic list of these [ ../samples/syllogisms.html ] and algebra added more:


        1. x < y,
        2. y < z,
        3. :. x < z.

        But what to combine...?

        Proving universals

        If you want to show something that looks like this
      1. For all whosis, if something then whatever introduce a new variable whosis and that something is true and try to show whatever as a result.
        Let
        1. whosis, something
        2. <A miracle occurs>
        3. whatever

        (Close Let )
        See [ logic_25_Proofs.html#Block Structure ] for a way to control the assumptions and results.

        Proving equality and if_and_only_if

        Often results of the form this = that or this if and only if that use [ logic_25_Proofs.html#Algebra and Calculations ] to figure from this to that.

        Proving either_or

        Sometimes when working with statements with special cases (P or Q) its easiest to use [ logic_25_Proofs.html#Proof by Cases ] to try out each possibility in turn.

        Proving something exists

        Sometimes you can find a suitable example. More often you need to construct an example out of existing objects using the construction rules are accepted in your discipline. For example, in Euclid, you used compasses and a straight edge. These days, we can use algebra -- or even a calculator. Indeed, in some disciplines, it is enough to show that an infnite series of approximations to a desired object get as close as we would like to what we want. So proving something exists boils down to [ Finding something ]

        Sometimes we are forced to go a different route -- assume that an object does not exist and derive a contradiction. Notice that a whole branch of mathematics intuitionstic rejects this approach and demands that if you want to calim that something exists, then you must give a method for constructing an example.

        Finding something

        One way of finding something is to show how to construct it from other given things. The terminology goes back to Euclid.

        In MATHS we might express a Euclidean construction like this

      2. For all x:T1, where P1, find a y such that P2.
        Let
        1. x :T1=given,
        2. (given)|-P1.
        3. y :T2=goal,
        4. (goal): P2.
        5. z:=....,
        6. z2:=....,
        7. ...
        8. y:=...
          ...|-(desired property): P2.

          Proof of desired property.

        9. ...

        (Close Let )

        Example 1 Euclid Book I Proposition 1

          The following is extracted, out of context, from a hypothetical translation of Euclid into MATHS. As a result some of the definitions and axioms are missing.

          Problem -- To draw an equilateral triangle on a given finite straight line
          Let

          1. AB:FiniteLine=given.
          2. ABC:Triangle=goal, equilateral(ABC).
          3. A:=start(AB).
          4. B:=end(AB).
          5. BCD:=circle(A, AB).
          6. ACE:=circle(B, AB).
          7. (ACE, BCD)|-some intersection(ACE, BCD),
          8. C:intersection(ACE, BCD).
          9. ABC:=triangle(A, B, C).
          10. (above)|- (p1): equilateral(ABC).

            Proof of p1


          11. (BCD, circle)|- (p11): length(AC)=length(AB).
          12. (ACE, circle)|- (p12): length(BC)=length(AB).
          13. (p11, p12)|- (p13): length(AC)=length(BC),
          14. (ABC, p11, p12, p13, equilateral)|-equilateral(ABC).

          (Close Let )

          For the real Euclid see [ propI1.html ]

        . . . . . . . . . ( end of section Example 1) <<Contents | End>>

        Algebra (using symbols for unknowns, and expressions to symbolize the construction of other things, ...) is a powerful tool for finding something that fits a property.

        Algebra picks up Pappus's idea of stating with what you want -- expressed as an unkown -- and then expresing the constraints on the unkown... in the hope of transforming these into a known object.

        Example 2

          For example consider:
        1. The product of four consecutive numbers is one less than a square.

          I did a lot of exploration which, when pruned of errors and dead ends, became:
          Net

          1. Let n be a number, then we are interested in this formula:
          2. n(n+1)(n+2)(n+3) + 1
          3. The obvious approach is to multiply this out and then recognise that the formula is the square of some obvious expression. But when I tried this, the obvious escaped me.

          4. If it was multiplied out, it would be a polynomial in n with leading and last coefficients equal to one:
          5. n^4 + ... + 1.
          6. So, if this is a square, it must have form:
          7. (n^2 + a*n + 1)^2, for some a.
          8. When n=1:
          9. (1^1 + a*1 +1)^2 = 1*2*3*4 + 1,
          10. (a+2)^2 = 25,
          11. (a+2) = 5,
          12. a = 3.
          13. So we have found an a and so can now go to the rigorous proof of the simpler result
          14. n(n+1)(n+2)(n+3) + 1 = (n^2+3*n+1)^2.
          15. ...

          (End of Net)

          Notice that the algebra in the middle of the above is not a rigorous proof. It just discovers a formula that can then be proved using algebraic techniques -- ones where we can check every step by the rules.

          The proof goes like this... but you can fill in the blank:
          Let

          1. n:Int.
          2. n(n+1)(n+2)(n+3) + 1 = ... = ... = ...
          3. (n^2+3*n+1)^2 = ... = ... = ...
          4. (above)|-n(n+1)(n+2)(n+3) + 1 = (n^2+3*n+1)^2,
          5. (-1, m:=n^2+3*n+1)|-for some m, n(n+1)(n+2)(n+3) + 1 = m^2.

          (Close Let )

        . . . . . . . . . ( end of section Example 2) <<Contents | End>>

        Example 3 Construction as a series of definitions

          The following is out of context of the algebra of real numbers. I assumes links to the laws of algebra (which don't exist in this context).

          Notice that I discovered a given property (dnz) in the process of drafting the construction.
          Let

          1. a,b,c:Real=given.
          2. (given)|- (anz): a<>0.
          3. x:Real=goal, a x^2 + b x + c=0.
          4. d:=0.5/a.
          5. e:=b d.
          6. Δ:=e^2- 2 d c.
          7. (given)|- (dnz): Δ>=0.
          8. (dnz)|-
          9. s:=sqrt(Δ).
          10. x:=(-e + s).
          11. (anz, d, e, Δ, dnz, s, x)|- (qe): a x^2 + b x + c=0.

            Proof of qe

          12. a x^2 + b x + c
          13. (x)|-
          14. =a (-e + s)^2+ b (-e + s) +c
          15. (square)|-
          16. =a (e^2 - 2 e s + s^2)+ b (-e + s)+c
          17. (s)|-
          18. =a (e^2 - 2 e s + Δ)+ b (-e + s)+c
          19. (Δ)|-
          20. =a (2 e^2 - 2 e s - 2 d c)+b (-e + s)+c
          21. (distributive)|-
          22. =a 2 e^2 - 2 a e s - 2 a d c + b (-e + s)+c
          23. (distributive)|-
          24. =2 a e^2 - 2 a e s - 2 a d c -b e + b s+c
          25. (e, d)|-
          26. =2 a b^2 d^2 - b s - c -b b d + b s + c
          27. (commutative, zero)|-
          28. =2 a b^2 d^2 - b b d
          29. (power)|-
          30. =2 a b^2 d d - b^2 d
          31. (factor)|-
          32. =(2 a d -1) b^2 d
          33. (d)|-
          34. =0 b^2 d
          35. (simplify)|-
          36. =0.

          (Close Let )

        Proving something doesn't exist

        About the only way to tackle this is to assume that the thing in question exists and show that if it did then impossible things happen. Sometimes the end point is a simple contradiction. This is called reducto ad absurdum -- in Latin. [ logic_25_Proofs.html#Reducto ad Absurdum ]

        In other cases you show that if an A exists then some other object B can be constructed from A, and this object B we already know doesn't exist:
        Let

        1. Some A exists.
        2. Construct B from A.
        3. (-1)|-Some B exists.
        4. (previous)|-B does not exist.

        (Close Let )

      3. (-1)|-A can not exist.

        This kind of proof is popular in computer theory texts -- it is called reduction.

        Stuck?

        A weird but valid trick (when stuck?) is to start by denying the truth that you want to establish and show that the result is absurd: [ logic_25_Proofs.html#Reducto ad Absurdum ] is the Latin term for this...

        If you are really stuck for ideas, then try relaxing and playing with various possibilities: what if, Suppose, perhaps,... [ logic_25_Proofs.html#Trying Ideas Out ] This is a traditional technique for mathematicians with a problem to solve. Often when you've worked on a problem long enough the key idea will come to you "out of the blue" while getting onto a bus or shopping.

        Sometimes it helps to loosen up and do something silly. An example is DST [ The Dilbert Salary Theorem ] which was recently proved after years of diligent research.

        Another strategy is to browse the ready made [ logic_25_Proofs.html#Proof Templates ] out there waiting to be used. Every field has it's own proof methods. It is, therefore, well worth studying these in preparation to publishing or presenting a proof.

        Practice

        It pays to practice the techniques on lots of unimportant problems just to train the brain. Hence the many exercises in good math classes.

        I have found some lovely light hearted examples in the Internet Futility Closet like this example [ http://www.futilitycloset.com/2012/06/27/readership/ ] of a simple syllogism and a classic informal but MATHS-style proof.

        Read

        Finally: There are many excellent books that can improve your proving and problem solving skills: [ Further Reading ] below.

      . . . . . . . . . ( end of section Hints) <<Contents | End>>

      Further Reading

        Polya -- How to Solve it


        (polya):

        Source: George Polya, How to Solve It: A New Aspect of Mathematical Method, Princeton University Press, Paperback 0-691-02356-5 (USA).

        This is worth reading just for the entry under the heading "Mathematics Professor, Traditional". The book lays out a program for solving problems and proving results. It has been known to increase grades a whole point. It balances rigor and creativity in an effective procedure. Polya suggests using four phases to solve problems:

        1. Understand the problem
        2. Work out a plan for a solution
        3. Carry out your plan
        4. Review the result

        In the first step you get a very clear idea of what you have been given and what your goal is and invent some notation. In the planning phase, use plausible, analogical, and even lateral (debono)thinking. Carrying out the plan is made of rigorous and formal steps. But he thinks we should aim for steps that are both intuitive and valid. In the last phase you look for ways to reuse the results and plans as well as improvements and variations to the solution.

        For more details on Polya's framework for problem solving see [ ../samples/polya.html ] which is my reduction/expansion/editing of his scheme.

        In the MATHS language the Net construct can be used for setting up a collection of givens and a goal and then working out a plan:

         		.Net
         		 name::type=given, meaning.
         		...
         		 (given)|- property.
         		...
         		 (goal): property.
         		...
         		.Close.Net

        Conceptual Block Busting

        Source: Wayne Wyckelgreen, Conceptual BlockBusting,??

        How and why we get stuck, and how to get unstuck. Full of practical ways of thinking around problems.

        Lateral Thinking


        (debono):

        Source: Edward De Bono Just about everything Dr. De Bono has written is easy to understand. It de-mystifies how your brain works and how to use it. He shows how ideas can be created, modified, and put to use.

        See [ http://www.debono.ws ] as a starting point.

        Three Kinds of Inference

        Sadly the following book [SebeokEco83] is not easy to understand and sometimes tends to mystify you... but it does define the way we tend to think quite well while writing about Sherlock Holmes, Charles Pierce, and C. August Dupin.

        According to [SebeokEco83] we can distinguish three kinds of inference that people, detectives, doctors, scientists, philosophers, etc. tend to make. Only the first of these is valid. But they are all commonly used. We start by labeling three statements:
        (Rule): All the beans from this bag are white.
        (Case): These beans are from this bag.
        (Result): These beans are white. Here are three types of inference... written in the MATHS format

        	(premises)|- conclusion.


        1. (Deduction): (Rule, Case)|-Result.
        2. (Induction): (Case, Result)|-Rule.
        3. (Abduction): (Rule, Result)|-Case.

        Which do you think is "common sense"? Which would you trust most? Which one do you find Sherlock Holmes using in just about every case?

        For a full discussion of Sherlock Holmes's logic please see [ http://philsci-archive.pitt.edu/5306/ ] Sosshichi Uchii's paper "Sherlock Holmes on Reasoning" at [ http://philsci-archive.pitt.edu/ ] the University of Pittsburg's PhilSci-Archive, or [ http://www1.kcn.ne.jp/~h-uchii/Holmes/ ] (1991) at the University or Kyoto.

        The Abduction reasoning is proposed as a Sign of Progress (Page 181). or Heuristic Syllogism (pages 186-190) by polya:


        1. If we are approaching land, we often see birds.
        2. Now we see birds.
        3. Therefore, probably, we are approaching land.

        (Page 181).

        These arguments can made rigorous by using the Bayesian model of probability. Here a value from 0..1 indicates the degree of belief we have in a proposition, and there are formulae for recalculating our beliefs as new data is collected. For more see [ math_81_Probabillity.html#Bayes ] my notes on how this works out using Bayes's Formula.

        Proofs and Refutations


        (Lakatos): Imre Lakatos (edited by John Worral and Elie Zahar), Proofs and Refutations, Cambridge University Press 1976

        This book is full of ideas. The author replays an advanced seminar in mathematics where the students duplicate the key points in the history of a particularly elegant theorem in the theory of polyhedra. He makes it clear that proofs are often refuted. Mathematics grows by a process of proof, refutation, and reproof. Indeed, the exception improves the rule more often than not.

        The author wanted to decrease the dogmatism of mathematics, and in particular the dogma that mathematical knowledge grows by deducing truth from known truths. He shows that actual mathematics develops in a illogical fashion because it grows by discovering the underlying assumptions as well as by working from these to prove conclusions.

        Proofs come after the thinking and experimentation

        p.88 of [Hogben37]
          There is always a danger of carrying this so far that we get lost in a maze of symbols. The mathematician then does what the sensible editor does. He brightens up the discussion by bringing in a cartoon. He goes back to his hieroglyphs, to geometry, whenever he can. To go back is not quite the right expression, because new kinds of hieroglyphs such as "graphs" have been devised (like business advertisements) to give a vivid picture of operations which could not be conveyed by the earlier kinds.[...]

          This tendency of number language to come back to the firm earth of figures and models, when doubts arise, has played a great part in building up the rules of what mathematicians call proof. In one sense proof has much the same relation to discovery as art criticism has to authorship. Mathematical truths are usually discovered by experimenting with individual cases first and "proved" afterwards. The proof is then an obituary notice on a praiseworthy statement. It exposes how it is connected with all the other mathematical rules we have learned to use. As we are always learning other rules, the methods first used to justify a rule in mathematics are nearly always held to be lacking in "rigour" by mathematicians who come later. This is how one of our most abstract mathematicians, Professor G. H. Hardy, speaks of one of the most abstract branches of mathematics: "The theory of numbers, more than any other branch of mathematics, began by being an experimental science. Its most famous theorems have all been conjectured, sometimes a hundred years or more before they are proved; and they have been suggested by the evidence of a mass of computations"


        Problems with Word Problems

        starting with p.304 of [Hogben37]
          What is called solving an equation is putting it in a form in which its meaning is obvious. The rules of algebra tell us how to do this. The really difficult step consists in translating our problem from the language of everyday life into the language of algebra. At this point the mathematician himself may go astray, because the mathematician may have less opportunity for understanding the problem as stated in the language of everyday life than the plain man who is not a mathematician. Once a problem has been translated into a mathematical sentence (or equation), we can safely trust the mathematician to do the rest. What remains is a problem in the peculiar grammar of mathematics. The danger lies in trusting the mathematician to do the translation.

        Proofs Without Words

        Mathematicians have been collecting these and the USA Mathematical Talent Search site [ http://www.usamts.org/ ] has a gallery [ G_Gallery.php ] of animated proofs.

        Somebody [click here [socket symbol] Visual Euclid if you can fill this hole] published a book of proofs from Euclid expressed using colored pictures. Some of these have been animated as Java Applets.

        The Logic Book

        BergmanMoorNelson90

        1. Merrie Bergman & James Moor & Jack Nelson
        2. The Logic Book (2nd edn)
        3. McGrawHill (1990) ISBN 0-07-909524-0 LC# BC135.B435
        4. =TEXT FORMAL LOGIC NATURAL DEDUCTION TREES SEMANTIC TABLEAUX

      . . . . . . . . . ( end of section Further Reading) <<Contents | End>>

      Motivation and Theory

        Arguments

        Arguments move from assumptions to conclusions. If valid (logical) then the truth of the assumptions forces the truth of the conclusions. However, most so called arguments, leave out many assumptions. Searching these out can help discover one that is not true and so lead to a discovery or a creative leap. This is helpful when you get stuck. Formalizing the arguments that got you stuck will often reveal an assumption that you don't want to make.

        Even more intriguing: if a valid argument leads to a known truth then it does not prove the truth of the assumptions. It can also strengthen ones belief in the assumptions -- but this is only a matter of probabillity not proof. An argument never proves anything positive about assumptions. It can show something positive about a conclusion however.

        When a set of assumptions leads to a false conclusion then one of the assumptions (at least) is false. Thus arguments help us question our assumptions. Hence they lead us to revise them until we have a set the do not contradict reality. The revision process, by the way, is not a logical process, but a dialectical one -- the steps are error prone. Indeed we may need a way to record the counter-arguments or rebuttals that develop in a real argument.

        Quick Example

          As a quick example: if we assume (informally) the first three statements below then the final statement follows from the assumptions:
          Let
          1. All birds can fly,
          2. All penguins are birds,
          3. No penguin can fly.


          4. (above)|-No penguins.

          (Close Let )

          So when we meet a real penguin and it does not fly, we need to revise our assumptions. Perhaps penguins fly under water? Perhaps some birds are flightless? Perhaps we have mis-classified penguins as birds? Choosing the right one to change is not a logical process but you can probably do it intuitively... Or if we are scientific we spend more time studying birds: sparrows, ostriches, crows, kiwi, canaries, cassowary, and so on, We may end up with birds divided into normal_birds and flightless_birds.
          Let

          1. All birds are normal_birds or flightless_birds,
          2. All normal_birds can fly,
          3. No flightless_birds can fly,
          4. (above)|-No normal_birds and flightless_birds,

          5. All penguins are birds,
          6. No penguin can fly.
          7. (above)|-All penguins are flightless_birds.

          (Close Let )

          In every day reasoning we are used to arguments that can be defeated by other arguments. Many of these are like the above and can be resolved by refining our assumptions. There has been research into the structures of these defeasible, dialectic, or plausible arguments: see [ChesnevarMaguitmanLoui00] for a history. Handling contradictory assumptions is also a key part of requirements analysis in developing software and there have been attempts to develop logics that help in this task. See [HunterNuseibeh97] [Nuseibehetal94] [Finkelsteinetal94] etc. for example. You can get a quick bibliography of recent work consistency by using my search engine: [Cc]onsisten

          In MATHS we can document rebuttals simply:
          But

          1. Penguins can't fly.

          (Close But )

          Rebuttals should be put after an argument that is not a full proof or a statement. Several rebuttals may be put after a statment and they all attack that proof/statment.
          Let

          1. It is raining.
          2. If it rains I get will get wet.
          3. |-I will get wet.

          (Close Let )

          But
          1. I carry an umbrella.
          2. |-I will not get wet.

          (Close But )

          To rebut a rebuttal, place it inside the first rebuttal.
          Let

          1. It is raining.
          2. If it rains I get will get wet.
          3. |-I will get wet.

          (Close Let )

          But
          1. I carry an umbrella.
          2. |-I will not get wet.
            But
            1. The wind is blowing.
            2. |-I will get wet.

            (Close But )

          (Close But )

          In a sense the penguin problem arises becasue of the rarity of penguins and other flightless birds. If more birds did not fly we wouldn't have assumed that "all" fly. There have been attempts to develop reasoning based on statements like "All birds normally fly".

          In mathematics, some thing analogous happens. You will see statements like:

          • "In all but a finite number of cases,...."
          • "Everywhere, except on a set of measure zero, ..."
          • "For large enough values, ...."
          • "In the normal case,...."
          MATHS does define some quantifiers to handle these. [ Advanced%20Quantifiers in logic_11_Equality_etc ]

          Probability and statistics are the branches of mathematics that are designed to handle these problems. Statements like 90% of all birds can fly. allow us to reason about uncertain situations. But they are essentially a more complicated than the propositions on this page. However, we can make a theory of probability using the simple logic discussed here. See [ math_81_Probabillity.html ]

        . . . . . . . . . ( end of section Quick Example) <<Contents | End>>

        Absurdity and Contradictions

        A logistic system can only be trusted if it does not lead to contradictory or absurd propositions. If the proof rules of MATHS are followed this should not occur. Natural deduction systems work by dividing up the current situation into special cases and showing that the majority of these (even all of them) are contradictory. If all are contradictory then the assumptions can be rejected. If some are open then the assumptions are taken to imply one or more of these conclusions.

        By Murphy's Law, it will happen that MATHS users will construct documentation that is inconsistent. The symptom of this is when a proposition and its negation are derived within a piece of documentation.

        There has been a lot of work done in the 1990s on inconsistent requirements for software. When talking to people about what they want software to do it is common to find large numbers of small and large contradictions. It is wise to do something about these when they are discovered.

        In any case a contradiction that occurs in MATHS is anything but a disaster! It is an opportunity to form a conclusion. It proves something. The correct thing to do is to close off the conclusion and the assumptions and declarations leading to the contradiction into a "set of documentation" that proves a result. Thus by simple editing the whole is protected from inconsistency. The result is a new theorem!

        A historical example of a project that failed through not doing this are the Pythagoreans who failed as mathematicians when they killed the man who proved that the assumption that sqrt(2) was a ratio lead to a contradiction. They could have proceeded by making this into an demonstration that there must exist numbers that are not ratios. Instead they preferred to support a mystical theory which they knew to be unsound. There are no Pythagoreans left today.

        Imre Lakatos studied the detailed behavior of mathematicians when faced with a proved theorem and a number of exceptions to it in the 19th century. From this he worked out an interesting heuristic (a guide for the in expert). A valid proof is(by definition) one that, given true 'givens' guarantees the truth of the conclusion. Now most proofs are some what incomplete but it is possible (by hard work) to construct such iron chains leading from assumptions to conclusions. In any case, Lakatos pointed out that this means that if the conclusion contradicts some other fact you should examine what you have taken as given, what you have assumed in the proof, to which of these are also contradicted. The assumptions that conflict with the new data can now either be improved, or they may have to be incorporated into the statement of the theorem. In summary, a proof plus a refutation improves the theory. See [ Further Reading ]

        Absurdity happens! You will find that your cherished and documented assumptions leading to conclusions that do not fit with observation. Again this is an opportunity not a problem! Science takes a leap forward whenever reality destroys a theory. First the data has to be checked and then the theory may have to be modified.

        For example the first evidence for a hole in the ozone layer was rejected by NASA because the values couldn't happen - it had to be "experimental error". Astronomy is full of examples of absurd ideas being rejected that were accepted later. There is an early map of the stars that someone drew which shows that one star had been erased and put near by - because a later observation showed it in a new position. The star was one of the unknown outer planets. Kepler wrote and crossed out "These points might as well fit an ellipse", and delayed his discovery of "Kepler's Laws" by several years. Because "action at a distance" was repugnant to Descartes' philosophy, his followers later rejected Newton's law of gravitation. For more on the history of astronomy see:

        Source: Koestler 59 , Arthur Koestler, "The Sleepwalkers" Acknowledgment:

          I'd like to thank m.dolbear@lineone.net for advice on redrafting the above paragraph in September 2001.

        An engineer is in trouble when reality shows up a misunderstanding - because the system often becomes unsafe or ceases to work. The wise engineer takes time to scientifically examine and model the situation prior to searching out the problems and solving them. No work is started until a mathematical analysis proves that the system is safe (within a generous "safety factor"). The wise engineer also finishes up by making experimental prototype versions first. As an example, in the UK there were once two rival groups designing an airship. One was designed on an elegant mathematical model and known to be safe. The other was "hacked" together - despite evidence that it would fall to pieces. The properly engineered system flew, the other one was the R101 and crashed in flames - breaking up as predicted

        Source: Nevile Shute, "Sliderule".

        However never totally discard a theory - document the context in which it did/does work and file it under - "It seemed like a good idea at the time...".

      . . . . . . . . . ( end of section Motivation and Theory) <<Contents | End>>

      Automation

        Most interesting logics have no automatic decision procedure that generates proofs of any given statement. However, there exist many small logics for which there are automatic tools.

        For example, formulas in the propositional calculus can be proved by several algorithms:
        Table
        Truth Tables (Mil Post)
        Proof_by_cases+Indirect (Jensen)
        Tree structured analysis [ logic_27_Tableaux.html ]
        Boolean Algebra (Boole)

        (Close Table)
        Notice however that except in the simplest cases the problem is intractable. If we have n propositional variables the time to prove/disprove a statement increases as at least the exponential on n. The problem harder when we have predicates and equality to worry about. Indeed it has been shown to be unsolvable! For practical evidence of the difficulties of proving some theorems, consider that some (Fermat's Last Theorem for example) took 400 years to prove. Indeed sometimes a proof could only be discovered by somebody who is not searching for it. The proof lies within a separated field of study to where the theorem appeared!

        Tools and notations can clear the logical underbrush and allow the necessary creative steps to be made.

        There are tools on the WWW. For example there is an automatic theorem prover for many non-classical propositional logics: [ LWBinfo.html ]

        I have developed a very simple Prolog program to evaluate, test and disprove boolean expressions: [ bool.plg ] and in some courses students are asked to write a program to generate semantic tableaux: [ project2.html ]

        Full blown theorem provers tend to need human intervention and guidance. The record for humans trying to prove a theorem is approximately 400 hundred years.

        One popular tool for computer people is the Prototype Verification System:

      1. PVS::= See http://pvs.csl.sri.com/, at SRI.

        [click here [socket symbol] Links to your tools can be put here. if you can fill this hole]

      . . . . . . . . . ( end of section Automation) <<Contents | End>>

      The Dilbert Salary Theorem


      1. (physics, proverbs)|- (DST): Engineers and scientists can never earn as much as administrators and sales people.

        The following proof was supplied to Tom and Ray [ http://cartalk.cars.com/ ] by From Kate Fowler, Denver, Colorado. As an informal proof [ 04.29.html ] in April of 2000.

        Proof of DST


        Net

        1. (physics)|- (dst1): Power = Work / Time.
        2. (proverbs)|- (dst2): Knowledge = Power.
        3. (proverbs)|- (dst3): Time = Money.


        4. (dst1, dst2, Power => Knowledge)|-
        5. Knowledge = Work/ Time,
        6. (dst3, Time => Money)|-
        7. iff Knowledge = Work/ Money,
        8. ( * Money, / Knowledge )|-
        9. iff Money = Work/ Knowledge.


        10. (-1)|-knowledge approaches zero, money approaches infinity, as long as some work is done.


        11. (-1)|- (Conclusion): the less you know, the more you make.

        (End of Net)

        Exercise on the DST

        So, why is the above argument invalid? And why is the conclusion still OK?

      . . . . . . . . . ( end of section The Dilbert Salary Theorem) <<Contents | End>>

      Glossary

      1. above::reason="I'm too lazy to work out which of the above statements I need here".
      2. given::reason="I've been told that...", used to describe the given part of a problem or a logical system.
      3. goal::theorem="The result I'm trying to prove right now".
      4. let::reason="For the sake of argument let...", many times a proof starts by assuming something called the hypothesis, [ logic_25_Proofs.html#Block Structure ]
      5. hyp::reason="I assumed this in the last Let".
      6. RAA::reason="Reducto Ad Absurdum".
      7. reason::="Something that often appears as the evidence input to a step in an argument."

      . . . . . . . . . ( end of section Glossary) <<Contents | End>>

      Comments and Reviews

        Open for business! Please tell me what you think of this page. Send Email to dick at Csci CSUSB edu.

      . . . . . . . . . ( end of section Comments and Reviews) <<Contents | End>>

      Glossary

    1. argument::=A sequence of steps that go from a set of assumptions(given) to a conclusion(goal).
    2. valid::=Something that can not produce a false conclusion from true givens. There are valid arguments, valid proofs, and valid steps.

    . . . . . . . . . ( end of section Help with Proofs and Demonstrations) <<Contents | End>>

    Notes on MATHS Notation

    Special characters are defined in [ intro_characters.html ] that also outlines the syntax of expressions and a document.

    Proofs follow a natural deduction style that start with assumptions ("Let") and continue to a consequence ("Close Let") and then discard the assumptions and deduce a conclusion. Look here [ Block Structure in logic_25_Proofs ] for more on the structure and rules.

    The notation also allows you to create a new network of variables and constraints. A "Net" has a number of variables (including none) and a number of properties (including none) that connect variables. You can give them a name and then reuse them. The schema, formal system, or an elementary piece of documentation starts with "Net" and finishes "End of Net". For more, see [ notn_13_Docn_Syntax.html ] for these ways of defining and reusing pieces of logic and algebra in your documents. A quick example: a circle might be described by Net{radius:Positive Real, center:Point, area:=π*radius^2, ...}.

    For a complete listing of pages in this part of my site by topic see [ home.html ]

    Notes on the Underlying Logic of MATHS

    The notation used here is a formal language with syntax and a semantics described using traditional formal logic [ logic_0_Intro.html ] plus sets, functions, relations, and other mathematical extensions.

    For a more rigorous description of the standard notations see

  1. STANDARD::= See http://www.csci.csusb.edu/dick/maths/math_11_STANDARD.html

    Glossary

  2. above::reason="I'm too lazy to work out which of the above statements I need here", often the last 3 or 4 statements. The previous and previous but one statments are shown as (-1) and (-2).
  3. given::reason="I've been told that...", used to describe a problem.
  4. given::variable="I'll be given a value or object like this...", used to describe a problem.
  5. goal::theorem="The result I'm trying to prove right now".
  6. goal::variable="The value or object I'm trying to find or construct".
  7. let::reason="For the sake of argument let...", introduces a temporary hypothesis that survives until the end of the surrounding "Let...Close.Let" block or Case.
  8. hyp::reason="I assumed this in my last Let/Case/Po/...".
  9. QED::conclusion="Quite Easily Done" or "Quod Erat Demonstrandum", indicates that you have proved what you wanted to prove.
  10. QEF::conclusion="Quite Easily Faked", -- indicate that you have proved that the object you constructed fitted the goal you were given.
  11. RAA::conclusion="Reducto Ad Absurdum". This allows you to discard the last assumption (let) that you introduced.

End