It was true that mathematics was done in argumentation and discourse or rhetoric in ancient times. The 6 volumes of Euclid’s elements have no symbols in it to describe behaviors of properties at all except for the geometric objects. The symbols of arithmetic: =, +, -, X, ÷ were created in the 15th and 16th centuries which most people hard to believe it - you heard me write. The equality sign “=” and “+,-“ appeared in writing in 1575, the multiplication symbol “X “ was created in 1631, and the division sign “ ÷” was created in 1659. It will be to the contrary of the beliefs of most people as to how recent the creations of these symbols were.
It is because of lack of symbols that mathematics was not developed as fast as it has been after the times where symbols were introduced and representations, writing expressions and algebraic manipulations were made handy, enjoyable and easy.
These things made way to the progress of mathematics in to a galaxy – to become a galaxy of mathematics. What is your take on this issue and your expertise on the chronology of symbol creations and the advances mathematics made because of this?
http://Notation,%20notation,%20notation%20%20a%20brief%20history%20of%20mathematical%20symbols%20%20%20Joseph%20Mazur%20%20%20Science%20%20%20theguardian.com.htm
Leibniz was the master of symbol creation! He created symbols that packaged meaning, helped cognition, stimulated generalization, and eased manipulation. He thought about them with care before committing to their use. William Oughtred invented hundreds of new symbols, but hardly any of them are still in use. Goes to show that willy-nilly made symbols don't have a good survival rate, for good reasons.
I do not really know whether the creation of math symbols has expedite the growth of maths. But the growth of math has certainly been accompanied by a continuous introduction of new symbols. This os the story both in math - as well as in logic.
The creation of math notation has been a constant since the modern age to-date. There was, indeed, a long period when that did not happen. Not by chance, that was the time, too, when math was somehow in the back of human culture.
In this respect, mathematicians are truly like poets - they introduce and create new concepts, new rods, new symbols - in the need to explain new discoveries that they could not explain previously to the creation of new notation and concepts.
This is a fantastic story of creativity.
@Dejenie, fine thread, but the link to the article does not work. I am free to give a link that You have mentioned in Your question!
@Carlos, You have made very nice response. Yes, "This is a fantastic story of creativity."
http://www.theguardian.com/science/alexs-adventures-in-numberland/2014/may/21/notation-history-mathematical-symbols-joseph-mazur
In AD 976 the Persian encyclopedist Muhammad ibn Ahmad al-Khwarizmi suggested use of a circle to represent sifr or zero. So I think that mathematical symbols are older than you say. I think you probably mean symbols for mathematical FUNCTIONS, correct?
I have not known about this book! I am going to get it , it is valuable resource!
http://press.princeton.edu/titles/10204.html
Yes dear @Nelson, Muhammad ibn Ahmad al-Khwarizmi is mentioned in WIKI!
http://en.wikipedia.org/wiki/0_(number)
There also this fine book, a classical one:
Cajori, a History of Mathematical Notations:
http://img1.imagesbn.com/p/9780486677668_p0_v1_s260x420.jpg
It goes without saying that symbols and notation play a vital role in the development of mathematics. Sometimes a good notation is half the solution of a problem.
A standard source on mathematical notations is: Florian Cajori (1929) A History of Mathematical Notations, 2 vols. Dover reprint in 1 vol., 1993. ISBN 0-486-67766-4.
See also: History of mathematical notation
http://en.wikipedia.org/wiki/History_of_mathematical_notation
and
Table of mathematical symbols by introduction date
http://en.wikipedia.org/wiki/Table_of_mathematical_symbols_by_introduction_date
Dear Dejenie, This is a very fundamental issue in Math and Science. The creation of Mathematical symbols did for sure expedite the growth of Mathematics. In the following Video, Terry Moore gives the answer to the question about the origin of the most important symbol used in Mathematics, the X:
https://www.ted.com/talks/terry_moore_why_is_x_the_unknown
We should note that although symbols play an important role in mathematical research, in mathematical education creates a severe problem. After we get used in using symbols , we forget that sumbols do not mean anything to students. We should explain what the symbol mean. There are some books dealing with such problems:
- David Pimm] Symbols and Meanings in School Mathematics
-[Paul_Cobb,_et_al] Symbolizing and Communicating in Mathematics Classrooms
Dear @Costas, Besides the fact that "in mathematical education, the Math symbols create a severe problem", they also play a role in raising critical thinking.
However, I use this occasion to warn phd students and younger researchers to not exagerate in introducing new notations and new terminology in their works. It is a quite popular tendency to abuse of new notation and terminology - but this does the work difficult to read and to understand. Please trust the existing notation and terminology and explait what is to be understood using this. To find the best Explanation (the best definition) is a much better intellectual work as to introduce new terminology - and helps both author, Reader - to say nothing about the overall Quality of the work.
This discussion is very old in mathematics and much has been written about it. Since I haven't seen Leibniz mentioned yet, let me point out he was one of the "master builders" of notation. Some additional really interesting articles on this topic:
F Cajori (1925) Leibniz: the master-builder of mathematical notations, ISIS 7(3), 482-514.
Lewis and Langford (1956) History of symbolic logic, In "The World of Mathematics" Ed. JR Newman, vol 3, 1859-1877.
E Nagel (1956) Symbolic notation, Haddock's eyes and the dog-walking ordinance, In "The World of Mathematics" Ed. JR Newman, vol 3, 1878-1900.
To add on Leibniz, his calculus notation D^n (f) is generally more common than Newton's (over-dots) and led to ready interpretation of fractional or negative derivatives. This directly contributed to the emergence and development of abstract algebra.
It is nice to see how both sintax and semantics of symbols modify through centuries. Very often the sintax (or form of the Symbol) has an hierogliphic evolution. Images with Hands and fingers became in some hundred years arab digits. The same happened with semantics. The letter S, meant at the beginning to symbolize a sum, modified a little bit and converged in the symbol of integral. Later this symbol started to mean different other things, from primitive functions to distributions. The reversed T means "orthogonal" in geometry and "falsum" in logic, and those meanings are not completely contradictory. Even complete formulae, like E = mc^2, became to have parallel interpretations and meanings, or even meta-senses - depending of the cultural and social environment where they are used.
Dear @Michael, it is a good example, then without a proper notation, the equations become chaos and unreadable.
I think higher mathematicians seem to assume that the symbols are the math.
I am (among other things) a tutor in math to kids 6-18. Symbols confuse them. Symbols makes math seem like a foreign language. So much of my job is showing them that math isn't so much a foreign language as a way of using shorthand to write English and to work with the numbers better than we can with full English sentences.
"Sam has three cookies more than Jim has."
Simply writing ( s = j + 3 ) doesn't enlighten these kids. But putting more words and fewer symbols in does.
sam's cookies (are the same number as) (jim's cookies) plus (three more cookies)
THEN and only then, comes the algebra, with single letter variable.
s = j + 3
If we're then told "Sam has 4 cookies" we can write "sam's cookies = 4" and sub into the first "equation" or if my student realizes on his own that we have written "sam's cookies" as "s," we can plug into our equation with variables.
The symbols are not intuitive. They are extremely useful, and probably necessary, as math gets more complex, but if math teachers go too quickly to the symbols, or if mathematicians try to write for the general public without evaluating how useful the symbols are to their audience, then the gulf between people who "get math" and the people convinced they will never "get math" widens unnecessarily.
Yes @Janin, it is not an easy task to teach Math; you have to be courageous to think of studying Mathematics. Thank you for the example.
Dear Mihai,
Notations are created because of necessity either in mathematics or in other sciences.
If someone thinks some of the quantities and new properties obtained need new notations then that will be welcomed. For instance in calculus, we have a notation for the derivative of a function ϕ at a number x = a which is ϕ′(a) - it describes all things involved but we do not have a well designed description of the average rate of change of a function ϕ on a closed interval [a,b] other than simply using ((Δy)/(Δx)) or the descriptions in words which does not say much about the interval and the name of the function. I will for instance propose and in fact use in my calculus class the description : (ϕ)′[a,b] for the average rate of change of the function ϕ on the closed interval [a,b]. Here I am afraid the bar (--) above ϕ may not be visible, but it meant for average and the prime notation is for the rate and including the interval [a,b] describe what we want to compute with all essential components incorporated. Therefore do not afraid of symbol creations, if redundant then society will just leave them.
@Janine: Please allow me to respond to several claims you made. I write
from the perspective as one who has tutored girls in math from the secondary
to the college level for many years, as well as having taught such subjects
at the undergraduate and graduate levels at a top-tier research university for
almost 2 decades.
* "Simply writing ( s = j + 3 ) doesn't enlighten these kids. But putting more
words and fewer symbols in does."
As the old saying goes, mathematics is not a spectator sport. Mathematics *is*
difficult and most curricula (especially at the secondary level) do not devote
enough time to it, especially given its increasing importance to more and more
areas in modern society. Of course, children start best with arithmetic, but one
must not dwell on this year after year (as I've seen some curricula do), otherwise
there is an obvious "dumbing-down" effect with the companion tragedy that
children "opt out" whenever faced with any higher level of math. (You can always
identify any adult who's had this experience when they say, with some level of
pride, that they were never good at math). If children are not starting to deal with
symbols by 7th or 8th grade, the risk of opting out becomes very high.
* "if mathematicians try to write for the general public without evaluating how
useful the symbols are to their audience, then the gulf between people who
'get math' and the people convinced they will never 'get math' widens
unnecessarily"
I don't know if you are referring to specific works, but I've found in general
that math-oriented books for the general market are quite good. All of John
Paulos' books are worth reading (e.g. "A mathematician reads the newspaper"
and "Innumeracy: Mathematical Illiteracy and its Consequences") --- in fact,
they *should* be required reading in high school --- as are Danica
McKellar's books for younger readers. (Danica played Winnie Cooper
on "The Wonder Years" and has an undergraduate degree in math.) My
impression is that such authors take great pains to make sure concepts
are accessible to the general reader and their books actually narrow the
gulf between those who do and do not 'get math' rather than widening it.
* "I think higher mathematicians seem to assume that the symbols are the math."
Symbols are representative, but "the math" is so much more. Again according
to the principle that math is not a spectator sport, this can only really be
appreciated by diving into it. It reminds me of one of my favorite quotes from
Richard Feynman:
"The burden of the lecture is just to emphasize the fact that it is
impossible to explain honestly the beauty of the laws of nature
in a way that people can feel, without their having some deep
understanding of mathematics. I am sorry, but this seems to be
the case."
@Michael Wendl: Having symbols in mathematics does not exclude words. In fact I beleive and I follow it, that before we introduce a unity, we should make a conceptual introduction, followed by a technical development where symbols and definitions may be used to summarise the conceptual reality.
Popularisation is another issue.
@Costas: Of course symbols do not exclude words. My point was that the further one goes into mathematics the more formal (symbolic) the treatment must necessarily become. That is, one cannot let words exclude symbols, essentially the inverse of what you're saying. Many elementary school students in the US graduate without ever having to have done much math beyond very basic arithmetic. In her comment above, Janine Wonnacott advocates deferring any introduction of symbols, but this is precisely what is needed (after mastering arithmetic, of course) for students to be able to begin thinking more substantively about mathematics, beyond the typical "making change for $5.40" type problems that 2nd and 3rd graders do. I agree that popularization is another issue, but my point was again in response to something Janine raised about the implications of mathematicians writing for the general reader. Best!
Notation are abbreivations for a Physical Phenomena e.g. addition , subtraction , multiplication and division. Take out the physical insight , mathematics turns insipid. Maxwell wrote his famous equations using Quarternions , look much more complicated compared to the Vector Calculus notations viz. Grad , Divergence , Curl.
I'm delighted that my article in the Guardian, and from there my book, has generated so much discussion on this surprising topic. The Guardian article was just a teaser to get you to read my book Enlightening Symbols. It's a long and complex tale that tries to uncover much of the history of mathematics writing. I'm not in full agreement with the last two paragraphs of Dejenie's piece above. We really cannot pass our own judgement on how people processes rhetorical math (easier or harder) as opposed to symbolic math. Janine is right. As adults we think that symbol writing mathematics is more natural for processing. But symbols are little packages of information that have to be unpacked in the process. See more in my book.
@ Michael Wendl
I write from a different perspective than you. As a tutor, I work one-on-one with a student who is struggling. Sometimes the problem is simply not (for example) memorizing the times tables, but often the student is frozen by the idea of a letter in an equation. They have trouble with the concept of a variable, and a constant, and with the fact that letter variable names can be arbitrarily chosen.
(This is from the Simpsons:)
Nelson: She can do the kind of math that has letters. Watch! What's x, Lisa?
Lisa: Well, it depends.
Nelson: Sorry. She did it yesterday.
I am working with the outliers, the students who don't "get it" during class, and who often manage to get one or two grade levels ahead of their understanding before they start failing tests. At that point, math has been painful for years, and "doing math" is just a matter of memorization and guessing rather than understanding.
"I don't know if you are referring to specific works, but I've found in general that math-oriented books for the general market are quite good."
John Paulos is wonderful. Danica McKeller is wonderful also, though my male students resist reading her. Keith Devlin does a great job explaining, for example, how we can do complicated calculus intuitively so we can catch a baseball but go bug-eyed seeing the calculus written on paper.
Richard Feynman, in his books written for the general educated public, lets me understand some of the beauty and the patterns by glossing through the math. Part of his genius is being able to explain (some) physics without the symbols. Higher level math is difficult because it's difficult, not because it's written in symbols. I understand it isn't the symbols that keeps me from understanding his work. It's that I haven't spent years learning the basics of the material. I accept that.
"Janine Wonnacott advocates deferring any introduction of symbols, but this is precisely what is needed (after mastering arithmetic, of course) for students to be able to begin thinking more substantively about mathematics"
I believe some students need to be eased into symbols more carefully and deliberately than others. I believe the introduction of symbols is one point where we can lose students who otherwise would do fine with math (at least through a high school level). It isn't always an easy conceptual leap.
All my best.
Dear Janine,
This is the reason that I suggest in this thread, that we must have first an historical and conceptual introduction, and if it is needed we can proceed with technical matters and symbols.
There is a dictum attributed to Samuel Eilenberg, "The right notation is worth ten theorems." His point was that the right notation for mathematical concepts suggests the correct theorems.
As a very recent example, the use of "string diagrams" in doing algebraic calculations in the context of a braided monoidal category is vastly superior to trying to force the calculation into strings of symbols denoting the braiding, associator, various objects and maps, because the string-diagram notation encodes most of the structure by way of the Joyal-Street (or in the presence of two-sided dual objects Shum) coherence theorem. The calculations in many of Yuri Bespalov's papers on Hopf algebras in braided monoidal categories would be at best unreadable and at worst intractible in any adaptation of Sweedler notation to force them into symbol-strings.
@Janine Wonnacott: As I said above, I too have tutored many students one-on-one and I have come away with precisely the opposite experience as you describe. I don't propose to explain why, but I can tell you that I believe much of being able to "jump the gap" from arithmetic into more abstract topics like algebra for troubled students relies on teaching them the "algorithmic approach". For example, in "word problems" one reads the whole thing (perhaps several times) and mentally identifies the thing that is not known --- e.g. in "If Alice can do a job in 2 hours and Bob can do it in 3, how long does it take if they work together?" the unknown is the time --- then we immediately assign a symbol, whose value we also don't know, to represent that thing. This is almost always the hardest part of word problems. One then has a "handle" to mathematically dissect the problem to arrive at an equation, which we then solve mechanically using various rules.
I don't think most students have an inherent difficulty with the concept of symbols themselves, simply because we are so used to dealing with them in everyday life --- one can again often use examples to explain this, like "red light" represents the command to stop, "star" represents the city of Chicago on a map, or the "needles" represent how much electricity you've used. You might use examples such as these when you run into the problems you describe, especially with students' lack of understanding how they're arbitrary. For example, we could just as well have chosen a "blue light" for stop, or a "circle" for Chicago, but those clearly do not change the nature of the "thing" that these symbols represent. Good teaching usually involves generalizing concepts by using/extending what is already in the student's frame of reference.
Again, I think there is a great danger in spending too much time on arithmetic (as many US curricula do) because it becomes increasingly more difficult to "jump the gap" and children then risk being left behind. Most US schools (especially at the high school level) simply don't spend enough credit-hours on the subject, especially given that mathematics will be increasingly important in the future. This, in my opinion, is tragic.
I had tried to understand Maxwell Equations from the Original Text by Maxwell. It was a disaster. When I got Feynman Lectures Vol. II , I could understand Maxwell Equations in One Day , including Vector Calculus. Of course , I had the prior knowledge of Gauss' Law , Fardays Laws of Electromagnetism and Ampere's Law.
@Pradosh: Your point is a good one, i.e. the "evolution" of notation/symbols rather than their creation. Most people have never actually looked at the original version of perhaps the greatest scientific document ever written, Newton's Principia. Aside from the fact that it is in Latin, the mathematics would not be readily recognized by most readers.
Dear Michael,
An example of the need of symbols, at least for our era, is Archimedes. Please try to read a proof of Archimedes. Everything is written with words. For Archimedes era this might be easy, but for us it is difficult.
Dear Joe,
We are also happy your book on symbols is one of the sources of reason for our discussion here. But I do not put any judgement on any body how they did mathematics, instead what I have said was, how symbol creations simplified the life and mathematical works of mathematicians from doing mathematics in rhetoric way using words.
As Pradosh and Costas indicated, we can see how difficult it would have been for many contemporary mathematicians to become what we are now, had mathematics been as it was in the time of Archimedes, Plato and Pythagoras. In fact when we look at the spread of mathematics and mathematical learning in the society of those times, we see a very limited, only within circles of few highly motivated and passionate individuals that value mathematics as a form and means of correct way of argument and reasoning that validates truth along with philosophy in their society.
Dear Dejenie,
We shouldn't judge the rhetorical way of doing mathematics by the quantity of individuals doing mathematics. Nor should we judge it by the limited mathematics. Mathematics at the time of symbol proliferation burgeoned by a proliferation of leisure classes that had time and money to study. That burgeoning coincides with the growth of universities that were beginning to grow all over Europe.
The matter is complex. I'm not saying that symbol creations didn't simplify working with mathematics. All I'm saying is that we really just don't know enough how people felt, since no one at that time (as far as I know) has written about it. We also do not have mathematicians' scratch papers, the ones they through away after they published their work. Now some of that would be nice to find.
But all this is great. I'm so glad you brought up the issue. Keep it going...
@Costas: You're right about Archimedes. I might add the example of Euclid and his 13 volumes of the "Elements" (circa 300BC). Reading these is extremely difficult because of the limited nature of notation.
Symbols are meant to, at the same time, formalize and make abstraction of a mathematical concept; thereby we may leave the definitions and use their symbolization in their stead. For example, take Gottfried Leibniz (17th century), who made the integral symbol as an elongated "s", because he thought of the integral as an infinite sum of infinitesimal summands. This was strictly formalized by Bernhard Riemann in 1854, and this definition made clear its theoretical limitations, leading to the definition of the Riemann–Stieltjes integral (1894), and later to the even more sophisticated Lebesgue integral (1904). Initially the concept of integral was meant to calculate functions defining areas under a curve, areas in the space, volumes and even hyper-volumes. But under the integral sign we have thrown also:
* Line integrals, which is calculating a line length of a parametrized curve (integral over trajectories).
* Path integrals (quantum mechanics), that is rather, functional integrals over a space of possible paths.
* Integrals of differential forms, so the fundamental theorem of calculus is generalized into Stoke's theorem. Similarly for other dimensions we can arrive at Green's and the divergence theorem, meaning that differential forms give us a natural way to have unifying view of integration, so by going to a higher level of abstraction in notation (using the exterior algebra introduced by Élie Cartan in 1945), we may use now the powerful language of p-forms.
* Integrals of motion or first integrals, which are the functions within a given phase space, for which along certain path the function is constant. Examples of those functions are the angular momentum vector or a Hamiltonian independent of time
* Integrable systems, in the sense of complete (analytic) solvability, as opposite to non-integrable systems, or in the Frobenius sense, where a system is integrable, if and only if it generates an ideal that is closed under exterior differentiation. Here, under the "integral" we have thrown concepts like invariant submanifolds, and constants of motion which may encompass only energy (non-integrable systems) or include other constants of motion (integrable systems). Frobenius exploration about integrability conditions was published about the Pfaffian problem, in a paper of 1877 (Frobenius, G. "Über das Pfaffsche probleme", Journal für Reine und Angewandte Mathematik, 82 (1877) 230-315.)
As long as you are generalizing to Stokes and beyond. Throw in the boundary operator in classical simplicial homology. It comes from the partial derivative symbol, and yet it is being thought of as a geometric object. That passes to become once again a differential operator in other homologies that SEEM to have nothing to do with the geometry of a boundary of any region. It generalizes the Stokes' even further to bring it to Gauss-Bonnett, and even further than that to an etale cohomology that could even link to Galois theory. That's the power of such a symbol. On the other hand, I'm not sure I would go so far as to say that symbols "are meant to... formalize and make abstraction[s] of a mathematical concept." They might help in the abstraction process, but they are meant to package information that would otherwise take large papers to explain.
@Joe Mazur, I agree with you, in fact, the idea that I wanted to convey with what I said is precisely what you point out "package information that would otherwise take large papers to explain", but I was carried on by a desire to express it with some other wording that would sound more "mathematical".
And what about notations for the derivative in calculus? Of the many that cropped up by the 18 century, it was the 'fractional' notation of Leibniz that so coincided with physical intuition. Consider the fractional statement of the chain rule: dz/dx =(dz/dy)(dy/dx)versus the Lagrange, (gf)'(x)=g'(f(x))*f''(x). While the Lagrange version is more precise, it is the Leibnizian that is more mnemonic. (However, the downside of the fractional notation makes the chain rule look trivial, belying its subtlety.)
Leibniz was the master of symbol creation! He created symbols that packaged meaning, helped cognition, stimulated generalization, and eased manipulation. He thought about them with care before committing to their use. William Oughtred invented hundreds of new symbols, but hardly any of them are still in use. Goes to show that willy-nilly made symbols don't have a good survival rate, for good reasons.
Dear Joe,
Although both Newton and Leibniz, the icones of calculus, credited for creating calculus, Newton was mostly on concept creation while Leibniz was doing both, developing concepts and creating symbols. Leibniz believed the transcendental powers of symbols in representing and understanding mathematical concepts.
Symbols make mathematical expressions language-independent. For this reason, mathematics becomes accessible to anyone familiar with the symbols. Also, unlike language, which can be cumbersome in formulating complex expression, symbols give mathematical expressions a crisp, easy-to-grasp character.
And one more thing about the value of symbols in nurturing the growth of mathematics is its kinship to music. We listen to music independent of a knowledge of the intricacies of musical script. And we read mathematical expressions independent of natural language.
Hi James, I don't quite get your analogy. I would say that musical script is divorced from natural language, in that its semantic component is pure reference to physical vibration frequencies, with modulation and timing thrown in. I do agree that you can appreciate the music with no knowledge of the notation, however. Where I disagree with you is in your last sentence. Isn't mathematical language a highly formalized fragment of natural language? For me, mathematical notation, like an everyday word (e.g. 'connected') given a precise technical meaning, is a matter of stylistic choice. A creative coiner of mathematical jargon/symbolism will cleverly insert mnemonic content that reflects her/his own intuition. The mathematical language has not only a precise denotation, but suggestive connotations that--hopefully--aid understanding. Without natural language, mathematical notation and terminology is just marks on paper. (Just reread your post: do you mean by 'mathematical expressions independent of natural language' the idea that mathematical notation is not tied to any one natural language? I do agree with you there, except that math jargon is often language/culture dependent for its connotative aspects.)
Hi, Paul,
I agree with your interpretation of mathematical notation. And, yes, what would theorems (and their proofs) be without a mixture of natural language and symbols. In fact, natural language is commonly used to explain the use of notation in axioms, definitions, theorems and proofs. So, mea culpa.
Notation clearly makes a big difference at the practical level- consider the algorithm for multiplication, or even addition, in arabic numerals compared with Roman numerals. Of course from a formal perspective arithmetic has no real connection to the world outside of practices like counting and measuring. But we would, I'm sure, never have arrived at arithmetic as a formal mathematical theory if we hadn't started by counting and measuring things, and been interested in the relations between counts of disjoint collections, counts of what was left after pairing of members of one collection with members of another, etc. Leibniz's notation for the calculus was also clearer and more flexible than Newton's, and came to be the standard (though my father was trained to use Newton's 'dots' when studying engineering in the 50's and early 60's).
I think a major significance of notation is that it encodes the development of concepts --- using a notation SO(3) implicitly makes connection with concepts of group (and, perhaps, manifold, etc.) as well as suggesting the existence of SO(157). This is quite closely related to what the cognitive psychologists call "chunking".
Further, the notation is capable of "overloading" (somewhat in the sense of object oriented programming languages) so using a symbol x for "an arbitrary (unspecified) number" leaves open to context whether this is to mean a positive integer (counting number) or a possibly infinite cardinal (also a counting number) or a complex number or a quaternion or...
Dear Rafael,
That was the reason, mathematics was not developing fast then. From the list of symbols and their creators, it is surprising as to why Issac Newton did not create notations that survived except the dot notation he used for derivative which now is used specifically for temporal derivative.
This is a fascinating thread. I especially like the remark that arithmetic algorithms are far easier to implement in Arabic/Indian place-holder notation than they are in Roman. I would hesitate, however, to quickly ridicule the equational notation of Diophantus, especially because we are all too familiar with the modern alternative. All you English speakers, just try reading Beowulf in the original Old English. It looked right fine to the Venerable Bede...
This is indeed a fascinating thread. I don't know if the contribution of François Viète has been mentioned yet.
http://www.math.rutgers.edu/courses/436/Honors02/vieta.html
http://math.unipa.it/~grim/mathnotation.pdf
Dear Lehtihet,
You have posted links of good sources on the history of symbols and how Viete introduced alphabets to represent unknowns (either variables or parameters ) although he is not much known in the contemporary mathematics community.
Jan Cizmar (second link) in his article indicated the plausible reason the Leibniz notations and symbols survived than Newton was - because the editorial offices of mathematical journals in Europe at the time were filled with people who knew and favor ed Leibniz than Newton, so they strictly impose mathematicians to use Leibniz symbols for publications in their journals.
But I do have a different take on that - indeed Leibniz notations were more concise, appealing, simplified and actually describe the purpose intended to describe, which we today use them rightly. Take the Newton dot notation for instance - to indicate higher order derivatives of a variable quantity is not good looking at all, but Leibniz notations of derivatives of any order, symbol of partial differentiation and the integral sign are all perfect now and they were then.
Therefore the editorial office then might have mathematical reasons than favoritism to choose Leibniz notations for its publications. By the way it is indicated also that journals of that time played a major role in crystallizing the usage and acceptance of symbols in the mathematics universe.
Dear Dejenie,
Thank you for your feedback.
I do share your view regarding Leibniz' notations.
The purpose of my last post was mainly to mention the important contribution of Viète. I added the paper by Jan Cizmar because it seemed to me interesting in the context of this thread.
Another important contribution, which is perhaps less known, is that of Thomas Harriot (1560-1621) (see first link below).
As to the main question of this thread, the second link below provides an interesting point of view regarding "the role of symbolic language on the transformation of mathematics".
https://www.maa.org/sites/default/files/pdf/upload_library/46/Biddix_Harriot_ed.pdf
http://logica.ugent.be/philosophica/fulltexts/87-5.pdf
Especially in connection with the theme "the role of symbolic language on the transformation of mathematics," I would emphasize the role of chunking; note Miller's article (and see the Wikipedia comments on it) "The Magical Number Seven, Plus or Minus Two: Some Limits on Our Capacity for Processing Information" The use of symbols which can be interpreted as things in themselves corresponds to a concept for which we have a clear (?!) and precise meaning -- although preferably one which links to related concepts.
Dear Abedallah,
Sorry I did not notice your remark. What you mean in asserting "they also play a role in raising critical thinking." I think is that not the symbols but the interpretation and the substance behind symbol, raise critical thinking! Symbols is good for easy in communication between people that know the dialect.
New symbols, in particular, new numerals aloowing one to express new numbers give a lot to mathematicas. Think on introduction of zero and positional numeral system. With respect to working with infinity have a look at the page http://theinfinitycomputer.com
The book and numerous papers of researchers from different countries show how new symbols allow us to increase the accuracy of working with infinity.
Dear Yaroslav,
Your extensive research on grossone theory and the introduction of new symbols and concepts such as the grossone are additions of new symbols and concepts to the language of abstraction and thereby broadens the galaxy of mathematics. The new infinite numeral you called it the grossone and the safe arithmetic that can be done with it including powers and the non-existence of indeterminacy unlike the ones with the usual infinity are remarkable properties.
Dear Dejenie,
I think any new mathematical symbols come out with new mathematical ideas and cognition results and the scientific symbols surely expedite the growth of mathematics if the new mathematical ideas and cognition results scientific enough, many of our pioneering ancestors set us good examples. We not only enjoy, appreciate their new mathematical ideas and cognition results but also the scientific symbols they created.
Best Regards
But, the unscientific mathematical ideas and cognition results with their unscientific symbols (it doesn’t matter old or new) may cause a “disease” showing the fundamental defects in our mathematics; surely not expedite the growth of mathematics such as the “recurring illness” of Zeno’s Paradox relating to present unscientific “infinitude”.
Mark Colyvan (University of Sydney) has a very interesting conference paper that might be relevant to this question. The paper is titled "Mathematical and Musical Notation as Models", and he presented it at the 2014 conference of the Australasian Association of Philosophy. I don't know whether there is a working paper version in circulation, but his webpage is at: http://www.colyvan.com
In addition to the excellent examples mentioned by @Zé Carlos Tiago, @Christian List and many others, here are a few other symbols to consider:
[0, 1] : closed interval for 0
We should divide symbols in mathematics in to two categories:
1. Text-mode symbols, like the ones in mathematical logic. After the algebraization of mathematics, and the domination of classical logic-set theory, we have the "language turn in mathematics as well.
2. Graphics-mode symbols, or diagrammatic reasoning. The model for this is Euclid diagrammatic reasoning of his geometry. Today's Computer Science and especially artificial intelligence, require other than the classical logic, and mainly, diagrammatic type of logic. We should notice that Category Theory is diagrammatic and an alternative to set theory. The air however is full of thinking about diagrammatic reasoning. Let me list some books:
Interesting thread. Thank you all.
Regarding Newton v Leibniz on calculus, it was my previous 'understanding' that Newton's inferior notation had actually been favored, because he had more political clout. At any rate, it is nice that the better notation survived. Notation is both a curse and a blessing. It is so necessary, and can make concepts clear, as discussed in this thread, but as also noted, needs to be learned as a language. Once learned, you can look at a paper in another 'natural' language you cannot read, but you might still determine the topic from the "mathematical symbols." But the downside is that symbol usage is often not consistent from one book or paper to another, even written in the same period. The same symbol in two different modern sources can have two different, and sometimes very highly related meanings.
So while symbols are very necessary, and like pictures and graphs, can efficiently and more easily convey complex meaning, it is not a good idea to perpetuate unnecessary new or redundant symbols. However, if the redundant symbols are better and introduced in some logical fashion and take the place of the old symbols, or the new symbols are actually not unnecessary, filling a need not previously covered, then introducing them can be very beneficial.
Unfortunately, symbol development and usage, like any progress in science and invention, has been, of necessity, somewhat haphazard. It is easy in retrospect, to say that 'this should have come before that.' But perhaps it would be a good idea to have a series of international conferences in mathematical symbology to try to guide this in the future! Various international conferences in areas such as establishment surveys, and ... well ... fill in your favorite topic here _____ , exist in all areas of science, and any other topics, so why not have a conference on mathematical symbology every 10 years, or perhaps as a part of another international conference series?
To try to "organize" mathematical symbology can have an upside and a downside, like any formal 'organization.' I think that the "up" side is obvious, but in any organization there is a tendency for a few people to rise to the leadership, often for the wrong reasons, and then monopolize and dismiss good ideas. But at a conference, various views might be presented, and perhaps this will give the natural development of and edification on mathematical symbology a nudge in more useful (and hopefully aesthetically pleasing) directions. There could be some sessions on teaching at various levels, and some sessions on new development, and perhaps a few sessions on interesting history, which can also provide insight.
No mathematics without symbols-------we have symbols of numbers, symbols of geometric objects, symbols of arithmetic, symbols of logics,…. But numbers as mathematical symbols have longest history among all; we have many different kinds of number forms in our mathematics to cognize all kinds of things in universe.
From the number spectrum (the expansion of number family), we witness a truth: the creation of mathematical symbols expedites the growth of mathematics.
How big the number family is and how many number forms do we have in our mathematics so far?
We witness many mathematical symbols created in our history; but is there a law for us to operate, when and how can people create new mathematical symbols?
Creating mathematical symbols is out of sheer necessity. Individuals who work in particular field reached to a point where a new property or behavior is obtained and want to represent that behavior. That is the time where symbol creation arises however awkward it may be. But it's acceptance may later be a point of debate if some one comes up with a better symbol for that new property or behavior which by itself received acceptance from the broader community.
That was what happened in choosing between Newton's and Leibniz's symbols of integral and differential calculus. The victor which in this case was Leibniz and I will say for two reasons. First, his symbols were better suited to the subject which have contemporary appealing even to this day that that of Newton's'. The second suspected cause for Leibniz's symbols selection and adaption in the mathematical literature and journals of the time was that he had many supporters at the editorial offices of mathematical journals - which I am not in-favor of this one.
Regarding as to when to create symbols, well, as long as humans continue to search and study nature and as long as mathematics is the tool for that search, definitely the galaxy of mathematics expands and therefore creations of new symbols and notations are of necessities.
For example the recent notations adapted from the integer function [x] are ⌊x⌋- the floor function and ⌈x⌉ - the ceiling function which are reminders of how symbols are created to fill chasms and necessities of symbolism of purpose in abstract discourse.
How do the creations of different number forms as mathematical symbols expedite the growth of mathematics? Why?
Dear Geng,
Would you please be specific when you say number forms? Are you asking forms or ways how numbers be represented in our brain or the different number systems developed in mathematics such as ℕ→ℚ→ℝ→ℂ→Cl_{n} ?
Thank you dear Mr. Dejenie A. Lakew and Zé Carlos Tiago,
I am sorry. I mean different kinds of numbers in our mathematics to cognize different things in universe such as quaternion number, Fuzzy number, finite number, infinite number,….
Sincerely yours,
Geng
Dear Ze Carlos,
The case of Iverson is one example how necessity is the real force behind creating a symbol or expression to simplify work in mathematical discourse. Because he could not use the existing formal expressions in mathematics to write algorithmic expressions he wanted to write, he restructured the existing mathematical expressions to serve the purpose he wanted which later developed to a programming language.
Dear Geng,
The creations of different number systems are by themselves the culminations of developments of mathematics to larger dimensional systems and larger cardinalities to get a wider working space and more algebraic and analytic capabilities. They are results of developments of mathematics it self and they are reasons to enrich and empower mathematics and mathematical theories.
Thank you dear Dejenie,
If we understand the foundation and meaning of a certain mathematical symbols, we can use them well and develop them into a system such as the invention of “0”; but if we don’t understand the foundation and meaning of a certain mathematical symbols, we may meet troubles-------people have been troubling by the “non-number infinitesimal (variables)” related "tyranny of epsilons and deltas formal language" in “standard analysis” ever since. It is difficult to say that the creations of “epsilons and deltas” symbols expedite the growth of mathematics.
Now, “non-standard analysis” is on the way. A very important feature acknowledged for “non-standard analysis” is its deep structural equivalence with “standard analysis” and this is just because nonstandard structure is constructed within the confines of present classical infinite related theory system ------“standard analysis” can do anything “non-standard analysis” can, it means that we don’t understand the foundation and meaning of the newly invented “being-number infinitesimal” either. So, will the new “being-number infinitesimal” related "tyranny of infinitesimal formal language" in “non-standard analysis” trouble us from now on?
I also think that the role of notation ('organization' of symbols) is also very important. Maybe in some cases is as important as proper symbols.
Interesting example of growth of a mathematical knowledge (in a broad sense) is an introduction of (Polish) prefix notation. It poses some new logical problems (e.g. described by J. Woleński in: Szkoła lwowsko-warszawska [Lwów-Warsaw School]). It also helps the computer science in developing some early machines and algorithms, so these notations shows many advantages (theoretical and practical ones).
I think mathematical symbols is a kind of notation, it belongs to notation family.
There are different reasons for the introduction of symbols or their modification, for example. Einstein introduced his summation rule for the recurring indexes to facilitate the work of the printer:-)
I agree that sometimes some practical reasons caused development of a new notation. If this new notation - for some reasons - is more convenient for researchers, it may be a source of new interesting scientific problems/ideas/analogies. I think that new notation can play a role of scaffolding for the imagination of researchers.
My previous example of prefix notation is one this kind. I heard that Łukasiewicz introduced this notation because of the problems with typewriting. Fortunately his invention shows many theoretical virtues for other researchers.
I think the very impressive examples of analysis of this role of language and notations are the Wittgenstein's Notebooks (1914-1916) and of course Tractatus. I think we could find there a lot of interesting ideas but sometimes it is not simple to understand connections Wittgenstein's problems with the role of notation. Interesting presentation of these analysis you can find in the web page "Wittgenstein's chronology" http://www.wittgensteinchronology.com/7.html
http://www.wittgensteinchronology.com/7.html
Are symbols the abstractions of notations and new notations producing new symbols?
Another work relating to number is challenging us.
When we study ”the meaning of zero" and the location of zero in “number spectrum” in our mathematics, an unbalanced defect can be easily discovered: “zero" appears on one side of the “number spectrum” as a kind of mathematical language telling people a situation of “ nothing, not-being,…”; but on the other side of the “number spectrum” we lack of another kind of mathematical language telling people an opposite situation to “zero”------“ something, being,…”.
We need a new symbol (“yan”) with opposite meaning to zero locating at the opposite side of zero in the “number spectrum” to make up the structural incompleteness of “number spectrum” and to complete the existence of “zero”.
He was a remarkable person who was trained in law but self taught mathematician by night who produced many results in mathematics. Among them is the relationships that exist between roots of polynomials which are called Vieta's formulas.
He lived between 1540 and 1603 and imagine how cumbersome and spacious doing mathematics was before that.
Creation of new symbols can be useful. Sometimes, however, if the symbols are not related to standard ones, they can unfortunately lead to confusion instead. Sometimes it is better two use the same symbol for different things, if it is standard, if in the context it is clear to what it referes. In Analytic Number Theory, for example, $\psi(x)$ is used: for the sum of von Mangoldt's function, for the logarithmic derivative of the gamma-function, and for $x- [x] - 1/2$, and almost always it is clear to what the symbol refers, as all three notations are standard.
Dear Aleksandar,
Very true. Standards and consistency are essential components of utilizing symbols.
In present traditional finite—infinite theory system, people have been creating many new “understandings” on “infinite”, “potential infinite” and “actual infinite” since Zeno’s time 2500 years ago. But it is difficult to solve those infinite related problems produced by the fundamental defects disclosed by the infinite related paradoxes since Zeno’s time, because within the present traditional finite—infinite theory system, “the infinite related problems” are strongly interlocked together with the foundation. So, though trying very hard willing to solve “some infinite related problems” with some new “understandings” within the present traditional finite—infinite theory system, but people finally discovered that nothing can be done because “everything is perfect” in present traditional finite—infinite theory system.
I know few people agree with me, but this is true.
Dear Geng,
I think Zeno's paradox is not from the very concept of infinity per se but from his misunderstanding or misconception of motion and distance by a moving being and the motion of that of a numerical digital point. If a point wants to go from the origin to the next point which is a unit away, then it is impossible to do that as there are infinitely many, not only countable but uncountable size of locations to visit and that is impractical. But if a person wants to cover a mile, he does not have to move like a point but rather a physical body that moves a non zero distance in few minutes and will definitely cover the one mile he wanted to run.
Dear Dejenie,
You are right Mr. Dejenie A. Lakew, many people (especially those with physical point of view) argued like that for more than 2500 years. But, it is also true that Zeno’s Paradox of Achilles--Turtle Race is generally accepted as a perfect “defects discloser” within the present traditional finite—infinite theory system with its foundation of “potential infinite” and “actual infinite”. We can forget Zeno’s Paradox, but we can not forget Harmonious Series Paradox-------the “strict mathematical proven” modern version of Zeno’s Paradox of Achilles--Turtle Race.
Sincerer yours, Geng
There's no doubt symbols make a huge difference -- consider just as two familar examples Arabic numerals and Leibniz's notation for calculus, which was clearly better than Newton's (even though engineers in Canada were still taught the 'dot' notation in the '60s, and may yet be for all I know). But I'm concerned about Dejenie's account of the Zeno puzzle, which seems to mix the paradox of plurality with the dichotomy. The dichotomy is answered simply by showing how the concept of limit allows us to calculate sums for some infinite series. The plurality/ paradox of division needs more, including non-denumerable infinities, since a countable infinity of points always has measure 0.
Martin,
I have some points on your statements. I think there is a distinction between a puzzle - a perfect non contradictory game of mathematics or otherwise that tests people's mindfulness to solve while a paradox - a self contradictory or correct looking argument but leads to absurdity. The Zeno's paradox is called a paradox not for dividing a distance in to infinite number of divisions, which mathematically is possible ( as you have indicated summability in mathematics is the property of series of infinite terms which add to a finite number ) but to associate that concept of possibility of infiniteness of divisions to a physically moving body (relation of time, motion and distance ). You are correct in the uncountable infiniteness of divisibility which in the case for instance leads to the concept of definite integrals, compared to countably infinite sums.
The property you mentioned, the zero measurability of countably infinite sets is primarily used in measure and integration theory in which such sets have no effects at all, the very reason they are called null sets in the theory.
The puzzle of how to think of motion in the context of infinitely divisible space and time has a pretty standard answer (though not everyone out there buys it): the 'at-at' theory of motion (first named by Bertrand Russell, I think), according to which motion just is being at different places at different times (and therefore is correctly described using a function from time to position). There's a substantial philosophical literature on the Zenonian paradoxes & puzzles that goes back to Grunbaum and Salmon (and before them to Reichenback, Russell and others).
Prior to the introduction of a strict definition of 'limit' (Bolzano and later Weierstrass), I think Zeno's puzzle really was a paradox, since there was no consistent account of how we could add an infinity of non-zero amounts and arrive at a unique, finite sum...
I agree with you, Martin, symbols are really important, but they are only forms carrying our ideas. If the idea is not “scientific enough”, symbols help nothing.
Dear Mr. Martin Bryson Brown, would you please give your frank opinion on following example: how dose modern limit theory with “epsilon and delta symbols” work?
The divergent proof of Harmonic Series, very elementary and important, can be found in many current higher mathematical books written in all kinds of languages:
1+1/2 +1/3+1/4+...+1/n +... (1)
=1+1/2 +(1/3+1/4 )+(1/5+1/6+1/7+1/8)+... (2)
>1+ 1/2 +( 1/4+1/4 )+(1/8+1/8+1/8+1/8)+... (3)
=1+ 1/2 + 1/2 + 1/2 + 1/2 + ...------>infinity (4)
We teach our students that we can produce infinite numbers each bigger than 1/2 or 1 or 100 or 100000 or 10000000000 or… from infinite Un--->0 items in Harmonic Series by “brackets-placing rule" with modern limit theory and change an infinitely decreasing Harmonic Series with the property of Un--->0 into any infinite constant series with the property of Un--->constant or any infinitely increasing series with the property of Un--->infinity.
It is miserable that sometimes we have to stuff our students such mysterious things, the more we try to explain the more doubts be aroused and we feel more helpless.
Sincerer yours, Geng
Dear Mr. Lakew,
the point about notations in "higher" mathematics that has impressed me most over the period that I've now been doing research in the history of early-modern mathematics is the huge difference Leonhard Euler's work made: The original texts of, e.g., Viète, Wallis, Descartes, Fermat, Huygens, Leibniz, Newton or Jacob Bernoulli are virtually inaccessible to the non-specialist 21st-century mathematician (even in translation), whereas those of Euler, Lagrange, Legendre or Gauss are so much closer to us in style and notation that they can be understood with much less effort.
I think this is partly due to Euler's didactic impetus (shared only by Descartes among the other authors I mentioned); but the introduction of a - more or less - consistent notation using standardized symbols easy to memorize and handle must also have played an important part in more widely diffusing advanced research and motivate others to contribute.
In my paper on Euler as "the first modern mathematician"(https://www.researchgate.net/publication/276445400), I have tried to elaborate this point and illustrate it by some of the many examples documented in Cajori's classic work (A History of Mathematical Notations, esp. vol.II, Chicago 1929).
To address your main question, I'd say that the most important way in which new signs affect mathematical progress is not (as outsiders sometimes tend to think) any kind of "automatism" in which the formalism generates new results by itself, but the easier understanding of what has been done before and the resulting encouragement to newcomers starting on applications and on their own research.
Wishing you all the best for your further research,
Martin Mattmüller (Bernoulli-Euler-Zentrum Basel, Switzerland)
Conference Paper The first modern mathematician? Euler's Influence on the Dev...
Dear Martin Mattmuller,
You have good points and I can tell your affection to Euler from your choice of tile " the first modern mathematician". Indeed Euler was one of the greatest mathematicians ever lived, with an unparalleled works in almost every branch of mathematics he encountered with a huge amounts of mathematical works probably until his death. He had a nickname Cyclops because of lose of his eye but that fits him to the mythical figure in Greek mythology of a cyclops of a superhero.