This always amazed me:
"A Jewish tradition that dates from at least the 19th century is to write plus using a symbol like an inverted T. This practice was adopted into Israeli schools (this practice goes back to at least the 1940s)[11] and is still commonplace today in elementary schools (including secular schools) but in fewer secondary schools.[12] It is also used occasionally in books by religious authors, but most books for adults use the international symbol "+". The usual explanation for this practice is that it avoids the writing of a symbol "+" that looks like a Christian cross.[12]" [1]
I have an uncomfortable memory of my 1st grade teacher showing this to us and me refusing to go along with what I insisted was an irrational superstition. I didn't understand the context or what her old age meant.
Similarly, I was shocked as a kid when I visited the US and saw apartment buildings skip the 13th floor numbering.
What fascinates me most about this story is how it illustrates what a young race we humans are intellectually.
If we ever want to be anything more than warring tribes, we must learn to accumulate knowledge together. Universal symbols for addition and substraction are the first small step on that very long road, yet it's only been a few hundred years since we took that step.
On the scale of the universe, we are like a baby who just minutes ago learned to grab onto objects and already is hell-bent on pouring a hot pot of tea on herself using this new capability.
And we're still using '*' for multiplication in our programming languages instead of '×' because using anything other than ASCII for programming is just too hard, or too weird, or whatever.
To be fair, I prefer dots over crosses for multiplication of scalars in when I write it myself. I tend to assume a cross means cross product instead of scalar multiplication. (But, oddly, I do not assume that a dot means dot product.)
More likely, it's because we don't have them on our keyboards. Programming languages are usually about reducing friction for programmers, but using the proper multiplication/division symbols would increase that for most people.
My best guess would be that we use * for multiplication because x being alphanumeric is a valid variable name.
regarding keyboard presence, this is one of the things that irks me about Mac keyboards is the absence of the # character. I write little programs and scripts on my mac infrequently enough that I have to look up how to type it every blasted time. That, and how to take screenshots. The "Print Scrn" key on PCs spoils me.
What language are you referring to? I'm not quite sure I understand your examples.
--------
Related:
In Haskell, any 2-argument function can be used (or defined) infix by surrounding it with backticks, eg:
add 2 3
is equivalent to:
2 `add` 3
----
And you can use this to cleanly write curried functions (often predicates) with their arguments flipped, eg:
vowelCount = length . filter (`elem` "aeiou")
----
Note that functions named with punctuation characters are infix by default, and one can be used/defined prefix by surrounding it with brackets, eg:
(<$>) = fmap
toUpper <$> "aeiou" # => "AEIOU"
----
It's also convenient for resolving a subset of one of the harder problems in computer science: naming things. For a 2-argument function, use it infix and read it as a phrase in english. Which is clearer:
".jpg" `isSuffix` url
or:
".jpg" `isSuffixOf` url
You can also define functions' precedence and associativity (left/right) when used as an infix operator.
> And you can use this to cleanly write curried functions (often predicates) with their arguments flipped
But unicode in code is terser. Instead of `elem`, we could use ∈ [U+2208].
To flip parameters, we could use the bidi-mirrored glyph, i.e. ∋ [U+220B, the bidi-mirror of U+2208]. In fact we could even make it a feature of the language grammar to automatically detect whether a mirrored glyph is being used, and perform the transformation.
We could even automatically detect and transform canonically equivalent graphemes using the non-spacing version of `not`, e.g. ∉ [U+2209, canonically equivalent to U+2208,U+0338], and ∌ [U+220C, mirror of U+2209 and equiv to U+220B,U+0338].
Editors should help the user to remap their keyboard keys.. We as users should move the symbols we need so that they are easier to type. For programming I've moved all of \{}[]() to the home row (with alt gr). Just one option of many but it demonstrates the concept..
We already tried going full-APL but it only proved that it wasn't a good idea. I mean this is impressive https://www.youtube.com/watch?v=a9xAKttWgP4 but I still think the notation is too obscure.
Full APL isn't just unicode, though, it's also a total commitment to single-character operators as the primary language construct. Take a look at some Agda or Unicode-enabled Haskell code to see it really improving readability... though hampering type-ability.
> We already tried going full-APL but it only proved that it wasn't a good idea.
When you say full-APL do you mean in one leap? Then I agree it's not a good idea, but how about going to APL in stages? First introduce just some operators (e.g. × ∧ ∨ ∈ ≥ ≤ ≠ → ⇒ only), then introduce more gradually until all APL operators are in use? Then keep on introducing other symbols from Unicode not in APL.
Well I used "we" continuing ruther's sense of "programmers in general". I didn't really do APL myself. I think it was Alan Kay who said: we needed to make APL to know what going too far looked like, that way we could stop wondering if we could do better by going further.
On the temporal / spatial scale of the universe we are virtually non-existent. On a scale that ranks everything in the universe according to physical / informational complexity & organization, we appear so far to be quite significant.
Our yardstick is far too short to make any universe scale ranking on this subject. However, what we lack in knowledge, we thoroughly compensate in sense of ego and self-importance as species :)
Our yardstick is pretty good. We have characterized a vast range of the objects we share out universe with in immense structural detail, from lifeforms here on earth to astronomical objects billions of light years away. The human brain sticks out.
I wrote my comment precisely because it has become a cliché to say what you're saying – how small, ignorant and insignificant we are in relation to the universe, how we compensate by being arrogant. Taking a different view, that human brains are really quite extraordinary compared to everything we can see out there, is not necessarily egotistic. The more I feel a sense of wonder at the preciousness and privilege of human life, the more I want to spend mine being helpful, nurturing, productive.
We know a lot about a tiny area, and a tiny amount about a large area. But, suggesting we know anything about life / AI outside our solar system is pure hubris.
The universe presents overwhelmingly as a vast void. Of the total space in the universe, what fraction at most do you think is occupied by self-aware minds? What is the maximum percentage of all the matter in the universe that is organized into conscious beings? Regardless of whether life is common or rare on other planets, it is certainly vanishingly scarce overall.
Of the total space in the universe, what fraction at most do you think is occupied by self-aware minds?
Somewhere between vary close to 0% and vary close to 100%.
What is the maximum percentage of all the matter in the universe that is organized into conscious beings?
Somewhere between vary close to 0% and vary close to 100%.
Regardless of whether life is common or rare on other planets, it is certainly vanishingly scarce overall.
That makes plenty of rather large assumptions life may or may not only consisting of organic life on planets. For example, as our view of the universe is hobbled by the speed of light most of the planets in existence may have been transformed into Dyson spheres by AI (as one possibility) and we have no way of knowing. Or to bring up a classic idea the observable universe may be an influentially small part of a larger organism.
PS: Now, I suspect your probably more or less correct. However, just because something seems to be the most likely possibility does not make it the only option.
After reading the article, I got fixated on why you were dividing time by space... finishing the sentence, I realized that wasn't what you were trying to say at all. :-)
I always felt the origin of the equals sign was wonderful: "I will sette as I doe often in woorke use, a paire of parralles, or Gemowe lines of one lengthe, thus : ==, bicause noe 2, thynges, can be moare equalle."
It's what permits those two lines to be essentially irreducibly perfect in describing "the same" that fascinates me. Any 'samer' and they would converge and simply be -.
The opposite of something. It's only when we can separate nothing, that we can begin to describe something.
(And is the universe the mathematical manifestation of everything that isn't nothing?) /stoner
The Hindus, like the Greeks, usually had no mark for addition, except that “yu” was used in the Bakhshali manuscript Arithmetic (which probably dates to the third or fourth century).
That's interesting. The Silk Road (http://en.wikipedia.org/wiki/Silk_Road) was in full swing at that time. This is pure guesswork, but multiple classical Chinese characters now pronounced yu (予,与,餘,逾,與) meant something close to addition. I wonder if there's a link?
There were a great many languages in operation there, and the Chinese were primarily interested in trade. It wouldn't surprise me at all if there were a regional patois language of business utilizing Chinese idiom that the subcontinent and others were able to borrow from.
Likewise, the borrowing may have gone in the other direction.
Note that some of the above symbols are not so dissimilar from the + symbol, allowing some minor scope for simplification.
Had a quick check for Basque/Estonian/Finnish online dictionaries as the major outlying European languages out of further interest. Basque alone seems to use the recognizable root ek... (one would suppose from common root with the Sanskrit for one: eka). Estonian and Finnish seem way closer to Latin languages in most cases .. I checked words like and, plus, sum, total. Lots of readily distinguishable slight variations, but no yu to be found. What about English union, unit and unity? Same root I suppose. Dictionary.com claims union is from Latin ūn(us) (one) via middle French but that unit is only attested to 1642. It seems the Romans had some of that yu going, too.
I guess that's as far as exploration will go with western sources: apparently the Romans weren't big on composing etymologies. Perhaps Indian sources are useful though: Nirukta (Sanskrit: निरुक्त, IPA: [n̪irukt̪ə], explanation, etymological interpretation) is supposed to be one of the six Vedānga disciplines of Hinduism. One of the primary texts there is available at the Internet Archive: http://archive.org/details/nighantuniruktao00yaskuoft
Potentially related tidbits I found there: "Yosa (a woman) is derived from (the root) yu (to join).", "Yutham (herd is derived from (the root) yu (to connect).", "Dasyu (demon) is derived from (the root) das, meaning to lay waste: in him the juices are wasted, or he causes works to be laid waste." Also had some good discussions of soma (a psychadelic draft).
OK, curiosity sated. Obviously there was some yu going on circa Central Asia at the dawn of history. Anything more specific seems like guesswork only. My take is it was to soma takers what PLUR is to early ravers. Party on man!
For Finnish, you may have missed the now deprecated word for summation, "ynnä", meaning "as one". It's not from Latin or Chinese, though - at least not recognizably so.
Interesting. It would probably be seen as a Proto Indo-European root rather than anything else. They were the folks tottering about Central Asia at the dawn of history. There's an amusing bit of video about them at the start of the BBC's History of India series, featuring a crazed Russian archaeologist who has been career-excavating early sites in Uzbekistan, and the neverending human quest for intoxication!
> The asterisk was used by Johann Rahn (1622-1676) in 1659 in Teutsche Algebra
Interesting. I've always thought that the use of an asterisk to denote multiplication was some sort of compromise owing to the lack of a "×" key on a typical computer keyboard. But it seems that there are much older precedents for it.
Meanwhile, using a slash to denote division is explained in the Fractions page.
> The diagonal fraction bar (also called a solidus or virgule) was introduced because the horizontal fraction bar was difficult typographically, requiring three terraces of type. An early handwritten document with forward slashes in lieu of fraction bars is Thomas Twining's Ledger of 1718 ...
Interesting. The close parentheses form of division from 1540 looks like the basis of the long division form of division, i.e. 2)6 with a bar or vinculum, extending from the top of the parentheses over the 6. The long division form finally takes its modern form in 1888. The transformation of that symbol seems somewhat obvious and natural.
I wonder if we are likely to see further changes, as modern typesetting should resist the variations that handwriting would introduce. I believe any change in the future will have to demonstrate improved readability before it will be accepted.
Also interesting: the form for long division that is used in the 19th century in the US: divisor ) dividend ( quotient, is very similar to the long division form that is still used in the Netherlands today: divisor / dividend \ quotient. Looking at Wikipedia, it seems all other countries have gone on to different notations, though.
We discussed this article in the office and among other things we found that even today the notation for division (when calculating it by hand) varies wildly:
i.e. here | acts as a low precedence division operator, ÷ as an ordinary precedence division operator like infix / and the 'closefix' rational notation 3/4 has the highest precedence in my programming language
n.b. in case you were wondering P v Q is used for Boolean Or.
Kind of weird that no consensus have emerged for multiplication (·, ×, *, or horizontal juxtaposition) or division (÷, /, or vertical juxtaposition with horizontal line between) yet.
Clearly, the idea to put those symbols on calculator buttons didn't come from nowhere. I think our collective point is that there exists a separate tradition for using × and ÷.
÷ is the symbol for division since it symbolises a fraction, with the two dots taking the place of the numbers. I'd say this is why, as some people have noted, : can also be used for division in some contexts: it represents a ratio, which is again just a fraction.
The way I was taught it, though, is that ratios are rather different from fractions. You might say 2:1 of tea and honey, but ⅔ tea and ⅓ honey.
However, when describing only one quantity, some people will say 1:2 of honey while others might say 1:3 of honey.
What then happens when you have to split a long line of math?
What then happens when you need to multiply 2 by 2?
The answer to the first question is that the following line is begun with a multiplication sign.
The answer to the second question is that bold roman type is used for vectors and often for other forms of tensor like a matrix to overload the semantics of dot and cross to not be scalar products were either of their arguments in italic.
The / is actually the - from the other symbol and the .'s are placeholders for the numbers to the left and the right of the / so those are in fact pretty close.
I always thought the plus sign had to do with two entities meeting or being placed on top each other, like a road crossing or a pair of sticks, and that the minus sign was just a spinn-off of that sign and also made sense in the same way that Descartes used "..." according to the article.
The next logical question is, "Where did × and ÷ originate?" From there, I've always wondered how multiplication migrated from "×" to "∗"? And recently, I've wondered if we will migrate "-" to something else? (This last derives from having to annotate a range of numbers including positive and negative. When "-" is used for both range and negative, it causes confusion.)
p.s. I tried to post this to the original poster's blog, but was told "...your comment seems a bit spammy." Really? I'd say their blogger spam settings a more than a bit off.
I always thought the * was used for multiplication to avoid confusing × with the English alphabet X. We can usually distinguish them while writing equations on paper(I write x in a cursive, for example).
* was used for multiplication because there weren't enough bits to fit every symbol everyone would have liked into ASCII, so compromises were made, including preference for symbols that could be 'overloaded' with multiple meanings in different contexts.
> Overall, what is perhaps most impressive in this story is the fact that symbols which first appeared in print only about five hundred years ago have become part of what is perhaps the most universal “language.”
They appeared just at the right time to be carried around the world by conquista and colonialism. Same goes for time notation, etc.
One of the interesting things is why are there 24 hours in a day, 60 minutes per hour, etc?
Mathematically, those values are useful because they can be factored in many ways. But who picked them?
I was at the British Museum one time, and they had an horological exhibit (clocks throughout the ages), so I asked one of the curators. He said it was the Babylonians that chose those values.
But since this thread is about notation, I should be asking who picked the colon (:) as the time separator that is generally used. And are there other conventions for this?
Medieval manuscripts are full of strange squiggles used to abbreviate words or word endings. It's almost like reading shorthand. Here is a short list for Middle English [1] but for Latin there were dozens if not hundreds. So I'm not surprised that + and - originate from that time.
There's an interesting note near the bottom about the ÷ symbol. Even today in the Netherlands, a symbol like -/- (like a percent sign, but with minus signs on both sides of the slash) is used as a kind of emphasized minus sign, especially in money calculations in a vertical layout where otherwise addition would be assumed. I don't know if this is used elsewhere, but it seems probable to me that it derives from the ÷ symbol mentioned in the text.
Interesting that there is no mention of ancient China, where money and accounting were invented.
The Chinese for buy and sell are 买 and 卖 respectively, the first indicating spending money and the second indicating receiving money.
Granted I don't know what I'm talking about, however I do know Chinese, and I believe these characters far predate the 15th century as do most Chinese characters.
Perhaps someone can elaborate on whether this is a sound theory.
Fun fact: the one on the left, everything but the top part is a pictograph of a shell, which was used for a kind of status symbol or possibly something roughly equating our modern money in many early societies. When Marco Polo arrived here in southwest China in the Yuan Dynasty (ie. just post Mongol-invasion), shells were still in widespread use. In fact, Chinese literary records document special requests to the emperor to except the region from the national law requiring use of imperial money by the provincial governor they placed in the region, who was an Uzbek and son of the then-king of Bokhara who had ridden with the Mongol horde to take China. One of his father's later successors was one of the Prokudin-Gorskii images (first colour images in the region), over here: http://pratyeka.org/prokudin-gorskii/the-emir-of-bukhara-191...
Chinese write it as 加, if you start making the strokes you get a + in the upper left. Consider this is over 6000 years old, probably the origin. My theory.
Edit (more evidence): "The plus symbol as an abbreviation for the Latin et, though appearing with the downward stroke not quite vertical, was found in a manuscript dated 1417 (Cajori)."
There is an assumption here that it's an abbreviation for the Latin et. But the downward stroke can be seen in the Chinese word.
[1] http://en.wikipedia.org/wiki/Plus_and_minus_signs#Alternativ...