gumby 3 months ago

This article is really just a gloss on Kaplan's book (which I have read so it stands out) with a bit of gratuitous randomness thrown in.

Much as I have a rather low-level atavistic desire to credit India with the zero/nil, as there was so much exchange between mesopotamia and the early indus regions (just look at the idea of the alphabet going one way and then digits going the other), the sumerian "origins" are quite likely. More importantly, the vedic tradition didn't give rise to the formalisms developed by the later Greeks and, centuries later, their Islamic students. Thus for a long time, scholarly dissertations from the subcontinent on mathematics, philosophy etc tended to essays and explorations of conjectures, which makes pinning responsibility hard to do, the way you can, say, "Wiles did prove Fermat's Last Theorem".

Personally I find invention requires so much history and intertwined communication that the idea of "inventor" is kind of bogus anyway.

BTW in case this sounds like I'm dissing ancient indian scholars: you see this in the early days of any scientific field: early neuroscience in the early 20th century, the same with cognitive science in the mid 50s-70s (at least) etc. In fact most of contemporary ML just has a light layer of formalism painted on too. It feels like fields need names, but only really get them when they have attained some early level of abstraction and emerging rigor.

Sorry, that moved on beyond zero!

  • jacobolus 3 months ago

    The real problem is there are too many people who treat mathematical/scientific history as some kind of olympic games, where the goal is to rack up medals for your preferred team (country, region, religious group, ...), with the result that any discussion becomes counterproductively politicized.

    Trying to argue about whether one tribe's or another tribe's 50-generations-ago ancestor was the first one to do this or that thing seems to me like completely missing the point, when all of these steps were part of a long and gradual historical process, building ideas and tools up over centuries. (Similarly, it's annoying how many debates center on various ancient figures' ethnicity or religious affiliation, usually without much evidence.)

    To anyone who tries researching ancient (or more recent) mathematics, it's clear that there usually isn't a single aha moment changing everything, but a broader culture that gradually evolves. We can see different flavors/aspects of a concept like "zero" which were developed different times and places (China, Mesopotamia, India, Greece, North America), none of which really draws any obvious line in the sand.

    With regard to Indian innovations, however, it seems pretty clear that written arithmetic per se (performed on a "sand board") was developed there, as credited by all of the oldest extant texts on the subject from writers in Arabic (which call it something like "Indian arithmetic" or "Indian numbers"). Written arithmetic was then substantially elaborated in the Islamic world with a switch to using pen and paper, before making its way to Europe where it eventually kicked off the development of modern mathematical notation. The earlier Mesopotamian/Egyptian/Greek/European tradition, as well as the Chinese tradition, were generally based on using finger counting or some form of counting board, with written numerals used as a serialization format rather than a calculation tool. Arguably the invention and spread of physical materials like cheap good quality paper, writing implements, ink, and eventually printing presses were as important as the theoretical developments.

    • mkoubaa 3 months ago

      What adds to the absurdity is that an ancient spartan or dalmatian would have seen folks like Socrates as foreigners, and so people who descend from ancient Spartans would have no reason to feel proud of the achievements of Greek philosophy were it not for the 20th century myth of the nation state. The situation is even worse in India.

      The intellectually honest account is that great people came at random from accommodating societies and cultures with the requisite technology and opportunity for those geniuses. There are countless stories of kings beheading inventors because their inventions risk offending the social order, or the gods, their pride, or some such. Just as there were in all likelihood millions of geniuses all over the world who were determined at birth to be subsistence farmers or slaves.

  • dyauspitr 3 months ago

    Isn’t India’s contribution the base 10 number system and using zero for the first time to represent numbers like we do today? I don’t think having a concept of nothing means much otherwise.

    • jacobolus 3 months ago

      Base ten number systems were invented repeatedly. The oldest positional number system we know about is the (base 60) system from Mesopotamia. Other ancient people also used positional base 10 systems for calculation, including in China, Egypt, and Greece, but their written numerals were not positional in the same way. India is the first place where we have evidence of a base-ten number-writing system with 10 symbolic digits whose contribution to a number depends on the place.

briankelly 3 months ago

Interesting enough, the oldest recorded use of the symbol zero in mesoamerica is older. From https://baas.aas.org/pub/2021n1i336p03/release/2 - “The oldest representation of the Mesoamerican zero, dating from the year 31 BCE, is found in Stela C in the ancestral Olmec site of Tres Zapotes in Veracruz, Mexico.”

  • adrian_b 3 months ago

    Older than the use of zero in India, but hundreds of years later than the use of zero in Mesopotamia.

    In any case, this must have been an independent invention of zero.

    • kragen 3 months ago

      unless it was brought over by the flying saucers

      (now where did i put my aluminum foil)

adrian_b 3 months ago

Speaking about the "invention of zero" is not a good choice of words.

All languages have had words for "zero", for at least a few thousand years.

For instance English has inherited "null" from Latin, whose literal meaning was initially "not even a small one", but whose meaning has become "zero". Latin also used very frequently the word "non-null", with the meaning "one or more". Latin had a few more other words that could be used to express the quantity "zero". The same was true for Ancient Greek and for the languages for which even older records exist.

What the parent article intends to discuss is the invention of a purely positional system for writing numbers. Before the invention of such a system for writing numbers, the words meaning "zero" were used in all languages only for the unique quantity "zero". They were not used as components of the numerals used to name bigger numbers.

The necessity to write very big numbers for accounting or computational purposes has made desirable the invention of a system that would be less cumbersome for such big numbers than the system used for the spoken numerals.

That was the positional number system, where a small set of symbols is sufficient to write even very big numbers. Any purely positional number system needs some kind of symbol for zero, which must be usable in any position.

When the sexagesimal Mesopotamian number system began to be used positionally, a symbol for "zero" had to be added. So nobody thought directly about a "zero" symbol. They just wanted to reuse the same symbols in all positions, and sooner or later someone understood that adding a symbol for "zero" is the solution to this problem.

  • kragen 3 months ago

    > When the sexagesimal Mesopotamian number system began to be used positionally, a symbol for "zero" had to be added

    the sexagesimal mesopotamian number system was used positionally from the beginning (that's what was sexagesimal about it) and, as the article explains, did not have a symbol for zero in common use, or evidently at all for millennia. the early zero symbol mentioned in the article is from a tablet from 0700 bce or later, which is only 2700 years ago. at that point positional sexagesimal mesopotamian numerals were already about 1000–2000 years old. what popova doesn't explain is that they often used an empty space for zero instead

    • srean 3 months ago

      Having a symbol (for zero) is half the solution. It takes another leap to treat zero as number that participates in the usual arithmetic operations.

      From what I understand, there were a variety of non-standardized symbols or place holders used by the Mesopotamians (space being one). Zero was not treated as a number, but as a special case in their evaluator.

    • jacobolus 3 months ago

      > sexagesimal mesopotamian number system was used positionally from the beginning

      This is oversimplified. The sexagesimal place-value system developed gradually over centuries+ from various inconsistent unit systems for length, weight, fluid volume, counting, ..., which were eventually (partially) standardized. This long development took place in tandem with the development of cuneiform writing, both ultimately originating in a record-keeping system involving clay tokens sealed in clay envelopes. For a detailed version, Eleanor Robson's book Mathematics in Ancient Iraq is excellent.

      • kragen 3 months ago

        that's true, but to the extent that at any given time it was purely sexagesimal, it was also positional, wasn't it?

        the book recommendation is greatly appreciated

        • jacobolus 3 months ago

          I think it's fuzzier than that, both as a historical process (stretching across more than a millennium of development), and as a tool at any particular point in time. Here's some quotation from Robson:

          The very earliest literate accounts – from late fourth-millenium Uruk – used commodity-specific metrologies with a variety of different numerical relationships between the units. Those original metrologies continued in use throughout the third milennium and beyond, whether essentially unchanged (for instance areas), or undergoing periodical reform (for instance capacities). They continued to be written with compound signs, which bundled both quantity and unit into a single grapheme, just as the preliterate accounting tokens must have done. Gradually, over the course of the later third millennium, scribes began to write those compound metrological numerals with a cuneiform stylus, rather than impressing a round stylus into the clay in imitation of accounting tokens. Throughout the Sargonic period, and even into the early Ur I period, impressed and incised number notations apear side by side on the same tablets – a phenomenon that has not yet been systematically documented or explained.

          [...] But not every new metrological unit was sexagesimally structured. The smaller length measures, first attested in the Early Dynastic IIIb period, divide the rod into 2 reeds or 12 cubits, and the cubit into 30 fingers. None of these newly invented units of measure was recorded with compound metrological numerals, but always written as numbers recorded according to the discrete notation system followed by a separate sign for the metrological unit. This has implications for our understanding of the material culture of early Mesopotamian calculation, as well as for the shifting conceptualisation of number.

          [...] At some point early in the Ur III period, the generalised sexagesimal fraction and the generalised unit fraction were productively combined to create a new cognitive tool: the sexagesimal place value system. This calculating device took quantities expressed in traditional metrologies and reconfigured them as sexagesimal multiples or fractions of a base unit, often at convenient meeting point between metrological systems.

          [...] The SPVS temporarily changed the status of numbers from properties of real-world objects to independent entities that could be manipulated without regard to absolute value or metrological system. Calculations could thus transform numbers from lengths into areas, or from capacity units of grain into discretely counted recipients of rations, without concern for the objects to which they pertained. Once the calculation was done, the result was expressed in the most appropriate metrological units and thus re-entered the natural world as a concrete quantity.

          • kragen 3 months ago

            this makes it sound like robson agrees that there's no purely-sexagesimal non-positional transitional phase like you seemed to be positing?

            • jacobolus 3 months ago

              As far as I understand there wasn't ever a "purely sexagesimal" phase. Some of the units were sexagesimal or sexagesimal-ish and others weren't, all the way along (but gradually trending toward more that were, as various unit systems were reformed). The sexagesimal system (per se) arose after literally centuries of prior experience with some of the unit systems being more or less sexagesimal, accompanying a transition in writing from using marks that were a concrete representation of clay tokens of various shapes toward using cuneiform. Even once there were people using sexagesimal calculations as an intermediate representation, they had to use tables for converting all of the non-sexagesimal units to sexagesimal quantities in terms of a specific unit of each type, and vice versa.

              The use of sexagesimal calculation probably had something to do with the use of a type of counting board about which we unfortunately know few details (it was called the "hand" and IIRC could generally support values with 5 sexagesimal places, but anything we know about it comes from scattered textual references and some inferences drawn from calculation mistakes; Robson's book doesn't talk about this subject much but there are some nice papers about it elsewhere).

              • kragen 3 months ago

                aha, i see! thank you very much!

  • vishnugupta 3 months ago

    > That was the positional number system, where a small set of symbols is sufficient to write even very big numbers.

    Just the other day I was thinking about it and realised that positional number system enables us to represent infinite numbers with a finite set of symbols. Whoever (person/community) invented is a genius, as positional numbering unlocks so many further inventions down the line.

    • kragen 3 months ago

      this is also true of clauses in grammar. the use of clauses makes it possible to write an infinite number of sentences using a finite number of words, each of which is drawn from a finite vocabulary, but whose assemblages are more infinite than the stars in the sky which so drew the attention of archimedes when he approximated the finite number of grains of sand that would be required to fill up the observable universe, a calculation which unfortunately he got low by several orders of magnitude due to an unfortunate lack of evidence for the hypothesis that the stars were other suns, not to mention the idea that some of the nebulae in the sky were entire galaxies of other stars, a fact which wasn't known until only a century ago

      the previous sentence contains 125 words, and it should be evident that it could be extended indefinitely in any known human language (except possibly pirahã) without doing any violence to the rules of grammar, though perhaps great violence to the canons of courtesy to readers. if we use shannon's early estimate of 11.82 bits of entropy per english word, in english there are about 2¹⁴⁷⁸ ≈ 10⁴⁴⁵ perfectly unremarkable sentences of that precise length, a number which (it should be evident) grows exponentially with the sentence length

      so, while i agree this concept is genius, it is part of the invention of language as we know it, and no isolated human tribe without language has ever been discovered. it probably dates to so-called behavioral modernity, at least 50000 years ago—probably longer than that

    • Chinjut 3 months ago

      You can already represent infinitely many numbers with a finite set of symbols without positional notation, in unary. You can write "1 + 1 + 1 + 1" or "This number of stars: ****" or whatever. This idea (make N marks to represent the natural number N) has surely been understood about as long as counting has been understood.

      The advantage of positional notation is greater efficiency at representing large numbers.

      • vishnugupta 3 months ago

        Agreed.

        I forgot to mention that positional numbering’s space complexity is log.

    • feoren 3 months ago

      No, Log10(infinity) = infinity.

      It allows us to represent numbers using log symbols. Symbols required = O(log10(N)). Or put another way, the max number we can represent with S symbols is exponential in S. N = O(exp(S)).

      • fwip 3 months ago

        They are speaking of a finite set of symbols (the digits 0 through 9), not the number of glyphs in the representation of an arbitrary number.

        • feoren 3 months ago

          You can represent any arbitrarily large number with a single symbol without positional notation, though. Let's use *. The number 3 is ***. The number 9 is *********. That interpretation of GP's comment (which, on re-read, is probably correct) makes it even more trivially false. Another comment by the same poster clarifies that I was actually responding to what he meant.

  • mensetmanusman 3 months ago

    Null is closer to ‘empty set’ than to zero, and the two concepts are similar but different enough to be distinct.

    • adrian_b 3 months ago

      In Latin, "null" was not a noun (i.e. substantive), but an adjective applied to nouns.

      It was used in precisely the same word contexts as the words for "one", "two", "three" etc. and in those contexts you could substitute any cardinal numeral. Like any cardinal numeral, it could be used to answer questions about how many things are in a certain place.

      So it was really the number "zero".

      For the empty set, the most appropriate Latin word was the noun "nihil" ("nothing"), sometimes contracted to "nil" (with long "i") hence the NIL of LISP for the empty list.

      So the concepts of "zero" and "empty set" were distinguished in Latin and also in the other known ancient languages.

      Some modern programming languages use "null" in a wrong way, when they should have stuck to the NIL of LISP. A null integer or floating-point number denotes the quantity "zero", but a null pointer is not a quantity. A null pointer points to nothing, so it denotes the object NIL.

    • dayjaby 3 months ago

      In German, it's the same. I hope you are not talking about null in coding? Because I'm pretty sure human 2000 years ago did not know any programming code or even empty sets.

      • rdtsc 3 months ago

        Empty set idea doesn’t seem so abstract. A bag of stuff can have nothing in it, so it’s empty. Combining and splitting up containers of items would have been an everyday occurrence.

    • kazinator 3 months ago

      The word "null" in "null set" refers its Lebesgue measure being zero.

      • badkitty99 3 months ago

        Null sets in and of themselves have nothing to do with Lebesgue integrals. Null sets are definitely are not defined by Lebesgue integrals, you might be thinking of a set of measure zero which has to do with Lebesgue measure

        • kazinator 3 months ago

          > has to do with Lebesgue measure

          As in, the only thing Lebesgue that my comment mentions.

    • kreetx 3 months ago

      I'm just nitpicking, but it's unlikely that the inventors of both had empty set and zero formalized to a degree to make them distinct.

      • adrian_b 3 months ago

        While the empty set and the number "zero" are distinct concepts, the relationship between them is much closer than between any other sets and numbers.

        The cardinal numbers are equivalence classes of the sets. For any other cardinal number except "zero", the equivalence class of that number contains a huge number of sets, potentially infinite.

        For "zero", the equivalence class contains only a unique set, the empty set. Because of this one-to-one correspondence between the empty set and "zero", they may be interchanged in many contexts without causing any ambiguities.

        • kreetx 3 months ago

          Right. What I was aiming at was that when someone invents one of these, they are probably thinking about the other, too, without the set theoretic rigor you recite.

  • Someone 3 months ago

    > All languages have had words for "zero", for at least a few thousand years

    > What the parent article intends to discuss is the invention of a purely positional system for writing numbers

    I don’t see how positional is required there. Even if you just tally things, computations may produce zero, and you’ll want to write something to indicate that you made the calculation, and didn’t abandon it halfway though.

    • adrian_b 3 months ago

      That matches in writing the meaning of the word "zero", "null" or whatever word is used in the spoken language.

      So it is not a new invention.

      When the positional system of writing numbers was invented, the symbol for "zero" had to be used in a new way.

      In the spoken language, one says "one hundred and five", without using any "zero" word.

      In a non-positional writing system, one would mimic the spoken language, writing e.g. "CV". Symbols for "0" are not used inside any other number, but only for the unique quantity "zero".

      In a positional system, one writes "one hundred and five" as "1" "0" "5", using a "0" in the appropriate positions, which allows the reuse of the symbol "1", instead of having to invent special symbols for hundreds.

      For very big numbers, the economy of symbols brought by the positional system is great, so its invention has been very important.

      The positional system did not need to invent a new word or symbol for "zero", but it had to invent a new way to use "zero", inside the strings used for writing big numbers.

      • Someone 3 months ago

        I disagree. The Romans had the word “nulla” to (loosely) mean “nothing”, but for them, the smallest whole number was 1 (https://en.wikipedia.org/wiki/Integer#History: “Historically the term was used for a number that was a multiple of 1, or to the whole part of a mixed number. Only positive integers were considered, making the term synonymous with the natural numbers”

        And yes, to introduce a positional number system, you need some way to indicate “there is a position here, but there’s nothing there”, but you don’t need a positional number system to extend the integers by adding the novel concept of zero (and then, you need a, preferably compact, way to write them)

        • empath75 3 months ago

          > “Historically the term was used for a number that was a multiple of 1, or to the whole part of a mixed number. Only positive integers were considered, making the term synonymous with the natural numbers”

          In fact, they didn't even consider 1 to be a number, let alone zero -- at least aristotle didn't, and his definition was the accepted one in Europe for many centuries.

          Though I think you need to sort of separate out what philosophers and mathematicians thought about numbers from what regular people did. I think people obviously had an intuitive understanding of zero and one as quantities, even if it wasn't formally defined that way by philosophers for thousands of years.

        • adrian_b 3 months ago

          As I have said "null" does not mean "nothing". The latter is a substantive noun, while the former was an adjective in Latin, where such words were not interchangeable. "Nihil", i.e. "nothing", could be the answer to the question "What is in that box?". On the other hand, the question "How many eggs are in that box?" could be answered with "nullum ovum" or "nulla ova" ("no eggs"). In English, "nothing" and "no", when the latter means "zero", are also not interchangeable.

          You have linked to a discussion of the word "integer", which has never been used by the Romans. "Integer" started to be used for numbers only in the late Medieval Latin.

          As I have explained, the word "null" (i.e. nullus/nulla/nullum) had a grammatical distribution identical to that of any other number, with no difference from 1, 2, 3 etc.

          Following the Greek tradition, the Roman grammarians did not classify their words based on their meanings or actual grammatical roles, but based on their kinds of morphological flexion (i.e. declension or conjugation).

          For historical reasons, the Indo-European numerals from one to ten had a special declension, which was different from the declension of any other words.

          Because of that, the Greeks and the Romans classified the numerals from one to ten and any other words derived from them into a special subclass of the nouns, the numerals.

          The words used for "zero", both in Greek and in Latin, are much more recent than the words for "1" to "10", so when they have been coined they have received the regular adjectival declension, instead of the archaic numeral declension. This difference in declension prevented the classification of "null" as a numeral, even if it belonged to a group of words about which various ancient grammarians have expressed doubts about how they should be properly classified.

          The ideas of the ancient grammarians about the relevance of declension for word classification do not matter for the classification of "null" from the point of view of modern grammar and mathematics. Even in antiquity, there were people like Aristotle, who have set the bases of a classification of the words based on meaning and grammatical roles, not on flexion (in the so-called "Categories"), even if, at least in the surviving works, this has not been applied to a detailed analysis of a language like the Ancient Greek.

          As it is said in the well known quotation about ducks, if one reads the surviving Latin texts, there is no difference in usage between "nullus/nulla/nullum" and any other cardinal number, i.e. all the cardinal numbers including "nullus/nulla/nullum" appear in the same word contexts, where they are interchangeable, therefore "nullus/nulla/nullum" is a cardinal number, based on how the Romans were using it, regardless whether Priscianus would have agreed to this.

    • vishnugupta 3 months ago

      > Even if you just tally things, computations may produce zero

      It may produce zero but not necessarily.

      Tallying things is more likely to produce a notion of "alright, no one owes anyone anything", and there's no need to track this "nothing". Only when things don't tally is it required to keep track.

      Because there's no use persisting this "nothing"ness over space and time it's unlikely to have been abstracted further to zero.

  • rokisen157 3 months ago

    Null mean nothing but not zero. In Tamil language it is called as Paazh. Zero is called as Suzhiyam.

osigurdson 3 months ago

I don't understand how the concept of zero could have taken so long for humans to discover: "I had two apples, I ate one and gave one to a friend". They must have had a word for the concept of running out of things as soon as any level of cognitive ability emerged.

  • xigoi 3 months ago

    What took long to find out is that zero is a number like any other. That having no apples is just a special case of having some quantity of apples, not something completely separate.

  • mkoubaa 3 months ago

    Some living languages in amazonian tribes don't have a word for left or right. Their directions are downriver, upriver, towards river, and away from river.

    The language we speak and the concepts we are taught literally form into structures in our brain, and we can't understand what it would be like not to have them, because those structures are part of the organ of understanding

    • osigurdson 3 months ago

      I don't think the analogy holds. River based directions can be mapped to NESW. It does't hold that the concept of "hey I ran out of stuff" would not exist in their minds. It is impossible that they didn't understand this concept.

      • avz 3 months ago

        Well, the may have understood the concept of "running out of stuff" and yet not realize it naturally belongs with numbers (of which they would have known other examples like one and two). I can imagine ancient folks putting the notion of "running out of stuff" with ideas such as "hungry", "empty", "dead" etc. Formation of the more abstract concept of zero-the-number is certainly a few cognitive experiences further down a line.

      • jacobolus 3 months ago

        There are cultures where the basic directions are north, east, south, west, which they use pervasively in preference to forward/back/left/right. People from such a culture are very good at keeping track of their absolute orientation and are uncomfortable with body-relative directions.

      • mkoubaa 3 months ago

        "Hey I ran out of arrows" is a different construct than "there is such a thing as a number that is independent of particulars like arrows and zero is one of those numbers"

  • avz 3 months ago

    The perspective of modern computer science (with its zero-based indexing etc) or modern algebra (with its need for neutral element for addition) certainly makes it clear that zero is just another integer. However, without those perspectives I doubt it is so obvious.

    For example, people normally count things by starting from one, so, at least following this usual procedure, counting to zero is technically speaking impossible. Also, we can't distinguish between zero apples and zero oranges, but we can tell two apples and two oranges apart.

    In fact, even with the perspective of modern algebra zero remains special. For example, it is the only element of a field without a multiplicative inverse.

    I'd be surprised if zero didn't take any extra effort to discover. It's clearly different than other integers.

  • dahart 3 months ago

    That’s right; the concept of nothing was already there, both as an idea, as well as in written language. It’s the use of a written numeral zero for arithmetic, and the implications of using zeroes in algebra that this article is discussing.

plasticeagle 3 months ago

Zero is important because it is the first abstraction. If your notion of numbers includes the idea of zero as a number, then you have broken through the first intellectual barrier of mathematics. Without this, it remains tempting to still consider numbers as existing for the purpose of counting _things_.

But with zero, this idea converges on the same thing. No matter what things you were counting, if you have zero of them, you have the same idea. And so you take a step towards the idea of a number being a concept in its own right, rather than existing purely for the purpose of counting or measurement.

It is the same sort of conceptual freedom that allows you to do things like add a number to a square. To deal with an equation like x + x ^ 2 = 0. If you're stuck with numbers "meaning" something beyond themselves, then you'll never add x to x^2. One is a length, the other an area. They are different objects.

This intellectual leap is one that must be made by all students of mathematics - and many young people do not.

  • dahart 3 months ago

    > Zero is important because it is the first abstraction

    I’m not sure what this even means, but Sumerians had abstract mathematics, in addition to art and literature which are abstract by their nature. They were using numbers in the abstract sense before the number zero was named, and so while it seems like a logical and tempting narrative that naming zero is what abstracted numbers, history doesn’t seem to support this particular post-facto rationalization. Naming zero is very important in the history of math, it just isn’t the first abstraction.

    https://en.wikipedia.org/wiki/Sumer#Mathematics

    • plasticeagle 3 months ago

      You may very well be entirely right.

      And I think you know exactly what I meant, because you immediately countered with some historical evidence around those first abstractions.

      So, thank you, and I will read much more about Sumerian mathematics with great interest.

  • kragen 3 months ago

    regardless of whether it makes sense to say that treating zero as a number was 'the first abstraction' or not†, ancient mesopotamians did not treat zero as a number. even noble fibonacci didn't consider it as a number or even a digit:

    > The nine Indian figures are: 9 8 7 6 5 4 3 2 1. With these nine figures, and with the sign 0 (...) any number may be written.

    i'm not sure when it became conventional to consider it a number rather than the absence of one; it might not have been until the early modern era

    https://news.ycombinator.com/item?id=40917674 suggests that european mathematicians still hadn't agreed that 1 was a number until the early modern era

    ______

    † it doesn't

  • roywiggins 3 months ago

    I am pretty sure early European mathematicians did treat polynomials as areas or volumes, x can be just a rectangle x units long in one dimension and 1 in the other, x^2 is a square, etc. This meant they had to go through contortions to avoid negative coefficients since they made no geometric sense. If a coefficient would otherwise be negative it would have to move to the other side of the equality, and be solved using a different method. Instead of a single quadratic formula they needed several different cases depending the exact form of the polynomial.

  • adolph 3 months ago

    > Zero is important because it is the first abstraction

    And thus was born the everlasting confusion between cardinal and ordinal numbers.

  • meroes 3 months ago

    Why is that the first abstraction? I would think at least one of these three earlier developments would be:

    1) deductive geometry of Thales, where we could now prove things independent of the physical world (abstracting away from the physical)

    2) Plato’s remarks on incommensurability (proto irrational numbers) being something real but not physical, because no physical process could prove to the mathematician that two lengths really have no common unit measure. Here the abstraction is again away from physical means.

    3) infinity of numbers. Abstracting away from large but finite collections. We can only ever survey finite collections physically, again is an abstraction.

  • njrc 3 months ago

    > Zero is important because it is the first abstraction.

    Aren't numbers themselves an abstraction?

    • plasticeagle 3 months ago

      Initially, no. They are a mechanism for counting, or a way of recording measurements. Many young people, when trying to learn mathematics, do not take the necessary step beyond these ideas.

      They are not helped by educators insisting on presenting "word problems" when teaching maths. To get to the next level, you need to break the connection between numbers and the "real world". I've always felt that the number zero represented the first step that humanity took in this journey - and it's a step that every human also needs to individually take if they want to learn maths.

  • traject_ 3 months ago

    Yeah, a lot of these articles conflate use of zero as placeholder, numeral and number. But the real critical conceptual step is the last step of using zero like any other number (mostly) in arithmetic.

constantcrying 3 months ago

I am very unconvinced by this "history of zero". Definitely the Greek geometers were aware of that concept, they just expressed it geometrically not numerically.

Putting the concept at some specific geographic location seems very strange to me. To me there is no doubt that in each mathematical culture there was some notion of it, just weaved into that conceptions of that particular culture.

Of course then there is zero as a symbol and positional number systems. The first one seems very uninteresting, the later one definitely more so, but the question of how to express numbers is definitely more interesting and broad than the history of just one component of it.

  • empath75 3 months ago

    The question is when they used it or understood it _as a number_. The greeks didn't even consider 1 to be a number, since "number" implied a "multitude". It was controversial as to whether 1 was a number up until the 1400s or 1500s, let alone zero.

    • taeric 3 months ago

      This feels wrong, at face value. Even in modern English, if I say I have a number of things, I almost certainly don't have just one. This does /not/ mean that one is not a number. It /does/ mean that the word "number" can have several uses.

      • wcarey 3 months ago

        Euclid distinguishes the unit (VII, Def.1) and number -- a multitude composed of units -- (VII, Def.2). His definition of prime and composite numbers (VII, Def.11 and Def.13) clearly exclude one from the group of numbers, otherwise every number would be composite.

        • taeric 3 months ago

          I think this is good to argue that my assertion is likely too strong. I fear this is close to arguing that early programmers were not familiar with map/flatMap. They did not discuss it as a first class thing, sure. Was it completely alien to all practitioners? I find that harder to swallow and it is likely that we are debating methods versus functions completely removed from the context in which the words were largely used.

      • empath75 3 months ago
        • taeric 3 months ago

          But this doesn't contend with my point? A sibling point brought up confusion of cardinal and ordinal numbers. In modern english, I challenge you to find a good understanding outside of advanced practitioners on the difference.

          I think it is fair to say that my statement is too strong to say it is wrong. My assertion would be that it is more complicated and almost certainly there is a lot lost in translation along the years.

    • kragen 3 months ago

      thank you! i was wondering about that

  • meroes 3 months ago

    Wasn’t 2 the smallest actual number to Greeks? I think it’s clear from the history Greeks did a TON of math but missed this number system and abstraction. And if you say they still understood zero but didn’t have number system to tag along (they didn’t have Hindu Arabic numerals) well a better number system provides so much more math that the argument doesn’t have much oomph. There’s no doubt the Hindu Arabic numeral system, of which zero was apart, was the mathematical development bar none of its time. Did anyone really understand zero before it could do all these useful things in equations? Doubtful. And that’s the most charitable take for the Greeks regarding zero.

  • kouru225 3 months ago

    I’m in the process of reading this article, but I recently read Zero: The Biography of a Dangerous Idea by Charles Seife (which is a great book) so I feel like I can explain:

    The Greeks were ideologically opposed to the number zero. Aristotle outright refuses to acknowledge the existence of zero and of infinity. The Greeks were aware of the idea of zero, and they even used zero when they calculated using the Babylonian number system (which used zero as a placeholder number,) but they always converted the numbers back into their own system, and stubbornly refused to acknowledge its existence.

    The fact that the Greeks saw geometry and math as interchangeable was their weakness here. There’s no way to represent the number 0 geometrically, but the Greeks weren’t gonna give up their belief in Geometry because it provided them with social and political power.

    Pythagoras and Aristotle believed in a religious philosophy with logos at the center. Logos can be translated as “thought” or “word” (as it is in the Bible) or it can be translated as “ratio.” This is because they saw all these things as one (the Latin translation of the Greek word “logos” is “ratio.”) The ratio of numbers was thought the be the underlying mechanism that proved the order of the universe (which naturally saw the nobility as orderly and the peasantry as chaotic). This was a profoundly powerful sociopolitical tool that ended up spreading all across the world because Aristotles student just happened to be the greatest conqueror of the era: Alexander The Great.

    Anything that threatened the philosophy of logos was suppressed, violently. Hippassus and Zeno were both murdered for the crime of talking about irrational numbers and infinity. Zero was one of these threats. 1:0 = infinity, 10:0 = infinity, anything:0 = infinity. This was not logos and therefore it was suppressed.

    This philosophy extended beyond mathematics into the realm of astronomy and, weirdly enough, music (at the time, Pythagoras was actually most famous for his discovery of the golden ratio using an instrument called the monochord, which is a legend that seems to be false but nonetheless made him very famous.) This astronomical belief system was then later attributed to Ptolemy. This philosophy then was transplanted into Christian theology, and it took centuries for the monks to accept the existence of zero and infinity as a result. We even have cases of religious figures persecuting mathematicians about zero and infinity as late as the 1800s.

    • jacobolus 3 months ago

      > The Greeks were ideologically opposed to the number zero.

      I don't think this claim is supportable, certainly not as such a broad generalization. Do you have a specific statement clearly attributable to an ancient Greek author to that effect?

      > Greeks saw geometry and math as interchangeable

      You're going to have to define these terms more explicitly. I don't think this statement is right. Ancient Greeks spent quite a lot of effort studying areas of what we now call mathematics but which were not geometrical per se.

      > Greeks weren’t gonna give up their belief in Geometry because it provided them with social and political power

      Any claim that geometers in general had social or political power owing to their mathematical work seems exaggerated. While a couple of geometers happened to incidentally also be local political leaders (e.g. Eudoxus), geometers explicitly lamented how little their contemporaries cared about their work and how few colleagues they could find to share it with.

      > Aristotle outright refuses to acknowledge the existence of zero and of infinity.

      This seems like a reductive summary. Aristotle had a pretty sophisticated idea about this which finds echoes in modern mathematics: here is Aristotle:

      > Now there is no ratio in which the void is exceeded by body, as there is no ratio of 0 (οὐδέν) to a number. For if 4 exceeds 3 by 1, and 2 by more than 1, and 1 by still more than it exceeds 2, still there is no ratio by which it exceeds 0; for that which exceeds must be divisible into the excess + that which is exceeded, so that 4 will be what it exceeds 0 by + 0. For this reason, too, a line does not exceed a point-unless it is composed of points!

      Alternate translation:

      > But the nonexistent substantiality of vacuity cannot bear any ratio whatever to the substantiality of any material substance, any more than zero can bear a ratio to a number. For if we divide a constant quantity c (that which exceeds) into two variable parts, a (the excess) and b (the exceeded), then, as a increases, b will decrease and the ratio a :b will increase; but when the whole of c is in section a there will be none of c for section b; and it is absurd to speak of 'none of c' as 'a part of c.' So the ratio a: b will cease to exist, because b has ceased to exist and only a is left, and there is no proportion between something and nothing. (And in the same way there is no such thing as the proportion between a line and a point, because, since a point is no part of a line, taking a point is not taking any of the line.)"

      Later, about motion in a vacuum:

      > But if a thing moves through the thickest medium such and such a distance in such and such a time, it moves through the void with a speed beyond any ratio.

      (The physical premise here is wrong, but the concept of division by zero is clear.)

      See https://www.jstor.org/stable/pdf/2304187.pdf

      Aside: anyone telling you what Pythagoras thought about numbers is pulling your leg. We have no idea about this whatsoever, only what various people claimed like 5+ centuries later, most of which is nonsense.

  • rusticpenn 3 months ago

    Mesopotamians were at least a millennium ahead of Greeks though..

    • constantcrying 3 months ago

      No. They weren't even in the same category, even the comparison doesn't make sense. In Mesopotamia mathematics seems to have been a tool, a method for business and construction planers to drive certain quantities.

      Mathematics to the Greeks was "mathematics" in the sense we understand it today. From a system of axioms they derived a complex system of theorems, which allowed for an abstract description of reality. It was mathematics as a system of truth, which then could be used for other purposes. E.g. with Archimedes who discovered integration, but also was a prolific engineer.

      The Greeks were so far ahead of anything the Mesopotamians did, that even the comparison is unfair.

      • derstander 3 months ago

        > No. They weren't even in the same category, even the comparison doesn't make sense.

        The comparison does make sense if you interpret the parent’s post simply to state that Babylonian mathematics (in Mesopotamia) were developed (and seemingly stagnated) before Greek mathematics began in earnest. Which seems to be pretty uncontroversial. There are extant clay tablets from 1800 to 1600 BC that would indeed predate the Greek Geometers by a millennia — and that’s if you’re counting Thales as the beginning.

        Ie. your parent post is using “ahead of” == “temporally before” not as in “more advanced”.

        • constantcrying 3 months ago

          [flagged]

          • kragen 3 months ago

            > from India to Mesopotamia to Greece

            surely nobody thinks that zero spread from aryabhata 1500 years ago to the mesopotamians† 2700 years ago, and classical and hellenistic greece didn't even have a numeral for zero, much less the idea that zero was a number. (obviously the greeks have been using a numeral for zero for several hundred years now)

            the hypothetico-deductive method that defines what we call 'mathematics' today does seem to have originated in the hellenistic period, although probably in alexandria rather than in greece proper. but menon and the timaios show that the pythagoreans already had the rudiments of it in classical greece, and there's no evidence that anything similar existed during the thousand-plus preceding years of mesopotamian mathematics. i think that's what you're saying

            on the other hand, the majority of greek mathematics during the classical and hellenistic period was still mostly 'a tool, a method for business and construction [planners] to [derive] certain quantities', even if the calculations were being done on sand-tables instead of counting-boards. systems of axioms don't make an appearance in the surviving written record until roughly euclid; aristotle, plato, and pythagoras evidently weren't yet familiar with what we call 'mathematics' and in particular 'proof', although the later mathematicians, especially in the hellenistic period, did build on their work

            so i think it's a mistake to talk about 'the greeks' as if they were a single person, like in a game of civ; all the greeks alive in plato's time were already dead by euclid's time, euclid didn't even live in greece, all the greeks alive in euclid's time were already dead by ptolemy's time, there are still greeks today, and in any time period, different greek people knew different things

            ______

            https://www.youtube.com/watch?v=jAMRTGv82Zo

          • rusticpenn 3 months ago

            I am of the opinion that there is something to learn from everyone. However I come to HN for civil discussion. That was not a gotcha or are we trying to win something. Mesopotamians needed complex mathematics for taxes long before even Mycenian Greeks came into picture. You are talking about classical Greeks. Naturally there were advancements in mathematics during the Greek period, but that does not undermine the achievements of Mesopotamians.

            • constantcrying 3 months ago

              Your response was a single sentence, pointing out something which I obviously knew. That information was totally irrelevant to what I said, but you thought you had discovered something which I didn't know about. You also obviously didn't understand what I said, else you wouldn't have posted a single statement with no new information. Obviously this was a gotcha. You made no attempt to even try to understand me or post anything interesting or relevant, instead you took a cheap shot at a misreading of my post.

              If this is what "high quality discussion" is to you, what does mid level discussion look to you? Just a slur as a response?

              >Mesopotamians needed complex mathematics for taxes long before even Mycenian Greeks came into picture. You are talking about classical Greeks. Naturally there were advancements in mathematics during the Greek period, but that does not undermine the achievements of Mesopotamians.

              I am stunned by your knowledge. You really know that? That is breathtaking information. Genuinely thank you for posting this novel information that I was totally unaware of. Bringing that to my attention was extremely relevant and has radically changed my position.

              Seriously fuck off, you still haven't even bothered to read my OP.

              • kragen 3 months ago

                maybe try to have a little more patience with other people; they may not find it as easy to understand you as you think it is, and it doesn't always mean they aren't trying. sometimes they're just dumb, sometimes what you wrote is actually pretty unclear (even though it's perfectly clear to you because you already know what you were trying to express), and sometimes they're actually aware of things you're not which introduce hidden contradictions into your narrative

                the worst case, from my point of view, is when they're actually just dumb and you yell at them for their intellectual limitations. that's sad

              • rusticpenn 3 months ago

                I am sorry anonymous person on the internet. I have no idea what you know or do not know.

                I am perplexed by the notion that time does not play a role for you when talking about the "history" of something.

                Just like the fact the Phoenecian alphabet was modified by the Greeks, there would have definitely been transfer of knowledge.

                • constantcrying 3 months ago

                  You are just impossible to talk to. Why are you still posting random gotchas at me?

                  • hifromwork 3 months ago

                    I have no intention to argue with either of you, but from an outside perspective, it's you who is very hard to discuss with.

                    I don't understand what's wrong with rusticpenn original post and I don't think they argue in bad faith. Maybe you are right, but because of all the personal attacks you make, it's hard to see your point, and you also come of as very rude. I don't write this to criticize you - I'm just saying that if you change the way you argue, you will probably convince many more people to your ideas. Constructive discussion is, in general, a very useful skill in life.

          • derstander 3 months ago

            > No. You are still ignoring that the mathematics of the Greeks and the mathematics of the Mesopotamians were radically different.

            This doesn't matter. We're talking about 0. Whether mathematics are conceptualized as a tool or an academic discipline in its own rite wasn't the thrust of the article. It wasn't even part of your original post.

            > Did you even read my original post?

            It might surprise you, but yes -- yes, I did!

            > My position is that a linear history of the concept of 0, which spreads from India to Mesopotamia to Greece is nonsense. That the Mesopotamians did some mathematics long before the Greeks did is irrelevant to my argument. It totally does not matter. What makes it even more ridiculous is that my OP obviously assumes that the Mesopotamians had mathematics before the Greeks, so pointing it out as some "gotcha" is just really stupid.

            This is a better statement of your position than you originally posted, when you said "I am very unconvinced by this "history of zero". Definitely the Greek geometers were aware of that concept, they just expressed it geometrically not numerically." Here's why I think your initial statement is a weak argument for your refutation of a linear history of the concept of zero: Babylonian mathematics temporally happened first and the general opinion (when I was in school, at least) was that Babylonian mathematics likely influenced Greek mathematics. That's why I think this is a weak argument to argue against the linear history concept of 0.

            > It genuinely makes me mad to have this low quality discussion, where someone barges in and gives you a trivial "gotcha" as if I wasn't completely aware of that fact. And when you try to point out why the person didn't understand what you were saying you are getting another person debating the stupid gotcha, as if it even mattered.

            I'm sorry you're mad but I mostly disagree on your assessment of the quality of the discussion. Your top-level comment on the article about Mesopotamia creating nought was essentially that Greek geometers knew about it and that tying concepts and geographical locations seems odd.

            I don't disagree with your second point, but we may be in the minority: lots of people are very interested in knowing who had what ideas first and where. Your first point doesn't stand well alone without the further elaboration you've made through the rest of this discussion. With the elaborations you've made since the rest of this thread might not have happened.

            • constantcrying 3 months ago

              I wanted to give a seriously reply, but clearly there is no point.

              I still don't get why you and the other guy think pointing out that "Mesopotamians did it first" is relevant. Even if they did and even if the Greeks were extremely influenced by them, their concept of a geometrical zero was still radically different to the Mesopotamians notion, so it is totally irrelevant who was first. The only counter argument could be that they had the same notion, which the Greeks adopted from the Mesopotamians as a complete package.

              I don't think this is really complicated. If their understanding was radically different, then the concept couldn't have just "moved over", so disprove my thesis he discussion on "who was first", is obviously irrelevant.

              > With the elaborations you've made since the rest of this thread might not have happened.

              Well, the genuine curiosity and willingness to discuss the subject embodied in a single sentence dismissing my post because of an obviously true statement made me very glad to elaborate and discuss further.

              • derstander 3 months ago

                I'm going to step back to a meta level and make a couple points:

                - Several times in the context of this discussion you've made negative assumptions about someone or their motives replying to you (e.g. "I wanted to give a seriously reply, but clearly there is no point" and "What a stunning revelation, genuinely brilliant insight from you"); I don't think this fosters good discussion (and I also don't think it's particularly good for mental health)

                - You seem particularly aggrieved by someone not automatically assuming you know something. Yes, we all understand now that you know the Mesopotamians predated the Greeks with regard to the mathematical notions we're discussing here. But you know what? There's nothing wrong with NOT knowing. And some people reading the discussion might not have known.

                Now back to the main topic at hand.

                > Even if they did and even if the Greeks were extremely influenced by them, their concept of a geometrical zero was still radically different to the Mesopotamians notion, so it is totally irrelevant who was first

                Now I finally feel as though I understand the main thrust of your argument. Let me restate your hypothesis to see if I've got it: even though the Babylonians had the concept of zero, it was sufficiently lacking compared to the Greek concept as to be incomparable; the Greeks fundamentally independently invented the concept of a zero more mathematically powerful than the Babylonians rather than took the existing Babylonian concept of nought and developed it further. Is that right?

      • kouru225 3 months ago

        This is the difference between philosophy and number systems. The Babylonian number system was, by far, the superior system to the Greek number system (which they inherited from the Egyptians.) This is something even the Greeks acknowledged because they would use the Babylonian system to do their own calculations, and then convert it back into the Greek numbers (similar to how the USA scientists use the metric system and then convert back to the imperial system today).

poulpy123 3 months ago

I'm a bit baffled by the concept of "pre-Arab Sumer", knowing that Sumer disappeared 2500 years before the arab invasion

tumidpandora 3 months ago

The concept of zero wasn't a single invention, but the credit for the initial zero concept goes to India. Around 500 AD, Indian mathematicians, like Aryabhata, figured out a way to use zero as a placeholder in their number system. This was a huge deal because before that, nobody had a good way to show "nothing" in a way that worked with other numbers.

Even though others like the Arabs built on this idea later, India got the whole zero ball rolling.

chrisweekly 3 months ago

Two things:

1. I'm a huge fan and long-time supporter of Maria Popova; IMHO themarginalian.org is one of the finest websites ever, with a breadth and depth of consistently worthwhile content. Check it out!

2. This post reminds me of a book I really enjoyed maybe 10 years ago -- "Zero: the History of a Dangerous Idea". Recommended.

ArnoVW 3 months ago

If you have young kids that you would like to infect with the virus of mathematics, I highly recommend "the story of 1" by Terry Jones (of Monthy Python fame)

They cover how we went from objects to abstract concepts, and how 1 and 0 were "invented".

https://www.youtube.com/watch?v=1Tm18iapVlI

ks2048 3 months ago

Shoutout to the Maya who also had zero (as a placeholder in base 20 number system), it seems by at least 36 BC

nyc111 3 months ago

"This concept of the infinite in a sense contoured the need for naming its mirror-image counterpart: nothingness."

Is infinity the mirror image of nothingness? I cannot conceive infinity let alone its mirror image. Same with nothingness.

kouru225 3 months ago

I just finished Zero: A Biography of a Dangerous Idea.

One of the best books I’ve read in a while. Really gives a great story about the evolving history of thought.

cushpush 3 months ago

Don't forget `nil` for when you need something in your collections more naught than "not"