Full disclosure: Mathematics has more than one meaning of zero. While you can't divide by zero in real numbers, you can use a process called "limits" to do something similar. For example, when you take the limit of dividing a positive number n by "positive zero" (the limit of x as x approaches zero from the positive side), you get another limit called "positive infinity," which basically means that as the number you divide by gets smaller, so long as it's above zero, the result gets larger. Taking the limit of dividing the same number by "negative zero" produces "negative infinity", and dividing by "two-sided zero" produces "projective infinity". But the physical consequences of these infinities, on the other hand, lead to things like black holes. Division by zero as a real number, though, results in some major problems. You see, division is defined as the inverse operation to multiplication — that is, a / b is a number that, when multiplied by b, produces a. Multiplication is, in turn, definednote as hyperoperation of addition, that is, a * b = b copies of a (or a copies of b, which gives the same result) added together. Addition has an identity element zero (0), such that for all a, a + 0 = 0 + a = a. So, a * 0 = 0 * a = a copies of zero added together. But zero plus zero is always equal to zero, so no matter how many copies of zero are added, the result will always be what was initially there — zero. This, naturally, leads to some problems when you want to find a number a / 0 such that (a / 0) * 0 = a if your a is nonzero, as all such numbersnote will make that expression evaluate to zero, not a. You might be tempted to think that a / 0 for some nonzero a is some infinite value, but nonzero a / (any infinite number) evaluates to an infinitesimal, which is a number infinitely close to zero but not zero. One way to demonstrate that these aren't the same as zero is to think of both infinities — 1 / positive infinity = 0+, 1 / negative infinity = 0-, and if both of these were equal then that would mean that their multiplicative inverses would be as well, i.e. positive and negative infinity would be equal, a clear contradiction. Real numbers form a field, which is an algebraic structure consisting of a set that is closednote under two operators + and *, each having commutativenote and associativenote properties, respective identity elements 0 and 1note , respective inverse operators - and /note , and a distributive propertynote relating them. That isn't the whole definition, though — the axioms rather conspicuously state that every nonzero element has a multiplicative inverse, leaving /0 undefined. The real numbers are not the only totally ordered field, however, and nowhere near the largest — the largest (and most general) one is usually called the surreal number system, which contains all real numbers, all infinitesimals (an infinite number, in fact, directly adjacent to every real number and every infinitesimal), and all transfinite numbers (numbers greater than any finite value). Every number that can even begin to be imagined in one dimension, including division by countlessnote infinitesimals, and guess what? They still can't properly handle a value describable in two symbols! The real projective line (real numbers with infinity thrown in) does allow division by zero, but "division" doesn't have any real meaning. It is a useful construct in analysis, but it contradicts the layman's notion of how numbers are supposed to work since it's almost a field, but not quite. Related to the above, there's also a rather weird kind of mathematical structure called a wheel where division is always possible. Some of the rules of algebra have to be sacrificed for this; for example, x/x=1 is not always true, and nor are 0x=0 or x-x=0. But it is a real thing. All of this leads to fun mathematics mind-benders where you can mathematically prove that 1=2 and all sorts of other nonsense by neatly leaving out the important bit about equations of the form x/x=1 when x does not equal 0note . From a computing perspective, dividing by zero poses a different problem - that of the “infinite loop”. Computers treat division as repeated subtraction - with most Arithmetic Logic Units basically using register shifts to perform long division in a manner similar to how third graders learn it. If you repeatedly subtract zero from a dividend, you will keep on subtracting for ever. This “freezes up” that processor from doing anything else. To prevent this, computers throw a fault whenever a program tries to divide by zero.