less_retarded_wiki

main page, file list (580), source, all in md+txt+html+pdf, report abuse, stats, random article, consoomer version

Infinity

Infinity (from Latin in and finis, without end) is a quantity so unimaginably large that it has no end. It plays a prominent role especially in mathematics and philosophy. As a "largest imaginable quantity" it is sometimes seen to be the opposite of the number zero, the "smallest possible quantity", though other "opposites" can be though of too, such as minus infinity or an infinitely small non-zero number (infinitesimal). The symbol for infinity is lemniscate, the symbol 8 turned 90 degrees (unicode U+221E, looking a bit like oo). Keep in mind that mere lack of boundaries doesn't imply infinity -- a circle has no end but is not infinite; an infinity implies there is always more, no matter how much we get.

The concept of infinity came to firstly be explored by philosophers -- as an abstract concept (similar to those of e.g. zero or negative numbers) it took a while for it to evolve, be explored and accepted. We can't say who first "discovered" infinity, civilizations often had concepts similar to it that were connected for example to their gods. Zeno of Elea (5th century BC) was one of the earliest to tackle the issue of infinity mathematically by proposing paradoxes such as that of Achilles and the tortoise.

The term infinity has two slightly distinct meanings:

It could be argued that potential infinity is really the reason for the existence of true, high level mathematics as we know it, as that is concerned with constructing mathematical proofs -- such proofs are needed anywhere where there exist infinitely many possibilities, as if there was only a finite number of possibilities, we could simply enumerate and check them all without much thinking (e.g. with the help of a computer). For example to confirm Fermat's Last Theorem ("for whole numbers and n > 2 the equation a^n + b^n = c^n doesn't have a solution") we need a logical proof because there are infinitely many numbers; if there were only finitely many numbers, we could simply check them all and see if the theorem holds. So infinity, in a sense, is really what forces mathematicians to think.

Is infinity a number? Usually no, but it depends on the context. Infinity is not a real number (which we usually understand by the term "number"), nor does it belong to any traditionally used set of numbers like integers or rational numbers, because including infinity would break the mathematical structure of these sets (e.g. real numbers would seize to be a field), so the safe implicit answer to the question is no, infinity is not a traditional number, it is rather a concept closely related to numbers. However infinity may sometimes behave like a number and we may want to treat it so, so there also exist "special" number sets that include it -- see for example transfinite numbers that are used to work with infinite sets and the numbers can be thought of as "sort of infinity numbers", but again, they are separated from the realm of the "traditional" numbers. This comes to play for example when computing limits with which we want to be able to get infinity as a result. The first infinite ordinal number omega is often seen as "the infinity number", but this always comes with asterisks, with infinities we have to start distinguishing between cardinal and ordinal numbers, we have to define all the basic operations again, check if they actually work, we also may have to give up some convenient assumptions we could use before as a tradeoff and so on. So ultimately everything depends on our definition of what number is and we can declare infinity to be a number in some systems, see also extended real number line and so on.

An important term related to the term infinite is infinitesimal, or infinitely small, a concept very important e.g. for calculus. While the "traditional" concept of infinity looks beyond the greatest numbers imaginable, the concept of infinitely small is about being able to divide (or "zoom in", see also fractals) without end, i.e. it appears while we start dividing by infinity -- this is important for limits with which we explore values of functions that get "infinitely close" to some value without actually reaching it.

When treated as cardinality (i.e. size of a set), we conclude that there are many infinities, some larger than others, for example there are infinitely many rational numbers and infinitely many real numbers, but in a sense there are more real numbers than rational ones -- this is very counter intuitive, but nevertheless was proven by Georg Cantor in 1874. He showed that it is possible to create a 1 to 1 pairing of natural numbers and rational numbers and so that these sets are of the same size -- he called this kind of infinity countable -- then he showed it is not possible to make such pairing with real numbers and so that there are more real numbers than rational ones -- he called this kind of infinity uncountable. Furthermore this hierarchy of "larger and larger infinities" goes on forever, as for any set we can always create a set with larger cardinality e.g. by taking its power set (a set of all subsets).

In regards to programming: programmers are often just engineers and so simplify the subject of infinity in a way which to a mathematician would seem unacceptable. For example it is often a good enough approximation of infinity to just use an extremely large number value, e.g. the largest one storable in given data type, which of course has its limitations, but in practice just werks (just watch out for overflows). Programmers also often resort to breaking the mathematical rules, e.g. they may accept that x / 0 = infinity, infinity + infinity = infinity etc. Systems based on symbolic computation may be able to handle infinity with exact mathematical precision. Advanced data types, such as floating point, often have a special value for infinity -- IEEE 754 floating point, for example, is capable of representing positive and negative infinity.

WATCH OUT: infinite universe doesn't imply existence of everything -- this is a common fallacy to think it does. For example people tend to think that since the decimal expansion of the digits of pi is infinite and basically "random", there should always exist any finite string of digits somewhere in it; this doesn't follow from the mere fact that the series is infinite (though the conclusion MAY or may not be true, we don't actually know this about pi yet). Imagine for example the infinite series of even numbers -- there are infinitely many numbers in it, but you will never find any odd number there.

See Also


Powered by nothing. All content available under CC0 1.0 (public domain). Send comments and corrections to drummyfish at disroot dot org.