What Is the Biggest Number?

Author:Murphy  |  View: 25747  |  Time: 2025-03-22 23:03:05
Photo by Giordano Rossoni on Unsplash

TLDR: The biggest number is 16 digits long


Ask an astronomer and we travel in aeons into the universe and beyond.

Ask a microphysicist and we slip into mystical quarks and preons, the building blocks of everything.

Ask my 5 year old and you get 643.

This is a question everyone can guess, but when you get to the nitty-gritty is surprisingly personal.


Infinity

I'm discounting it, like imaginary numbers, infinity is an alchemy, a symbol, rather than a practicable digit. It could never appear on a tape measure.

Here are the parameters:

1. Every individual digit must count:

It's awesome that the universe is estimated at 93 billion light years from end to end, but it could be 93.4 or 93.5 and both would be fine. To get within the hair's breadth of detail the largest number requires would need an application for it to make sense. Travelling from one side of the universe to the other would be great, but…

"You're really not going to like it"

  • Hitchhikers Guide to the Galaxy, Douglas Adams

Even if we felt the need, this a is number we know fluctuates. The size of the universe is useful, but the number of 93 billion light years is to showcase pure awe – a poster child for what big means.

At its best it's merely a reference point for numbers that are far smaller, rather than something worth knowing to billions of digits of accuracy.

Photo by Anna Tarazevich on Pexels

2. The number must have a practical, real-world, application:

There are groups dedicated to determining long numbers for the sake of the number itself. As of writing Pi is known to 62.8 trillion digits.

For perspective, to measure the diameter of the observable universe to the accuracy of a single hydrogen atom you would only need 38 digits of Pi.

This makes the trillion-digit research number just for show.


1s and 0: Binary Beasts

Let's talk about binary as a number. As a string, 1s and 0s have as valid a place as 2,3,4,5,6,7,8,9, and that means looking at code as a singular line of digits.

When you run a program you rely on the counting speed of a string of numbers. While you might use the letter a when coding, that'll only ever be 1100001 in ASCII code (or 97 for binary fanatics). Even with the most flamboyant of coding languages, all roads lead back to C eventually, and right at the bottom of the pile, 1s and 0s.

And yet, the validity of a program's string as a singular number is only a half-truth.

Most users only graze the lightest of sequences within the main string of digits as they are shifted forward and backwards at mind-boggling speeds.

Even the greatest, most awkward programs alive, search engines and AI are loaded with hidden gremlins adapted from dirty Git Hub code. No shame here, the world is founded on cut and paste, and you could spend a lifetime in these applications and not find one – that doesn't mean they fail to exist.

This means we can discount Google's infamous 2 billion lines of code as a single number in and of itself. Within the labyrinth, in its darkest recesses, lies a dormant nook, a Schrodinger's cat which might as well forever be a 1… or a 0.

This means it is not a definite single number in which every digit counts.


Where code is in sequence – a single data file.

Media, for example, is linear. Every pixel of every frame you see on your screen relates to a number which needs to be correct, that's 2 million pixels per frame, at 60 Hz, 120 million pixels per second. The same is true for other files too.

The largest well-documented single file I can find is one that solves the Mathematics problem of the Boolean Pythagorean Triples, at a cool 200 Terabytes. This is effectively reams of numbers that give every possible example to a mathematical theory. Lots more info on that here.

As a binary line, this file is a solid number… and a big one, 200 quadrillion bytes, an exact number in the region of 1,600,000,000,000,000,000 individual digits.

And yet… no one will ever see anything more than a scant touch of that, because while admirable this number does not serve any practicable purpose, resting on the onus of a $100 bet from a professor back in the 1980s.

Its mathematical basis alludes to another field of big numbers. Numbers so large that the most powerful computers in existence spend years searching for a successor.


Primes

The foundation stones of Bitcoin theory and the basis for the internet security you rely on every day. RSA encryption relies on using two primes to send hidden messages via colossal multiples, effectively outgunning outrageous processing speed through the sheer size of the calculation. Primes are very much practicable and valid to every digit.

The largest primes are found by The Great Internet Mersenne Prime Search team, or GIMPS for short – I'm not making this up, a free collective that harnesses computing power all over the world to grind for primes. If this is your bag, join the church.

The largest known prime currently stands at 2 (to the power of 82,589,933)–1, or 24,862,048 digits.

Yet new technology might soon cut these numbers down to size, at least in its practicable purpose.

Quantum computing threatens to crush the practical use of primes on Q-Day **** – the moment when current algorithms are roasted by the giant leap in processing speed that Quantum computing should (theoretically) provide. If your defence is grinding out sums, then the near-instantaneous response of these computers is your nemesis.


Hypothetically, the biggest numbers we crunch will also come from this leap forward.

Except it won't, at least not as we understand numbers to be, and not due to the need for freezing computer temperatures, but because Quantum processing relies on digits being 1 and 0 in a superstate, rather than one or the other, so arguably would never be efficiently written as a singular number. Which is about as far as I'm prepared to go down that rabbit hole.

However, it does point us in the right direction.


What do we mean when we consider supersized numbers? While something may be undoubtedly practical, it is often only so within a more complex form than a raw number.

Take the media example above – these files undoubtedly represent breathtaking numbers. But to what end is that number useful as a written digit?

The latest Aquaman film in 4k represents 1,889,785,610,240 individual digits. That film is terrible, but it will forever be a nicer way of presenting those digits than an unknowable matrix of digital jibberish.


The Human Limiter

The longest number any single person has even been able to remember stands at 111,700 digits, a fine example of humanities extremes by Akira Haraguchi.

That has to quantifiably be the longest number that has meant anything to anyone – as it touches the fine limits of what we could numerically ingest. That has to be our ultimate bookend.


We only manage to work with bigger numbers because we've found a way to express them outside of a numerical form. Massive numbers hide in our everyday environment in ways you could never express in digits.

Take the sky above your head. The weather is one of the few problems we really can't touch. The latest attempt is a $1.7 billion supercomputer in the UK. The role of this is to understand the world's weather (read: colossal number), working it out to incredible levels of accuracy. Then use it to spit out a few choice variables that hold interest specifically to us – What is the temperature to the closest degree, will it rain, here, at a given time?


The Sandbox

At this point, at the end user, we get closer to what we mean by the biggest number, because what is relevant to us as a practical number is determined by the ratio of the environment, or sandbox of reality, to ourselves.

The way we like to see the weather – somewhere between -20 and 80 – is a great illustration of the simplified way we like our numbers in the day-to-day – and these are rarely more than 3 digits.

It adds context to some of the more bizarre numerical quirks we still insist upon – take imperial measurements:

  • The Yard: The measurement from the end of King Henry I's fingertips to his nose.
  • An inch: Three grains of barley laid out end to end.
  • The French foot: You guessed it, Charlemagne's foot.

This wholly unreliable system would have been ripe for significant arguments, errors and, overall, pretty haphazard design – no surprise Tudor houses look drunk.

But they stayed the course for hundreds of years.

Eventually, they chose bits of wood to measure all inches by. There they remained, sat in the guildhall of London.

Except, when you think about the day-to-day purpose of numbers, imperial measurements make sense. Between the 10th and 18th centuries, there was rarely a need for sums past half a dozen digits. The idea of a chain (66ft) or anything larger, like an acre (1/640th of a square mile), was the preserve of a few limited professions.


It's only in our recent history that the metric system has become a requirement. Turns out working in tens, rather than threes and dozens has its advantages:

  • There are 100 million square cm in a hectare.
  • There are 6,272,640 square inches in an acre.

(it is telling that the best definition of an imperial unit is given by its metric equivalent)


And yet even a system which was largely disunified for most of its history, a system that was far more tricky to navigate in the large, was sufficient for us to get by.

Likewise, in the modern day, small numbers are the ones you see on screen every day. What is the biggest number you're likely to encounter in your tabs? Every news summation is editorially hacked to a three-digit headline, and it's for a reason…


Even the most demanding production projects, like Lego, are only geared to microns (100th mm).

Within science, the longest numbers in practical, real-world use are likely to be in the region of Intel's new microchips, which come in at 2 nanometers, or 0.000000002 of a metre. It's a tremendous feat, but it only needs 10 digits written down.


There's a surety to this. With the biggest of numbers comes the biggest responsibility, and even the greatest of us can fall foul to number blindness. In 2022 a fat-fingered trader caused a ‘flash crash' which rippled across European Stock markets. The error accounted for €300bn in losses at its peak. This was a place where one digit in a dozen mattered.

The sweat of a conveyancer (housing lawyer) when transferring huge sums on completion is palpable. The payday accounts at large companies could conceivably run into the billions – but this is still only a 12-digit number (including the change).


The Number

The final clue is the computer or phone you are on right now.

The standard Apple calculator will only ever give you an answer accurate to 16 digits, averaging out the rest in zeros. Excel documents round from 15 digits. To go longer you need a specialist approach like Wolfram Alpha.

Even at the far ends of accuracy, the Jet Propulsion Laboratory at NASA uses 3.141592653589793 as Pi for its biggest calculations on interplanetary navigation.

This accuracy is enough to ensure that in an orbit of 150 billion kilometres, the rate of error would be no more than the width of a finger.

This is where we get the outer edge of what the smartest among us work with – 16 digits.


At the far end, the biggest numbers are banded about by a few characters in mathematics or finance – massive geeks who have my utmost respect. For most of us, it starts with your first number as a child and usually increases to the terrifying 6-digit house purchase (8 with the change).

Working to an average, 16 digits covers any number you might encounter.


This was a real journey, brilliant to research and to share. If you work with bigger, even if it's a 17-digit overdraft, let me know in the comments


See also:

The Human Odometer. How Far Can We Run?

Time is the only barrier

Tags: Data Science Ideas Mathematics Psychology Space

Comment