Results 1 to 15 of 15
-
2022-10-20, 10:05 AM (ISO 8601)
- Join Date
- Dec 2005
- Location
- In the playground
xxxxxxxxxxxx
xxxxxxxxxxxx
Last edited by gooddragon1; 2023-01-07 at 09:52 PM.
There is no emotion more useless in life than hate.
-
2022-10-20, 11:35 AM (ISO 8601)
- Join Date
- Mar 2007
- Location
- Oregon, USA
Re: Would this make bubble sort viable (Unlikely Hypothetical Scenario)
No. Bubble sort is very lean on memory usage, additional RAM doesn't help it; and performing comparisons faster improves all sorting algorithms. Bubble sort's problem lies in how it can only swap adjacent elements and iterates from one direction, and how many comparisons that entails; anything list large enough to seriously care about will be better served by more efficient sorting algorithms, that can make better use of the resources you're throwing at it.
The closest scenario I can think of where this might be not be the case is where some esoteric requirement of an exotic ultra-high-speed sorting system means only adjacent elements can be compared and exchanged; and even then, a slightly more advanced algorithm like cocktail sort or gnome sort would be better (or at the very least, not worse) than bubble sort.Last edited by Jasdoif; 2022-10-20 at 11:36 AM.
FeytouchedBanana eldritch disciple avatar by...me!
The Index of the Giant's Comments VI―Making Dogma from Zapped Bananas
-
2022-10-20, 12:40 PM (ISO 8601)
- Join Date
- Jan 2007
Re: Would this make bubble sort viable (Unlikely Hypothetical Scenario)
If I may ask a slightly different question: did anyone check, which sorting algorithm leads to least number of steps of a Turing machine? I think in that case comparing only adjacent numbers might cut the computation time in comparison to some more global approach.
Also, did anyone check, which sorting algorithm can be modeled with a Turing machine having least states? Not sure if such considerations are relevant anywhere to be honest, but I am simply curious.In a war it doesn't matter who's right, only who's left.
-
2022-10-20, 01:02 PM (ISO 8601)
- Join Date
- Aug 2011
Re: Would this make bubble sort viable (Unlikely Hypothetical Scenario)
"Viable" is context dependent. There are definitely times where people will use inefficient software simply because it's easier, and the available computing power sufficiently mitigates the inefficiencies (see: everyone who use Python).
The number of atoms in the observable universe is something like 10^82. A modern 64 bit computer can address 2^64 bytes, so we'll take that as an upper limit on how large the list can be. Bubble sort is O(n^2), which means, naively, it takes on average something 2^128 operations* to sort a 2^64 element list. Quicksort is, on average, O(nlogn), which puts it at 2^70 operations*. In the worst imaginable case where you're using your entire addressable memory to store a list and sort it, bubble sort would take about 2^58 times longer than quicksort, or about 3x10^17 times longer.
The number you're suggesting is ludicrously large, many many orders of magnitude larger than 3x10^17, so doing bubble sort on a computer 5x10^(2(10^82)) times faster** than the fastest computer would be faster than doing quicksort on the fastest computer. It would be viable for current applications.
Though again, you could just do quicksort on it, unless there was some kind of physical limitation forcing you into using bubble sort.
* this is oversimplified
** I believe this is physically impossible. If you make a computer a few orders of magnitude faster than what we have now, you start running into speed of light problems, as in you can't propagate the calculations through the processor fast enough to achieve that speed. Current processors are in the gigahertz range (billions of operations per second), and in one billionth of a second light (and therefor anything) can only propagate about 30 cm, or one foot. If we ever achieve single core terrahertz processing, the limit on the processor size is going to be on the order of 10s to 100s of micrometers, which severely limits your ability to actually employ every atom in the universe.Originally Posted by crayzzOriginally Posted by jere7my
-
2022-10-20, 04:00 PM (ISO 8601)
- Join Date
- Dec 2005
- Location
- In the playground
Re: Would this make bubble sort viable (Unlikely Hypothetical Scenario)
That's a helpful use of comparisons. I didn't know where to start on them, so that's useful.
As for the number, I meant if you took a space the size of the observable universe and filled it with hydrogen atoms.
Also, 10000000000000000000000000000000000000000000000000 00000000000000000000000000000000000000000000000000 0... which is a 1 followed by 100 zeros in this case would be 10^2 zeros (instead of 10^100). 10^82 zeros would be more. I think it's like 10^112 zeroes but that's what I saw online somewhere. So we're talking 10^112 zeroes after a 1 raised to a number that has 10^112 zeroes after a 1. That would be the value of Z. So, (5x10^Z) * (10^Z) times faster than the current fastest supercomputer in those contexts. Which doesn't change that it would work, but it's fun to think about that number.
This was somewhat from the idea of like how increases in memory space made it less necessary to take that into consideration when programming. So, I could be inefficient/lazy and use bubble sort instead of quick sort.
Though, if that wouldn't have worked, I'd maybe have gone with a system I thought of for fun.
1x10^[100][100]
100^100, then you take that number and raise it to itself. You repeat this process 100 times because it's the number in the second pair of brackets.
Then, after doing that, the previous process is complete. Now repeat the process with that number you got.
However, instead of using 100, using Z from the calculation I mentioned before this. Arriving at Y (as in, why am I doing this?). 1 followed by Y zeroes is B.
Then we take B (as in, Because) and use it, (5x10^B) * (10^B) times faster.Last edited by gooddragon1; 2022-10-20 at 04:14 PM.
There is no emotion more useless in life than hate.
-
2022-10-20, 06:54 PM (ISO 8601)
- Join Date
- Aug 2011
Re: Would this make bubble sort viable (Unlikely Hypothetical Scenario)
I was pretty sure I was messing something up. I will admit to having skimmed exactly how you were constructing the number; at some point when you're stacking exponents on exponents on exponents its faster to just call it a BIGNUMBER and not worry about it.
If I'm reading you right, this notation already exists: it's called up arrow notation. If you have a fascination with large numbers in general and this sort of iterative/recursive approach to building them, you could also look at the Ackerman function.Originally Posted by crayzzOriginally Posted by jere7my
-
2022-10-20, 10:39 PM (ISO 8601)
- Join Date
- Dec 2005
- Location
- In the playground
-
2022-10-21, 01:34 PM (ISO 8601)
- Join Date
- Jan 2021
-
2022-10-21, 07:41 PM (ISO 8601)
- Join Date
- Nov 2010
- Location
- California
- Gender
Re: Would this make bubble sort viable (Unlikely Hypothetical Scenario)
On the other hand, if you have a really, really good Quantum Computer, Bogosort, which classically has far worse performance than bubble sort, might become the best possible sorting algorithm.
Bogosort is basically: Randomly shuffle the numbers, see if they're in order, and if not, randomly reshuffle. Worst-case is O(unbounded), and averages O((n+1)!).
However, if you had a really, really good Quantum Computer, you could just try all the random shuffles at once, check them, and return the one that passes the check. Bam, done, in O(n) time.
-
2022-10-21, 09:36 PM (ISO 8601)
- Join Date
- Dec 2005
- Location
- In the playground
Re: Would this make bubble sort viable (Unlikely Hypothetical Scenario)
What I meant was just the number of zeroes was that big. The computer itself would maybe be the size of a standard laptop or something.
It's just the idea of being able to be inefficient with coding algorithms but still being able to get away with it. And isn't that what programming is really about: finding the way of doing things with the least effort invested?Last edited by gooddragon1; 2022-10-21 at 09:38 PM.
There is no emotion more useless in life than hate.
-
2022-10-22, 07:14 AM (ISO 8601)
- Join Date
- May 2007
- Location
- The Land of Cleves
- Gender
Re: Would this make bubble sort viable (Unlikely Hypothetical Scenario)
Actually, where "inefficient" routines like bubblesort come into their own is for very small problems. They scale poorly, but for a list of, say, half a dozen elements, they can still come out ahead of the "efficient" routines like Quicksort (the precise crossover point depends on fine details of the implementations of the algorithms and of the computers running them). And small sorting problems actually come up fairly often, since they can be the bottom layer of some efficient algorithms, like mergesort.
(aside: Mergesort is my favorite sorting algorithm, because it's nearly as efficient as quicksort, doesn't have quicksort's catastrophic worst case, and it's conceptually simple enough to be easily explained and understood by humans)Time travels in divers paces with divers persons.
—As You Like It, III:ii:328
Chronos's Unalliterative Skillmonkey Guide
Current Homebrew: 5th edition psionics
-
2022-10-24, 07:59 PM (ISO 8601)
- Join Date
- Nov 2010
- Location
- California
- Gender
Re: Would this make bubble sort viable (Unlikely Hypothetical Scenario)
-
2022-10-27, 06:02 AM (ISO 8601)
- Join Date
- Aug 2019
-
2022-10-27, 09:02 AM (ISO 8601)
- Join Date
- Aug 2011
Re: Would this make bubble sort viable (Unlikely Hypothetical Scenario)
Just to be pedantic, the Halting Problem is the question of whether an arbitrary Turing machine eventually halts. The related function is called the Busy Beaver function, which is the largest number of steps an n-state Turing machine can take before halting, under some constraints (tape starts at all 0s, only works with 0s and 1s, might be other restrictions I don't remember). It's worth noting that BB(8000+) is uncomputable due to some shenanigans involving the halting problem, encoding ZFC set theory into a turing machine, and Godel's incompleteness theorem.
Possibly incorrect pedantry aside, besides the BB function and the Ackermann function, the usual culprit that crops up is the TREE function, which is relatively straightforward to understand but complicated enough to explain that I'll instead leave the wikipedia page and a numberphile video. TREE(3) is famously much, much larger than Graham's Number, and since TREE(1) = 1 and TREE(2) = 3, the function does grow insanely quickly.
That's very true and very important! You can't get more efficient than bubble sort on a list of length 2. You see the same thing with e.g. matrix multiplication, where theoretically optimal algorithms have been designed down to O(n^2.4) or so, where the naive method is O(n^3) and the first ever break through, called Strassen's algorithm, was O(n^2.8). Now, 10000x10000 matrices aren't super uncommon in data heavy work, so you'd naively expect the naive multiplication algorithm to be ~250 times slower than the optimal method, but in practice linear algebra libraries tend to implement either the naive algorithm or maybe's Strassen's algorithm. Part of this comes down to being bottlenecked at the cache, but a lot of it is just that the constants in front of the scaling function are just so much larger for the "optimal" algorithms that unless you had a matrix that took up every bit of ram on earth to store, it wouldn't be worth it.Last edited by crayzz; 2022-10-27 at 09:16 AM.
Originally Posted by crayzzOriginally Posted by jere7my
-
2022-10-28, 04:26 AM (ISO 8601)
- Join Date
- Aug 2019