New OOTS products from CafePress
New OOTS t-shirts, ornaments, mugs, bags, and more
Results 1 to 15 of 15
  1. - Top - End - #1
    Troll in the Playground
     
    gooddragon1's Avatar

    Join Date
    Dec 2005
    Location
    In the playground

    Default xxxxxxxxxxxx

    xxxxxxxxxxxx
    Last edited by gooddragon1; 2023-01-07 at 09:52 PM.
    There is no emotion more useless in life than hate.

  2. - Top - End - #2
    Titan in the Playground
     
    Jasdoif's Avatar

    Join Date
    Mar 2007
    Location
    Oregon, USA

    Default Re: Would this make bubble sort viable (Unlikely Hypothetical Scenario)

    Quote Originally Posted by gooddragon1 View Post
    Would using a bubble sort be a good idea
    No. Bubble sort is very lean on memory usage, additional RAM doesn't help it; and performing comparisons faster improves all sorting algorithms. Bubble sort's problem lies in how it can only swap adjacent elements and iterates from one direction, and how many comparisons that entails; anything list large enough to seriously care about will be better served by more efficient sorting algorithms, that can make better use of the resources you're throwing at it.

    The closest scenario I can think of where this might be not be the case is where some esoteric requirement of an exotic ultra-high-speed sorting system means only adjacent elements can be compared and exchanged; and even then, a slightly more advanced algorithm like cocktail sort or gnome sort would be better (or at the very least, not worse) than bubble sort.
    Last edited by Jasdoif; 2022-10-20 at 11:36 AM.
    Feytouched Banana eldritch disciple avatar by...me!

    The Index of the Giant's Comments VI―Making Dogma from Zapped Bananas

  3. - Top - End - #3
    Troll in the Playground
    Join Date
    Jan 2007

    Default Re: Would this make bubble sort viable (Unlikely Hypothetical Scenario)

    If I may ask a slightly different question: did anyone check, which sorting algorithm leads to least number of steps of a Turing machine? I think in that case comparing only adjacent numbers might cut the computation time in comparison to some more global approach.

    Also, did anyone check, which sorting algorithm can be modeled with a Turing machine having least states? Not sure if such considerations are relevant anywhere to be honest, but I am simply curious.
    In a war it doesn't matter who's right, only who's left.

  4. - Top - End - #4
    Orc in the Playground
     
    SwashbucklerGuy

    Join Date
    Aug 2011

    Default Re: Would this make bubble sort viable (Unlikely Hypothetical Scenario)

    "Viable" is context dependent. There are definitely times where people will use inefficient software simply because it's easier, and the available computing power sufficiently mitigates the inefficiencies (see: everyone who use Python).

    The number of atoms in the observable universe is something like 10^82. A modern 64 bit computer can address 2^64 bytes, so we'll take that as an upper limit on how large the list can be. Bubble sort is O(n^2), which means, naively, it takes on average something 2^128 operations* to sort a 2^64 element list. Quicksort is, on average, O(nlogn), which puts it at 2^70 operations*. In the worst imaginable case where you're using your entire addressable memory to store a list and sort it, bubble sort would take about 2^58 times longer than quicksort, or about 3x10^17 times longer.

    The number you're suggesting is ludicrously large, many many orders of magnitude larger than 3x10^17, so doing bubble sort on a computer 5x10^(2(10^82)) times faster** than the fastest computer would be faster than doing quicksort on the fastest computer. It would be viable for current applications.

    Though again, you could just do quicksort on it, unless there was some kind of physical limitation forcing you into using bubble sort.

    * this is oversimplified

    ** I believe this is physically impossible. If you make a computer a few orders of magnitude faster than what we have now, you start running into speed of light problems, as in you can't propagate the calculations through the processor fast enough to achieve that speed. Current processors are in the gigahertz range (billions of operations per second), and in one billionth of a second light (and therefor anything) can only propagate about 30 cm, or one foot. If we ever achieve single core terrahertz processing, the limit on the processor size is going to be on the order of 10s to 100s of micrometers, which severely limits your ability to actually employ every atom in the universe.
    Quote Originally Posted by crayzz
    That a given person is known for his sex appeal does not mean that he is only known for his sex appeal.
    Quote Originally Posted by jere7my
    For instance, I am also known for my humility.

  5. - Top - End - #5
    Troll in the Playground
     
    gooddragon1's Avatar

    Join Date
    Dec 2005
    Location
    In the playground

    Default Re: Would this make bubble sort viable (Unlikely Hypothetical Scenario)

    Quote Originally Posted by crayzz View Post
    "Viable" is context dependent. There are definitely times where people will use inefficient software simply because it's easier, and the available computing power sufficiently mitigates the inefficiencies (see: everyone who use Python).

    The number of atoms in the observable universe is something like 10^82. A modern 64 bit computer can address 2^64 bytes, so we'll take that as an upper limit on how large the list can be. Bubble sort is O(n^2), which means, naively, it takes on average something 2^128 operations* to sort a 2^64 element list. Quicksort is, on average, O(nlogn), which puts it at 2^70 operations*. In the worst imaginable case where you're using your entire addressable memory to store a list and sort it, bubble sort would take about 2^58 times longer than quicksort, or about 3x10^17 times longer.

    The number you're suggesting is ludicrously large, many many orders of magnitude larger than 3x10^17, so doing bubble sort on a computer 5x10^(2(10^82)) times faster** than the fastest computer would be faster than doing quicksort on the fastest computer. It would be viable for current applications.

    Though again, you could just do quicksort on it, unless there was some kind of physical limitation forcing you into using bubble sort.

    * this is oversimplified

    ** I believe this is physically impossible. If you make a computer a few orders of magnitude faster than what we have now, you start running into speed of light problems, as in you can't propagate the calculations through the processor fast enough to achieve that speed. Current processors are in the gigahertz range (billions of operations per second), and in one billionth of a second light (and therefor anything) can only propagate about 30 cm, or one foot. If we ever achieve single core terrahertz processing, the limit on the processor size is going to be on the order of 10s to 100s of micrometers, which severely limits your ability to actually employ every atom in the universe.
    That's a helpful use of comparisons. I didn't know where to start on them, so that's useful.

    As for the number, I meant if you took a space the size of the observable universe and filled it with hydrogen atoms.

    Also, 10000000000000000000000000000000000000000000000000 00000000000000000000000000000000000000000000000000 0... which is a 1 followed by 100 zeros in this case would be 10^2 zeros (instead of 10^100). 10^82 zeros would be more. I think it's like 10^112 zeroes but that's what I saw online somewhere. So we're talking 10^112 zeroes after a 1 raised to a number that has 10^112 zeroes after a 1. That would be the value of Z. So, (5x10^Z) * (10^Z) times faster than the current fastest supercomputer in those contexts. Which doesn't change that it would work, but it's fun to think about that number.

    This was somewhat from the idea of like how increases in memory space made it less necessary to take that into consideration when programming. So, I could be inefficient/lazy and use bubble sort instead of quick sort.

    Though, if that wouldn't have worked, I'd maybe have gone with a system I thought of for fun.

    1x10^[100][100]

    100^100, then you take that number and raise it to itself. You repeat this process 100 times because it's the number in the second pair of brackets.

    Then, after doing that, the previous process is complete. Now repeat the process with that number you got.

    However, instead of using 100, using Z from the calculation I mentioned before this. Arriving at Y (as in, why am I doing this?). 1 followed by Y zeroes is B.

    Then we take B (as in, Because) and use it, (5x10^B) * (10^B) times faster.
    Last edited by gooddragon1; 2022-10-20 at 04:14 PM.
    There is no emotion more useless in life than hate.

  6. - Top - End - #6
    Orc in the Playground
     
    SwashbucklerGuy

    Join Date
    Aug 2011

    Default Re: Would this make bubble sort viable (Unlikely Hypothetical Scenario)

    Quote Originally Posted by gooddragon1 View Post
    Also, 10000000000000000000000000000000000000000000000000 00000000000000000000000000000000000000000000000000 0... which is a 1 followed by 100 zeros in this case would be 10^2 zeros (instead of 10^100). 10^82 zeros would be more. I think it's like 10^112 zeroes but that's what I saw online somewhere. So we're talking 10^112 zeroes after a 1 raised to a number that has 10^112 zeroes after a 1. That would be the value of Z. So, (5x10^Z) * (10^Z) times faster than the current fastest supercomputer in those contexts. Which doesn't change that it would work, but it's fun to think about that number.
    I was pretty sure I was messing something up. I will admit to having skimmed exactly how you were constructing the number; at some point when you're stacking exponents on exponents on exponents its faster to just call it a BIGNUMBER and not worry about it.

    Quote Originally Posted by gooddragon1 View Post
    Though, if that wouldn't have worked, I'd maybe have gone with a system I thought of for fun.

    1x10^[100][100]

    100^100, then you take that number and raise it to itself. You repeat this process 100 times because it's the number in the second pair of brackets.

    Then, after doing that, the previous process is complete. Now repeat the process with that number you got.

    However, instead of using 100, using Z from the calculation I mentioned before this. Arriving at Y (as in, why am I doing this?). 1 followed by Y zeroes is B.

    Then we take B (as in, Because) and use it, (5x10^B) * (10^B) times faster.
    If I'm reading you right, this notation already exists: it's called up arrow notation. If you have a fascination with large numbers in general and this sort of iterative/recursive approach to building them, you could also look at the Ackerman function.
    Quote Originally Posted by crayzz
    That a given person is known for his sex appeal does not mean that he is only known for his sex appeal.
    Quote Originally Posted by jere7my
    For instance, I am also known for my humility.

  7. - Top - End - #7
    Troll in the Playground
     
    gooddragon1's Avatar

    Join Date
    Dec 2005
    Location
    In the playground

    Default Re: Would this make bubble sort viable (Unlikely Hypothetical Scenario)

    Quote Originally Posted by crayzz View Post
    If I'm reading you right, this notation already exists: it's called up arrow notation. If you have a fascination with large numbers in general and this sort of iterative/recursive approach to building them, you could also look at the Ackerman function.
    I think that might be larger. Ackermann function is something I'll have to look at.
    There is no emotion more useless in life than hate.

  8. - Top - End - #8
    Dwarf in the Playground
    Join Date
    Jan 2021

    Default Re: Would this make bubble sort viable (Unlikely Hypothetical Scenario)

    Quote Originally Posted by Jasdoif View Post
    No. Bubble sort is very lean on memory usage, additional RAM doesn't help it; and performing comparisons faster improves all sorting algorithms. Bubble sort's problem lies in how it can only swap adjacent elements and iterates from one direction, and how many comparisons that entails; anything list large enough to seriously care about will be better served by more efficient sorting algorithms, that can make better use of the resources you're throwing at it.

    The closest scenario I can think of where this might be not be the case is where some esoteric requirement of an exotic ultra-high-speed sorting system means only adjacent elements can be compared and exchanged; and even then, a slightly more advanced algorithm like cocktail sort or gnome sort would be better (or at the very least, not worse) than bubble sort.
    Seems like it would be more of a hardware map issue than a chemical issue
    The closest I get to clear and consise:
    Quote Originally Posted by Justanotherhero View Post
    Interesting read! Thanks for the post!

  9. - Top - End - #9
    Barbarian in the Playground
     
    OldWizardGuy

    Join Date
    Nov 2010
    Location
    California
    Gender
    Male

    Default Re: Would this make bubble sort viable (Unlikely Hypothetical Scenario)

    On the other hand, if you have a really, really good Quantum Computer, Bogosort, which classically has far worse performance than bubble sort, might become the best possible sorting algorithm.

    Bogosort is basically: Randomly shuffle the numbers, see if they're in order, and if not, randomly reshuffle. Worst-case is O(unbounded), and averages O((n+1)!).

    However, if you had a really, really good Quantum Computer, you could just try all the random shuffles at once, check them, and return the one that passes the check. Bam, done, in O(n) time.

  10. - Top - End - #10
    Troll in the Playground
     
    gooddragon1's Avatar

    Join Date
    Dec 2005
    Location
    In the playground

    Default Re: Would this make bubble sort viable (Unlikely Hypothetical Scenario)

    Quote Originally Posted by Hyoi View Post
    I think the basic problem with this hypothetical is that if your society's computers are scaled up to these universe-scale sizes, then the sizes of the lists your society needs to sort are probably going to be similarly vast. So you will still need to use a more efficient algorithm for any serious application.
    What I meant was just the number of zeroes was that big. The computer itself would maybe be the size of a standard laptop or something.

    It's just the idea of being able to be inefficient with coding algorithms but still being able to get away with it. And isn't that what programming is really about: finding the way of doing things with the least effort invested?
    Last edited by gooddragon1; 2022-10-21 at 09:38 PM.
    There is no emotion more useless in life than hate.

  11. - Top - End - #11
    Titan in the Playground
    Join Date
    May 2007
    Location
    The Land of Cleves
    Gender
    Male

    Default Re: Would this make bubble sort viable (Unlikely Hypothetical Scenario)

    Actually, where "inefficient" routines like bubblesort come into their own is for very small problems. They scale poorly, but for a list of, say, half a dozen elements, they can still come out ahead of the "efficient" routines like Quicksort (the precise crossover point depends on fine details of the implementations of the algorithms and of the computers running them). And small sorting problems actually come up fairly often, since they can be the bottom layer of some efficient algorithms, like mergesort.

    (aside: Mergesort is my favorite sorting algorithm, because it's nearly as efficient as quicksort, doesn't have quicksort's catastrophic worst case, and it's conceptually simple enough to be easily explained and understood by humans)
    Time travels in divers paces with divers persons.
    As You Like It, III:ii:328

    Chronos's Unalliterative Skillmonkey Guide
    Current Homebrew: 5th edition psionics

  12. - Top - End - #12
    Barbarian in the Playground
     
    OldWizardGuy

    Join Date
    Nov 2010
    Location
    California
    Gender
    Male

    Default Re: Would this make bubble sort viable (Unlikely Hypothetical Scenario)

    Quote Originally Posted by gooddragon1 View Post
    It's just the idea of being able to be inefficient with coding algorithms but still being able to get away with it. And isn't that what programming is really about: finding the way of doing things with the least effort invested?

    No, it's about ridiculously overoptimizing things just because you can and because it's fun trying to squeeze just one cycle more out of an algorithm.

    And because then you feel like you did it right.

  13. - Top - End - #13
    Orc in the Playground
     
    Kobold

    Join Date
    Aug 2019

    Default Re: Would this make bubble sort viable (Unlikely Hypothetical Scenario)

    Quote Originally Posted by crayzz View Post
    I was pretty sure I was messing something up. I will admit to having skimmed exactly how you were constructing the number; at some point when you're stacking exponents on exponents on exponents its faster to just call it a BIGNUMBER and not worry about it.

    If I'm reading you right, this notation already exists: it's called up arrow notation. If you have a fascination with large numbers in general and this sort of iterative/recursive approach to building them, you could also look at the Ackerman function.
    Does anyone know any other absurdly scaling functions, aside from the Halting Problem?

    Halting Problem scales, by definition, at least as fast as any computable function.

  14. - Top - End - #14
    Orc in the Playground
     
    SwashbucklerGuy

    Join Date
    Aug 2011

    Default Re: Would this make bubble sort viable (Unlikely Hypothetical Scenario)

    Quote Originally Posted by Rydiro View Post
    Does anyone know any other absurdly scaling functions, aside from the Halting Problem?

    Halting Problem scales, by definition, at least as fast as any computable function.
    Just to be pedantic, the Halting Problem is the question of whether an arbitrary Turing machine eventually halts. The related function is called the Busy Beaver function, which is the largest number of steps an n-state Turing machine can take before halting, under some constraints (tape starts at all 0s, only works with 0s and 1s, might be other restrictions I don't remember). It's worth noting that BB(8000+) is uncomputable due to some shenanigans involving the halting problem, encoding ZFC set theory into a turing machine, and Godel's incompleteness theorem.

    Possibly incorrect pedantry aside, besides the BB function and the Ackermann function, the usual culprit that crops up is the TREE function, which is relatively straightforward to understand but complicated enough to explain that I'll instead leave the wikipedia page and a numberphile video. TREE(3) is famously much, much larger than Graham's Number, and since TREE(1) = 1 and TREE(2) = 3, the function does grow insanely quickly.

    Quote Originally Posted by Chronos View Post
    Actually, where "inefficient" routines like bubblesort come into their own is for very small problems. They scale poorly, but for a list of, say, half a dozen elements, they can still come out ahead of the "efficient" routines like Quicksort (the precise crossover point depends on fine details of the implementations of the algorithms and of the computers running them). And small sorting problems actually come up fairly often, since they can be the bottom layer of some efficient algorithms, like mergesort.

    (aside: Mergesort is my favorite sorting algorithm, because it's nearly as efficient as quicksort, doesn't have quicksort's catastrophic worst case, and it's conceptually simple enough to be easily explained and understood by humans)
    That's very true and very important! You can't get more efficient than bubble sort on a list of length 2. You see the same thing with e.g. matrix multiplication, where theoretically optimal algorithms have been designed down to O(n^2.4) or so, where the naive method is O(n^3) and the first ever break through, called Strassen's algorithm, was O(n^2.8). Now, 10000x10000 matrices aren't super uncommon in data heavy work, so you'd naively expect the naive multiplication algorithm to be ~250 times slower than the optimal method, but in practice linear algebra libraries tend to implement either the naive algorithm or maybe's Strassen's algorithm. Part of this comes down to being bottlenecked at the cache, but a lot of it is just that the constants in front of the scaling function are just so much larger for the "optimal" algorithms that unless you had a matrix that took up every bit of ram on earth to store, it wouldn't be worth it.
    Last edited by crayzz; 2022-10-27 at 09:16 AM.
    Quote Originally Posted by crayzz
    That a given person is known for his sex appeal does not mean that he is only known for his sex appeal.
    Quote Originally Posted by jere7my
    For instance, I am also known for my humility.

  15. - Top - End - #15
    Orc in the Playground
     
    Kobold

    Join Date
    Aug 2019

    Default Re: Would this make bubble sort viable (Unlikely Hypothetical Scenario)

    Quote Originally Posted by crayzz View Post
    That's very true and very important! You can't get more efficient than bubble sort on a list of length 2. You see the same thing with e.g. matrix multiplication, where theoretically optimal algorithms have been designed down to O(n^2.4) or so, where the naive method is O(n^3) and the first ever break through, called Strassen's algorithm, was O(n^2.8). Now, 10000x10000 matrices aren't super uncommon in data heavy work, so you'd naively expect the naive multiplication algorithm to be ~250 times slower than the optimal method, but in practice linear algebra libraries tend to implement either the naive algorithm or maybe's Strassen's algorithm. Part of this comes down to being bottlenecked at the cache, but a lot of it is just that the constants in front of the scaling function are just so much larger for the "optimal" algorithms that unless you had a matrix that took up every bit of ram on earth to store, it wouldn't be worth it.
    Usually, people will design their storage and algorithms to the problem at hand, when working with large matrices. Some linear algebra on your problem can reduce the computation significantly.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •