PDA

View Full Version : Multi-core and Everyday Life



Vinyadan
2017-12-11, 05:50 AM
When multi-core processors came out, I remember being somewhat confused about how we were supposed to understand which one would make your computer faster.
Now it's been a while. So I wonder, what changes did multi-core processing bring into daily lives? What can we do now that, otherwise, would have been impossible?

LamaFrancis
2017-12-11, 07:27 AM
Once upon a time I would start a DNA sequence alignment and my computer would be useless for the next hour. Now I can start a multi-sequence alignment, work on a spreadsheet, then see the alignment results in a few minutes (sometimes less than a minute!). Science comes at me fast now!

gomipile
2017-12-11, 07:50 AM
I have two monitors. I can have a 3D game on one, with my messaging clients and YouTube, Pandora, etc. open on the other and not have any framerate stutter.

Also, having lots of browser tabs open and visible used to cause computers to chug.

Basically, multitasking.

Also, there are some games that require a number of CPU cores now. Far Cry 4 requires at least four cores.

That's an example of multithreading. Other multithreaded applications include photo and video editing software, and 3d modeling and rendering software. These don't require multiple cores, but they run much better if you have them. That's to the point where doing a modern workflow on a single core machine would be painful to the point of unbearable. Today's cameras have much more resolution than those of a decade ago, and there are new photo editing filters and transformations that require lots of computation, too. An example would be Photoshop's content aware fill tool.

It might be worth mentioning that today's graphics cards are essentially massively multicore processors. They are also leveraged for photo, video, 3D modeling, and scientific workloads these days. A midrange graphics card of the current generation is a better supercomputer than the best supercomputers of the year 2000.

wumpus
2017-12-11, 08:17 AM
One thing to remember is that multi-core programming is *hard*. If you can obviously split your code in producer and consumer threads, things work easily but there is the famous "Amdhal's law" that says that everything has to come back to a single thread of code sooner or later and the computer can go no faster than you can execute that code.

Multi-core isn't a new thing. Back around 1990 or so, MIPS made the R4000 which plunked an entire 64 bit core on a single chip, leaving anything spread out across multiple chips (other than cache, split int/fp and similar) left behind. After that "big machines" would have multiple processors (in supercomputerland, multiple "processors" had been a thing since 1966). It has taken programmers a long time to catch up.

snowblizz
2017-12-11, 08:57 AM
When multi-core processors came out, I remember being somewhat confused about how we were supposed to understand which one would make your computer faster.
Now it's been a while. So I wonder, what changes did multi-core processing bring into daily lives? What can we do now that, otherwise, would have been impossible?

You can escape out of programs that hung the computer because they are only hogging on core and the OS can run on the other(s) shutting off said program.

The first time this happened to me I was like "wooow", multicores, they actually do stuff".

factotum
2017-12-11, 12:05 PM
The main reason multi-core processors exist is because the processor manufacturers simply couldn't keep upping the clock speeds any further, so it was the only way to keep improving performance. (For instance, the fastest Pentium 4 processor was clocked at 3GHz and that came out 15 years ago--the fastest current CPUs are around 4GHz, so not a lot of advancement). Yes, they've increased the performance per clock cycle over those 15 years, but not by a factor of 4 or more. Because of that, the real question here ought to be "What can we only do now that our CPUs are so much more powerful than they were?", and the answer is realistically "Nothing, but everything we *do* do is faster than it would otherwise be".

Yes, Far Cry 4 wouldn't even start up unless you had a quad-core processor, but that was a limitation programmed into the game by its developers--if you could have removed that piece of code I've no doubt it would have run on a single-core processor that supported all the CPU instructions the game required, it just would have been horrendously slow.

Balmas
2017-12-16, 01:06 AM
I have two monitors. I can have a 3D game on one, with my messaging clients and YouTube, Pandora, etc. open on the other and not have any framerate stutter.

Also, having lots of browser tabs open and visible used to cause computers to chug.

Basically, multitasking.

Adding to this: I can render a video in one window while listening to a podcast in another, and playing a demanding video game with a third window.

thracian
2017-12-21, 03:08 PM
The main reason multi-core processors exist is because the processor manufacturers simply couldn't keep upping the clock speeds any further, so it was the only way to keep improving performance. (For instance, the fastest Pentium 4 processor was clocked at 3GHz and that came out 15 years ago--the fastest current CPUs are around 4GHz, so not a lot of advancement). Yes, they've increased the performance per clock cycle over those 15 years, but not by a factor of 4 or more. Because of that, the real question here ought to be "What can we only do now that our CPUs are so much more powerful than they were?", and the answer is realistically "Nothing, but everything we *do* do is faster than it would otherwise be".

Yes, Far Cry 4 wouldn't even start up unless you had a quad-core processor, but that was a limitation programmed into the game by its developers--if you could have removed that piece of code I've no doubt it would have run on a single-core processor that supported all the CPU instructions the game required, it just would have been horrendously slow.

While the per-clock throughput isn't 4 times or more, the benchmarks I can find suggest about a 3 times improvement in clock-for-clock performance. Still pretty impressive.

Also, the latest top-end i7 consumer processor clocks up to 4.7ghz before any overclocking.