PDA

View Full Version : Laptop Help?



Mindfreak
2012-09-22, 05:46 PM
So, I was thinking of doing something quite special for Christmas this year for my brother and sister.
My brother and sister both need and deserve laptops. They've done so much for me this year and the last, that now that I have a job I want to pay them back for it.
Since Walmart has layoway, I figured I could very well afford this.
However, I don't quite know laptops.
Which is why I'm hoping you fine folks could help me.
My brother wants:
A laptop for gaming(he plays Team Fortress 2, Minecraft, some of the other free Steam Games), and he wants the screen to be around 15".

My sister wants:
A laptop for school, watching movies on, and more general use. She doesn't want the screen or keyboard to be tiny.

I want:
This to be as cheap as possible. I love them to pieces, but I don't wanna buy them $1000 laptops. And also at Walmart so I can put them on Layoway and pay for them over time.

So...um...help please? :smallredface:

Rawhide
2012-09-22, 05:55 PM
Best suggestion:
Don't put them on lay-by/layaway.

Prices of computers are dropping all the time, while at the same time their speed is increasing. By the time you pay it off, they will be obsolete.

Instead, put the money aside (in an interest bearing account if possible) and then buy them when you have the money.

Hiro Protagonest
2012-09-22, 06:34 PM
You don't have to buy them $1000 laptops.

Best option: Look up the recommended system requirements for the games your brother plays, find a small locally-owned computer store in your area, find a laptop that meets the recommended and has at least 4 GB of memory before any memory upgrades. I found a desktop CPU with 6 GB (up to 32 with upgrades) of memory and better than the recommended requirements for $500. A laptop with a bit less memory and slightly worse run speed than that should cost about the same. For your sister, movie watching is all about the graphics card. Don't worry about memory so much, although you should probably still get 4 GB (I'm not sure if movies from DVDs take up memory).

Second best option: As above, but go to Best Buy instead, and don't listen to the salesmen's recommendations.

Brother Oni
2012-09-22, 06:56 PM
For your sister, movie watching is all about the graphics card. Don't worry about memory so much, although you should probably still get 4 GB (I'm not sure if movies from DVDs take up memory).

If you're watching from a DVD, then you should be able to define how much of a buffer you want to ensure smooth playback. A good graphics card helps (since they all pretty much can do hardware acceleration these days) but depending on the codec, some are very CPU intensive - if she's regularly watching video files or streaming them off the internet, the OP may not want to skimp too much on the CPU.

Tonal Architect
2012-09-22, 09:42 PM
I'm sure this has been repeated ad nauseum... But "gaming" and "cheap" don't fit together in the same sentence if the word "laptop" is also there.

It used to be case, at least.

Rawhide
2012-09-22, 09:45 PM
Gaming, cheap, laptop. Choose any two.

Flickerdart
2012-09-22, 09:51 PM
A gaming laptop isn't really a laptop, anyway, it's just a more expensive desktop with more cooling problems and a smaller screen. Playing for any length of time without having to plug the thing in would be impossible.

Rawhide
2012-09-22, 10:04 PM
A gaming laptop isn't really a laptop, anyway, it's just a more expensive desktop with more cooling problems and a smaller screen. Playing for any length of time without having to plug the thing in would be impossible.

Actually, the new MacBook Pros with Retina Display are pretty darn awesome for gaming.

Flash memory hard drives, high end (for a laptop) graphics card, powerful processor, 8gb ram minimum, extremely light and portable, rather long battery life (which does shrink with gaming, but is still decent), fantastic screen.

It is actually a good price for what you get too, especially when compared to the competition. It is not, however, cheap.

Flickerdart
2012-09-22, 10:14 PM
Obligatory "lol macs gaming" joke. Steam might be bringing a bunch of games to OSX, but it's still nowhere close to the amount of stuff you can get on Windows.

Besides, I didn't say that there weren't laptops capable of gaming. I said that if you're gaming, a desktop is just better in every respect, considering that the main advantage of laptops disappears with the heavy battery drain that gaming would create.

Rawhide
2012-09-22, 10:19 PM
Obligatory "lol macs gaming" joke. Steam might be bringing a bunch of games to OSX, but it's still nowhere close to the amount of stuff you can get on Windows.

Besides, I didn't say that there weren't laptops capable of gaming. I said that if you're gaming, a desktop is just better in every respect, considering that the main advantage of laptops disappears with the heavy battery drain that gaming would create.

Obligatory "lol boot camp + parallels removes all such hinderances".

You do know that can run Windows on a Mac, right? Boot Camp will let you boot into it natively, and Parallels or VMware will let you run it in a virtual machine (it will be slower, but both can boot a Boot Camp partition for the times you only need a single application and speed isn't as much of an issue, giving you the best of both worlds).

Also, CrossOver/WINE.

Flickerdart
2012-09-22, 10:25 PM
BootCamp is (IIRC anyway) strictly inferior to just running Windows to begin with.

But this isn't an OS war thread. I'm sure the Macbook is perfectly satisfactory for gaming, insofar that a laptop can be.

Rawhide
2012-09-22, 10:49 PM
BootCamp is (IIRC anyway) strictly inferior to just running Windows to begin with.

But this isn't an OS war thread. I'm sure the Macbook is perfectly satisfactory for gaming, insofar that a laptop can be.

Boot Camp is running Windows natively. It's not a virtual machine, it's Windows running on its own partition, natively. You are "running Windows to begin with". It's the same as having a partition for Linux and a partition for Windows. Parallels and VMware Fusion are both virtual machines, and do slow things down - this may be what you're thinking of.

I wasn't discussing the OS, only the hardware.

Joneson.w
2012-09-23, 04:31 AM
What do you think lenovo Y580? the price is under $900, it features Nvidia GT 660M, it's the best GPU of all laptops under $1000.

Joneson.w
2012-09-23, 04:33 AM
you can see this page (http://gaminglaptopsunder1000.net/), there are many great laptop under $1000, but I think the y580 is the best,

Here's why I settled for the Y580:

- Not the prettiest or lightest looking laptop, but had the most value for the price. Also, purchasing an extended warranty through Lenovo is fairly cheap.
- Glossy display, but accurately displays colors including red. According to Noteebook check, the display covers 91% of the sRGB spectrum, which is impressive for a consumer level notebook. Since I do photo and video editing on the side, this was pleasing to hear.
- Room for self-upgrades. It has an mSATA slot if you want to install your own. It gets SATA 3 speeds, unless you decide to remove the OD and install a HD in its place; if you do that, you'll only get SATA 2. Otherwise, the mSATA and primary HD bay get SATA 3.
- Memory is upgradeable to 16GB
- Amazing sound quality with the JBL Speakers
- Strongest discrete GPU of the laptops I tested
- Backlit keyboard was amazing, with good tactile feedback, travel, and little flex.
- Good cooling. I was scared at first to try it out since some owners reported problems with temperatures during heavy usage, but I'd have to disagree. After gaming for multiple hours and doing some heavy rendering, my temps were only warm and never got too dangerous to damage internal components.
- Battery life. Although it only comes with a 6 cell battery, on a full charge under power saving profiles and normal usage, I can get around 4.5 - 5 hrs, which is impressive for a laptop that has so much power.

Don Julio Anejo
2012-09-23, 12:12 PM
Actually, the new MacBook Pros with Retina Display are pretty darn awesome for gaming.

Flash memory hard drives, high end (for a laptop) graphics card, powerful processor, 8gb ram minimum, extremely light and portable, rather long battery life (which does shrink with gaming, but is still decent), fantastic screen.

It is actually a good price for what you get too, especially when compared to the competition. It is not, however, cheap.
Sorry, "double the price of similar-spec'd competition" is not really a good price. 1.5x the price if you factor in form factor (yay, I made a pun!). The only area the Macbook wins in is the screen, which is really only important if you're a graphic designer. $2200 is not a good price for _any_ laptop these days, really.

Case in point:

you can see this page, there are many great laptop under $1000, but I think the y580 is the best,

Here's why I settled for the Y580:
Pretty good for gaming, although I personally don't see the point of buying a Lenovo and _not_ buying a ThinkPad which can be customized at purchase and there's usually lots of amazing sales (i.e. Visaperks gives you ~10-15% off in addition to whatever sale Lenovo has).

As for your sister, any laptop with an Ivy/Sandy Bridge Intel processor will work just fine for pretty much anything she can throw at it. Even a 2nd gen i3 (2xxx series) with HD3000 will be fine. Don't buy anything less than that though, many makers tend to put in cheapo CPU's that lag out after 3 open Firefox windows: i.e., avoid Atoms (Intel), C- or E-series (AMD) or anything really not a Core, they either use too much power or don't perform well, or both.

- Room for self-upgrades. It has an mSATA slot if you want to install your own. It gets SATA 3 speeds, unless you decide to remove the OD and install a HD in its place; if you do that, you'll only get SATA 2. Otherwise, the mSATA and primary HD bay get SATA 3.
This is an awesome feature, really :) You can put in an SSD for the full SSD experience AND still keep your old huge hard drive for storage space. This way a cheap 64GB mSATA SSD (i.e. Crucial M4.. don't buy OCZ!!) will work as a system drive for all the advantages it brings and you won't run out of space or have to pay $200 for 256GB just to make sure you have enough space.

Rawhide
2012-09-23, 06:35 PM
Sorry, "double the price of similar-spec'd competition" is not really a good price. 1.5x the price if you factor in form factor (yay, I made a pun!). The only area the Macbook wins in is the screen, which is really only important if you're a graphic designer. $2200 is not a good price for _any_ laptop these days, really.

You know, I am thoroughly and totally sick of this "2x the price" myth.

I challenge you to find a laptop that has all of the same features as the MacBook Pro with Retina Display at a much cheaper price. No, I am not interested in a desktop replacement laptop. I want something light and portable with good battery life.

SarahV
2012-09-23, 10:30 PM
Apologies for interrupting the impending Mac vs. PC war :smallwink: but I would agree with the poster above that says don't put it on layaway. If you save your money you will be able to get cheaper/better stuff in a few months, especially if you can catch some Black Friday/Cyber Monday sales. HOWEVER, if you really want to do a layaway at Walmart for reasons I don't know about, I think this would work well for your sister:
http://www.walmart.com/ip/Lenovo-Gray-15.6-IGF-Idea-Z575-Laptop-PC-with-AMD-A6-3420M-Quad-Core-Processor-and-Windows-7-Home-Premium-with-Windows-8-Pro-Upgrade-Option/20701045

It's got 100% five-star reviews and the specs are perfectly good for watching movies and doing schoolwork. I have a Lenovo with the same amount of memory and a slower CPU, and I do graphics work, video and audio editing, etc. with no trouble.

I don't really know anything about gaming specs, I'm afraid, so I haven't really got a suggestion there.

It's really nice of you to do this for them, btw :smallsmile:

Mynxae
2012-09-24, 01:30 AM
I've got a gaming desktop that was only $678, built by a friend last year, which is low-mid range, shouldn't be that hard to find in a laptop should it?

These days, the best thing to make sure you have in any gaming PC:

4GB RAM.
At least a 1GB Graphics Card.
Windows 7 Ultimate is a nice touch.
A CD drive is a must (some laptops {yes, laptops} don't have them).
And a decent battery life. Because my two year old laptop's battery life lasts 15 mins. :smallannoyed:

factotum
2012-09-24, 06:23 AM
4GB RAM.
At least a 1GB Graphics Card.
Windows 7 Ultimate is a nice touch.
A CD drive is a must (some laptops {yes, laptops} don't have them).
And a decent battery life. Because my two year old laptop's battery life lasts 15 mins. :smallannoyed:

And if you could have it generate world peace while you're about it, that'd be great... :smallsmile:

Seriously, that 1Gb graphics card is going to absolutely drink power, so getting decent battery life as well would result in a laptop weighing about 20lb. You have to make some compromises when speccing a laptop, and battery life is going to be one of them if you want the thing to perform in all other respects as a desktop machine would.

Rawhide
2012-09-24, 06:29 AM
And if you could have it generate world peace while you're about it, that'd be great... :smallsmile:

Seriously, that 1Gb graphics card is going to absolutely drink power, so getting decent battery life as well would result in a laptop weighing about 20lb. You have to make some compromises when speccing a laptop, and battery life is going to be one of them if you want the thing to perform in all other respects as a desktop machine would.

*whistles*

I know a laptop that has 8gb of ram and 1gb of graphics ram (GDDR5) and excellent battery life... It's also exceptionally light at 4.46lbs/2.02kg.

But it isn't cheap.

Erloas
2012-09-26, 12:09 AM
The main issue with using a Mac for gaming is that they simply aren't better at it then what you can get a PC for the same price. They are more expensive, no matter what any Mac fan wants to try and claim, though Apple does go out of their way to have AMD/Intel/Nvidia do special naming on their products so you can't easily compare them to the standard PC parts, they'll change the model number but it will be an identical part, but unless you know the hardware really well you won't know what the equivalent PC part is. That said, they aren't 2x as expensive either, Dells generally run about... 10% more then other manufacturers and Apple runs probably 10-20% more then Dell. And yes, boot camp lets you run Windows natively, but you've just added another $100 to the price of the laptop to get Windows on it, and thats a pretty significant price increase in the market segment being looked at.
And sure, Apple has some nice marketing features but most of them do little to nothing in terms of performance. The retinal display for instance has a nice higher resolution, but there is a very good chance that when it comes to gaming their video card hardware isn't going to be good enough to run any even moderately demanding game at that resolution anyway so you really haven't gained much.

As for specs... well the amount of RAM on a video card is almost completely meaningless. As long as it is a discrete graphics card it will have the RAM needed for what it can process because RAM is dirt cheap and they aren't going to cripple the performance of the GPU to save almost nothing on the really cheap RAM. For an example in desktop cards you can get video cards with 1GB of RAM from $30 to $300, which is everything from video cards that can do little more then codec acceleration and isn't going to run games at all to the newest high end video cards and it is the standard size for the majority of video cards being made now.

As for the games your brother plays... they are all low end games, it should be easy to find a laptop that will play them.

I bought my laptop... probably 3-4 years ago for I think $750 and it is still doing pretty well. I don't actually use it much because I prefer my desktop, but it was 17" and it is playing Borderlands 2 just fine (my brother is using it since his computer is having issues).

First step would be to go to Newegg and search for laptops. Minimum 4GB system ram, a dedicated video card, and the screen size you are looking for. (I personally wouldn't go under 17" unless it is specifically going to be for a constantly traveling laptop). That should get you the basics of what you are looking for, they aren't going to put dedicated video cards with really low end processors. Should be a decent number of options in the $600-800 range, look through some of the options and post them here for further review.

As for battery life, ignore it for your brother, any computer worth gaming isn't going to run long even on a huge battery, even ones that would otherwise claim great battery life isn't going to last that long while gaming. If the video card is powerful enough to run games well its going to draw enough power to not last long on battery.

For your sister... you can probably get just about anything and it will fit those requirements. At least anything that isn't a netbook. You can look a lot more at battery life here, but it mostly comes down to what you want to pay. These will start at about $300 for the most basic ones, but will still do what you've mentioned (even a netbook would other then the small size). And really battery life will probably be the biggest factor in cost as they will all have pretty much the same hardware and batterys until you get into the "ultra light/portable" segment which is a higher price range.

Rawhide
2012-09-26, 12:40 AM
The main issue with using a Mac for gaming is that they simply aren't better at it then what you can get a PC for the same price. They are more expensive, no matter what any Mac fan wants to try and claim,

The units generally appear more expensive for what you get when you look at certain raw specs. And many gamers will decide that they are not interested in the other features you pay for with a Mac. But, I argue the whole "more expensive" claim when everything is said and done, because that money does go somewhere. They have exceptional quality screens, for example, very sturdy construction, and the by far the best (multi touch) touchpads in the industry, to name but a few. Every little bit adds up, both in creating an excellent user experience, and in where that 10-20% goes.

P.S. I have never used a touchpad that I have actually enjoyed using, that wasn't on a Mac.



though Apple does go out of their way to have AMD/Intel/Nvidia do special naming on their products so you can't easily compare them to the standard PC parts, they'll change the model number but it will be an identical part

So, what's the equivalent of a GeForce GT 650M with 1GB of GDDR5 (not DDR3) RAM? I would tend to think that it would be a GeForce GT 650M with 1GB of GDDR5. Actually, from what I've read, with the clock speeds and RAM Apple has used, it compares favourably with a GTX 660M.

Don Julio Anejo
2012-09-26, 01:40 AM
@ Rawhide: Sorry, I wanted to reply a few days ago, but haven't had a chance. In regards to my "twice as expensive, 1.5x if you factor in form factor." I'm not ranting on Macs, I actually like Macbooks insofar as both Pro and Air were very innovative for their time (a mid-spec'd really light laptop with good battery life and then an inexpensive for the time ultrabook), in much the same way Asus came up with netbooks. They're also the absolute best laptops for graphics work (sorry, for desktops high-end Eizo/Dell/NEC monitors still win) because of available software and screens.

But... The reason I usually react strongly to anyone throwing Apple suggestions is the almost cultist-like nature of a few Apple users (have to deal with two of them at work), who keep babbling about how Apple can do no wrong and their products are perfect just the way they are. Also, the inane Samsung lawsuit, but that's beside the point and probably against forum rules. I'm not saying you're one of them (you're probably not), but specific Apple products are good for their specific uses, nothing more and nothing less. They're not an end all. Recommending a high-end Macbook Pro to someone who specifically wanted a sub-$1000 laptop that can hopefully play games is.. weird.

Don Julio Anejo
2012-09-26, 02:31 AM
Now, since I'm a Lenovo fan (and also an Asus one, but I won't address them as they have too many different models for me to care about or know), I'm going to use Lenovo as the baseline for "normal PC manufacturer that doesn't copy Apple design philosophy." The 13" $1200 Macbook Pro compares unfavorably to my (somewhat custom configured) Lenovo X230:

- Same CPU/GPU
- Same RAM, but it's easy to expand, so W/E.
- X230 has a smaller (12.5") screen, both use IPS. Mac wins because it's 16:10 though.
- Mac has an optical drive, X230 doesn't since it's technically an ultrabook, but..
- X230 weight less than 3 pounds. Dimensions are comparable.
- 7 or so hours of battery life with a 4-cell battery, 6-cell is $20 and brings it up to 9 hours.
- Won't touch ergonomics, they're personal for many people. Both brands have their proponents and detractors. Trackpoint was the dealmaker for me.

Price: $1200 vs $832 (via the US Apple/Lenovo websites). Paying by VISA will save you another 20%, which brings the total down to $670 and is in line with my statement of "pay twice more." As far as I'm aware, Apple very rarely has anything in terms of sales or discounts with the student discount at my school being a measly 5% (might be different in US though). For Lenovo, current coupon code is (if I'm not mistaken) THINK920, might be CAPTHINK920. There's almost always a 10-20% discount on some models. Now it's 12%, which is what I've used above to get $832. VISA discount is automatic at transaction.

If optical drive is a big deal, the T430s is also comparable to this Macbook, with 2 options (one has 520M video card, another i5 3320 instead of 3210). The difference is it lacks an IPS screen, but then Lenovo's aren't meant for graphics by any stretch. It's a bit bigger at 14.1" and weighs 3.74 pounds (my friend just got one). Weight they give on the site is for the heaviest possible configuration (2 extra batteries in expansion bays) so it can be ignored). Has an option for a 1600x900 screen though. Does cost a bit more than the X230 but can still be purchased for around 900 or so with a Visa card for the Blu-Ray writer.

Both laptops are fully customizable during purchase so you can get more features (i.e. fingerprint scanner), or less. Both also have an mSATA slot - can be used for an SATA3 SSD in addition to regular hard drive or an integrated 4G card.

Won't touch the $2200 Macbook Pro Retina - for one, I never claimed anything about very specific, or very high-end models. For two, no-one makes retina screens. And for three, I don't see any use for them on a Windows machine: everything is going to be extremely tiny (goodbye eyesight), or isn't going to be scaled properly.

OracleofWuffing
2012-09-26, 05:28 AM
As much as I adore and would love to sing praises of Mac products... Wal-Mart does not sell Macbooks. Even if we ignore Mindfreak's budget issues (double-price discussion aside, Apple doesn't directly make a sub-900 notebook), that's a big problem for Mindfreak.

That said, I would second all caution about buying laptops on Wal-Mart layaway. The computers you see on their shelves are likely going to be what the industry refers to as "consumer grade," which is a nice way of saying "the cheapest we can get because we companies don't like spending money on you." This is a sweeping generalization that isn't always true, but if you want to get a computer with some amount of reliability, you want to start looking at an Office or Small Business line. The major players already offer their own payment plans or Bill Me Later if you absolutely must buy on layaway.

From the information you've given us, one thing I'd like to point out is that your brother does not necessarily want a "gaming" laptop. There are apparently people that run Team Fortress 2 on Windows 98. Virtually any modern laptop with an i3 processor and a single gig of RAM can run the game. Maybe not at the highest graphics settings (that'll be more a function of getting good video card/chipset with the computer), but it'll be able to play the game and you can still frag your hordes or whatever it is you do in that game perfectly fine.

If it were me (and I rarely spend even $75 on someone else- much less so for my siblings- so yeah, maybe I'm just a mean ol' scrooge), I'd actually get something like this (http://www.newegg.com/Product/Product.aspx?Item=N82E16834200545) for both of your siblings. A quick search got me to a fan site saying it'll run TF2 on medium to high settings. I personally prefer a 7200rpm hard drive, though, but it looks like the market's heading towards 5400 and I can't have nice things. Potentially because of the ubiquity of tablets, a lot of the 17" laptops (the kind that would have a number pad, which your sister might want) made now are made with "desktop replacement" in mind, and focus on being bulky and expensive (which your sister might not want). That's not to say they're not there, just that you'll have to look harder than the 5 minutes I have. :smallwink:

Here's (http://www.walmart.com/ip/Asus-Matte-Dark-Brown-15.6-K55A-RBR6-Laptop-PC-with-Intel-Core-i5-2450M-Processor-and-Windows-7-Home-Premium-with-Windows-8-Pro-Upgrade-Option/21002708) an option that's from Wal-Mart. Given that it's a 15" with a numpad, I don't really see it being a comfortable keyboard, but perhaps your store would have a display unit to see for yourself.

Brother Oni
2012-09-26, 06:23 AM
Also, the inane Samsung lawsuit, but that's beside the point and probably against forum rules.

The whole iphone debacle in China must have you positively giddy with schadenfreude then. :smalltongue:

factotum
2012-09-26, 06:35 AM
I personally prefer a 7200rpm hard drive, though, but it looks like the market's heading towards 5400 and I can't have nice things.

These days, if you really want decent disk performance from a laptop you need to go SSD--7200rpm 2.5" drives are usually intended for large servers, so they don't usually put them in laptops. Consider yourself lucky, I remember when 4200rpm was the standard spindle speed for a laptop hard drive! :smallsmile:

The Succubus
2012-09-26, 06:40 AM
Hey Rawhide, is there any chance we could change the forum rules to forbid Mac vs Windows and iPhone vs Android debates? It could fall under "religion" for some folks. :smalltongue:

Rawhide
2012-09-26, 06:58 AM
Recommending a high-end Macbook Pro to someone who specifically wanted a sub-$1000 laptop that can hopefully play games is.. weird.

I haven't recommended a high-end MacBook Pro to anyone. It was a comment that it could do all the things someone claimed laptops couldn't. It's well out of the original poster's budget (something I made sure to comment on) and was never recommended for them.

---

What is the quality of the screen like on the non-Apple laptop? What about the quality of the keyboard? And the quality of the touchpad (I've used them before, the ThinkPad ones are terrible)? How about the connectors and the quality of the power supply (you know that MacBook power supply connectors are magnetic?)? The little touches on all of the other little things?

As for sale prices, I have seen them discounted by 10% or more (that includes brand new models about a month old), and even more commonly with bundles/included extras (which, if you want what's in the bundle, can be a really good deal). Apple store online is a terrible way to compare, as it almost never discounts, and this is for a reason, they do not want to undercut their resellers. What you see there is RRP, not street price.

There are many, many elements that go into the creation of these laptops that increase their price within reason, when those things are important to you. If they aren't important to you, then the expense will be wasted. But the whole claim of being 2x the price is a load of bull.

I'm not saying that Macs are for everyone, and I'm definitely not claiming them to be cheap. All I am saying is that the prices are justified when everything in the construction and development of the systems is taken into consideration. If the features they have don't matter to you, then the expense won't be worth it for you.

I'm no Mac fanboi, nor do I think they are the perfect answer to every question, but I do hate the bashing of Apple/Mac products based on myths and untruths, the same way in which, while I don't think it's the best browser by any stretch, I also hate the bashing of IE based on myths and untruths.

---

Also, why won't you consider the retina display ones? Especially since those were exactly the machines I was talking about before you said anything, and the machines I challenged people to match.

Heliomance
2012-09-26, 07:18 AM
What exactly is retina display? I figured it was just meaningless marketing jargon, but you're talking about it like it's something special.

Rawhide
2012-09-26, 07:29 AM
What exactly is retina display? I figured it was just meaningless marketing jargon, but you're talking about it like it's something special.

High DPI display. The full resolution of the MacBook Pro Retina Display 15 inch screen is 2880 by 1800 pixels. Though, I personally was referring to the whole system (which is called "MacBook Pro with Retina Display"), rather than just the screen.

Erloas
2012-09-26, 09:24 AM
So, what's the equivalent of a GeForce GT 650M with 1GB of GDDR5 (not DDR3) RAM? I would tend to think that it would be a GeForce GT 650M with 1GB of GDDR5.
Well they might not do it on everything and it might change from generation to generation. My bet is that they didn't change that one because it was a higher end part. But I know having looked when this discussion came up before that there are a lot of cases where they do change the model name slightly and Apple is the only one with that SKU/designation.

One other thing that is probably part of the problem is that you are from Australia. Which is well known for having price issues compared to a lot of the rest of the world, and with that sometimes the price differences are more or less then what they are in other parts of the world. But it is also well known that Apple isn't putting more money into their laptops compared to their price because they have by far the highest profit margins of any computer manufacturer, so they are clearly pocketing more of your money then they are spending on giving you more for your money. No doubt that is good for the company and for the stock-holders, but its not to the benefit of the consumer.

What I don't really understand is why Macs even came up in this topic considering that they don't make a single model that fits the criteria being asked for by the original poster. A quick check of Newegg shows they have all of 2 computers under $1000, both of which are essentially netbooks in terms of size and pretty close in terms of specs. They don't have a full sized laptop (15"+), or one with a dedicated video card until you get to $1800. So of the very basic requirements from the OP they don't meet any of the criteria other then being a laptop. Given Newegg might not have the full line of Apple products, but they should have least have a decent cross-section of what Apple has

Rawhide
2012-09-26, 10:02 AM
What I don't really understand is why Macs even came up in this topic considering that they don't make a single model that fits the criteria being asked for by the original poster.

They weren't brought up in this topic as a recommendation for the original poster, this much has been discussed to death already. The relevant part was a gaming laptop with excellent battery life. It's possible, it exists, the MacBook Pro w/ Retina is one of them, they just aren't cheap.


But it is also well known that Apple isn't putting more money into their laptops compared to their price because they have by far the highest profit margins of any computer manufacturer, so they are clearly pocketing more of your money then they are spending on giving you more for your money. No doubt that is good for the company and for the stock-holders, but its not to the benefit of the consumer.

Not so fast with that profit margins claim. (http://www.tgdaily.com/technology/43029-teardown-reveals-thin-profit-margin-on-mac-mini)

Also, how is having more money from the previous generation of devices being re-invested into the current generation not of benefit to the consumer?

Drumbum42
2012-09-26, 01:13 PM
Boot Camp is running Windows natively. It's not a virtual machine, it's Windows running on its own partition, natively. You are "running Windows to begin with". It's the same as having a partition for Linux and a partition for Windows. Parallels and VMware Fusion are both virtual machines, and do slow things down - this may be what you're thinking of.

I wasn't discussing the OS, only the hardware.

Just to clear things up really quick, I'm a Mac tech. I work on them all day, and spend lots of time "under the hood" working on things. Using bootcamp windows is running "mostly" natively. Windows is not running on bare metal(or talking directly to CPU, motherboard, ext...). There is an emulator between the OS and hardware that is not there on PC hardware, mostly it is because of video cards (well, any cards really).

A Mac video card cannot run on PC motherboards and a PC video card can not be used on a Mac motherboard. It's a firmware thing, and unless you mod the firmware on the card, it doesn't work. That said, they both support x86_x64 instruction set, so the cpu commands need no translation, thus the "mostly" native.

To answer the original gaming aspect of this question, a windows install will run just like normal on a mac 99% of the time. BUT, there will be some things that just won't run the same. Sorry, but that was bothering me. Not that I blame anyone, because it's not something that's advertised.

Don Julio Anejo
2012-09-26, 01:41 PM
Comparing to my X230...


What is the quality of the screen like on the non-Apple laptop?
About the same. Anti-glare IPS screen (don't you have to pay extra for anti-glare on Macs? Not sure.). Probably wouldn't use it for photo editing, but contrast is comparable to my (specifically bought for photography) Dell U2212HM. It is a lot brighter than most Macs I've seen though.

What about the quality of the keyboard?
X230 is better by a mile, you can't even compare them. Doesn't help that Macs sacrifice key placement for design, so no PgUp/PgDown, Delete (as in delete letter in front, not the Backspace key called Delete on Macs), Home and End keys. Might be a Mac thing in general, but still annoying for any serious amount of reading or word processing.

And the quality of the touchpad (I've used them before, the ThinkPad ones are terrible)?
Thinkspads have a trackpoint mouse in the middle of the keyboard. I'll agree on the point of the touchpad itself, but generally it's only there just in case. I'll give it parity though, since ergonomics usually = personal preference.

How about the connectors and the quality of the power supply
2x USB 3.0, 1x 2.0, VGA, mini-DisplayPort, built in b/g/n (A is available for another $20) with a 1x1 or 2x2 antenna, LAN port, SD card reader and a headset jack. Power supply is the only thing it loses, it's a more conventional one, but it's designed so if you yank on it, it pops out instead of breaks something.

The little touches on all of the other little things?
On the other hand, no upgradeability whatsoever. Probably doesn't matter to about 80-85% of people who buy computers, but matters to tech geeks like me or normal people who like DIY to save money. Mac: only RAM is upgradeable, 2x4GB max. X230: RAM: 2x8GB. Hard drive: completely removable/upgradeable with SATA3, so can put in a bigger hard drive or SSD. mSATA slot: can house an integrated HSPA or 4G card, or an SATA3 SSD. So you can use one as your system drive while still keeping the hard drive. Now, Lenovo's are in the top tier for upgradeable PC's, but even then, all of them allow you to change the hard drive. On many Pro's I've seen, you can't even remove the battery without opening the case, which may or may not void the warranty.

Also, any Lenovo with an optical bay lets you instead stick another battery or hard drive there instead.

As for sale prices, I have seen them discounted by 10% or more (that includes brand new models about a month old), and even more commonly with bundles/included extras (which, if you want what's in the bundle, can be a really good deal). Apple store online is a terrible way to compare, as it almost never discounts, and this is for a reason, they do not want to undercut their resellers. What you see there is RRP, not street price.
True enough, although several years ago I wanted a Macbook Pro (my main concern was battery life and back then mid-spec PC's didn't run for more than 3 hours or so), couldn't find any sales, gave up and got a netbook.

There are many, many elements that go into the creation of these laptops that increase their price within reason, when those things are important to you. If they aren't important to you, then the expense will be wasted. But the whole claim of being 2x the price is a load of bull.

Didn't I just show you that the price is if not double, then at least 1.5x higher?


I'm not saying that Macs are for everyone, and I'm definitely not claiming them to be cheap. All I am saying is that the prices are justified when everything in the construction and development of the systems is taken into consideration. If the features they have don't matter to you, then the expense won't be worth it for you.

I know exactly what I'm paying for. I can, for example, dump a cup of coffee on my keyboard, then wash it down with a cup of water and the only thing I'll get is sticky keys. If that starts to annoy me, I can pop the keyboard and put in a new one in 5 minutes.

I honestly don't see what there is worth paying for that much in Macs except screens and a really nice design and maybe form factor for higher-spec models.

Also, why won't you consider the retina display ones? Especially since those were exactly the machines I was talking about before you said anything, and the machines I challenged people to match.
I don't see any point to having a retina screen unless you're an artist and want lines to look smoother or something. There's a reason pretty much no-one makes desktop monitors beyond 2560x1440, and that's on 27" models - they're not needed (and chances are will mess up scaling for text). But since no-one else makes them, it's also difficult to compare. I did a short comparison of specs by themselves, and ended up with $1500 or so after purchase of a separate SSD for a Lenovo T530, but the comparison is moot since T530 is 3/4 of a pound heavier and the video card is only 540M.


Also, how is having more money from the previous generation of devices being re-invested into the current generation not of benefit to the consumer?
Because the money isn't? All the R&D is done by manufacturers like Intel or nVidia. And the link only hinders your argument. They specifically state that the low profit margin was for basically the absolute cheapest Mac you could get at the time. When you look at custom configuration for 2012 Pro's, adding 4GB of RAM costs something like $100, while a stick of RAM by itself is $20 if I buy one as a consumer, and probably a lot less for a major company.

Erloas
2012-09-26, 02:14 PM
I did a short comparison of specs by themselves, and ended up with $1500 or so after purchase of a separate SSD for a Lenovo T530, but the comparison is moot since T530 is 3/4 pounds heavier and the video card is only 540M.
Well I did a bit of a check from earlier, the $1800 Macbook Pro (not retinal display, thats at least $400 more)
We had 3 options with the same video card and processor and all 3 are 15" laptops.
http://www.newegg.com/Product/Product.aspx?Item=N82E16834100226
http://www.newegg.com/Product/Product.aspx?Item=N82E16834152339
http://www.newegg.com/Product/Product.aspx?Item=N82E16834230411
The MSI is $1040 before MIR, the ASUS is $1150 and the Apple is $1800. The Macbook is heavier then the MSI by a little bit (the Asus doesn't have weight listed). The Asus has 2x as much RAM, the MSI has 1.5x as much RAM. Both have 1.5x more hard drive space.
The Macbook has the lowest resolution of the 3. Both the MSI and Asus have 2GB video memory, the Apple has 512MB (though as I said, its almost meaningless).
It looks like the battery advantage goes to Apple because they have a 77W/Hr battery where as the other 2 have 6-cell batteries (quick search shows those being 56W/hr) so about a 30% better battery (because the battery is going to be the determining factor since Nvida makes the video card and Intel makes the processor and there isn't much difference in motherboard power draw (Apple doesn't make their motherboards, I assume FoxConn does, as an aside Apple doesn't make any internal components, all they make is the case, and there are only a few places that make LCD displays and Apple isn't one of them, those few places sell panels to everyone else)). And all 3 have 5400 RPM hard drives.
The Asus has a blu-ray drive in it, the other two I think are just DVD (the Apple doesn't specify). All 3 have bluetooth, wireless (obviously) and USB 3.0 ports. Apple has a thunderbolt port... we can be changed to an actually useful HDMI port with an adapter, the other two have HDMI direct and Apple has Firewire ports. So in theory the Apple has the advantage in ports but Thunderbolt is rarely used and doesn't seem to be growing with any real speed and Firewire is all but dead and only useful if you very specifically already need it.

Personally $650 is way too much to pay for a little bit better battery (you could buy about 8-10 spare batteries for that cost) and a brushed aluminum case. So I wasn't too far off, thats a 35% difference and I figured it would be in the 20-30% range.

OracleofWuffing
2012-09-26, 05:18 PM
These days, if you really want decent disk performance from a laptop you need to go SSD--7200rpm 2.5" drives are usually intended for large servers, so they don't usually put them in laptops. Consider yourself lucky, I remember when 4200rpm was the standard spindle speed for a laptop hard drive! :smallsmile:
Yeah, I think I used to use one for an IRC bot.:smalltongue:

Maaan, that sucks, though. Are decently-sized SSD prices still through the roof? *Checks Newegg* Yes, yes they are. :smallfrown:

Rawhide
2012-09-26, 07:41 PM
Just to clear things up really quick, I'm a Mac tech. I work on them all day, and spend lots of time "under the hood" working on things. Using bootcamp windows is running "mostly" natively. Windows is not running on bare metal(or talking directly to CPU, motherboard, ext...). There is an emulator between the OS and hardware that is not there on PC hardware, mostly it is because of video cards (well, any cards really).

A Mac video card cannot run on PC motherboards and a PC video card can not be used on a Mac motherboard. It's a firmware thing, and unless you mod the firmware on the card, it doesn't work. That said, they both support x86_x64 instruction set, so the cpu commands need no translation, thus the "mostly" native.

To answer the original gaming aspect of this question, a windows install will run just like normal on a mac 99% of the time. BUT, there will be some things that just won't run the same. Sorry, but that was bothering me. Not that I blame anyone, because it's not something that's advertised.

Didn't think it was worth mentioning that mac uses EFI rather than BIOS, which Windows XP and Vista didn't support (but later editions do). Though, I still wouldn't want to use Windows 7 on an EFI device without manufacturer support, as it's not really mature enough. It's such a minor point, however.


Comparing to my X230...

About the same. Anti-glare IPS screen (don't you have to pay extra for anti-glare on Macs? Not sure.). Probably wouldn't use it for photo editing, but contrast is comparable to my (specifically bought for photography) Dell U2212HM. It is a lot brighter than most Macs I've seen though.

I can't do a direct comparison, not having them in front of me, but in general I've found all the monitors on Macs to be of relatively high quality, and usually of higher or much higher quality of similar systems people claim are "better" or "cheaper".


X230 is better by a mile, you can't even compare them. Doesn't help that Macs sacrifice key placement for design, so no PgUp/PgDown, Delete (as in delete letter in front, not the Backspace key called Delete on Macs), Home and End keys. Might be a Mac thing in general, but still annoying for any serious amount of reading or word processing.

I'm talking quality of construction, not placement of keys. And you are aware that you can plug any USB keyboard in? Laptops specifically have a reduced number of features (also, try pressing Fn-Delete, it deletes backwards).


Thinkspads have a trackpoint mouse in the middle of the keyboard. I'll agree on the point of the touchpad itself, but generally it's only there just in case. I'll give it parity though, since ergonomics usually = personal preference.

I hate touchpoint mice almost as much as I hate touchpads. Well, except for the touchpad on a Mac, which they actually sell separately for desktops, and people actually buy them!


2x USB 3.0, 1x 2.0, VGA, mini-DisplayPort, built in b/g/n (A is available for another $20) with a 1x1 or 2x2 antenna, LAN port, SD card reader and a headset jack. Power supply is the only thing it loses, it's a more conventional one, but it's designed so if you yank on it, it pops out instead of breaks something.

Again, I'm talking quality of construction.


On the other hand, no upgradeability whatsoever. Probably doesn't matter to about 80-85% of people who buy computers, but matters to tech geeks like me or normal people who like DIY to save money. Mac: only RAM is upgradeable, 2x4GB max. X230: RAM: 2x8GB. Hard drive: completely removable/upgradeable with SATA3, so can put in a bigger hard drive or SSD. mSATA slot: can house an integrated HSPA or 4G card, or an SATA3 SSD. So you can use one as your system drive while still keeping the hard drive. Now, Lenovo's are in the top tier for upgradeable PC's, but even then, all of them allow you to change the hard drive. On many Pro's I've seen, you can't even remove the battery without opening the case, which may or may not void the warranty.

Also, any Lenovo with an optical bay lets you instead stick another battery or hard drive there instead.

Yes, upgradeability is one area where it loses out. But, as you mention and in the laptop market in particular, this is one area it doesn't matter so much. However, if that is something that matters to you, then it can be a huge negative.

The trade off here is that you get smaller, lighter, devices of higher quality (e.g. the monitors have the glass attached to them, which reduces glare, reduces thickness, and produces a generally better screen, but has to be replaced entirely if broken).


True enough, although several years ago I wanted a Macbook Pro (my main concern was battery life and back then mid-spec PC's didn't run for more than 3 hours or so), couldn't find any sales, gave up and got a netbook.

Every single time I went to upgrade my PC, I checked Macs out. Every single time I looked at the features and quality of a Mac and wanted it. Every single time I decided to buy a PC with Windows on it. Why? Because the features I want and placed importance on could be purchased for much cheaper elsewhere.

They're not for everyone and they weren't for me. But they are good systems and they are reasonably priced for what you get. The only reason I have one now as just one of my systems is for software development and testing.


Didn't I just show you that the price is if not double, then at least 1.5x higher?

No, you cherry picked the features/specs that you favour and two systems which don't compare on the other features (one of which has become a budget supplier), then compared the RRP of one to the street price of the other.

If those are only the features you care for, then buy the cheaper one, it's as simple as that. But the price difference isn't as simple as saying you're paying more for less, because you're not.

It is also, again, not the system I was talking about before you said anything, nor was it the system I challenged people to match.


I know exactly what I'm paying for. I can, for example, dump a cup of coffee on my keyboard, then wash it down with a cup of water and the only thing I'll get is sticky keys. If that starts to annoy me, I can pop the keyboard and put in a new one in 5 minutes.

I honestly don't see what there is worth paying for that much in Macs except screens and a really nice design and maybe form factor for higher-spec models.

Then don't buy one. No one is forcing you, least of not me. But other people can see what they're paying for, do like those features, and do decide to purchase one.


I don't see any point to having a retina screen unless you're an artist and want lines to look smoother or something. There's a reason pretty much no-one makes desktop monitors beyond 2560x1440, and that's on 27" models - they're not needed (and chances are will mess up scaling for text). But since no-one else makes them, it's also difficult to compare. I did a short comparison of specs by themselves, and ended up with $1500 or so after purchase of a separate SSD for a Lenovo T530, but the comparison is moot since T530 is 3/4 of a pound heavier and the video card is only 540M.

Honestly, as I have already said, I wasn't talking about the retina display, but the whole package. Industry leading touchpad, thin, light, long battery life, flash memory hard drive, high quality construction, high end (for a laptop) graphics card, fast processor, and all the other features that make using it just so much easier and more pleasant. The retina display is also awesome, by the way, even for non "artist" related tasks and you should not dismiss it so easily. OS X scales things very well and provides drivers for Windows (which already partially does and should soon better support its own High DPI features).


Because the money isn't? All the R&D is done by manufacturers like Intel or nVidia. And the link only hinders your argument. They specifically state that the low profit margin was for basically the absolute cheapest Mac you could get at the time. When you look at custom configuration for 2012 Pro's, adding 4GB of RAM costs something like $100, while a stick of RAM by itself is $20 if I buy one as a consumer, and probably a lot less for a major company.

You obviously don't know how Apple's R&D works then.

That article doesn't hinder it at all. In general, all higher end systems will have higher profit margins, and those potentially higher profit margins aren't as high as you claim.

And let's not forget what they are doing for their social responsibility (http://www.apple.com/supplierresponsibility/). Yes, they publish the reports and have actually terminated contracts for non-compliance (http://www.telegraph.co.uk/technology/apple/8324867/Apples-child-labour-issues-worsen.html).

Don Julio Anejo
2012-09-26, 09:01 PM
I can't do a direct comparison, not having them in front of me, but in general I've found all the monitors on Macs to be of relatively high quality, and usually of higher or much higher quality of similar systems people claim are "better" or "cheaper".
That's because all Macs use IPS display panels. Most PC's use cheaper TN (= less motion blur, really good response time but really bad contrast/color reproduction and viewing angles).

I'm talking quality of construction, not placement of keys. And you are aware that you can plug any USB keyboard in? Laptops specifically have a reduced number of features (also, try pressing Fn-Delete, it deletes backwards).

And I'm talking ergonomics, where I'm sorry, but Mac loses significantly (IMO only of course):
a) keys are a little too hard to press
b) keys are too far apart (then again, someone who's used a Mac their entire life will find PC keyboards cramped)
c) lots of buttons I'm missing. Sorry, but when me (or anyone, really) has to type a 10-50 page paper, it gets really annoying having to use 2 hands and 2 keys to do something that's standard on 95% of keyboards that should already be there since it's... standard. Also, PgUp/PdDn keys on many PC's are now near the arrow keys to simplify navigation.
d) I shouldn't have to plug in a separate keyboard. Mouse, maybe, but keyboard? Plus then the screen would be too far away unless I'm at a computer desk. Plus I prefer to type in an armchair/my bed.

I hate touchpoint mice almost as much as I hate touchpads. Well, except for the touchpad on a Mac, which they actually sell separately for desktops, and people actually buy them!

Which is why I didn't want to address ergonomics to begin with: they're really personal. I don't like large touchpads, for example, I prefer smaller ones, like 2"x2" or so. I also hate all kinds of multitouch gestures - I prefer to do the action on the keyboard, and if the touchpad has that feature, anytime I drag my wrist, hold the finger sideways, or my hands are wet, something multitouch happens and screws up whatever I was doing.

Again, I'm talking quality of construction.
Brushed aluminium case pretty much falls under design. Overall quality, it's about the same as higher-end Asus models and loses to ThinkPads. Why?
a) Shockproof. System itself is also built in such a way that any fall will amortize the hard drive, so worst case scenario, data should still be recoverable.
b) Spillproof. Top part of the laptop is completely separate from the internals, there's grooves leading to the bottom of the case for liquids to drain from and keyboard/touchpad/trackpoint are all easily replaceable.
Nice shiny exterior? You got me here, but I've admitted that from the start. And even then, Asus Zenbook (admittedly, probably a Macbook Air clone), has the same exterior.

The trade off here is that you get smaller, lighter, devices of higher quality (e.g. the monitors have the glass attached to them, which reduces glare, reduces thickness, and produces a generally better screen, but has to be replaced entirely if broken).
What you're describing is called an anti-glare (aka matte) screen. Most business laptops have them (e.g. standard on most Dells, on all ThinkPads, most Fujitsu, HP business line, etc) although most consumer-grade ones you see at Walmart/Best Buy won't since glare (aka glossy) screens are cheaper to make and look better on the showroom floor. That said, very few companies put in IPS panels standard (a few Asus/ThinkPad models have them and I think that's pretty much it). Lighter? Depends on the model, as shown to you by both me and Erloas, but in general you can find a similar spec'd laptop with similar weight. Main loss is usually battery life.

Every single time I went to upgrade my PC, I checked Macs out. Every single time I looked at the features and quality of a Mac and wanted it. Every single time I decided to buy a PC with Windows on it. Why? Because the features I want and placed importance on could be purchased for much cheaper elsewhere.
Which is exactly our point. No-one is saying Macs are bad (they're not). We are, however, saying that they're more expensive for the same features.


No, you cherry picked the features/specs that you favour and two systems which don't compare on the other features (one of which has become a budget supplier), then compared the RRP of one to the street price of the other.
I compared my X230 to the cheapest Macbook Pro given that size/weight/battery life/tech specs are comparable, with the lack of optical drive (which many people don't use anymore) compensated by weighing a pound less. I also went with the price off the Lenovo website (to be fair, you can only get ThinkPads off the official website). Also, in Canada there are very few resellers, and except for university bookstores, they're still technically an Apple filial (i.e. displays at Future Shop/Best Buy usually have a special Mac guy with them instead of normal salespeople). I'm about 90% certain that prices everywhere are set by Apple and not the reseller. No idea about the States, I haven't lived there in something like 12 years.


It is also, again, not the system I was talking about before you said anything, nor was it the system I challenged people to match.
Erloas matched it.


Honestly, as I have already said, I wasn't talking about the retina display, but the whole package. Industry leading touchpad, thin, light, long battery life, flash memory hard drive, high quality construction, high end (for a laptop) graphics card, fast processor, and all the other features that make using it just so much easier and more pleasant. The retina display is also awesome, by the way, even for non "artist" related tasks and you should not dismiss it so easily. OS X scales things very well and provides drivers for Windows (which already partially does and should soon better support its own High DPI features).

As I've said, it's only high-quality construction when compared to El Cheapo consumer models that usually have to sacrifice at least something. While not every manufacturer places a high priority on it, many companies do, in fact, have really durable models/lines (ThinkPad, Zenbook, etc). I also admitted quite freely that on a Windows machine, there's no point to a Retina screen, at least not yet. After Windows 8, maybe. Now, not so much. For comparison to non-retina Macbook, see Erloas' post.


You obviously don't know how Apple's R&D works then.
Maybe not, but I started hating Apple as a company after the Samsung lawsuit, iPhone 5 being exactly the same thing as iPhone 4s with 4G (which should have come with 4s to begin with), except $150 more at release; and Apple Maps. All of them show a lack of R&D and lots of marketese, which is the worst thing I can think of for a tech company to focus on.

Rawhide
2012-09-26, 09:20 PM
We're getting way off topic for this thread, and I would like to wind this up as much as possible, there's just a few things I'd like to address first.


d) I shouldn't have to plug in a separate keyboard. Mouse, maybe, but keyboard? Plus then the screen would be too far away unless I'm at a computer desk. Plus I prefer to type in an armchair/my bed.

I'm not saying that you should plug in a keyboard, just that the option is there. I use external keyboards on all laptops when they are at a desk, because the laptop keyboards are just terrible in comparison. Well, except for the Mac, which works really well and feels really good. All the keys you need are there when you learn the differences (also, the arrow keys can be modified by a keypress to do PgUp, PgDn, Home, and End).

It's actually the only laptop I've used that I enjoy using on my lap or laying down in a bed.


Which is why I didn't want to address ergonomics to begin with: they're really personal. I don't like large touchpads, for example, I prefer smaller ones, like 2"x2" or so. I also hate all kinds of multitouch gestures - I prefer to do the action on the keyboard, and if the touchpad has that feature, anytime I drag my wrist, hold the finger sideways, or my hands are wet, something multitouch happens and screws up whatever I was doing.

The multi touch gecture recognition is actually really good and works really well, as well as regognising accidental touches really well as well. However, if it's still not your cup of tea, every single gesture can be switched off granularly (want them all off? Sure! Want only this one on? That can be done too).


What you're describing is called an anti-glare (aka matte) screen.

No, I'm describing a glass screen, glossy, with greatly reduced glare.


Which is exactly our point. No-one is saying Macs are bad (they're not). We are, however, saying that they're more expensive for the same features.

No, they're not. They may be more expensive for the features you place value on, but that does not make them "more expensive for the same features".


Erloas matched it.

No, he didn't.

Erloas
2012-09-26, 10:01 PM
Erloas matched it.
To be fair I matched a different MacBook pro. The retinal display version is $2200, has 256GB SSD (instead of 500GB normal hard drive), has 8GB of RAM, and is about 1 lb lighters and all other specs are essentially the same.
The extra 4GB of RAM retails for about $20 (4GB is $16, 8GB is about $34). The 256GB SSD is about $160-250 depending on brand and specs, so $90-180 more then the standard hard drive. Its hard to put a price on weight, though I wouldn't be surprised if the majority of that weight savings just came from the SSD. And the last bit is the retinal display... which is really just not possible to price. The only desktop monitors with close to that resolution are all 30" and there aren't any 30" monitors with a lower resolution to compare price to resolution change rather then size change.
Of course the prices given are retail prices, what any consumer could go to Newegg and buy the parts for, surely Apple gets a better price on the parts.

One thing I hadn't caught right away, though I did notice it earlier when I posted just didn't mention it at that time, is that those are the GT nvidia cards rather then the GTX cards. The GTX line is their more powerful line. And its hard to give a good comparison of what that change actually means. The significantly more powerful (and pretty much top of the line laptop GPU) GTX 670M can be found in laptops starting around $1400, though those are all 17" desktop replacement types of laptops and hard to find anything from Apple that really compares to them in any meaningful way. There are a lot of options around with something more powerful then the GT 650M but less powerful then the GTX 670M (including AMDs line of GPUs) but it just makes the comparisons much more complicated again. One of the main reasons is that very few people are going to be spending the money for a high end gaming laptop and going to stick with the smaller 15" form factor so no one makes anything less then 17" with the higher end video cards.

Rawhide
2012-09-26, 10:11 PM
One thing I hadn't caught right away, though I did notice it earlier when I posted just didn't mention it at that time, is that those are the GT nvidia cards rather then the GTX cards. The GTX line is their more powerful line. And its hard to give a good comparison of what that change actually means.

From what I've read, with the clock speeds Apple has used, and the GDDR5 RAM, the GT 650M used on the MacBook Pro line compares favourably to the GTX 660M. Also, let's not forget that they actually use two graphics cards - switching between them as required.

Don Julio Anejo
2012-09-26, 11:03 PM
From what I've read, with the clock speeds Apple has used, and the GDDR5 RAM, the GT 650M used on the MacBook Pro line compares favourably to the GTX 660M. Also, let's not forget that they actually use two graphics cards - switching between them as required.
All Ivy/Sandy Bridge cards switch between integrated and discrete GPU's if the latter are available. The main thing about Macbook GT 650M is that it uses DDR5, the normal 650M doesn't, which is about a 15-20% difference in framerate (oh the things you learn while arguing on an internet forum.. :smalltongue:). However, as for clock speeds, they are easily adjustable with a program like Riva Tuner as long as the actual card is the same. Mac 650M is overclocked a little more than stock PC cards.

Rawhide
2012-09-27, 03:52 AM
All Ivy/Sandy Bridge cards switch between integrated and discrete GPU's if the latter are available. The main thing about Macbook GT 650M is that it uses DDR5, the normal 650M doesn't, which is about a 15-20% difference in framerate (oh the things you learn while arguing on an internet forum.. :smalltongue:). However, as for clock speeds, they are easily adjustable with a program like Riva Tuner as long as the actual card is the same. Mac 650M is overclocked a little more than stock PC cards.

Not all include a GPU, but most do. However, that's beside the point. That's a specific architecture designed to include a graphics card, which not all companies use if they are going to use a dedicated graphics card. You do have to pay extra (over the cost of a CPU without GPU) in order to get that included GPU, and most companies will try to save those few bucks by just having one or the other.

They also have special on the fly graphics card switching technology (http://arstechnica.com/apple/2010/04/inside-apples-automatic-gpu-switching/). They're not the only company to do that, and that article is old, so technology for all parties has improved, but they do have their own technology to help take care of that.


The difference with the clock speeds, is that these are factory set clock speeds. They are sourced from NVIDIA as being guaranteed to work at that speed (and generally higher). Basically, they are designed and warranted to work at those clock speeds. But, that too is almost beside my point. The point is that the model number is clear, and it is in fact better than most of the cards of the same model.

Erloas
2012-09-27, 09:57 AM
From what I've read, with the clock speeds Apple has used, and the GDDR5 RAM, the GT 650M used on the MacBook Pro line compares favourably to the GTX 660M. Also, let's not forget that they actually use two graphics cards - switching between them as required.

I haven't seen the specific breakdowns of the clock rates and such, but I know at least the MSI laptop had GDDR5 as well, and the Asus isn't listed (neither is Apples) on Newegg. Most that I have found listed all show GDDR5 though. A quick check shows the GTX 660M to be about 10% faster then the GT 650M, the GTX 670M is about 20-25% faster, and the GTX 680M is about 25-30% faster. The GTX 680M is pretty rare though. The GTX 670M seems pretty common and can be found in a number of laptops in the $1300-1500 range.

As for the integrated graphic card switch, the Intel HD4000 is integrated into every Ivy Bridge laptop processor, its the same processor virtually ever mid to high end laptop uses, including both of the ones I linked to. They can all switch between the lower powered integrated and more power hungry discrete card as needed. It used to be a less common feature but it is pretty much standard now with Ivy Bridge. It has been common in the $1000+ price range laptops for quite a while too.

And as far as that goes, I would be interested to see the actual battery run times of any of these systems running games. Apple's site said up to 7 hours, but there are always a lot of caveats with any battery tests and it wasn't very clear what their use-case was. What is says is "up to 7 hours wireless web," which to me says only low usage gets you close to that, no discrete graphics card use and limited CPU utilization. I would guess that its not going to run a game without being plugged in for more then 1 hour, maybe 2.

Rawhide
2012-09-27, 11:18 AM
I haven't seen the specific breakdowns of the clock rates and such, but I know at least the MSI laptop had GDDR5 as well, and the Asus isn't listed (neither is Apples) on Newegg. Most that I have found listed all show GDDR5 though. A quick check shows the GTX 660M to be about 10% faster then the GT 650M, the GTX 670M is about 20-25% faster, and the GTX 680M is about 25-30% faster. The GTX 680M is pretty rare though. The GTX 670M seems pretty common and can be found in a number of laptops in the $1300-1500 range.

Here's the clock speeds. (http://images.anandtech.com/reviews/mac/retinaMacBookPro/gpuz.jpg)

You can see that they are very aggressively clocked, at 900mhz, which is higher than the almost identical 660m (http://www.geforce.com/hardware/notebook-gpus/geforce-gtx-660m/specifications).


And as far as that goes, I would be interested to see the actual battery run times of any of these systems running games. Apple's site said up to 7 hours, but there are always a lot of caveats with any battery tests and it wasn't very clear what their use-case was. What is says is "up to 7 hours wireless web," which to me says only low usage gets you close to that, no discrete graphics card use and limited CPU utilization. I would guess that its not going to run a game without being plugged in for more then 1 hour, maybe 2.

Apple's "wireless web" methodology is a bit more aggressive in battery use than other manufacturers generally use for their tests (and some reviewers use, with one site testing it at over 8 hours of constant web browsing). In the tests, it's constantly using the wireless adaptor to browse "25 popular sites".

That said, it's still not real world for all people, and heavy use with the dedicated graphics card will reduce that time (light use will increase it). But the battery is definitely no slouch, even in moderate to heavy use.

Neftren
2012-09-27, 11:49 AM
Here's the clock speeds. (http://images.anandtech.com/reviews/mac/retinaMacBookPro/gpuz.jpg)

You can see that they are very aggressively clocked, at 900mhz, which is higher than the almost identical 660m (http://www.geforce.com/hardware/notebook-gpus/geforce-gtx-660m/specifications).

The GT650M is extremely aggressively (over)clocked. You could push it even higher though. I've been reading some results of Windows OC'ing, with people pushing it over 1000 MHz Core (OTOH 1200 Shader, but I'd have to double check).

I've got my 2011 MBP with AMD 6750M clocked at 750/[email protected] for reference.


Apple's "wireless web" methodology is a bit more aggressive in battery use than other manufacturers generally use for their tests (and some reviewers use, with one site testing it at over 8 hours of constant web browsing). In the tests, it's constantly using the wireless adaptor to browse "25 popular sites".

That said, it's still not real world for all people, and heavy use with the dedicated graphics card will reduce that time (light use will increase it). But the battery is definitely no slouch, even in moderate to heavy use.

Apple's Wireless Web methodology is a pretty lousy standard as is. I get anywhere from 3 hours to 11 hours on my (2011) MBP depending on what I'm doing. The average is around 5 hours, though I am compiling code on this machine and such... I think the machine was advertised at 7 hours of battery life, so I'm coming in under (granted I'm not just "browsing the web").

Rawhide
2012-09-27, 12:23 PM
Apple's Wireless Web methodology is a pretty lousy standard as is. I get anywhere from 3 hours to 11 hours on my (2011) MBP depending on what I'm doing. The average is around 5 hours, though I am compiling code on this machine and such... I think the machine was advertised at 7 hours of battery life, so I'm coming in under (granted I'm not just "browsing the web").

It's better than both their previous methodology and the methodologies that are used by the industry in general.

And the tests aren't designed to be for heavy use, but rather for constant light to moderate use, which is actually quite a common way people use laptops.

That said, I'm definitely not claiming it's a perfect test.

Erloas
2012-09-27, 02:34 PM
Here's the clock speeds. (http://images.anandtech.com/reviews/mac/retinaMacBookPro/gpuz.jpg)

You can see that they are very aggressively clocked, at 900mhz, which is higher than the almost identical 660m (http://www.geforce.com/hardware/notebook-gpus/geforce-gtx-660m/specifications).
Well I'm going to assume that anandtech picture is a stock macbookpro, because knowing anandtech they could very well be trying to OC stuff. But looking at the nvidia site you linked to, the GT 650M has a clock rate up to 900MHz, same as the MacBook shows. The question is if the MacBook is the only one using the 900MHz core clock or how many other manufacturers are using it. There would then also be the question if Apple uses that with all of their GT 650M, or only in the retina display model.

But even beside that, PC based laptops with the GTX 660M (what the Apple one is close to with its clock rate according to what you said earlier) starts from $900 (http://www.newegg.com/Product/Product.aspx?Item=N82E16834246628) to $1600 (http://www.newegg.com/Product/Product.aspx?Item=N82E16834230405), and of course all of the GTX660Ms use GDDR5. They all have the same CPU. The MacBookPro with a normal display starts at $1800 ($200 more then the most expensive and $900 (2x) as the cheapest one), and the Retinal Display version has the same hardware in terms of processing/gaming and its $400 more (with the advantage of the better display and SSD).

The other issue is when you start looking at the Retinal Display, its has about 2x as many pixels as a 1920x1080 display, so to actually run at that resolution your GPU has to have 2x the power and more video RAM to actually run at that. According to notebookcheck.net (first place I found with a lot of benchmarking for laptop GPUs) the GTX 660M runs well at 1440x900, but is really struggling by the time you get to 1920x1080 and is going to be no where near capable of running any game at that 2880x1800 resolution of the Retinal Display. So all that extra resolution you're getting with the Retinal Display is being lost if your playing games, and the 1920x1080 that most other laptops are running at that resolution can make use of it. Because by the time you get to the $1800-$2200 of the MacBookPros you are competing against the GTX 670M and 675M, which are a lot more powerful.
As an aside, the 2880x1800 resolution is an upscale of 1920x1200, which I prefer over the 1920x1080 resolution, but again, thats an extra 10% of screen area to fill when gaming, so requires 10% more GPU power. I would definitely look at an 2880x1800 monitor for my computer when they start making them.

Neftren
2012-09-27, 07:23 PM
It's better than both their previous methodology and the methodologies that are used by the industry in general.

And the tests aren't designed to be for heavy use, but rather for constant light to moderate use, which is actually quite a common way people use laptops.

That said, I'm definitely not claiming it's a perfect test.

Good point. :smallbiggrin:


Well I'm going to assume that anandtech picture is a stock macbookpro, because knowing anandtech they could very well be trying to OC stuff. But looking at the nvidia site you linked to, the GT 650M has a clock rate up to 900MHz, same as the MacBook shows. The question is if the MacBook is the only one using the 900MHz core clock or how many other manufacturers are using it. There would then also be the question if Apple uses that with all of their GT 650M, or only in the retina display model.

Anand does not overclock their review units, unless explicitly doing an overclock test.

I doubt very many other manufacturers factory clock their GT 650M at 900 MHz. If anything, I would underclock and undervolt it, plus tack on NVidia Optimus (for non-Apple laptops) for improved battery life.

I would be inclined to believe that Apple is factory-clocking all MBP models at 900 MHz core, just to make it easier to manufacture everything.


But even beside that, PC based laptops with the GTX 660M (what the Apple one is close to with its clock rate according to what you said earlier) starts from $900 (http://www.newegg.com/Product/Product.aspx?Item=N82E16834246628) to $1600 (http://www.newegg.com/Product/Product.aspx?Item=N82E16834230405), and of course all of the GTX660Ms use GDDR5. They all have the same CPU. The MacBookPro with a normal display starts at $1800 ($200 more then the most expensive and $900 (2x) as the cheapest one), and the Retinal Display version has the same hardware in terms of processing/gaming and its $400 more (with the advantage of the better display and SSD).

The other issue is when you start looking at the Retinal Display, its has about 2x as many pixels as a 1920x1080 display, so to actually run at that resolution your GPU has to have 2x the power and more video RAM to actually run at that. According to notebookcheck.net (first place I found with a lot of benchmarking for laptop GPUs) the GTX 660M runs well at 1440x900, but is really struggling by the time you get to 1920x1080 and is going to be no where near capable of running any game at that 2880x1800 resolution of the Retinal Display. So all that extra resolution you're getting with the Retinal Display is being lost if your playing games, and the 1920x1080 that most other laptops are running at that resolution can make use of it. Because by the time you get to the $1800-$2200 of the MacBookPros you are competing against the GTX 670M and 675M, which are a lot more powerful.
As an aside, the 2880x1800 resolution is an upscale of 1920x1200, which I prefer over the 1920x1080 resolution, but again, thats an extra 10% of screen area to fill when gaming, so requires 10% more GPU power. I would definitely look at an 2880x1800 monitor for my computer when they start making them.

As a note, under OSX, the display isn't actually running at 2880x1800 (easy to verify if you judge using font/interface DPI), at least, not in the traditional sense. I don't actually own a Retina MBP, and without having done any additional research on the matter, my theory is that Apple is just super-sampling every interface element, then downscaling it back to 1440x900. I could be totally wrong.

Also, you're probably not going to be gaming at 2880x1800. 10% additional screen != 10% more GPU power. That's just not how it works. It's relatively easy to draw more pixels. What's not easy is post-processing. Advanced lighting, shadows, cloth simulation, and shaders are the primary bottleneck. The GT 650M has proven to be quite capable at rendering games in 2880x1800. How it will fare in a few years? Who knows. It probably won't be running the latest framerate intensive FPS, but for running something like Civilization, it's perfect.

As an aside, if you're buying a Macbook Pro, you're probably not primarily gaming on it. A Retina Display is phenomenally useful for web design or other art-related disciplines. Might I add that by buying an ASUS G7x, you're getting what is essentially baking dish sized slab that weighs twice (three times?) that of the thin slate of a MacBook Pro.

Buying computers is a series of tradeoffs, factoring in performance, weight, form-factor, cost, and so on. Apple chose "slim, strong performance, light, extremely expensive" while ASUS chose "brick, strong performance, HEAVY, more reasonably priced" ... Laptops aren't all about the insides you know... :smallbiggrin:

Rawhide
2012-09-27, 08:02 PM
Erloas: Before the rMBP was released, the NVIDIA site listed the GT 650M as having the following clock speeds:
GDDR3 version: Up to 850MHz
GDDR5 version: Up to 735MHz

And that was an image of the stock speeds to confirm what they were for review.

Karoht
2012-09-27, 09:05 PM
Instead, put the money aside (in an interest bearing account if possible) and then buy them when you have the money.

Seconded.
Put the money aside, wait until black friday or some other sale.

It's basically how I buy any computer component I'm looking for. Save up, put the money away, wait for it to be on sale, spend and feel smart. For example, because I waited all of 2 weeks, I recently bought a pair of video cards for 100 bucks each. I saved over 120 dollars buying them when I did, which covers the cost of my router upgrade. The cards now cost 160 each after the sale. And guess what? I'll do the same with the router money when I upgrade my router. Wait for a good one to be on sale, then jump on it.

Lastly, don't shop at Walmart for electronics if you can avoid it. If it is the only store you can get to easily that has laptops, fine, but I would try Costco if there is one in the area, as the return policy (6 months no questions asked) and warrenty (2 years normal, 3 years concierge I think) is much better than Walmart, where you are subject to the manufacturer's whims.

Good luck, happy hunting!

Erloas
2012-09-27, 10:12 PM
As a note, under OSX, the display isn't actually running at 2880x1800 (easy to verify if you judge using font/interface DPI), at least, not in the traditional sense. I don't actually own a Retina MBP, and without having done any additional research on the matter, my theory is that Apple is just super-sampling every interface element, then downscaling it back to 1440x900. I could be totally wrong. That could be... but then what is the point? If you're lucky you loose very little in upscaling or downscaling but it is never as good as running at the native resolution of a panel. You aren't going to get a better image on a 2880x1800 monitor that is upscaling a 1440x900 source then you are running on a straight 1440x900 source and monitor.


Also, you're probably not going to be gaming at 2880x1800. 10% additional screen != 10% more GPU power. That's just not how it works. It's relatively easy to draw more pixels. What's not easy is post-processing. Advanced lighting, shadows, cloth simulation, and shaders are the primary bottleneck. The GT 650M has proven to be quite capable at rendering games in 2880x1800.
If you aren't running at 2880x1800 then, again, what is the point of having a monitor at that resolution? The monitor is then just a marketing gimmick and doesn't actually give you anything. You aren't even gaining any working space on the screen. I seriously doubt that is what they are doing... as much as I hate most of what Apple does, I'll give them the benefit of the doubt on this one.
And while 10% more resolution (not screen size) isn't exactly 10% more GPU power, it does work out fairly close to that, at least in GPU limited games. If you have some other bottleneck then it doesn't make as much difference. And while you are right that drawing the pixels isn't the big issue its applying filters, AA, post-processing, etc.; all of those activities go up linearly with with increased resolution. If you have to apply AA to 1000 pixels instead of 2000 pixels its going to take half as much time. Every one of those activities runs on every pixel on the screen so the more pixels the more you have to do each of those tasks. Then there is also the fact that the higher the resolution the more video memory you need, and while 1GB is good for 1920x1080 its probably going to be lacking for 2880x1800; which I couldn't say for sure, its more speculation and the fact that most high end video cards on PC are going over 1GB VRAM at this point, and I could probably find out if I wanted to look more into the multi-display options that are being used.
I also don't see how you can claim the card runs games just fine at 2880x1800 when the benchmarks show that the GTX 660M struggles on a lot of games at 1920x1080.


Erloas: Before the rMBP was released, the NVIDIA site listed the GT 650M as having the following clock speeds:
GDDR3 version: Up to 850MHz
GDDR5 version: Up to 735MHz

And that was an image of the stock speeds to confirm what they were for review.Ok, just an imagine didn't tell me that. I assumed that was the case, just stating that I couldn't know for sure what those specs actually meant. As for the base clock speeds, that seems backwards, but maybe that is how it is. But of course its still the top mid-range card, rather then one of their high end cards. No matter what the clock rate they have that card at its not going to beat what other companies are offering at the same price range.

Don Julio Anejo
2012-09-27, 11:37 PM
I also want to point out that an MBP is portable first, mid-end video second. Most gaming laptops are the reverse: they're not meant to be carried around. They're meant to be moved around the house a bit, maybe taken to college or a lan party, but at the same time, they're not meant to live in your backpack so you can whip it out anytime you're at Starbucks. Hence, much less focus on portability.

MBP got its niche as a portable, high-spec laptop which was in demand for photo/video people and software developers. Hence, they're not going to sacrifice size/weight for other options, as that's counter-productive to the marketing niche it occupies, they sacrifice other things, such as price and upgradeability.

Retina screen... I'm with the "I don't see any point to it as anything more than a marketing gimmick" crowd. Sure, it looks a little smoother for some things, but there's too many drawbacks plus the cost/performance hit make it not worth it IMO.

Flickerdart
2012-09-28, 09:53 AM
A Retina Display is phenomenally useful for web design or other art-related disciplines.
How so? A tiny fraction of a percentage of the population will have a display like yours, so anything you see on your Retina display is almost guaranteed not to be what anyone else will see (unless you're designing WebsiteForPeopleWithRetinaDisplays.com). I can see non-interactive, where the pixels won't be the final mode of display, but with web, you can't really work on an atypical machine and expect good, consistent results without a ton of cross-platform testing you could have easily avoided.

noparlpf
2012-09-28, 12:33 PM
A gaming laptop isn't really a laptop, anyway, it's just a more expensive desktop with more cooling problems and a smaller screen. Playing for any length of time without having to plug the thing in would be impossible.

Yeah, in general for gaming you'll want a desktop. And if you're serious about it you should probably just build one, because it's cheaper and more customisable that way.

I don't have any actual suggestions besides NOT Asus. They're buttheads. By which childish vulgarity I mean their customer support system is best summarised by this picture:
http://www.xtremesystems.org/forums/attachment.php?attachmentid=16307&stc=1&d=1061275625
I am not a computer person, but I am clever and good with my hands, and it took me over five hours to open up an Asus (this thing is designed to be impossible to open, rather than for actual performance) to replace the damaged video cable after the oh-so-helpful "support" people replaced the perfectly functional LCD, when five seconds on Google would have told them the damage was most likely to the video cable, and when I told them in the description of the issue that the damage was to the video cable. Then they claimed it was out of warranty when I tried to send it back, which is why I ended up buying the fifty-dollar cable myself and installing it. The new one also wore out about three months later, which tells me that the cable itself and/or the screen's joint is/are just badly designed.
All in all, next time I'm just going to do the research, build a desktop myself for performance (because with the attached monitor this thing is basically a desktop now), and get a cheap netbook of some sort for portability.

Neftren
2012-09-28, 01:50 PM
That could be... but then what is the point? If you're lucky you loose very little in upscaling or downscaling but it is never as good as running at the native resolution of a panel. You aren't going to get a better image on a 2880x1800 monitor that is upscaling a 1440x900 source then you are running on a straight 1440x900 source and monitor.

My point is, the Panel has the same dimensions as all previous MBP panels, i.e. the 15" 1440x900 display ratio, only with four times the pixel density. You could run the panel in 1440x900 and get exactly the same image quality compared to an older MBP (setting aside color-quality for the moment). It's not about losing a little or a lot by upscaling/downscaling. You're supersampling and then displaying at your effective native resolution. Basically, the rMBP is running at 1440x900, only supersampling all available graphics at 2880x1800 and rendering at an effective 1440x900. I would be more than happy to go take some screenshots of say, Guild Wars 2 with Supersampling on and off, and prove to you the definite increase in image quality.


If you aren't running at 2880x1800 then, again, what is the point of having a monitor at that resolution? The monitor is then just a marketing gimmick and doesn't actually give you anything. You aren't even gaining any working space on the screen. I seriously doubt that is what they are doing... as much as I hate most of what Apple does, I'll give them the benefit of the doubt on this one.

I'm going to invoke a logical fallacy here, but: "If your car has an engine capable of running at 120mph, why aren't you running at 120mph?" Forget the legal restrictions for the moment. What if my car is more fuel efficient at 40mph? What if my car isn't as loud at 40mph? I could come up with an infinite number of hypotheticals as to why I shouldn't run my car at 120mph, with the corollary being that I can come up with a number of reasons why I want a 2880x1800 display, but not actually run it at the full 2880x1800 resolution (chief reason being the standard DPI would render interface elements illegible).

The maximum resolution the rMBP can be set to is 1920x1200, if I remember correctly. That's a massive increase in available working space compared to previous generations of MBPs. Or are you saying that because it's a 15" panel, all 15" panels therefore have exactly the same amount of working space?

As much as I'd love to bash Apple (and I do want to bash Apple for some asinine design decisions they've made in the past), adding a Retina display makes a lot of sense.


And while 10% more resolution (not screen size) isn't exactly 10% more GPU power, it does work out fairly close to that, at least in GPU limited games. If you have some other bottleneck then it doesn't make as much difference.

No. I can render at 1680x1050 and get 100 FPS, but downscaling to 640x480 doesn't instantly give me three times that framerate (or 2.62-ish, if we want to be more precise, not accounting for the variation in aspect ratio).


And while you are right that drawing the pixels isn't the big issue its applying filters, AA, post-processing, etc.; all of those activities go up linearly with with increased resolution. If you have to apply AA to 1000 pixels instead of 2000 pixels its going to take half as much time.

Anti-aliasing is a relatively cheap procedure, and that's not how AA works. You don't apply it on a per-pixel basis. The short version of AA is a rapid interpolation of curves to apply offset-colors, providing an illusion of a real curve. With a higher pixel density, Anti-Aliasing mostly becomes a non-issue as you start to have enough pixels to render curves to greater and greater precision. That's part of what the whole "Retina" display is about, in that to the average person with 20/20 vision, they will be unable to distinguish a single pixel from a standard viewing distance.


Every one of those activities runs on every pixel on the screen so the more pixels the more you have to do each of those tasks. Then there is also the fact that the higher the resolution the more video memory you need, and while 1GB is good for 1920x1080 its probably going to be lacking for 2880x1800; which I couldn't say for sure, its more speculation and the fact that most high end video cards on PC are going over 1GB VRAM at this point, and I could probably find out if I wanted to look more into the multi-display options that are being used.

As I previously stated, you don't exactly "Anti-alias every pixel" but rather a series of pixels. Also, AA is overall a relatively cheap process. As for video RAM, now we're charging into the Megahertz wars discussion. More RAM != Faster Card, just as More GHz != Faster CPU. You have to take into account whether this card is running GDDR5, make and manufacture quality, clock rate, onboard cache, PCIe interconnect speed (typically x8 on laptops).

I've got friends running 2560x1600 on 1GB of VRAM. Or at that point, if you have multiple monitors, take advantage of AMD's Eyefinity scaling (or whatever NVidia's is called) to get the same high-res effect. Most high-end video cards are at exactly 1GB VRAM. Top of the line GPUs might have 2GB, but those are typically overkill (and why would I spend $800+ for one card, when I could buy two cheap cards and overclock them)? Besides, we're comparing apples and oranges now, if we invoke the "Desktop PC cards" because a Desktop PC card is meant to play games.

Let's re-evaluate the purpose of the MBP. 1) Professional production work, photography, film editing, all these FIRST. 2) Gaming (secondary). Let's look at the G7x Gaming laptops... 1) For gaming, yes. The two laptops fill two radically different purposes.


I also don't see how you can claim the card runs games just fine at 2880x1800 when the benchmarks show that the GTX 660M struggles on a lot of games at 1920x1080.

Diablo III and Civilization V both run fine at 2880x1800. I've seen it personally. You might not be running them at the absolute maximum settings, but again, the MBP isn't a gaming laptop. I think you really need to step back and reconsider what the Retina MacBook Pro is primarily used for (hint: not gaming).


Ok, just an imagine didn't tell me that. I assumed that was the case, just stating that I couldn't know for sure what those specs actually meant. As for the base clock speeds, that seems backwards, but maybe that is how it is. But of course its still the top mid-range card, rather then one of their high end cards. No matter what the clock rate they have that card at its not going to beat what other companies are offering at the same price range.

It's not backwards. GDDR3 is based off DDR2 SDRAM technology if I remember correctly, whereas GDDR5 is DDR3 SDRAM, which is twice as fast as DDR2. So factoring in the effective memory clocks...

GDDR3@850MHz = 1700 MHz Effective Memory Clock
GDDR5@735MHz = 2940 MHz Effective Memory Clock

You should be able to verify this under Windows using your GPU-profiler of choice (I'm a personal fan of GPU-Z and CPU-Z), which should show both the base and effective clock rates. I think OSX's System Profiler will report base memory clock as well, but I'm currently in Linux, so I can't check.

There's only one other company making GPUs out there, and it's AMD. Team Red is doing quite poorly right now compared to Team Blue and Team Green.


How so? A tiny fraction of a percentage of the population will have a display like yours, so anything you see on your Retina display is almost guaranteed not to be what anyone else will see (unless you're designing WebsiteForPeopleWithRetinaDisplays.com). I can see non-interactive, where the pixels won't be the final mode of display, but with web, you can't really work on an atypical machine and expect good, consistent results without a ton of cross-platform testing you could have easily avoided.

Flickerdart, regarding the primary area of Web Design, it's important to have a display with a pixel density in the Retina range primarily for targeting mobile devices, for instance, the iPad. It's extremely important to be able to see what it looks like on your screen and get a rough feeling for how it will look on a mobile device, without actually having to do a whole lot of copying back and forth to say, an actual iPad.

Also, there is a gradual trend towards high definition displays. Most "Gamers" typically already have 1680x1050 or 1920x1080 displays. A helpful place for statistics is the Steam Hardware Survey. Also, many tech-oriented professions have workstations with 2560x1600 or multi-monitor setups. If I'm running a website, I'd like to be ahead of the game. Essentially, my site should already be serving high-definition media before 51% of internet users have HD capable displays (I think of it in my mind as "all my soldiers should have ammunition in hand" before the battle starts, as opposed to issuing ammunition when the battle starts).

Regarding the web, it's about building standards-compliancy into your website, so that when users reach that stage, you won't have any issues. It's why font-rendering is currently such a huge pain, because all primary operating systems have different methods of font rendering (OSX > Linux >= Windows in that regard).

Also, another big challenge that Retina displays helps with is accurate rendering. In the past, people pre-rendered banners and such in pixels. Browsers now have support for SVG and other vector-based graphics, so it's important to have a display that can accurately render SVG. It's not so much the high resolution anymore so much as it is the high pixel density associated with Retina displays.

As I previously noted, the Retina Display actually runs at an effective 1440x900 resolution (so if you put a Retina MBP next to a 2010 MBP, they'd have exactly the same size interface elements, only the Retina display would have much crisper rendering). Oh, and did I mention color quality? I can only think of two panels that beat out the Apple IPS displays. 1) Sony's Z11 panel (also a $2000+ laptop) and 2) the Alienware M17x with RGBLED panel (also $1700+ laptop).


That's just for web design though. A Retina Display is quite helpful for editing high resolution footage out in the field, and so on. I could sit here all day, but I'll stop ranting for now.

Erloas
2012-09-28, 02:55 PM
I'm going to invoke a logical fallacy here, but: "If your car has an engine capable of running at 120mph, why aren't you running at 120mph?" That analogy isn't even close to the same thing. For one, once you have the GPU running, running it at full speed or half speed doesn't change the power usage much. Its actually so far off I'm not even sure how to debate it.


The maximum resolution the rMBP can be set to is 1920x1200, if I remember correctly. That's a massive increase in available working space compared to previous generations of MBPs. Or are you saying that because it's a 15" panel, all 15" panels therefore have exactly the same amount of working space? Now the thing runs at 1920x1200? Is there no consistency? I could see 1920x1200 being the max resolution a number of games or certain other types of programs support simply because there was no higher option when it was created. And work space has little to do with screen size, a 15" and a 45" screen both running at 1920x1080 are going to have the same work area. A 20" screen running at 1920x1080 is going to have more work area then a 25" screen running at 1440x900.



No. I can render at 1680x1050 and get 100 FPS, but downscaling to 640x480 doesn't instantly give me three times that framerate (or 2.62-ish, if we want to be more precise, not accounting for the variation in aspect ratio). That is because, as I said, that is only the case when a program is GPU limited. If you are running at 1680x1050 you probably aren't GPU limited on most games on most mid-range video cards and at 640x480 there is no way you are GPU limited and your framerate limit is based on something else. Take a look at the well known GPU limited games and you can see very clearly how resolution affects framerates.



Anti-aliasing is a relatively cheap procedure, and that's not how AA works. You don't apply it on a per-pixel basis. The short version of AA is a rapid interpolation of curves to apply offset-colors, providing an illusion of a real curve. With a higher pixel density, Anti-Aliasing mostly becomes a non-issue as you start to have enough pixels to render curves to greater and greater precision. There are a lot of different forms of AA and they are done in different ways, but they all work by looking at the pixels that make up the object and doing something to them, so the more pixels that make up that object the more work it has to do. If your curve is 600 pixels long instead of 300 pixels long its going to take that much more work.
But that is just AA, many of the post processing and other types of shader operations work on every pixel on the screen.


As for video RAM, now we're charging into the Megahertz wars discussion. More RAM != Faster Card, just as More GHz != Faster CPU. I talked a lot about about the fallacy and marketing behind VRAM, but it does server its purpose. Again, its an issue that is only an issue when the limit is reached, it is a fairly quick drop between being under the limit and over the limit. But a screen that is 5M pixels takes up twice as much space on that VRAM as a screen that takes up 2.5M pixels. And in most cases there are at least a couple copies of a screen in the VRAM at any given time. Also every single element that makes up that screen is saved in VRAM as well, if your gun is 200x100 pixels its going to take up 4x as much space as a 100x50 pixel gun. Multiply that out by every single element on the screen and you can see where VRAM starts to run out as you increase screen resolution.



Diablo III and Civilization V both run fine at 2880x1800. I've seen it personally. You might not be running them at the absolute maximum settings, but again, the MBP isn't a gaming laptop. Ok... I'm just going to be a bit contrary here...
But didn't you just say it doesn't run anything over 1920x1200, or 1440x900



It's not backwards. GDDR3 is based off DDR2 SDRAM technology if I remember correctly, whereas GDDR5 is DDR3 SDRAM, which is twice as fast as DDR2. So factoring in the effective memory clocks...

GDDR3@850MHz = 1700 MHz Effective Memory Clock
GDDR5@735MHz = 2940 MHz Effective Memory Clock
if that was how he meant it then it would make sense. The way I thought he said it was that the GDDR3 ram and the GPU core clocked at 850MHz while the GDDR5 ram had the GPU clocked at 735MHz, so the core clock, not the memory clock.


And it was lost somewhere in the cutting and editing of your post and I'm out of time to fix it... but Supersampling and upscaling and downscaling are not the same thing. Supersampling is also something that can, and is, easily done with all other types of displays and is not something you are gaining by having the Retinal display being a higher resolution then what the single you are feeding it is. The screen has nothing at all to do with supersampling.

Neftren
2012-09-28, 04:22 PM
That analogy isn't even close to the same thing. For one, once you have the GPU running, running it at full speed or half speed doesn't change the power usage much. Its actually so far off I'm not even sure how to debate it.

I'm not even sure how to debate this either. Let's try the real world test.

Let's take my laptop at full battery charge for comparison. I get on a good day, approximately 9 hours of continual usage, just idling or surfing the web, and so on. Basically, the extent at which my GPU is working is near zero, other than to render my GUI. You and I can both agree that my GPU is "running" yes? Utilization is effectively 0%.

I can guarantee you that if I ramp my GPU up to 100%, my battery life will go from 9 hours to a very very short one hour or two. Easiest way to prove this to you is to run something extremely GPU intensive, but doesn't stress my CPU at all. Now this took a while for me to think up, but the simplest way for this to happen is to load up my OpenCL Matrix Multiplication Algorithm and loop it for a few million iterations. The host program is effectively "load in sample set of data, wait until cl_finish returns true.

In both instances, my GPU is running, so by your argument, running at full speed or half speed shouldn't change my battery life, correct?

Let's consider the mathematical approach.

Power (in Watts) = V^2 / R, or in computer-friendly terms...

P(W) = Capacitive Load * V^2 * Clock Rate (in Hz)

By definition, running a GPU at full speed or half speed WILL change the Power requirements by a very very big much. Some easy data to verify that running computer components at a faster speed will consume more power? I think SPEC maintains an archive of Performance/Watt charts somewhere.

It's for this exact reason that data centers aren't run at 100% load, because it isn't power or cost-efficient (they're mostly the same thing once you reach that scale) to do so. I would be surprised if say, a datacenter went above 50% computation load at any given time.


Now the thing runs at 1920x1200? Is there no consistency? I could see 1920x1200 being the max resolution a number of games or certain other types of programs support simply because there was no higher option when it was created. And work space has little to do with screen size, a 15" and a 45" screen both running at 1920x1080 are going to have the same work area. A 20" screen running at 1920x1080 is going to have more work area then a 25" screen running at 1440x900.

If you've played around with one at all, you'll note that in OSX Mountain Lion, the resolution selection has changed from a list of resolutions to a radio button picker optimized towards various things. You can still get a list of course. Here (http://imgur.com/29gCc,BD034) is what it looks like on my 1680x1050 MBP. I think the Retina MBPs have an extra option, but I'm not positive on that.

I'm not going to take a side on the whole work area dispute. I think there are people who swing on both sides on the line. Some people prefer bigger screens. I'm a personal fan of smaller screens, higher resolution, but that's just me. I think both are valid approaches to "bigger work area" ...


That is because, as I said, that is only the case when a program is GPU limited. If you are running at 1680x1050 you probably aren't GPU limited on most games on most mid-range video cards and at 640x480 there is no way you are GPU limited and your framerate limit is based on something else. Take a look at the well known GPU limited games and you can see very clearly how resolution affects framerates.

I can assure you that I am at-present GPU limited, by virtue of my CPU being near top of the line quad-core, and the only way I can increase framerate in games is to overclock my GPU (from 600/[email protected] to 750/850@1v, but this is mostly irrelevant). I'm rather disappointed at the presumption that my framerate limit is based on something else, and that I haven't already tried benchmarking in GPU-limited games.

As a side note, if anyone here also has an AMD 6750M and is willing to do some testing, I'm trying to figure out whether it's the Catalyst 12.8 drivers that are giving me lousy OC headroom, as others have reported 850/[email protected]. So either I got an underperforming chip, or drivers are unstable.


There are a lot of different forms of AA and they are done in different ways, but they all work by looking at the pixels that make up the object and doing something to them, so the more pixels that make up that object the more work it has to do. If your curve is 600 pixels long instead of 300 pixels long its going to take that much more work.
But that is just AA, many of the post processing and other types of shader operations work on every pixel on the screen.

Let's be more specific then about AA. You're referring to AA as a general graphics optimisation, when it really isn't. So, here we go...

SSAA - Supersampling, then downsampled with a filter. Jagged edges become smooth automagically when you shrink your image. SSAA is great because you can use this practically anywhere, but it's extremely expensive and rarely used to its full potential (except if you have $700+ GPU).

MSAA - What basically everyone uses. Advantages? The pixel shader is run just one time! Also, depending on which textbook you read out of, multisampling could refer to the special case in which not all parts of the rendered frame are supersampled. I'm not a big fan of this latter description, but again, you don't necessarily have to anti-alias everything on the screen. Now, my area of expertise isn't in anti-aliasing or multisampling, so I'll leave it at that, and maybe someone can correct me if I made any technical mistakes. I'm more of a scientific computing (OpenCL) person.

FXAA - The new kid on the block. Works a bit differently, in that you're not antialiasing edges and polygons. You're anti-aliasing the entire frame! Which is what you're thinking of. In this case, if you have 2000 pixels, you're anti-aliasing all 2000 pixels (as opposed to rendered textures in the previous case). Now, this can induce some blurring on the output image as FXAA is run entirely after the frame has been rendered, but on HiDPI displays, you won't really notice it much. It's also ridiculously cheap compared to other forms of AA.


I talked a lot about about the fallacy and marketing behind VRAM, but it does server its purpose. Again, its an issue that is only an issue when the limit is reached, it is a fairly quick drop between being under the limit and over the limit. But a screen that is 5M pixels takes up twice as much space on that VRAM as a screen that takes up 2.5M pixels. And in most cases there are at least a couple copies of a screen in the VRAM at any given time. Also every single element that makes up that screen is saved in VRAM as well, if your gun is 200x100 pixels its going to take up 4x as much space as a 100x50 pixel gun. Multiply that out by every single element on the screen and you can see where VRAM starts to run out as you increase screen resolution.

Can we please not invoke the Megapixel wars now too? Memory is cheap nowadays. The problem isn't with (V)RAM, let alone DRAM (or GDDR# if you want to mince words). Memory is helpful, yes, but what I would kill for is some brilliant savant to write me an efficient way to quickly raytrace.

Yes, you are correct in that bigger gun (pixel-wise) will take more resources to render. In the case of the MBP (where its primarily application is in 2D manipulation, this is mostly irrelevant). If you want to go into games, then fine, you should take into account the fact that the problem isn't that the gun now takes four times the resources to render. It's that it probably takes twenty times the resources to process the shadows, particle effects, shaders, lighting, that all take place around the gun. In Computer Science speak, think Big-O notation (not that I wouldn't die to reduce a few constants here and there!).



Ok... I'm just going to be a bit contrary here...
But didn't you just say it doesn't run anything over 1920x1200, or 1440x900

Ah, I glossed over that. Okay, so the display's native resolution is 2880x1800, but runs at an effective 1440x900 (again, as far as I am aware -- I don't actually own a Retina MBP), or can be overridden in OSX to run at 1920x1200. I don't know why Apple chose 1920x1200 as the upper limit, as numerous Hackintoshes I've put together are fully capable of driving 2560x1600 displays (granted XQuartz doesn't work entirely well, but that's a Hackintosh problem). My best guess is that Apple hasn't rendered any interface elements in Retina-level DPI for anything over 1920x1200. The panel is still fully capable of running at 2880x1800, but the graphics subsystem in OSX isn't written to actually output at that resolution. After all, why render at a resolution at which point the system DPI causes all interface and text elements to be illegible?

Now, games on the other hand can run at 2880x1800 because elements aren't necessarily pre-rendered, but instead rendered on the fly (as games usually are). Blizzard and Firaxis have both added additional graphics settings that enable 2880x1800 rendering in their respective game settings. The games themselves play reasonably well at that resolution. I will admit that Civ5 is mostly a CPU-bound game at later stages. I couldn't comment on Diablo 3, as I don't own it.


If that was how he meant it then it would make sense. The way I thought he said it was that the GDDR3 ram and the GPU core clocked at 850MHz while the GDDR5 ram had the GPU clocked at 735MHz, so the core clock, not the memory clock.

Hmm, I can see the confusion. Rawhide, are those figures the Shader Clock or the Core Clock? Can you clarify please? Actually I should go look those numbers up myself after I post this.

In theory, the core clock could be the same across both cards. As long as your memory is running fast enough to keep pace with the onboard graphics processor cores, there shouldn't be any problem. Most new cards have done away with the "Maintain X ratio of Core to Mem when OC'ing" business.


And it was lost somewhere in the cutting and editing of your post and I'm out of time to fix it... but Supersampling and upscaling and downscaling are not the same thing. Supersampling is also something that can, and is, easily done with all other types of displays and is not something you are gaining by having the Retinal display being a higher resolution then what the single you are feeding it is. The screen has nothing at all to do with supersampling.

Okay, I should have been more precise. My definition of supersampling: rendering elements at a higher resolution, then downsampling them to the desired resolution. Upscaling: rendering at below-native resolution and then projecting it onto a larger resolution (up to native), or the "Projector" effect. Downscaling: Rendering at a higher resolution than native, and then scaling down, using the extra pixel data to improve image quality.

Supersampling isn't done on a per-display basis (as far as I am aware). Supersampling is done on the GPU, by rendering interface elements at say, 1920x1080, even though my native (max) resolution is 1680x1050. The Retina display means you can render at ultra-high resolutions, and then use that signal to improve effective image quality at 1440x900.

Flickerdart
2012-09-28, 06:02 PM
Flickerdart, regarding the primary area of Web Design, it's important to have a display with a pixel density in the Retina range primarily for targeting mobile devices, for instance, the iPad. It's extremely important to be able to see what it looks like on your screen and get a rough feeling for how it will look on a mobile device, without actually having to do a whole lot of copying back and forth to say, an actual iPad.
So it's important to have an expensive low market penetration device in order to design for another expensive low market penetration device? At the cost of ruining your perception for every other device? That's kind of...not logic.



Also, there is a gradual trend towards high definition displays. Most "Gamers" typically already have 1680x1050 or 1920x1080 displays. A helpful place for statistics is the Steam Hardware Survey. Also, many tech-oriented professions have workstations with 2560x1600 or multi-monitor setups. If I'm running a website, I'd like to be ahead of the game. Essentially, my site should already be serving high-definition media before 51% of internet users have HD capable displays (I think of it in my mind as "all my soldiers should have ammunition in hand" before the battle starts, as opposed to issuing ammunition when the battle starts).
You will notice that none of the setups you mention are, or resemble, the Retina display. As of January 2012, 15% of devices accessing the web had 1024px (13%) or lower (2%) horizontal resolution. Of the remaining 85%, over three quarters were either 1280, 1440 or 1366 horizontals. HD resolutions make up a tiny sliver of web-surfing devices.

Your argument seems to boil down to "this screen that looks nothing like other people's screens is great for making content for other screens that don't exist yet". Personally, I would rather make content that doesn't alienate 80% of all web users and not even accurately predict the experiences of the other 20%, and take the extra pain of loading the page on an iPad once in a while to make sure that it's still shiny at high res...but whatever floats your boat.

Rawhide
2012-09-28, 06:30 PM
Hmm, I can see the confusion. Rawhide, are those figures the Shader Clock or the Core Clock? Can you clarify please? Actually I should go look those numbers up myself after I post this.

Core clock. But the DDR3 memory version is limited to a 128bit interface.

---

The rMBP isn't outputting 1440x900 on a 2880x1800 screen when you use the Finder (their equivalent of Explorer). It is actually outputting it at 2880x1800, it's just made everything (text, icons, etc.) larger so you can see it. It is also possible to override this and display with no "upscaling". At the default setting, you get the equivalent screen-space of a 1440x900 display, but with much higher pixel density.

This is different to running a game at 1440x900.

Erloas
2012-09-28, 08:36 PM
Let's take my laptop at full battery charge for comparison. I get on a good day, approximately 9 hours of continual usage, just idling or surfing the web, and so on. Basically, the extent at which my GPU is working is near zero, other than to render my GUI. You and I can both agree that my GPU is "running" yes? Utilization is effectively 0%. Actually, as Rawhide pointed out much earlier, most mid to high end laptops now have an integrated graphics card (built directly into the newest Intel and AMD CPUs, and built into the Northbridge in previous generations) which run up until the more powerful GPU is needed. The GPU is almost entirely powered down and the system can switch between the two methods on the screen or an external display with no change at all noticeable to the user.


In both instances, my GPU is running, so by your argument, running at full speed or half speed shouldn't change my battery life, correct?
Let's consider the mathematical approach.

Power (in Watts) = V^2 / R, or in computer-friendly terms...

P(W) = Capacitive Load * V^2 * Clock Rate (in Hz)
I'll start this with the fact that I have a degree in electronics engineering technology and have built a number of digital circuits, though nothing nearly as complex as anything in a computer.

To start with you are partially correct, in that when a GPU isn't fully utilized they have a power saving mode (true of desktop GPUs as well, even when they are the only one present) which shuts down parts of the GPU, most if not all of the shader units and it cuts the whole processing pipeline down. They also automatically cut back the clock rate and reduce the voltage to the device.

Where you are wrong those is in the belief that the frequency the device runs at scales to the load on the system. My GPU for instance (GTX 275 desktop for reference) alternates between the 400MHz standby and 650MHz load, memory goes from 100MHz to 1200MHz, and shader goes from 600MHz to 1450MHz. And I just checked to make sure what I knew was right, was right, and even at only 50% GPU utilization (monitoring via GPU-Z) all of the clock speeds jumped up to that max rate. Its pretty hard to scale clock speed, it used to not be possible to do it on the fly at all it had to be done in the BIOS and then reset, now it can be changed on the fly but it isn't highly granular, you can adjust the base frequency but the multipliers used to get it up to the hundreds of MHz range aren't really adjustable, you have some control, but not a really finite control.
So if your GPU is running at 30% or 80% or 100%, once it switches out of that power saving mode it runs at the same frequency all the time. If your GPU doesn't have anything to do for a couple clock cycles it just runs a "wait" or "no action" command. This is true of the CPU as well, though the CPUs can generally run a bit more load in their lower power state before switching higher.
The GPU just doesn't have the capability to shut down 100 of its 280 shaders because they aren't all being fully used. And for the most part it doesn't take that much more power to run the wait statement compared to an actual command, you save some power when not running at full utilization, but it is no where near a linear growth from 0 to 100% matching load.


I'm not going to take a side on the whole work area dispute. I think there are people who swing on both sides on the line. Some people prefer bigger screens. I'm a personal fan of smaller screens, higher resolution, but that's just me. I think both are valid approaches to "bigger work area" ... Resolution is work area, there is no sides to it. A 640x480 image/window/etc is going to take up 1/4 of a 1280x960 screen whether that screen is 15" or 30". I'm personally a fan of big (for a monitor, medium to small if you where to compare it to a TV) monitors with high resolution. I'm currently running at 26" screen at 1920x1200. If I had the room for a bigger desk I would go for a 3 monitor setup. Either that or wait until more people start releasing the higher resolution monitors, I'm pretty sure both Samsung and Sony have some in the works.




I can assure you that I am at-present GPU limited, by virtue of my CPU being near top of the line quad-core, and the only way I can increase framerate in games is to overclock my GPU (from 600/[email protected] to 750/850@1v, but this is mostly irrelevant). I'm rather disappointed at the presumption that my framerate limit is based on something else, and that I haven't already tried benchmarking in GPU-limited games. You might have to find the right games, some do have upper limits. It is also very possible that the game isn't making 100% use of your CPU, a lot can't make use out of 4 cores completely yet. There are also other considerations, your RAM bandwidth could be the issue too, there are quite a few potential limitations, GPU and CPU are just the most common. The thing is though, if you want to test the GPU limitedness and effects of increased resolution you need to go the other direction, try increasing the resolution keeping all other settings the same and seeing what that does to your framerate.


Can we please not invoke the Megapixel wars now too? Memory is cheap nowadays. The problem isn't with (V)RAM, let alone DRAM (or GDDR# if you want to mince words). Memory is helpful, yes, but what I would kill for is some brilliant savant to write me an efficient way to quickly raytrace. I know RAM is cheap, but I've also followed the development trends to know that 1GB of video RAM is started to be a limit on 1920x1080 resolutions. It does depend a lot on the game though, some games have tons of different objects and skins that are all loaded at the same time.
A lot of the low end of the top line cards are running 2GB now because we're seeing the need for it over 1GB even at resolutions lower then 2880x1800. I'm just saying that I see 1GB as being a limiting factor for gaming at 2880x1800 because that is 2x the RAM needs as 1920x1080 where 1GB was pretty much the required amount. And even though VRAM is cheap there is no way at all to change it out on the MBP (or any other video card for that matter)



Now, games on the other hand can run at 2880x1800 because elements aren't necessarily pre-rendered, but instead rendered on the fly (as games usually are). Blizzard and Firaxis have both added additional graphics settings that enable 2880x1800 rendering in their respective game settings. The games themselves play reasonably well at that resolution. I will admit that Civ5 is mostly a CPU-bound game at later stages. I couldn't comment on Diablo 3, as I don't own it. I could see Blizzard games running well at that resolution because Blizzard is well known for making their games very light in the demands department. Almost anything right now can run WoW at 1920x1080 reasonably well.



Okay, I should have been more precise. My definition of supersampling: rendering elements at a higher resolution, then downsampling them to the desired resolution. Upscaling: rendering at below-native resolution and then projecting it onto a larger resolution (up to native), or the "Projector" effect. Downscaling: Rendering at a higher resolution than native, and then scaling down, using the extra pixel data to improve image quality.

Supersampling isn't done on a per-display basis (as far as I am aware). Supersampling is done on the GPU, by rendering interface elements at say, 1920x1080, even though my native (max) resolution is 1680x1050. The Retina display means you can render at ultra-high resolutions, and then use that signal to improve effective image quality at 1440x900.
You are pretty much right with supersampling, but it can be done regardless of the resolution of the screen. You don't have to have a screen that does 2880x1800 to render at that resolution and then rescale it to a lower resolution. There really is no such thing as downscaling though, that is pretty much exactly what supersampling is. Though I think there is also the use of supersampling where a scene or section is rendered multiple times and the results are "averaged" to give you are more smoothed image. It is either a very similar name or the same name used for different things, I can't actually remember right now.
Upscaling is done a lot, in fact it is done every time you run anything at all on something other then the screens native resolution. Though I think in virtually every case the monitor handles the upscaling. If you give a monitor a signal higher then it can handle it either doesn't show anything or it shows just how many ever pixels fit onto the screen and the rest are "off to the side"
You are right in that most of the time supersampling is done on specific areas of a scene and not the whole picture because it isn't generally a "thing" on its own, it is a method used when applying other affects to make them better. But with that, if that area you are supersampling to add extra lighting effects to (or whatever) is 400x200 pixels instead of 200x100 pixels (for the same object/part of a scene on at a resolution twice as high) its going to take twice as much processing power.

The point is that you don't gain anything by rendering at 2880x1800 and then downscaling it compared to what you would get by just rendering it and outputting it directly as 2880x1800. You don't improve performance, you don't improve image quality, the only thing that might change is a slight loss in image quality from the resolution change.