These days at viewing distance 300dpi on a phone is acceptable,,, Over that and since "their so small you can't see the dots". why would that is indicate a difference ? since the eyes can't see higher than that.
today's laptop even the "Retina" have over the top resolutions, and as Apple says "shows of photos" as the main target...
We will get to a point when the resolution will increase, and displays have to be bigger.... because you cannot no longer fir x number of pixels within the same size display, so you have to increase the viewable (either shrink the bezels. or bigger screen is another way)
Then, you'll get to the point of,, "these phones are too big being 8' phone", and then we'll be an uproar as to wanting these much bigger resolutions, but in order to do that we must have a massive phone..
thus,, we're going from a "brick" phone of the old days, to a "big" display
end result. ? Times haven't changed much
Yep -- there are people (like myself) who only want a single machine that can be used both for work and for some gaming as well. My own needs are pretty pedestrian at this point (Diablo III, Minecraft, TF2, other Source games), but a dGPU will come in handy for D3 at least.
It also has 8GB of RAM, which is more than I'll ever need.
I study electronics and electrical engineering, don't think of me as some peasant who knows nothing of lithography and semiconductors. But if you knew anything of what's been happening in the world of semiconductor fabrication you would understand Intel hasn't exactly been putting the pedal to the metal to try and push these things out. Why? Because it has zero competition and would only be competing with itself. This is why I made that comment, I realise any shrinking of transistors is difficult, infact its what of the most advanced pieces of technology we humans have, I also know it's governed as much by economics as it is by science.
Intel was meant to have these things cooking years ago, no it looks like a 2015 launch. Just look at what happened when AMD handed them their asses and they accelerated Core 2, was pretty incredible.
I'm more bothered by the ever growing list of Apple products that look increasingly semi-abandoned.
So if the current 8GB/256GB mid-range 13 inch gets replaced with a 16GB/512GB model for the same price I think I'll pull the trigger.
This will be the last update before Yosemite hits, and after beta testing it since DP 1 I've decided I'll stick with Mavericks until I'm forced to upgrade.
What's your point?
Name an apple laptop that has:
The 4th generation Intel® Core i7 Processor
17.3" Full HD anti-glare LCD
GeForce GTX 880M DDR5 8GB
Exclusive Super RAID 2 with 3 SSD RAID0 with 1500MB/s reading speed!
32GB of RAM
Support for 3 External displays
Pro tip: you cant.
Nice update, but I think I'll hold out another year before replacing my mid-2010 MBP.
I'm in the same boat as you. I have a hackintosh with similar specs as your iMac and the same 2010 MBP as well as a retina iPad mini. I really want a new MacBook but this minor refresh completely turns me off. I'll have to tough it out and wait until Broadwell comes out.
Intel has competition, it's just not in the desktop space. Their primary competitive threat right now is from ARM. Hence the push for low-power processors.
and a battery life of 10 whole minutes! enough time for you to find a wall outlet to plug it in, so your cool ELITE GAMER LED'S can light up the room
$700 for a mediocre GPU upgrade? Bit rich.
I'd say you're overestimating the skills of the people who run Intel's manufacturing department here... AMD spun off it's manufacturing division back in 2009 as a contract manufacturing company and they've got no reason not to get their 14nm process that they've been working on with Samsung to work.SCOLANATOR said:I understand fully the implications of a 14nm transistor size and how advanced that is. But like in my previous reply to another comment this has more to do with economics and lack of competition than engineering challenges.
14nm second gen FinFET is an engineering marvel, as is the doped silicon it's made from, but if AMD offered a modicum of competition then Broadwell would likely be available right now, not in 2015.
and a battery life of 10 whole minutes! enough time for you to find a wall outlet to plug it in, so your cool ELITE GAMER LED'S can light up the room
He has a point. At the present moment Apple doesn't really have a top of the line mobile solution for graphics intensive stuff or video editing.
Apple's offerings are thin, small and light. Unfortunately not too powerful either.
I also don't understand why some of you are so against people wanting to have an MBP which would be great for gaming, too. It is really not that hanging out in MacRumors or drinking latte in Starbucks all day long is morally superior in any way
But the continued lack of decent graphic hardware, and I'm not saying good or even up-to-date, for the premium premium price we pay, is going to force me to switch back to PC after 10 years of using Macs.
It started with that integrated GPU non-sense,
i know tons of video editors that photographers that use rMBP's. its literally the standard on most productions, second only to the mac pro