Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
MacBook Pro with i9 chip is throttled due to thermal issues, claims YouTuber (macrumors.com)
411 points by electic on July 18, 2018 | hide | past | favorite | 401 comments


Dave Lee's youtube video is a withering takedown, presented dispassionately.

His argument is such a slam dunk that the article concludes with stunned disbelief, speculating that maybe there's "something wrong with the MacBook Pro with Core i9 chip that Lee received".

Yes there's something wrong with it, that was the exact point of his video. Do people really think he just got a lemon?


Quinn Nelson (a.k.a. Snazzy Labs) recently did a couple of videos where he replaced the stock thermal paste on a 2017 MacBook Pro with a liquid metal thermal paste [1] and then 3 months later with a standard thermal paste [2].

With the liquid metal paste the CPU temps dropped 10°C, ran 200MHz faster and the MacBook Pro didn't thermal throttle. The standard thermal paste also saw a 7-10% improvement in thermals compared to Apple's paste.

I wonder if replacing the paste with a better one would help with the latest i9 MacBook Pro?

[1] https://www.youtube.com/watch?v=iw4gqfrBN4c

[2] https://www.youtube.com/watch?v=JNoZNzOQpVw


When working as intern at an Apple store more than 10 years back the instructions from Apple for reassembly would always tell to put a huge amount of thermal grease on the CPU. So much that it would be squirted out from under the heatblock in large blobs. I always found it odd as I was used to add just as much as was needed without spilling over. Never read any research on it though so I'm curious what was the right approach. I can't image Apple just getting this part of the thermal puzzle wrong when everything else is fully optimized. Or them just skimping a few cents on better grease.


If a bare-die CPU or GPU is not fully covered, the uncovered portions will rapidly become hot and the chip may die (especially if the thermal sensors are covered and cannot detect this). This is different from modern desktop processors where there is an IHS to help spread the heat around - there is a very real danger to insufficient paste application on a bare die. In contrast, since modern thermal pastes are non-conductive, there is no real downside to an enthusiastic application of paste except for making a mess, so you want to default to putting too much on rather than having some dumb intern kill a PC with an insufficient application. There is virtually no effect on thermals.

https://www.youtube.com/watch?v=rAid5G30-WM

LM is conductive, so you need to be much more careful with the application, as over-application will kill the chip. It also can migrate over time, especially if you put a bit too much on, and is particularly unsuitable for laptops that are going to be moved around a lot.

Putting LM on a laptop is a gimmick, and one with fairly considerable risk to the hardware, not something that is suitable for mass production.


> since modern thermal pastes are non-conductive, there is no real downside to an enthusiastic application of paste except for making a mess, so you want to default to putting too much on rather than having some dumb intern kill a PC with an insufficient application. There is virtually no effect on thermals.

The effect on thermals are absolutely not insignificant. Thermal pastes are actually extremely bad at transferring heat and too much of it really has negative effects.

Yes, it is safer to put too much than too little. Using too much the worst case is just that the CPU throttles and performance is sluggish and that rarely leads to an RMA.

From https://www.ekwb.com/blog/thermal-compound-guide/

For example, the thermal conductivity of a high-grade thermal paste is 8.5 W/mK, and the heat conductivity of copper is 385 W/mK, or for aluminum 205 W/mK. You can see that the thermal compound is actually a poor heat conductor and that is exactly the reason why you only need a very thin layer of paste to fill the micro imperfections between the IHS and heatsink.


> For example, the thermal conductivity of a high-grade thermal paste is 8.5 W/mK, and the heat conductivity of copper is 385 W/mK, or for aluminum 205 W/mK. You can see that the thermal compound is actually a poor heat conductor and that is exactly the reason why you only need a very thin layer of paste to fill the micro imperfections between the IHS and heatsink.

This has always been my rule of thumb whether it's metallic or non-metallic paste. I throw on a pair of vinyl gloves, squirt a dab on the heatsink (not the CPU), and spread an even layer with my finger. I go back and fill in low spots with a smaller dab, trying for an even surface. This way, it's thin enough that it doesn't squirt out the sides when it's installed, but it's a uniform spread that covers any gaps between the heatsink and the CPU surface. Also, by putting it on the heatsink instead of the CPU, I don't risk accidentally getting paste on the non-contact parts of the CPU.

I've also used the semi-solid thermal pads similar to what comes from the factory on most OEM heatsinks, but they are far less efficient than paste.


For what it's worth, Arctic Silver recommend not to use your finger, on account of dead skin cells and grease.

Page 4, http://www.arcticsilver.com/pdf/appmeth/int/vl/intel_app_met...


To be fair, the Poster did mention they wore gloves.


Right, hence the vinyl gloves. It also protects my skin from any harmful chemicals or minerals that may be in the paste.


so IIUC thermal paste is only 'better' than air gaps :)

anyone ever friction soldered a heatsink to an IHS ?


This is possible but friction welding needs temperatures to reach above melting point for a little while that is 1085°C for copper and 660°C for aluminium, machinery needed to do this would be insanely expensive and would need to be done to the IHS detached from the cpu as that gets ruined at around 115°C i think, it would be better for enthusiast grade heat sinks to carve out an IHS in their blocks. but that would be made for mainstream processors only 8700k, some intel hedt and amd threadrippers

Edit: selling custom IHS integrated heatsinks with LM applied and nail polished is a small and probably fairly inconsistent business to invest in buying a well binned 3790k kit, very few people would upgrade till an 8700k which showed significant gains worthy of an upgrade you would run dry for a few years especially the premium you would ask for such a service people would use these things for a long long long time i know people who are still using i5 2600k just because majority games are poorly multithread optimized

AMD could collaborate with EKWB would be great TR2 Liquid and Navi 96 Liquid. Intel have too much ego to collab


High mounting pressure will squeeze out excess thermal paste which greatly narrows the difference between the most and least conductive compounds when both surfaces are fairly smooth and even in while in contact.

https://www.overclockers.com/heatsink-mounting-pressure-vs-p...


> since modern thermal pastes are non-conductive, there is no real downside to an enthusiastic application of paste except for making a mess

I recently re-applied thermal paste (an old Arctic MX-2 I had lying around) on a late 2013 MacBook Pro which has bare-die CPU and GPU. The first time I did it, I applied too much paste and it had an observable negative effect on thermal dissipation. Once I applied the correct amount, I got slightly better results than stock paste.


That's strongly mitigated by mounting pressure which is very high in macbooks.


> there is a very real danger to insufficient paste application on a bare die

So, I've repaired and cleaned many, many laptops and computers over the years. For bare die components (CPU/chipset) I use something like "half a grain of rice per finger nail area". This is already plenty. I sometimes checked after mounting, and there was always some minor squeeze-out. (For desktop CPUs I also use just half a grain of rice. The higher mounting force results in about a ~3-4 cm dia circle; this doesn't extend all the way to the corners of the IHS, but it doesn't have to, since nothing is dissipating heat there.)

> there is no real downside to an enthusiastic application of paste except for making a mess

That's not my experience.

With laptops, the pressure applied by the cold plate is not that much and thermal pastes have high viscosity. (The cold plate is often not that stiff, too, so it tends to flex). Applying too much paste results in a too thick layer of paste, which results in higher temps, hotter laptops, fans spinning faster and thus more noise and annoyance.

Indeed often I found the issue with manufacturer-applied paste was that it was simply too much. I think the reason they do this is that it's harder to get wrong and it makes the product just more annoying, not break it. If they did it properly, there would be some chance of getting it wrong, resulting in a broken product. For the same reason they're using thermal pads wherever they can get away with it: They're practically impossible to misapply.


> if the thermal sensors are covered and cannot detect this

Modern chips seem to have sensors specifically detecting hotspots. E.g. with an AMD Polaris GPU, you can see "Hotspot Temperature" in HWiNFO.


Yes but they might not have as high a density of sensors at the periphery/uncore.


I think that might be the reason indeed. It's better to err on the safe side.


The best scenario is full contact between the CPU/GPU/whatever makes the heat and the heatsink. No product that is put between those elements will make heat dissipation get any better, as it won't change the composition of the heatsink or of the CPU/GPU surface.

Now, in real life, CPU, GPU and heatsink don't have the perfect, more-than-mirror finished surface. There are small gaps and small peaks that prevent a full contact. And those gaps are filled with air. Air is a bad heat conductor. So, to reduce this bad, bad air effect, you fill the gaps with thermal paste.

Thermal paste will be better than air pockets. Thermal paste will be worst than metal/metal (CPU/GPU/heatsink) contact. So, put as little as thermal paste as needed, just to fill the gaps.

But put too few paste, or don't cover everything, and you'll end up with more gaps: the paste will prevent those metal/metal peaks contacts and if not spread evenly it will leave bigger gaps.

If you can't know the ideal, it's better to have a little more paste (but covering everything) than to few or badly spread (leaving bigger gaps).


It's even better to add more paste and high mounting pressure which is what Apple does.

https://www.overclockers.com/heatsink-mounting-pressure-vs-p...


"Better grease" makes a fairly limited difference over standard[0], and it requires way more care if it's a silver-based compound (as that's electrically conductive).

I'm guessing their thinking is they want to avoid air at all costs, so requiring way excessive amounts ensures it will never be lacking, which they probably think is significantly worse than excesses.

[0] the difference in thermal conductivity between an entry-level thermal grease and a high-quality one is 2x~3x (entry level is ~2W/(m.K) while top-of-the-line might reach ~8W/(m.K)). The difference between air and a low-end thermal grease is 2 orders of magnitude (~0.025 W/(m.K) versus ~2)


It also causes structural breakdown of aluminium, so if there's any of that anywhere nearby (or in the heatsink) it's a no-go.


Came here to say this.

Former employees of Apple I know recommend reapplying the thermal paste immediately after getting a MacBook Pro.


They recommend I open up the device myself? Thereby voiding warranty? Isn't that getting more and more difficult?


This may have changed, but for years the apple warranty was "as long as you don't damage anything, you're allowed to open it". I did some research and at least the manual for my machine still states this, but mine is also from 2012


I took a 2011 model in since I just needed a replacement battery due to bulging. The Apple employee was flabbergasted that I had removed the screws and bottom plate saying I had voided the warranty... on a severely out of warranty device. He was insistent that any device that had screws in it would have the warranty voided if those screws were removed and the device opened, even after showing him the manual which instructs the user on how they can open the device to replace the hard drive in the section labeled "Boost your MacBook Pro" starting on page 35. https://manuals.info.apple.com/MANUALS/1000/MA1602/en_US/mac...


They're getting a bit sloppy with the people they hire recently it seems. I had someone from AppleCare basically tell me I had 90 days after a repair to find an issue with it (violating Australian Consumer Law, which gives you 2 years) - mortified staff in-store gave me my refund though.


> So much that it would be squirted out from under the heatblock in large blobs.

Is this stuff electrically conductive?


No


Here's a typical warning from a manufacturer: http://www.arcticsilver.com/as5.htm

> While much safer than electrically conductive silver and copper greases, Arctic Silver 5 should be kept away from electrical traces, pins, and leads. While it is not electrically conductive, the compound is very slightly capacitive and could potentially cause problems if it bridges two close-proximity electrical paths.


The correct answer is "it depends on which paste you use".

If you're using a "liquid metal" compound, you probably want to be very careful, as it is conductive.[1]

[1] https://www.ekwb.com/blog/thermal-compound-guide/


> The correct answer is

A correct answer takes context into account. In this case, "vendors that always tell to put a huge amount of thermal grease on the CPU".


Apple can't make good keyboards anymore, are they even failing with their thermal paste now?

I really don't feel these machines are professional level.


It's not just Apple, the thermal paste between the die and the heat spreader on newer Intel CPUs isn't getting good reviews either. Replacing the paste under the heat spreader is a lot more tricky though, and requires special tools.

Why neither Apple nor Intel is opting for better thermal compound is amazing. Tests continue to CPU temperatures drop 10+ degrees (C) just by replacing the thermal compound. Even if they opted to get the very best thermal compound, I can't see it eat into the profit of either companies.


I recently built a new gaming PC, and it was so bad that I couldn't even run stock speeds without thermal throttling, let alone any overclocks. A swift delid and some new TIM and I had shaved 25c off the stock temps while no longer throttling. Woe betide any average user who isn't willing to void their warranty on a £500 part to get the it to run as it should, enjoy that RMA process.

Half the issue is the shit TIM they use, the other half is the way they attach the IHS to the CPU. It's attached with a huge amount of sealant around the edges of the IHS and simply removing it will gain you a good 10c improvement because you're moving the IHS a few mm closer to the dye, combine it with good TIM and you're golden, replace it with some liquid metal TIM and you're off to the OC races.

I could maybe forgive them if this was an isolated incident on one generation of chips but this has been the case for about 4 years now.


It may not hurt their margins much to use more expensive thermal compound, but it definitely won't help them in any meaningful way. These products are optimized for real-world bursty workloads that basically never hit a serious thermal limit that better paste could help with. Even on the desktop chips, delidding and replacing the thermal compound provides no benefit until you're pretty far into overclocking territory.


But the price for this machine is crazy, and it doesn't support a use case where you actually use the i9 cpu for long periods of time... Why do you need it then? To open the browser a few milliseconds faster?

It's not a professional machine if it's throttling heavily under load so you can't use its resources....it boggles my mind that it's even allowed to sell something like this.

It's like selling a supercar which can't go at top speed for more than 20 seconds, then it throttles speed to 60 mph.


Not disagreeing with you, but as it happens the Bugatti Veyron can't do the full top speed for very long (Michelin won't guarantee the tires to run above 250 mph for more than 15 minutes.).


It will run out of gas way before then.


They'd avoid bad press like this story though


The problem is actually that the silicone adhesive they use to attach the IHS is too thick. The act of shaving the adhesive down is what really produces thermal improvements.

Delidding, shaving the adhesive, and replacing with fresh TIM produces nearly as good a result as using LM (without the risks of LM). And using LM without shaving the adhesive only produces a few degrees of improvement.


The first edition MacBook Pro had widespread issues with thermal paste back in 2006 that led to random shutdowns and led to a recall to re-apply new paste (I had one and had to get the paste re-applied).

Future generations of MBPs didn’t have the same problems.

The point is: shit happens with any product, the question is will you be supported well? Apple’s history here has generally been good but occasionally they’ve made major blunders (eg. the iBook motherboards about 17-18 years ago).

Their size leads to amplification of issues beyond reasonable response. The keyboard issues for example are way overblown. We have thousands of MBPs and hundreds of Touchbar MBPs at our company and there hasn’t been a widespread keyboard failures.


>Their size leads to amplification of issues beyond reasonable response.

This is a ridiculous sentiment IMO.


> Their size leads to amplification of issues beyond reasonable response. The keyboard issues for example are way overblown.

I have a MacBook pro next to me that is on its fourth keyboard.


Replacing the thermal paste on my old Lenovo was a night and day improvement. It wouldn't surprise me if that was the issue here.


In fairness, Lee criticized the Dell XPS 15's thermals in a video a couple of days ago. Of course, he also posted a video last week about how he was starting to hate Apple products.

If you watch a lot of his videos, you'll notice that he almost always talks about thermals, so it's not surprising to see him focus on that aspect of the MBP.


In all fairness: If you sell a >3000€ laptop, people should expect to get the full value out of it. Maybe it's hard to cool CPUs in thin laptops. But then don't let people pay for expensive upgrades that have no performance benefit.


Seems like it's hard to cool GPUs in desktops too if you're Apple

>their maximum FP32 compute performance is 11 TFLOPS (which points to around 1340 MHz clock-rate for the Vega 64) and their peak memory bandwidth is 400 GB/s (indicating about 1600 MT/s memory speed), which is slower when compared to the Radeon RX Vega cards for desktops. The main reasons why Apple downlocks its GPUs are of course power consumption and heat dissipation https://www.anandtech.com/show/12152/apple-starts-imac-pro-s...

Sad to see that this is becoming a recurring theme from Apple's pro line.


Downclocking, by itself, tends to increase component lifetime. It makes sense for a professional machine that's not supposed to break after a couple years.

Of course, if the component still overheats, that completely beats the purpose of the downclocking.


Exactly. In other tech this is called headroom.

It’s just a pragmatic approach to a physical limitation. The right choice given the clientele— debatable. But just the same it’s not unique to computer manufacturers.


Or, they've gotten a really good deal on binned chips.


It depends on workload. If I'm typing code for a minute and then spending few seconds compiling and launching it, it won't trigger throttling, but reducing each compilation from 3 seconds to 2 seconds is certainly a benefit.


Exactly. It seems that fast chips in slim laptops are like fast chips in phones. They may be worth having for performance on 2-second tasks, and might keep up with a desktop there.

If what you care about is performance on 2-hour tasks, there's just no chance, that's a completely different thing.


“Creative” workloads require sustained CPU tho’ and that’s who Apple claims this is for


Or people whose projects require several minutes of compilation rather than just 2 seconds...


Which "creative" workloads? Video, 3D, and photography mostly require sustained GPU.

For audio, it's more the CPU.


It's both, it's actually hard to have a very high GPU load and a low CPU load (unless you're doing crypto mining, I guess)

For typical loads, both will be at a high level of usage


I expect the GPU generates heat as well. Which is why all the interest in external enclosures.

This entire thread is baffling to me. Everyone rushing to justify buying a new high performance laptop which they don’t mind throttling because they never needed the performance in the first place??


> Everyone rushing to justify buying a new high performance laptop which they don’t mind throttling because they never needed the performance in the first place??

Time scale is relevant. I want interactive tasks to be faster. I won't notice a 5% improvement in a job that takes long enough for a coffee break, but I will appreciate if more short tasks can cross the threshold of being indistinguishable from instantaneous.


We're talking 100% improvement here. This could be reducing a render from an hour to 20 minutes, a qualitative difference.


Does the internal GPU generate enough heat to throttle under typical conditions? Having the option for external GPUs is a completely different story.


If it's the 13 inch model with an iGPU only then yes. The CPU and GPU will share their total TDP.


Video encoding and photography processing mostly use CPU.


> But then don't let people pay for expensive upgrades that have no performance benefit.

The number of people who want performance stats, or at least think they need, far outnumber the amount of people who need performance. It makes absolute sense to sell these lemons, even though it is unethical.


> or at least think they need,

Irrelevant. If you pay premium for performance expecting to get the level of performance you've paid for then you better get what you've paid for.

Otherwise we're dealing with something very similar to consumer fraud covered with a thin paternalist veil of "corporation knows best".


> pay premium [...] get what you've paid for

That makes no sense. "Premium" strictly means "at a higher cost." You can't possibly get what you have paid for with premium products, because some of your money has been spent purchasing the premium.

Regardless of sour reviews such as this one, most people will continue to buy Apple products because Apple sales have managed to transcend the quality of their products. Apple will continue to push the boundaries of what is acceptable and people will continue to buy their products, defend them and praise them - no different from the past decade.

The fault lies squarely with consumers. Apple is just doing what they are being asked to do: flatter things, bigger numbers, higher prices.


> it is unethical

The only part that matters. Maybe it's even illegal.


My ThinkPad T470P can run at 100% on a hot day, the CPU gets hot but it doesn't throttle down.

Stockfish (chess engine) will batter a CPU like that.


Actually, it throttles as well, see notebookcheck review:

https://www.notebookcheck.net/Lenovo-ThinkPad-T470p-Core-i7-...


>In all fairness: If you sell a >3000€ laptop, people should expect to get the full value out of it.

The full value out of it includes:

1) It not being short-lived by rampant overheating (even if it's within cpu tolerances) 2) It being quiet as you expect a MBP to be

It's not supposed to render 6K 3D all day at full speed in the tropics...


I think it's supposed to render for a few dozen minutes fairly fast though and it fails at that. And I don't think it was 6K 3D footage either, considering the render time it was probably closer to 1080p60.

"You're using it wrong" (ie, "it's not supposed for" with different wording) has been a long time excuse for many Apple failures at this point and I'm tired of it.


> It's not supposed to render 6K 3D all day at full speed in the tropics

Hyperbole aside, their advertising shows it hooked up to a 5K external display running the Unity debugger. So it should at least be able to render 5K 3D for a typical work day.


And it doesn't? Before that was not part of the test in TFA -- which was quite simplistic.


I have a Dell XPS 15 (2017) and the thermals are really bad. I saw constant throttling in my kernel logs and the laptop sometimes slowed to a crawl when compiling, running a ton of docker containers, using Light Table or even just having a ton of browser tabs open.

I finally ditched it a few weeks ago, finally replacing it with a Ryzen 7 desktop.

I'll admit my XPS 15 was a factory refurbished model, so maybe I just got a lemon. It got so hot the battery is now swelling and it popped out the trackpad (I think there might be a recall; or if not take it in since that's gotta be under warranty).

My current laptop is a HP Spectre. It's thicker than the XPS, but I think it's got a much better cooling solution and preforms better. I don't think I'd recommend a Dell XPS again.


I don't know if this is sad or funny but my $350 Chromebook[1] that I modified to run Linux happily runs large Docker based Rails apps and has overall been a great development laptop for the last 2+ years. It has a 1080p IPS panel, a keyboard that I like more than my workstation, a real SSD and it weighs under 3 pounds.

[1]: https://nickjanetakis.com/blog/transform-a-toshiba-chromeboo...


YMMV, but in addition to a dodgy (manufacturer defect) and oddly designed (strangely shaped modifier keys) keyboard, my HP Spectre got really hot under comparatively minor load.


The higher end Latitudes seem like a better comparison to the MBP, I'd be interested to know how they stack up (thermal and otherwise).


I have the new 2018 9570 with an i9, and throttling is very modest. My workloads tend to be bursty rather than sustained, so I probably wouldn't notice in normal use anyway. But running the Intel XTU stress test, I only get very short periods of throttling.

Perhaps the 9570 is improved, or as many suggest Dells QC is poor and I got lucky.


Oh man, it's not a lemon...My experience, with a pricy 7700HQ, 4K touch, is exactly the same. First a faulty SSD which first replacement of SSD and motherboard didn't solve (second repl. of drive did), then throttling, and then the swollen battery pushing out the touchpad - I even also got a Ryzen desktop instead (1700x)!


Thermal on my i7 xps13 is abysmal. Never again.


Thermals on my early 2018 9370 i7 are perfect and I haven't really seen any throttling even doing video rendering unless I'm sitting in my apartment while its 85F+ (Seattle still doesn't know what AC is) and it's on my lap.

The fans do spin up and they're audible, but not louder than my Thinkpad or any of the Macbook Pro's we have at work. It is dead silent unless I'm actually doing intensive work.

It seems people have a lot of issues with older models though, so YMMV.


Same. It's hot, loud, and still never feels as fast as it should.


Loud? it's the quietest machine I ever had. No problems so far (9370 model)


Mine is the 9360 with the i7-8550U. In high performance mode, it's a hair drier. In balanced, the fan is a constant annoying whine if I have, for example, Webpack in watch mode, or am typing code quickly and keeping the IDE's language service active, or if any browser tab has any active JS at all. Basically if the CPU is at a reasonable speed and at 20% load, constant fan noise.

It's quiet if I have it in "silent" mode, but then I might as well be using a 350$ machine, as far as performance is concerned.

Maybe you have a slower CPU, or they fixed the thermals in your model?


Then I'm glad I have the '18 model, despite the USB-C tyranny. I don't know if they fixed stuff, but it's a pleasant experience so far. CPU is i5-8250I with Windows 10 if feels snappy.

https://imgur.com/a/afvqtbQ

The fan is running at around 10%, barely noticeable.


I have 9360 with i5-7200U. I've just checked, fan kicks in after 90 seconds of sha1sum /dev/zero and stays always on lowest RPMs. 25C ambient. With 50% core load at 3.1GHz it stays silent most of the time...


50% at 3.1GHZ would make for one noisy experience here. Either mine's a lemon, or, more likely, there's no way to get a CPU like the i7-8550U into such a small case without causing thermal issues.


Same here.


My old xps 13 with the i5 had great thermals on the other hand, but was still snappy enough for almost everything I threw at it. Didn't like it b/c the screen was just a tad too small for me and I had some problems getting good audio on the machine. Also, coil whine. It was awful.


Same here. 13 inch XPS, coil whine is insane. Had the motherboard replaced 3 times, no improvement. It was a replacement for my ageing Air 13", and I honestly wish I didn't swap. At first glance it's just as nice, the build quality and form factor are identical - and then you start using it, and see that Dell charges just as much if not more than Apple, but quality of several different elements is ages behind apple.


I was thinking about getting an XPS 15 soon so this gives me pause.

> Dell charges just as much if not more than Apple

Not even close. XPS 15 with 6 core, 16GB RAM, 256GB SSD is $1500. MBP is $2400 for the same specs.

What I really like on the XPS 15 is the 97 WHR battery. I haven't seen any other notebook with this capacity.


I bought a Nvidia/i7 XPS 15 in March and have never understood the coil whine complaints, yeah there is some but nothing too bad (comparable to my 2012 MacBook Air). Alternately everyone else has super hearing or I’m deaf, YMMV.


Yes, I mean obviously right now the brand new MBP is absurdly expensive. But when I got the XPS 13 it was actually more expensive than an equivalent MacBook Air(with the caveat that the XPS had a higher resolution screen).


MacBook Air is not comparable to an XPS...


The one I have(9333) is literally identical in form factor and inside construction to the Air - if you open the bottom cover, the layout of everything is exactly the same, it's as if Dell copied the whole design as-is. Nowadays the XPS 13 has exceeded everything the Air can offer, but just couple years ago that was not the case.


You can buy older (but unused) Macbook Pro with these specs (except it's 4 core instead of 6) for $1000 and it has way more battery life, nicer screen and better touchpad.


The Aero 15 has a 94WHr and a better gpu


What model XPS 13? My 2018 9370 has no coil whine even when I hold the keyboard to my head.

I think it's a fixed issue, but only in 2018 models, so YMMV.


And thermals is real issue with race to making thinnest laptops possible. My Thinkpad T450s throttles just running Chrome and bunch of tabs. I don't see how a competitor having thermal problems exonerates Apple from having similar problems.


A big part of why I got the T480 instead of the T480s was to avoid the thermals issue. Seems okay so far, but who knows how long (I've only had it a couple days) that'll go.


In biology thin would imply a high surface area to volume ratio, and great performance if you're trying to get rid of internally generated heat. Here, I guess we're talking about room for cooling systems so as to avoid blistering users' skin or charring their pants?


We were able to shave 1mm off of the skin by removing the sweat glands. When you start to feel hot just stop what you're doing.


A CPU in a laptop destined for any target market that doesn't even hold base clock speeds should not have made it out of the QA dept.


But does the i9 version have performance issues (i.e. it under performs compared to the i7)? The XPS 15 with the i9 configuration comes with 32GB ram and it much cheaper than an equivalent macbook pro. I can deal with it getting hot!


The XPS 15 i9 does do a bit of throttling, but off the boost clock (it manages to mostly stick to 4GHz in the stress test that Lee shows) while the MBP i9 apparently can't even maintain the 2.9GHz base clock.

That, along with performance that's below the previous-gen i7 but charging $300 more is the real kicker - honestly, I'm not sure how anyone at Apple would be able to justify that as not just bilking their customers out of money.

It seems like these days, the mantra is "buyer beware" if you're looking to buy a Mac.


IMO the last good laptops Apple made were the pre-Touch Bar Retina MBPs. I've used three different ones, two for work and one that I purchased myself, and they were all solid performers with excellent screens, excellent keyboards, excellent performance, adequate ports, and adequate portability. The new MacBook, with its terrible everything-aside-from-thickness-and-weight, showed that Apple has lost interest in making a usable laptop in favor of any kind of differentiation they can get. Even if it's pointless or apparently even worse than nothing.


In the video he shows the i9 rendering slower than the i7. When he runs the render on the i9 in a freezer then it runs faster.


A couple of weeks into f/t use, mine (XPS 15" 9570, i9, 32GB RAM) doesn't get hot at all (fairly typical dev stuff - Docker, VMs, browsers, IntelliJ). It throttles intermittently, for short periods (perhaps 5% of the time) under a stress test.


I have had thermal issues both in my old Lenovo T61P and a more recent Macbook Pro 15, both with dedicated graphic cards.

No replacing of soldering paste on my part was ever able to improve the situation, with the result of having powerful machines only on paper, that were brought to their knees even by light workloads.

My Mac becomes impossibly hot and slows down playing games like Hearthstone and Civ5, FFS.

I have long decided that my next laptop (regardless of the brand) will have integrated graphics and a medium TDP CPU, and I will just limit heavy workloads and gaming to my desktop.


> I have long decided that my next laptop (regardless of the brand) will have integrated graphics and a medium TDP CPU, and I will just limit heavy workloads and gaming to my desktop.

You get more bang for your bucks this way, too. Desktop hardware costs a fraction of a comparable laptop.


Of course. A laptop, no matter how powerful and well designed, is always going to be a compromise compared to a desktop, in terms of raw power and relative cost.

The tradeoff is convenience, where you would have single machine that is able to fill all your needs (power and mobility), instead of two.

But, for me at least, in the end it's not worth it. Also, cloud and sync services made working in parallel on multiple machines almost painless.


> Do people really think he just got a lemon?

Well, other sites have done benchmarks... have any of them reported it being slower than last year’s model?


Dave Lee's tests usually come from a gamer and media creator's perspective.

In most, if not all, of his reviews he makes a point to check thermals and throttling related to heat, because that matters to the gaming segment of his audience.

As for media creation, he always tests Adobe Premiere for his cross-platform media creation tests while acknowledging that Premiere on Mac is less performant than it is on Windows, and that there's a significant number of Mac media creators who prefer Final Cut.

The better question to ask is if any of those other sites who did benchmarks did heavy workload testing on the computers for extended periods of time. Running a CPU hot for short bursts can yield different comparative results than for a task that takes 30 minutes or more.


> The better question to ask is if any of those other sites who did benchmarks did heavy workload testing on the computers for extended periods of time.

I would expect so—benchmarks often have long-running tasks; if a task only takes a few seconds it's hard to distinguish the overhead from the task itself.

But if nothing else—they will do so going forward.


>> benchmarks often have long-running tasks

Yes, but depending on who you're talking to, "long-running" could mean a few minutes or several hours.

Throttling doesn't usually have a major impact on tasks that take a few minutes. But on heavy, long running loads, the effect of throttling starts to add up quickly, which is why the i9 MBP couldn't outperform the 2017 i7 MBP until it was stuffed in a freezer.

While the use of Premiere might raise eyebrows, I don't think it matters in Lee's testing, because he ran the same test on both the 2018 i9 MBP and a 2017 i7 MBP. Either way, it was a heavy, taxing, long running task.

I don't think anyone will dispute that the i9 has significant performance gains for people whose CPU utilization spikes are relatively short -- and that might be the majority of people who will opt to buy an i9 equipped MBP.

For content creators and other users who need to tax the processor for long periods of time, however, those users might be better off with the i7 version of the MBP.


> Dave Lee's youtube video is a withering takedown, presented dispassionately.

You've said this in two threads that I've read and I truly do not know what you mean by it. Is it a bad thing that Dave doesn't scream at his viewers when making a video? Or do you mean something else entirely?


It's impossible he had a faulty model?


Considering it ran fine with a lower ambient temperature, likely not...


It's summer now and as soon as I plug my MacBook Pro 2017 in to charge, it starts throttling itself due to the heat of the charging battery. Same thing happened to the 2012 I had before.

These things just don't have adequate cooling.


I often work outdoors, and merely being in sunlight makes my 13” MBP go to max fan speed after about ten minutes, and insufferably slow after a half hour or so.


That doesn't strike me as particularly noteworthy/unexpected.


You don’t think it’s unreasonable that a laptop should work in daylight?


A 15" laptop's body could easily be absorbing 75 watts of heat from the sun, equivalent to running both the CPU and GPU of a MacBook Pro pegged at 100%.

I think throttling in this scenario is a reasonable tradeoff, much more so than overengineering the cooling solution to handle it and thereby increasing cost, weight and bulk. Sit in the shade.


Isn't the operating temperature of a mac laptop max out at something like 35c ? That's woefully low for anywhere it gets hot, like Australia for example. It basically means that on a hot day in summer you cannot use your laptop unless you're indoors in air conditioned comfort.

EDIT: Yes, I checked, the MacBook Air has an operating temperature of 10-35c and a storage temperature of -25-45c, which means that it's technically unusable or out of warranty if you use it in a hot country on a hot day. Like in the tropics on a beach for example.


I think you'll find that any device with a lithium ion battery will have similar temperature ratings if the manufacturer is being honest. Everything else in the machine is fine with much hotter temperatures, but cooking the battery is a bad idea.


The battery heat issue is especially problematic with LiPo cells, since they tend to swell up and break open the surrounding device enclosure.


35ºC == 95ºF - I'm at the upper end of that range sometimes, but not exceeding it.

I typically work from a hammock, in the shade, until it gets to 90-95ºF. The throttling happens even when the temperature is in the range of 80-85ºF and CPU load make it much worse. It's not uncommon for me to have to seek air conditioning if I'm running a large test suite or rebuilding docker containers, else my machine throttles to the point that I can't even use Safari.

I've never had this problem with any other laptop I've owned, including previous generations of MacBooks and MacBook Pros.


> like 35c

> you cannot use your laptop unless you're indoors in air conditioned comfort

Maybe it's just me being used to colder climate, but I feel that laptop's thermal throttling is going to be a relatively small issue if trying to work in sunlight when air temperature is +35C out there.

I'd really seek some shade even when it's merely +25C...


Not necessarily in the sun in that temperature, but just say indoors where there is no air conditioning. It's not bright sunlight but it can easily get to +35c. You yourself can cool down with a fan and a cold drink but the laptop is sitting around in hot temperatures in the shade.


"You're holding it wrong!"

Other laptops work in the sun.


I live on a beach in Thailand and it can get pretty hot at times. It is a beach in Thailand after all. I have no thermal issues with my Lenovo Yoga when sitting directly in the sun. Lots of MacBook Pro users around here and they do run in to thermal issues (random shutdowns mostly) if they sit in the sun doing something CPU intensive. Around here that would be using Ableton (music program), with up to 4 tracks loaded at the same time. We are blessed with amazing DJs around here and they all use a mac


My experience with various portable computing devices over the years is that they overheat and shutdown to protect themselves when used in direct sunlight combined with warm to hot ambient air temperatures.

I do not expect the manufacturers to overbuild the thermal capacity such that this isn't a problem, not when we're talking about portable devices which (appreciably) prioritize compactness and lightness.

In the past I would have to shade my laptop while using it to play MP3s from the passenger seat with the convertible top down while on road trips through Nevada, to prevent if from overheating in the direct sunlight. It never crossed my mind that this was some kind of failure on IBM's part, the machine was already pushing the limits of what was possible at the time in the interests of being small and light.


As a mechanical engineer, no. It’s basic heat transfer physics. You can’t throw money at the laws of thermodynamics and expect a physics bending solution.


I think it's unreasonable that a laptop should work at all -- knowing the myriads of components and software that makes it work and has all kinds of complexity.

As for the daylight, the sun is not the same as daylight. Devices have tolerances for heat, tropical heat in Miami is not the same as daylight in Chicago.


I'm in the middle - northern Arkansas.

To be clear, I don't make a habit of working in direct sunlight. Screen glare makes that impractical, even if the temperature warrants it. I do generally work from a hammock in the shade, and I deal with thermal issues constantly.

Even in the shade, at around 85ºF, I find that I start throttling noticeably when I run large test suites, recreate complex python environments, or build docker images. I keep a close eye on my system's sensor temps through iStat Menus, and try to tailor my laptop's workload around it. For example - I use Slack on my iPhone or iPad instead of on my laptop, and keep Safari closed unless I'm actively testing something in it.

Still, it seems like my machine throttles to uselessness right around noon. That's when my battery hits 20% or so unless I've been plugged in to my external battery, and about the time the temperature hits 85-90ºF.


In the summer I have to unplug one of my external monitors because my (maxed out) 2014 Macbook Pro gets too hot.

I'm going to try using my linux desktop when it happens next, that at least has adequate cooling.


Same here. 2014 maxed out, doesn't do well in the spanish summer. External screen mostly stays disconnected, otherwise it throttles down after 30 mins.. Are there not some kind of cooling pads to help with this shit?


I've had to do the same this summer in the UK. It's the first time we've had weather hot enough to trigger a problem, but it has been CRAWLING. :(


Same problem, I resorted to just have a fan blow on it to help with getting the heat away...


Do you have a way to monitor or measure the CPU throttling on macOS?

I've found Intel Power Gadget [1] which looks like it might help, however, I'm not really sure what these graphs should look like when throttling vs not.

[1]: https://software.intel.com/en-us/articles/intel-power-gadget...


Look up the specs for the processor in your system, then chart the temps alongside per-core clocks.

On Intel chips with a 1 to 4 core boost, no cores will rise above the base clock when thermally throttled, voltage may increase, and frequencies will be lower. You can run a CPU bound benchmark until temps rise and note when the performance begins to decrease.

On desktop Intel processors, this usually happens between 70-80°C.


That should work for revealing throttling -- you will see that it tops out at 1.1 GHz or something.


I have to run an accessory fan stand under my MBP.


But they’re so thin!


This is crazy. They have a metal body to use as a heat sink. Just put shielding on the parts that touch human skin and let the rest go toasty.


> Just put shielding on the parts that touch human skin

That’s literally the entire case. There is no part of a laptop designed to never touch human skin.


There is another video [1], in which a 2017 i7 outperforms a 2018 i9 in video compression due to its severe throttling. Sure the benchmarks look great and are better on the i9, but as someone mentioned, that's like looking at sprint compared to a longer race.

[1] https://www.youtube.com/watch?v=ip-sZfWaVo0&t=04m37s


I keep seeing thermal issues come back to bite some of the smartest companies with products designed by the world's best engineers. Everything from batteries to laptops to phones is severely affected by heat, and improper thermal design can lead to some huge issues.

For example, the biggest thing affecting battery life is thermal load. The biggest thing affecting the maximum sustained power of a laptop or phone is the thermal design. Anyone can slap a cheap, powerful chip in; good thermal design is much harder.

I think people are going to start realizing this more and more, especially as powerful chips get extremely cheap.


Don't forget the infamous Xbox 360 Red Ring of Death, caused by incorrect thermal management causing balls on a BGA package to lift from the board underneath as it deflected.


Which is ironically also how you fixed it. I'll never forget the "magic" of wrapping my Xbox 360 in several towels and overheat for 30 minutes to supposedly resolder it back together. It worked for another 6 months after that until it completely and finally died.


The problem was more precisely that thermal variations combined with the way the heatsink was attached and pushing on the middle of the CPU repeatedly warped the board due to thermal expansion, causing the balls to unseat. Heating the thing would help reseat the BGA thanks to thermal expansion but would certainly not reflow soldering (not enough heat).

The proper fix was to try to reseat the BGA by whatever means and change the way the heatsink was attached so that pressure on the BGA/board was even. I did that on two units and they survived for years after their first RRoD, and on a third unit right from the get go, and that one never suffered of any issue.


The Xbox 360 is the only thing I’ve ever bought the 3rd-party warranty for at an electronics store. I don’t really know why I did it that day, but I sure was happy to have paid the $20 or whatever when I wound up exchanging FIVE of them before I got one that lasted more than a few months. What a mess.


Is the Intel upgrade treadmill is starting to show its age?

How much longer can we keep making the chips "faster" when the tradeoff is increasingly just more thermal issues?

Can't wait for a decent A-series chip MacBook.


Well, we could go back to having laptops that are thicker than a Bic pen and have actual cooling systems in them. I have physical spiral bound paper notebooks that are thicker and heavier than my work notebook.


The 'intel upgrade treadmill' consists of reducing transistor size and upping clock; the former of which reduces thermal issues, and the latter of which increases them.


> consists of reducing transistor size

It used to. The problem here is that they have not been able to ship next gen chips and so they started shipping a 6-core chip on the same process as the old 4-core, and that doesn't work so well.


The chip works fine. Apple used an insufficient cooling solution, Dell XPS laptops with a better cooling solution are doing fine.


The new chip has the exact same TDP rating as the old one. Unless Apple has made their thermal solution worse (highly unlikely) it is Intel that is fudging the TDP ratings.


The old chip pulled 33% less than its rated TDP in the real world.

Also, lol if you actually think TDP means anything anymore. Over the last 5 years it's become a marketing number more than anything else. It gives you a vague category of what power consumption will look like... at base clocks, under a non-AVX workload. Modern processors will happily blast past their TDP by 50-100% if there is thermal headroom available.

(And that's where laptops come up short - they can't actually cool a steady-state load, so these laptops will happily turbo themselves into a wall, and then once the heatsink is saturated they'll thermally throttle. Looks great in a 30-second benchmark though!)


> The old chip pulled 33% less than its rated TDP in the real world.

That sounds slightly made up. As you mention yourself chips pull vastly different amounts of power, and put off vastly different amounts of heat depending on workload.

> lol if you actually think TDP means anything anymore

Yes, it does not mean anything because Intel releases CPUs that exceed their ratings because they can't get their process right, which is my whole point.


You're assuming Apple's thermal solution was adequate to begin with.


No one has been able to reduce transistor size as easily as they used to.


Why do you think putting an ARM in a laptop will solve anything? If you want 5-8 year old x86 performance buy a HP envy and see how well your full blown desktop apps run.

https://www.cnet.com/reviews/hp-envy-x2-snapdragon-835-revie...


Apple's A11 benchmarks twice as fast as the Snapdragon 835. And if Apple was going to put an ARM in a laptop, they would surely design a chip specific to that use case with even higher performance than one that has to go into a passively-cooled smartphone.

(not saying I'm convinced an Apple ARM would actually outperform the x86 but that waving around Snapdragon machine doesn't prove your point)


I don’t want an iPad for my laptop (I love my iPad but it’s utterly unfit for anything but browsing around)


That's really interesting. Do you have any suggestions on some further reading?


some of it in Apple's case can be ascribed to their quest for reducing the thickness of the laptops to the point we lost not only ports it may be affecting their ability to adequately cool them


At this point, people in the Mac hardware engineering leadership need to start submitting their resignations or being forced out. This kind of sloppy engineering and lack of product focus is the antithesis of what Apple has stood for.

If I were Tim Cook I would fire Dan Riccio, and then resign myself and hand the company over to someone who cares about products and not supply chain optimization.

Also the whole executive Memoji thing is incredibly stupid and childish. Please take your jobs more seriously -> https://www.apple.com/leadership/


Why immediately jump to firing Dan Riccio? He was in charge during the glory days of Apple design.

Instead, I'd probably begin by taking a look at at Tim Cook and Jony Ive's input and approach to product design. It shouldn't be hard.

Look for the people championing the "go thinner, lighter!" "form over function" cause and act accordingly.


I don't think there has ever been a macbook pro that I can safely call a mobile workstation: that is to run for long periods of time with no thermal throttling and no burning chassis. Every time I see apple further thinning the chassis I shrug because no matter how much more efficient intel can make their cpus at idle or how fast the system races to idle sustained loads remain the same( 35 watts) and apple taking more and more of the tiny headroom available.


The Executive team had nothing to do with the Memoji’s. It was probably someone from marketing or their web team. Why even bring that up? It doesn’t help your argument at all.


Actually I agree with him, it's fairly unprofessional, and i'm usually one for casual but it just feels immature.


> The Executive team had nothing to do with the Memoji’s.

I think Steve Job's own philosophy would override that assertion: Steve Jobs On The Difference Between A Vice President And A Janitor https://www.sfgate.com/news/article/Steve-Jobs-On-The-Differ...


The thing is, the MBP is fulfilling Apple's design goals: they are optimizing for thinness, lightness, and battery life, not performance or thermal dissipation. As long as the thing doesn't break (in a way that can't be fixed trivially by the Geniuses at the Genius Bar) everything's fine as far as Apple's business is concerned.


But if you charge people for a CPU that has been modified or set up so it cannot achieve the advertised specs then it is misselling.


Intel's CPUs are likely not modified (very much). All Intel CPUs automatically downclock themselves when they start to overheat as a self-protection measure.

It is up to the case-manufacturer to provide adequate fans and cooling for the chip. Otherwise, Intel's chips are forced to protect themselves and start to slow themself down.


Last i had heard, TDP was a manufacturer configurable item and could be set within a pretty wide range, and the limits would act accordingly


This seems the same as someone selling you a 4K monitor (for 50% more than other 4K monitors) without ever mentioning that you can only actually have 4K for 5 min max, otherwise it's QHD. It sure sounds like something that should be illegal to me...


I like how you note the customer needs have no role in the Apple design process.


The fact that Apple is the most successful company in the world seems to suggest that their method of thrilling the customer by subverting and surpassing their expectations beats building what the customer says they want. In the case of laptops, it turns out that lightness and battery life -- portability -- is what people actually want, otherwise they'd just use desktops.


I’d guess the software, as that’s the primary distinguishing factor. I don’t know of anyone who runs windows on a mac, even though that’s relatively easy. However, there’s a thriving hackintosh community.

It’d be easier to tell if Apple offered consumers choice themselves.


> otherwise they'd just use desktops

That's not an option for many. If you use a desktop machine in your office, you can't ever work from home, and you can't carry your machine into meetings to demo something.


> If I were Tim Cook I would fire Dan Riccio, and then resign myself and hand the company over to someone who cares about products and not supply chain optimization.

You mean, if you were someone who cared about product, couldn't execute on it yourself, but also had the capacity to direct Tim Cook’s actions, you would cause Tim Cook to do all that.

If you were Tim Cook, you would either not have the concern that motivates this or you wouldn't need to resign to have someone who cares about product in charge.


If Tim Cook hasn't stepped down over all apple failures he's overseen, he's not going to do it because the new MacBook gets hot.


Here's the catch -- those with a financial interest in Apple (i.e., shareholders, directors, etc) would likely consider the sales performance of [insert Apple product here] a success when those of us who care deeply about the products themselves would consider [insert Apple product here] a failure.


Here's a review from a dev who works at NASA:

http://hrtapps.com/blogs/20180712/

Maybe the key point is he isn't running Adobe Premiere.

The thermal issues in modern laptops certainly concern me, but they've been running hot for a long time now, and being thin doesn't actually change the surface area significantly.


Well the single-core test is unlikely to max out the thermal loading on the CPU. You can see how the scaling from 4-6 cores provides severely diminishing returns, perhaps due to throttling.


Uh, it turbos in the 4+Ghz range. That is by itself probably going to blow the 45W thermal rating the entire chip has. My gut instinct says that macbook probably can't even dissipate that over a long period of time. The linked test seems to indicate the truth of that, and its not unusual for a mac product, the last one that could probably continuously dissipate the combined cpu+gpu heat was that crazy ugly mac pro.

Intel didn't invent some new physics to get "high" performance cores crammed into a laptop, they are just pulling the same stuff the phone market has been doing for the past few years. Put enough thermal capacitance against the chip itself to absorb a few multiples of the chips dissipation, allow it to get really hot, and then throttle it hard. Hope, people keep their benchmark runs short.

There is a difference between socket 2011/2270 based i7/i9s and their insufficiently cooled, memory and IO bandwidth starved cousins found in laptops. Even a fairly low end desktop is going to stomp the most expensive laptop you can find simply due to physics. Being able to dissipate more heat means higher performance. Its possible to move the curve up/down depending on fab processes, or micro-arch, but these days there aren't any miracle cures. More power equals more performance.


Isn't the 4 GHz meant as a boost frequency as thermals allow? The more CPUs are active the rarer this happens. It becomes problematic when the CPU can't maintain its base clock speed anymore.


Look at how fast the curves on the 2018 flatten out after 3 or 4 cores.


> being thin doesn't actually change the surface area significantly

No, but it reduces the space for a better cooling system (pipes, fans, etc).


Yes, I also think we need to see more tests with other apps than Adobe Premiere.

Does Premiere use the GPU as well? The thermal system of the MBP needs to handle the combined thermal load of GPU an CPU.


One of the many downsides of making all your computers like two millimeters thin.


Indeed. What ever happened to product differentiation?


To be fair they’ve always been bumping up against this since they don’t like to put tons of fans in their laptops.

While this obviously shouldn’t happen, it seems like something you could easily predict might be a flaw if you knew that one had to exist.


Oh they put the fans in there, they just don't spin them up until components are half desoldering themselves off the board.


They've always been bumping up against this back in the Apple II days, and systems would overheat then, too. All in the name of Steve Jobs emphasizing case aesthetics & not wanting fan noise.


Eh, that reminded me of the paper chimneys some people advocated for better air flow on the original Mac or MacPlus.

(e.g. https://www.google.com/search?q=mac+chimney , namely https://www.flickr.com/photos/idontlikewords/275104234 - some case aesthetics ;-)


Every MacBook Pro I've ever owned locked it self down to low frequency after running a CPU intensive task for a minute or so.

I thought that was common knowledge? There's NO laptops out there that wouldn't do that (and NO mobile devices either). This is why desktop workstations are still so much faster even though they carry same frequency chips.


What makes you think this? Lots of laptops have severe throttling issues and the thin-and-light trend of the last few years has not really helped the situation but there are certainly laptops out there that have adequate cooling systems and do not thermally throttle. I was kind of shocked when I got the original Thinkpad X1C (14" screen and weighs 3lb) that it maintained 100% boost clock with no throttling over the entire span of hours-long video encoding workloads.


Take the Intel's CPU frequency tool, run compilation on your MacBook and see how the clock gets locked to < 2GHz (exact frequency dependent on CPU you have) after the CPU temperature reaches a barrier (usually around 80C).


I don't have a Macbook? I have a Thinkpad that remains at 100% of the 2.6Ghz max turbo clock indefinitely at ~60C because it has a good cooling solution.

Notebookcheck reviewed the i7 version which is clocked higher than mine. Max boost is 3Ghz and in their tests it maintains 2.8Ghz running Prime95, drops to 2Ghz while running simultaneous CPU and GPU stress tests.

https://www.notebookcheck.net/Review-Lenovo-ThinkPad-X1-Carb...


Sure, but the amount of throttling his tested configuration was running at was fairly significant.

Most laptops can at least run their base clock most of the time, his MacBook couldn't even do that (2.2 GHz vs 2.9ghz). The XPS 15 is considered a PC laptop with subpar thermals and even those issues only crop up during turbo, not baseclock (same CPU).


I've been doing hours-long CPU+GPU intensive tasks (compiling ghc, number crunching, playing games like Bioshock Infinite and StarCraft II on a HD4000) on a bunch of pre-retina MBP and rMBP 13" and they never throttled. Thermal design and management is spot on, fans stay silent most of the time and kick in only when load requires it.

Beware of anecdata. Given the stories I hear I would squarely put the various issues in the realm of thermal paste gone bad, busted fans, or clogged airducts, and certainly not on intrinsic bad design because I have personal counterexamples of dozens of various Apple laptops from 2008 to 2016 being perfectly fine out of the box.

Obviously though, a desktop will beat a laptop's thermal performance any day but I think this is missing the point.


> I've been doing hours-long CPU+GPU intensive tasks (compiling ghc, number crunching, playing games like Bioshock Infinite and StarCraft II on a HD4000) on a bunch of pre-retina MBP and rMBP 13" and they never throttled.

I find that hard to believe. I have been gaming a lot on MacBook Pro unibodies and retinas with and without an external GPUs. Especially when the internal GPU is active, the thermal load is just too high for many games and applications.

Throttling normally starts in a very subtle way by disabling clock boost and is hard to notice. Throttling below the base clock is rare and this is where it becomes really noticeable.


I thought that was the one of the points of turbo boost? To take advantage of high clock frequencies for short high intensity hurts while still running at the base clock for long term work? If the clock speed is staying above the base clock I wouldn’t consider it a problem.


It isn't throttling if it is operating at or above base clock.


My data comes directly from monitoring the CPU frequencies as I was doing development work. There's an intel tool that gives you that data in realtime.

I did mostly only have the 15" MBPs, but thermal throttling is a fact of life on pretty much all mobile devices. I do mobile development work and having mobile CPUs run at full frequency for more than a minute or two is pretty much impossible.


>There's NO laptops out there that wouldn't do that (and NO mobile devices either).

My 10 years old HP nc6400 doesn't do that. It requires occasional (once a year) cleaning of the fan assembly though (not that hard to access). The fan is loud, the laptop is bulky but it never ever throttles. The CPU has 34W TDP, I do not think that's on the low side.


I've only ever had issues with 5yo+ laptops, and that's due to thermal paste degredation. I wouldn't pay for a laptop if it thermal throttles. I'll forgive the passive cooling on my phone for not keeping up with the CPU, but if the fans on a laptop are running full power and the chip thermal throttles why should I even consider buying it?

Keep in mind, there's a difference between thermal throttling and low boost clocks on multi-core workloads.


That, and also battery life (and the extrapolated time left until empty) takes a severe hit when doing CPU intensive task for more than a minute.

Does anyone know if thermal throttling is so much of an issue for iPads?


I use smcfancontrol on my imac to force the fans to do their thing. The problem seems to be that osx prefers throttling over turning the machine into a vacuum cleaner. For me this is the only way to play games. Without smccontrol throttling kicks in and the fans never go on fully.


This is why the pro desktop market is not dead.

Rendering real time path tracing while designing a house, visualising point clouds, video editing, sound design, you can do it on a laptop but the pro market will buy a desktop because it's 'cheaper', more reliable and upgradeable


I guess the key is that it under-performed vs last model's i7... not that it has thermal issues. I would think most laptops have some level of thermal trade offs.


I’d say that not even being able to maintain the advertised base clock is a pretty key issue. Being slower than last year’s model just adds insult to injury.


The key is that the thermal issues are so bad that it causes it to underperform against the last models i7.


My guess is Premier already had optimizations for the 2017 Macbook Pro to lessen the amount of hits to the throttle threshold temperature.


Isn't it the case that access to 32GB currently requires more power use (hence the larger battery), and that more power use necessarily means more thermal stress? In which case there may be nothing wrong with the thermal paste or the engineering; it may simply be that Apple came to the conclusion that the 32GB was more important than the issue of thermal throttling. For my use, I would readily take computer a little slower than the 2017 model for extremely compute-intensive, long-running tasks, but that let me use 32GB of memory.

Whether they are dishonestly marketing the machine as being faster is another.


I have been watching Dave2D videos for years. He is one of the best laptop reviewers on YouTube.


I suspected as much which is why I didn't bother with the ~£300 upgrade from the default 2.6Ghz i7 to the 2.9Ghz i9.

The MBP already struggled with the old i7 and with no redesign I couldn't see how they could handle the i9 without extreme thermal throttling. Turns out they can't but ship it anyway!


After seeing those benchmarks, I'm feeling even better about having just decided to go for an Aero 15.

I quite like Mac laptops, but I've concluded that I don't really need one. I'll just grab one of the new (rumored) Mac minis to continue MacOS development as needed.


The 2018 MacBook Pro having thermal issues doesn't surprise me after seeing what my 2015 model does under load, but it's really disappointing since I've been anxiously waiting for Apple to release new laptops.

This 2015 MacBook Pro will avoid turning the fan on at any cost. I frequently watch the CPU package heat up to about 190F while the fan is still at 0RPM.

Further, the SMC chip and/or kernel are set to use fairly slow fan speeds compared to the amount of heat being produced, so even when it does come on, the system remains hotter than necessary.

But, here's the major problem: it's also become clear that there is a thermal window where the system is hot enough to cause it to throttle itself, perhaps somewhere between 140F and 180F, but not hot enough for the SMC to turn the fan on fast enough or at all, even though it could and should.

So in addition to being really hot and not using the fan to get the heat away from your lap, it gets slow at the same time, right when it's being used. The system becomes faster soon after the fan is turned up to the max.

And to top it off, there appear to be two different throttling mechanisms, the CPU speed can adjust up or down (you can see this with the Intel Power Gadget), and Apple has some kernel system that will use CPU cycles for idling instead of scheduling threads that are waiting, which reduces overall CPU usage, and therefore also heat production.

All combined, it looks like the system is constantly bouncing between ramping up the CPU frequency to handle load, then wasting some of the additional CPU time idling to avoid turning the fan on, then turning the fan on just long enough to get the heat down before turning the fan back off again, and repeat.


Well I guess my next machine has to be a PC running linux now. Been waiting to replace my aging MBP with this latest gen, but this is the the last straw for me. Charging such a premium and being too lazy (or thin-focused) that it can't meet it's base clock outside of a freezer is ridiculous.

I just wish I could run OSX in a VM. Linux as a primary desktop isn't quite there yet.


You could run an OSX VM in either Windows or Linux (I'm doing fullscreen Linux VM on Windows with a Thinkpad, and it's quite good/fast). Idk if there are issues with licensing, though.

Also, give Linux as a primary desktop a try! What do you believe that's lacking, from your perspective?

Granted, it's not great out of the box, and some software isn't available, but if you have a couple of weekends to spare/invest with tinkering, you can have a pretty and solid desktop workstation setup.


I've been thinking the same for some time. Razer makes supposedly quite good premium laptops with some degree of linux compatibility. They also seem to have a community of enthusiasts behind them so I will likely try that or the Surface Pro.


Been waiting to replace my aging MBP

I replaced my MBP with a Surface and I’m very happy with it, so add that to your list. WSL beats my previous setup of VirtualBox easily for Linux stuff.


Macbook “Pro”.


It pains me how correct this feels. The MacBook Pro becomes an echo of itself every generation.

Soon, the beat fades and we're left in the dells


I’m sticking with my 2015 15” until it bites the dust and then probably getting something else if Apple hasn’t figured this out yet. Even there software is degrading. I feel like I’m doing better by not running updates (I know this is not so, for security reasons, but...)


My favorite is the latest Air (years old now) which doesn’t even have a retina screen yet, a perk in my book.

I could watch Netflix on my bare stomach with my Air. Latest Pro is quickly too hot to handle. Of course not the only issue but the computer never bounced back from that first impression.

The dedicated gfx card doesn’t even seem to run games like Civ 5 better than my Air which I’m guessing is due to all the extra screen pixels.


> I could watch Netflix on my bare stomach with my Air

Don’t try that with a first gen Air. It had a very significant problem with thermal related issues. Several times a week it’d get so hot I could make pancakes off it, the rest of the time it was in a throttled state. Was a very common problem among folks I worked with (albeit mine was a personal, theirs was a work machine) too.


The Air story is incredible. It went from compromising, underpowered, overheating, pricey, loathed and ridiculed, to everyone's favorite laptop, with excellent performance across the board at a stupefyingly low price point.


I do wish they would bring back the Air’s keyboard on higher end models.


> Latest Pro is quickly too hot to handle.

When watching Netflix?


PS4 "Pro".


This!

They ought to rename it, and actually sell a professional laptop.


"Pro" is a pretty broad term. I suspect the MacBook Pro is powerful enough for most professionals who need a computer to do their jobs. I use one professionally for software development, and it suits my needs just fine.

Sure - there are some professions out there that require more power than a MacBook Pro can offer. But is it really fair to say it's not "Pro" just because it can't meet the needs of 100% of all working professionals?


Is it really fair to say it's not "Pro" just because it doesn't deliver on the stock rated performance of the hardware it's spec'ed with? Yes, that's fair.


> But is it really fair to say it's not "Pro" just because it can't meet the needs of 100% of all working professionals?

Apparently it cannot even meet the needs of its own spec sheet. It gets throttle back below its base clock speed according to the video. That's the real issue.

Apple claims this is "More Pro" but it turns out to be just more of the same from Apple lately. One issue after another.


That would require a complete and total culture change. As it is, and I know Apple has a war chest, the clock is ticking on their survival.


Yes, they can only survive so long with their revenues shrinking like they have been...


Nothing lasts forever. The bigger you are, the harder you'll fall. The clock and mindset and innovation of Apple products has been alluding to a sea change for years, these things don't happen overnight.


So... they discovered that there is a reaason that desktop computers have a large heatsink with a fan many times larger attached... Don't buy a laptop and expect it's sustained performance to be anywhere near that of a proper desktop computer. It simply won't be


That's not really it at all. His point is that this laptop is significantly worse than other laptops in terms of throttling. To the point where the i9 renders slower than the i7 in his testing. That's pretty damning for a $300 upgrade.


Not just desktops. A few years ago we had thick laptops with good cooling, too.

Unlike newer thin laptops, they were, and still are, viable desktop replacements. Some of them even allowed upgrading the CPU. I once inserted i7-3612QM in 13” HP ProBook 4340s of a family member (it had an i3-3110M initially) also replacing fan assembly. It runs well under sustained loads, and the performance with the i7 is very close to much newer 15W mobile CPUs such as i7-8550U found in the current 13” macbook pros.

It’s a shame we no longer have these, not Apple, not Windows, nobody.


Imagine someone selling you a 4K monitor (for 50% more than other 4K monitors) without ever mentioning that you can only actually have 4K for 5 min max, then it falls to QHD.


I bought the MacBook Pro 2017 13" without touch bar with an i7. I pretty happy with it, but, the only thing that bothers me is it gets hot. Hot as hell. Skype sharing the screen and JetBrains CLion and you get a beautiful roaster.


FYI, I went through the same experience as you and returned it after discovering the non-touchbar model has one less fan. Apple clearly hates the customers who don't want to buy the touch bar model.


The non-touchbar laptop uses a lower-TDP part (15W vs 27W or something).


Too late. It's been a while since I have it.

The temperature is something that bothers me but is usable, I guess.


I have a 2016 Touchbar 13", doing anything Eclipse gets it hot. And I also have a 2016 15" that gets hot once you plug a monitor in. Not unusable or super hot, but just hot.


Anybody knows if Apple's ARM chips (iPhone) have better thermal efficiency (watts/GFLOPs) than the chips in this latest gen?

If this is the case, I think it's just a matter of time for Apple replacing Intel with their own custom ARM chips.


You can't really compare ARM and x86_64 chips. I'm going to use a crude example. Think of the ARM chip being one of those USPS Trucks (https://upload.wikimedia.org/wikipedia/commons/3/37/Small_US...) while the x86_64 is a F150 pickup.

Both can drive around and transport things, but the USPS Truck is purpose build to deliver mail, stop and start frequently, and be reliable. It does a few things well. On the other hand the F150 is widely produced, can be customized in many different ways, and is general purpose vehical. Could it do the function of the USPS Truck? Sure, but it would be slower, cost more to run, and be less efficient. Could the USPS Truck haul a trailer? Maybe, but it wouldn't be very good at it. Both are a truck, but both are designed to different things and excel at different things.

That crude comparison is ARM vs x86_64. ARM is getting there, and for the specific ARM functions it's more performant than the equivalent x86_64 functions. But once you go outside of what the ARM chip has been specifically designed for, your performance takes a massive hit. Will ARM be able to take off in a Macbook? Maybe in a few years, but Apple would need to dump a massive amount of R&D into their designs before I would think they would consider it (but who knows?)


What? They're CPUs. They're designed to run instructions. Intel's chips spend more energy on the really fancy stuff like huge reorder buffers and extensive speculation. They get higher IPC at lower MIPS/watt. An i7 isn't going to run iOS more slowly than whatever Apple calls their latest ARM SoC. It _is_ going to blow its lid without a big ole heatsink brick on there and a fan too, which is why smartphones use ARM cores designed for efficiency.

P.S. if anything, ARM is the "general purpose vehicle" in your analogy. x64 is all about high throughput and high power on a chip you buy and plug in. ARM is the one where you can get an architecture license and customize the shit out of it.


Not all instructions (or instruction sets) are equivalent.


They're not that different, either. It's been a long time since high-performance cores were executing the instructions as fetched and not translating into µops. The stuff Intel does on their chips works on ARM too. People are doing it, and it's working[0]. That makes a hell of a lot more difference on throughput than what ISA is implemented.

[0] https://fuse.wikichip.org/news/1316/a-look-at-caviums-new-hi...


Actually the x86 instruction format is key to Intel's performance dominance. It happens to be rather dense in terms of instruction per byte, and it turns out that instruction cache hit rate (and iTLB hit rate) are extremely important to high performance. Rather than the naive view that x86 is somehow saddled with its ancient and ugly instruction set, it's actually all of the RISC-like architectures that turn out to be hampered by their overly beautiful instructions.


The other way to think about it is like the v7 thumb modes in comparison to the original 32-bit predicated instruction set. X86 has a lot of less than ideal instructions, but they are uncommonly used. So the core instruction set is quite dense and high performance.

Intel is learning (or knows, depending on your perspective, see goldmont) how to build highly efficient cores too, but like ARM haven't quite figured out how to build a super high performance one that is crazy efficient. ARM's continue to be quite efficient but not particularly performant, while intel's continue to be quite peformant but not particularly efficient.

Either way, the trend is pretty clear at this point more power dissipation=more performance.


Most memory is data.

Code density is important, but you can just double the icache. Given that icache is often smaller than dcache, it seems that's not the major bottleneck.

Also, if code density is so important, why did arm drop Thumb when they switched to 64bit?


There is nothing inherently energy efficient about x86 or ARM instructions sets, the silicon merely optimizes for different performance.

[ https://ieeexplore.ieee.org/abstract/document/6522302/ ]


> They're designed to run instructions

ARM can do a small subset of what x86_64 can do. Let's say for a minute ARM doesn't have a built in H.264 decoder (they both do, but that's the one feature popped into my head).

The x86_64 chip will be able to do execute the decoding of a H.264 file that faster and more efficiently than the ARM chip because the x86_64 has purpose designed instruction to do this or make the job easier.

Another Example. x86_64 can multiply and divide while ARM can't. Both chips can do the same work, but x86_64 can multiply 20*4 in one instruction while the ARM chip has to do 20 + 20 + 20 + 20, with each one being an addition instruction. ARM might be able to add faster than x86_64, but x86_64 will still be able to multiply faster. That's extremely simple, but when you get into more complex operations ARM has to spend more time doing what x86_64 can do.

> An i7 isn't going to run iOS more slowly than whatever Apple calls their latest ARM SoC.

an i7 will run iOS slowly because iOS is build specifically for ARM. Just like how PS3 emulators take a lot of effort to run on x86_64 chips, there is an emulation overhead.

> It _is_ going to blow its lid without a big ole heatsink brick on there and a fan too, which is why smartphones use ARM cores designed for efficiency.

Yeah, because it's designed to run at a higher thermal envelope. To get ARM anywhere near the Floating Point performance of an i7 you will need to increase the power and throw some active cooling on it.

> P.S. if anything, ARM is the "general purpose vehicle" in your analogy.

I don't think so, as seen above and below.

> x64 is all about high throughput and high power on a chip you buy and plug in.

Yes, because OSs have been designed to use as much as the chip can offer it. But I can get a 5 watt Intel CPU will will run circles around a 5 watt ARM chip.

> ARM is the one where you can get an architecture license and customize the shit out of it.

Yes, so you design it to fit your specific functions. Apple's CPUs only implement what the the engineers need it to, so they can save die space by not implementing unused instructions. You are proving my example here. ARM is the Customized Purpose built USPS Truck, while x86_64 is a general purpose F150. The USPS build their truck to fit their needs just as ARM Chips are built to meet the needs of the consumers.


This isn't a problem solely with the MacBook Pro - it affects most of the "ultralight" laptops these days.

My XPS 13 (lovely machine, thermal issues aside) has been RMA'd 3 times because it overheated. It still gets far too hot and throttles itself so I've had to underclock it and turn off "turbo" mode. Other users have mentioned that simply repasting the CPU has caused massive improvements in temps, but I'm personally not comfortable with messing around inside a $2000 laptop.

Users shouldn't have to cripple their hardware or actually fix components to make it work properly.


I'm only comfortable doing that after I've got at least 3 years of value out of it. My current laptop is an aging Zenbook Pro where I've replaced the custom SSD with an adapter plus an m.2 board.

So maybe if you still want to use the laptop after a few years you can open it up and give it a bit of a facelift. Maybe applying a whole bunch of thermal paste will give it a slightly new lease of life?


Absolutely - once it is out of warranty and starting to lose performance I'm happy to mess around. But less so just a few months after I've bought it!


Which generation of XPS 13 is that?


9360 so the previous generation. Made the mistake of going with the higher res screen - apparently they have more issues.


I’m not surprised. I’m hitting 90C avg, 95% CPU, fan maxed, 25C room temp, running W10 on a 2016 15in MBP.


That... seems way too high? Are your vents clogged?


This is pretty normal on laptops, my 6700HQ can go up to 95°C. These chips are rated for up to 100°C (TJunction).


The heat off that MBP reminds me of how hot my early 2011 15" MBP got. I suspect that the heat was the culprit for the GPU failures that gave the 2011 model its notoriety.


I thought that was gpu manufacturer issue? I recall Nvidia specifically had a lot of issues across many oems.


You are right, NVIDIA replaced the die for many OEMs, at the time my Vaio was affected and out of warranty, but Sony was nice enough to replace the motherboard for me and the laptop still works..roughly 10 years later..


Hard to say. Some people got temporary fixes by baking their logic boards and/or reballing the GPU.

IIRC the "recall" involved replacing the logic boards with refurbs having the same GPUs. The CPUs on the boards seemed to be throttled after the replacement.


It was likely a bit of both, but there's one thing that can't be denied, apple products are built for form over function.


Apple was by no means the only manufacturer to have trouble with those Nvidia parts; it was an industry-wide problem.


Macs dont even start running fans until you hit >80C.


I think the correct terminology is “don’t start running the fans at high(audible) speeds till >80C.


Apparently they don't start at all until 80C here: https://www.youtube.com/watch?v=wgeh7ZJRhZU


Mine are running at 2k RPM ~40 C.


Mine runs at >6k RPM after 50-something Celsius.


It really depends on the model. My older MacBook Pros would idle at 2K RPM, while my new MacBook Pro can go all the way down to zero in the winter or with very few things open but generally stay at 1200 RPM during daily usage. All of these are silent, of course.


Depends on the model. The MacBook Escape (non-touchbar Pro) doesn't run fans at all at low temperatures. I actually thought it was broken when I first got it, but it's intentional behaviour.


It's probably because Toronto, where he lives, has been around 27 degrees and 70% humidity the last few days. That's around 7 degrees and 20% humidity above the typical office climate of around 20 degrees and 50% humidity. That difference can result in roughly 20% lower dissipation power if the temperature at the heatsink is around 50-60 degrees which it appears to have been historically.

That 20% extra dissipation power might be enough to get it from 2.2 to 3 GHz considering the base clocks are usually where processors have the most efficient marginal frequency scaling.

IMO, designing the thermal solution + chip combination to reach base clocks at typical office climates seems reasonably smart/defensible because most places will be cooler than that most of the year even before AC.


The i7 was tested in the same circumstances I assume. So it shouldn't matter how hot or humid it was. The comparison should be valid.


Yeah I'm not sure what's going on there, the cores on the 2017 had to be maintaining a sustained 3.7 GHz clock or 0.6 GHz boost for the highest end option.

Might just be silicon lottery, might be the GPU thermals are interfering or maybe the 2018 would run faster with 2 cores disabled.


What’s to say he didn’t perform the benchmarking in a room with AC?


I'm guessing here because he appears to be in a house/apartment, didn't show ambient temperatures and because Toronto is surrounded by the great lakes which should make it more temperature than Montreal and Quebec city which recently experienced a spade of heat deaths and heat induced early mortality due to lack of widespread AC adoption.


He wasn’t testing it outside in the sun, though …


But if his place has no AC, it's even worse than outside (in the shadow)


Notebookcheck also lookedinto it in the recent review https://www.notebookcheck.net/Apple-MacBook-Pro-13-2018-Touc...


This is not a first for Apple.

I remember buying the 2006 Macbook with a CPU upgrade. A Core 2 Duo with 2GHz instead of 1.83. Due to heat, the CPU throttled under load and ended up with the same speed as the standard model.


Is this due to their focus on iphone and ipad? Probably and very likely, if I remember, most of their hardware talent had been pulled at some point to work on things that bring more money, these products get less attention. Looks like mac line upgrade cycles suffers due to that, don't quote me though....

But on the other hand this is good that it a came out, should be fixed in next upgrade I would assume, no wonder we got huge bump spec, people voiced their concerns loudly.


I think it was the software team. MacOS dev team was inexplicably merged with the IOS dev team and that's when quality of macOS really plummeted, with laughable security issues every other week


Is this throttling in the chip itself? Or is it something that e.g. Linux (the kernel) needs to be taught specifically for make/model? Or some combination of the two? And is it possible that the in-chip limiter is more of an emergency limit, and hitting it regularly can result in long term consequences if the kernel doesn't do aggressive enough limiting?

I just see really different behavior with heat, fans, and kernel behavior between XNU vs Linux, and Windows vs Linux on the same hardware.


I am not sure why you are referring to Linux (macos kernel is XNU), but the firmware of the chip is in charge of controlling the CPU clockspeed and the only reason it is hitting this limiter is because the CPU does not have correct cooling, nothing to do with the OS.


In the source video, it shows that the chip is downclocking to 2.2GHz under sustained load. There is nothing any software can do about that as that's hardware doing it.

The OS can govern how the fans spin up, but all CPUs will throttle down when they hit a predetermined temperature.


All three 15-inch Mid 2018 MBP configurations benchmarked are at the top of Geekbench for multi-core, only behind iMac Pro and Mac Pro models [1].

Here they are in isolation compared to the 2017 and 2016 15-inch models sorted by score descending overall:

    | Model                           | Configuration                            | Score |
    |---------------------------------|------------------------------------------|-------|
    | MacBook Pro (15-inch Mid 2018)  | Intel Core i9-8950HK @ 2.9 GHz (6 cores) | 22547 |
    | MacBook Pro (15-inch Mid 2018)  | Intel Core i7-8850H @ 2.6 GHz (6 cores)  | 21266 |
    | MacBook Pro (15-inch Mid 2018)  | Intel Core i7-8750H @ 2.2 GHz (6 cores)  | 21096 |
    | MacBook Pro (15-inch Mid 2017)  | Intel Core i7-7920HQ @ 3.1 GHz (4 cores) | 15550 |
    | MacBook Pro (15-inch Mid 2017)  | Intel Core i7-7820HQ @ 2.9 GHz (4 cores) | 15251 |
    | MacBook Pro (15-inch Mid 2017)  | Intel Core i7-7700HQ @ 2.8 GHz (4 cores) | 14374 |
    | MacBook Pro (15-inch Late 2016) | Intel Core i7-6920HQ @ 2.9 GHz (4 cores) | 14134 |
    | MacBook Pro (15-inch Late 2016) | Intel Core i7-6820HQ @ 2.7 GHz (4 cores) | 13706 |
    | MacBook Pro (15-inch Late 2016) | Intel Core i7-6700HQ @ 2.6 GHz (4 cores) | 13008 |

[1]: https://browser.geekbench.com/mac-benchmarks


The Macbook Pro has a great 100m sprint, but it can't complete a 1km race without crawling across the finish line.

Geekbench takes a few minutes to run. If you ran it back to back to back without letting the laptop cool down, you'd see different numbers.


Were the benchmarks run on a cooled room? Some years ago, as a student, I paid $$$ for a Toshiba with 4 cores and hyperthreading, and I could only run them well on the winter or in a very cool room. In the summer it would reach 100C and constantly throttle, so I do think that temperature is a key issue, sp. now with slimmmer laptops


Not sure on an individual basis but the Geekbench numbers are aggregate across all users of the machine.

I reached out to the video author to see if he'd be willing to run Geekbench on it in a freezer vs not.

https://twitter.com/kicksopenminds/status/101942058846774067...


Two suggestions were made:

1. Run GeekBench back to back (realistic: a prolonged workload... which would show the Mac as being worse)

2. Working in a freezer (completely unrealistic situation... which would make the Mac seem better).

Only one suggestion gets passed on to the video author. Science (Apple flavour).


I can't speak for everyone but I already run Geekbench multiple times when I use it and always have. I think most users do the same. The author is experienced in testing machines so I felt this point was redundant to state.


> 100C

Are you sure?? That sounds a bit...hot. Do you mean 100F?


My MBP will hit 99 C before it starts ramping up the fans or throttling the CPU to keep cool https://imgur.com/a/GFkMXfg


The insides of computers can frequently reach that range.


My thinkpad shuts down with a CPU temperature of about 100-105. It's currently idling (98% idle according to top) at 80C, with the CPU clocked down to 1199MHz. The chip is an i5 CPU M 560 @ 2.67GHz

The fan is broken, so it can reach shutdown temperature even when I have the "powersave" governer set (which I normally do)


So far the last several Macbooks have throttled after X minutes of rendering. The only real surprise is that the 2018 Macbooks ended up with worse rendering time. But it does make sense since the i9 is a hotter chip and combine that with the same thin design...you'll reach the threshold faster...which will throttle faster.


Problems with MBP unibody cooling are known for years... https://www.google.com/search?q=cooling+macbook+drilled&tbm=...


That's seems to contradict a recent benchmark/post [1] which showed impressive CPU performance from the i9 MacBook Pro.

[1] http://hrtapps.com/blogs/20180712/


They really do test different things. In this case he is testing video renders from Premiere Pro, which renders for tens of minutes at 100%. It's apparently an outrageously inefficient app as well as the Windows version of it renders the same output in a fraction of the time (which invalidates it as a test, really).

All modern chips thermally throttle. Run a heavy AVX2 workload and you'll often go below the so-called "base" frequency. Run one or two threads and it can dramatically "overclock". It is the nature of all current chips.

Eh. For the overwhelming majority of users I doubt this would ever be an issue. Compile a kernel. Apply a filter. Do some convolutions. Every normal thing will be ridiculously fast. Run a server or workstation style load and it won't be so good. That has been the case with laptops for time eternal.


BS! All modern chips do not thermally throttle!

Only ones in systems that have not been designed correctly!

The fact that the Aero x15 he tested with the 2.2Ghz base clock 6 core i7 can all core boost to 3.1Ghz in the same workload that brings the macbook pro far below it's base clock speed proves that!

P.S, it does not matter what program it is running, throttling is a hardware flaw!


Depends on the workload and amount of time spent processing the workloads.

Dave Lee ran some tests that took a half hour to complete with the CPU pretty much maxed out. I would imagine the CPU was running hot for most of that time.


The only impressive one was single core. Otherwise it was in line with a laptop, and not a desktop.


When the heading said Youtuber, I thought for sure it would be Louis Rossmann


I don't know a lot about heat dissipation, but could the slimness of the MBP be a factor in this? I would think that after a certain point, you wouldn't be able to get good airflow in such a small space.


There are multiple factors, but slimness is probably not the only culprit.

I don't think Apple (or most other PC vendors) use the best thermal compound on their CPUs.

When my early 2011 MBP was having heat problems, I replaced the compound over the CPU and GPU and got noticeable although not spectacular reductions in heat.

Of course, that's not something you can do on a "modern" Macbook Pro, since they're not intended to be user serviceable any more.


They can use an okay quality thermal joint compound if it is applied well. My Dell's thermal joint compound looked like someone was trying to make a PBJ sandwich with the stuff. Applied too thickly, and it becomes an insulator. Mac used to have this problem in the past. I repasted my Dell and got 5F reduction at load right away.

The problem with all high end mobile CPU's is that they are given a tiny heatsink and heatpipes which are simply not up to the job. They quickly become heat soaked and the fans are unable to reduce the heat, so the units throttle.


The i9 CPU is a K part, therefore, I would think one could conceivably undervolt/underclock in order to change the thermals on the machine to their liking. Is there such a tool for Mac OS X?


You can actually undervolt and underclock any Intel CPU, Unlocked ones are just unlocked in the "up" direction.


We're talking about a laptop here, which most definitely doesn't contain an overclockable i9 K processor.


It is according to Intel (otherwise it would not be labeled as i9-8950HK): https://www.intel.com.au/content/www/au/en/gaming/overclocki...


And you would be wrong there. There are mobile K SKUs that can be overclocked.


What’s a K part?


The Intel processors that have the suffix “K” are unlocked, and can be under or overclocked: https://www.intel.com.au/content/www/au/en/gaming/overclocki...


What an embarrassment. This particular generation of MacBook Pro can't be abandoned fast enough. Flush the whole thing and start over.


Apple it seems is happy to be known for iPhones and iPads but for many people Apple is really about a rich history in computing, computers, good engineering, design and well built products.

Now especially with the mac business they are reducing the scope and value of their brand, and becoming purveyors of high value trinkets. This is nothing short of a tragedy.

There is a difference between a small premium for good engineering, components choice and design, and profiteering to the point of becoming an extremely overpriced luxury product with more brand than real value.


The biggest shock for me with these laptops came when I tried to do some light gaming. I'm not talking about recent titles either, my 2017 MBP gets bought to the brink by KSP at minimum settings with no mods. I understand that I would be quite rightly criticised for trying to run something like Fortnite but we're talking about a game here with roots in 2013.

The hardware they put inside the later models has no hope in hell of running normally with all the thermal issues.


KSP can lag on my desktop with a 1080ti. Don't feel too bad.


KSP is largely CPU intensive, the GPU barely matters (You can run KSP on maximum settings on a RX460 very comfortably with low parts numbers)


KSP is CPU constrained and IIRC uses only one thread beacause of Unity 3d.

Try lesr boosters :)


More struts!


Not surprised given how thin the thing is.


Is this a news? MacBooks aren't particularly known for their excellent thermal design...


The Retina model is pretty good, actually. A huge improvement over the original unibody. And the Touchbar models are even better (when comparing 2015 vs 2016)

https://images.anandtech.com/reviews/mac/retinaMacBookPro/pe...


"...'this degree' of thermal throttling..." Very impressive!


Genuinely curious: I don't work in the space of CAD, video editing or any other field that people tend to use to justify the MacBook Pro's ludicrous pricetag that has some shiny numbers on the tech specs sheet. Do these really do anything a sufficiently equipped laptop from anyone else can't? Plenty of options with latest gen Intel and Nvidia 10xx series are out there for less insane prices. Granted, I don't see many with that much storage space and RAM but I'd love to hear how some of you have experienced the same workload on non-Apple devices and felt you should keep buying MacBook.


> justify the MacBook Pro's ludicrous pricetag that has some shiny numbers on the tech specs sheet

If you think people buy MacBook Pros because of their tech specs, you are missing the point. MacBooks are by far the best general purpose, well-rounded laptops on the market from a build quality perspective. Everything from the screen to the touchpad to sleep/hibernate (and, until recently, the keyboard) is finely tuned to the point that you can't find another laptop on the market that just feels anywhere as nice as a whole. The tight integration between the hardware and the software doesn't hurt, either.

If all you need is a powerful laptop, or if you don't care about any of those details, you are probably better off getting a more cost effective machine, but you will always sacrifice some combination of things for it. (For me personally, the biggest one is the touchpad.)

I say this as someone who doesn't own any other Apple products, but I will likely never buy a laptop other than a MacBook Pro.


I have a handful of laptops but by far my most reliable have been my macbook pros. I'm writing this on a mid-2010 mbp that's run like a champ since day one. I also have a 2014 mbp for work, but I really like the 2010. It's not a skinny ultra-portable but it's not a tank either. It only runs hot when I have it hooked up to a large display. Eight years with no notable problems for a daily driver is pretty good in my eyes.

I agree completely on the touchpad. I picked up an XPS13 that I'd intended to use as a replacement. The touchpad is as good as I've had on a non-Apple product, but it's still not on par with Apple. On the otherhand I've started playing with OpenBSD with ratpoison & qutebroswser and it's pretty remarkable how far you can get without a touchpad at all.

A couple years ago I wouldn't have considered anything but an mbp, but with the recent touchbar and butterfly keyboard shenanigans I would absolutely consider something other than an mbp. I hope Apple comes to recognize that reliability is more important than super skinny unrepairable machines.


I appreciate your comment and sentiment, and agree to some extent about previous MBPs, but saying that "it only gets hot when attached to a large display" is quite shocking...I normally use 3 external screens for work on my Dell XPS (previously Thinkpad), and don't understand why a powerful laptop would get hot just by outputting to displays..


In general, I would agree that Macbook Pros are very reliable.

Having said that, it's not like Apple hasn't had their share of lemons. I was unfortunate enough to have an early 2011 15" Macbook Pro. Without getting into too much details, it's a known lemon that has been discussed here on HN many times. My 2011 MBP was the first laptop I've ever had that actually stopped working. FWIW, most of my other laptops were PC laptops that didn't cost nearly as much as the MBP.


Seconding the touchpad. If I'm spending eight or so hours a day using a computer you can bet I'm going to get the one that feels best. CPU performance matters, but I'm not running at 100% CPU all day. I am clicking and typing all day, though.


Have you taken the Surface Book 2 for a spin? Pricey, but gives the MBP a good run for its money, and does a couple tricks the MBP cannot, such as having an excellent touch/pen surface for a screen. Real ports, including SD card. The outstanding downside is, of course, the abominable Windows 10.


How often do you rely on the touch screen? I know it seems like a naive question, but I have an older surface pro and I rarely used the touch screen. The machine itself was bulkier than an iPad or Nexus9, so I tended to use those instead for tooling around and mindless consumption. The surface pro was pretty much useless for actual work unless I had a second monitor, actual keyboard, and trackpad.

I do native Windows dev for a living, but I have no urge to pick up a machine with a touch screen unless it's a tablet. Is a touch pen/surface actually useful in day to day work?


I use the touch screen all the time. They made it very easy to do one important thing, they let you take a screenshot by pressing the pen eraser twice, and immediately begin annotating it by drawing right on top of it. For doing things like UI prototypes this is a huge booster. Someone sends me a test build or concept, and I can leap right into graphical annotation. Why waste time typing when unambiguous visual communication is this easy.

As a pen input device it's at least as good as my Intuos2 which it replaced, plus you get to draw right on the screen. (I don't like the side button though, it requires too much force to press.)


The surface pro seemed designed to be a tablet that could also be your computer. The surface book is a laptop that can also be a tablet. This difference is really important for interaction. That said I feel like the digitizer is only really useful for taking notes that contain diagrams, or things like digital art. I also barely use my surface book, despite thinking it's one of the best alternatives to my beloved 2015 15" MBP.


I have a touchscreen on my dell e7470. I use it daily and I wouldn't want to miss it. It's perfect for like web browsing, navigating diagrams, and presenting. It's not useful for actual coding itself, only for supporting activities. I only use the laptop monitor itself, as soon as a actual monitor becomes the primary monitor, the usefulness of a touchscreen declines massively.


I have, and to give Microsoft credit, it's the closest thing to a MBP I've ever used. That said, it still doesn't just quite feel as polished (the touchpad isn't as accurate, for one). But if I had to use a non-Apple laptop, I would probably go for a Surface Book.


I do some Photoshop and Lightroom work, but generally don't need an overly powerful laptop. I used to detest OS9 or whatever Apple had before OSX and stuck with Windows for some time. But after switching and when I last upgraded, price was no issue. There might've been a powerful, $500 Windows laptop, and I would ignore it. The MacBook Pro has simple, clean lines. It works well. The OS stays out of my way (I shrink and hide the Dock and don't use many of the other features - Expose, etc - can't even remember what they're called). I can't trust that to be the same for Windows-based laptops or other hardware.

If it costs $3-5k and I use it for hours and hours each day, years on end, it's peanuts as a professional tool. I'd rather pay and enjoy using it than take a discount but be frustrated each day.


OSX really. Loss of productivity from the little stuff in windows just isn’t manageable for me. OS X has problems, but I’m used to it and familiar with it and it works as expected.


I agree with the loss of productivity in Windows. For me it started with Win7, really took hold with Win8/server 2012 "metro", and only got a little better in Win10.

I don't understand why there are two significantly different ways to do simple things like connect to a network or configure an existing connection. The OS throws the pretty/flat metro inspired applet at you, but it has 10% of the actual functionality you're looking for, so you end up popping open the control panel anyway. The start menu ads are an insane addition. The apps in the app store are pretty much pointless and anything you actually need you can install via chocolatey.

If MS were to produce an actual "pro" version of Windows that removed all of the fluffy inconsistent stuff, ripped out all vestiges of Win8, and put PowerShell front and center, it would appeal to a lot of people that just want to get stuff done.


OSX, as sibling pointed out, but let me be much more specific:

- iMessage

- excellent integration between Contacts, Maps, iMessage, Mail, and Spotlight

- Mail.app, the best desktop mail client (IMO)

- Siri > Cortana > Whatever tarball you are expected to run autoconf inside of to get your linux box to act like a mac from the 90s

- excellent HiDPI support (neither Windows nor Linux have figured this out yet)

- plug and play external display support

- configuration-free lid switch sanity

- configuration-free trackpad sanity

- Lack of vendor adware (Windows obvs, did Ubuntu ever remove those Amazon search things?)

- painless FDE (extremely important)

- good fonts

- iOS development possible

- the most common modifier key is where my thumb can reach it and is not overloaded with the unix modifier key (is ctrl-c copy or SIGINT?)

- copy and paste work between apps using consistent keystrokes (windows figured this one out, linux not so much)

- less malware than windows

- an order of magnitude better a11y than anyone else in the history of OSes (it’s literally not even a race anymore, Apple is so far ahead here)

Finally, in not-OSX:

- build quality, even with keyboard and thermal issues, is still simply a head and shoulders higher than everyone else

The only people who could reasonably challenge Apple’s level of fit and finish from an OS perspective are Google, which is why I carry an rMBP and the 16gb Pixelbook. ChromeOS these days is a fucking marvel, up there on quality par with google search before answer cards, google assistant, and google maps. It’s a major achievement and is the first actually workable/usable Linux desktop.


Genuinely can someone explain why this is downvoted? The answer says “the software” which people constantly discount - just look at the cycle predicting Nintendo’s death every few years.


Comsol and sometime CAD user here, on a 2015 MacBook Pro, the latest of many Apple laptops for me.

My workflow runs faster on many laptops outside the walled garden. But I've gotten used to an exceptional build quality that's reliable under hard use, and OSX lets me think & work more like I want than other OSes. Also, my customer experience with Apple has been outstanding for decades.

For me, Apple has a mix of quality and aesthetics that I like better than what I've experienced from other manufacturers. It's been worth the price premium for me.

That said, I'm avoiding the butterfly keyboards until they really get them figured out.


In general, unless you are running a long computation (in CAD, that could be a stress analysis; in graphic design that could be rendering a frame) you won’t notice any difference in snappiness of workflow.

I run Autodesk Inventor 2018 on a bootcamped mid-2017 Mackbook 12”, it has a built in Intel video card (650 I believe). The only time I wish I had more processing power is when doing huge FEA analysis, my 15W processor goes into a down-clocked limp mode, and computations take 4X longer than on a latest gen desktop).


> Do these really do anything a sufficiently equipped laptop from anyone else can't?

Yes, mac touchpads are light years ahead of non mac touchpads.


Yep Its just a million little details that make it such quality. The awesome UI of the OS. The trackpad is the best trackpad I've ever used. Every other trackpad feels clumsy after using one. The fact that it's a TRUE linux machine not just retrofitted Windows linux. The fact that everything just works. The chance of getting a virus is incredibly small. The time saved in troubleshooting is worth the price. Operating system upgrades are like 30 bucks vs the 1-200 dollars of every new windows release. Also, the build quality is the best. I have a macbook that has lasted 10 years and still works great. Never had a windows machine do that. So the quality, UI, attention to detail, upgrades, and more make it more than worth the upfront price tag in hassle saved down the line.


> Operating system upgrades are like 30 bucks vs the 1-200 dollars of every new windows release.

When was the last time you upgraded macOS? It's been free for several years. :)

Also, to be fair to Microsoft, Windows 10 is supposed to be the last version of Windows, so you should only have to buy it once and never upgrade again.


The fact that it's a TRUE linux machine

To be pedantic, a GNU/Darwin/Mach/BSD machine.


Yessir. A flavor of Linux but Linux nonetheless.


A flavour of UNIX. BSD (and OS X) evolved mostly separately from GNU or Linux, with some limited cross-over. Even basic CLI tools have subtle differences. The only Linus Torvalds code in a Mac would be git.


Sorry. you're right my bad. I was thinking Unix but kept typing Linux for some reason.

A more accurate statement is that it's a true posix compliant Unix based system down to the metal.


The "retrofitted Windows linux" is more Linuxy than the Mac (and both suck compared to an actual Linux OS).


Try interfacing in depth with your Hardware with the windows Linux.


> Also, the build quality is the best. I have a macbook that has lasted 10 years and still works great

I have a Mid-2010 MBP that is a large and ugly paperweight.

Due to shonky quality control, it hard-locks every 5 minutes when using the GPU. This only started when Mountain Lion was released, but good luck rolling back if you want to actually use XCode.

Given that it was such a piece of garbage, and that (at the time) workers were jumping to their deaths rather than keep building Apple products, and more recently the crap with sneaky performance reductions on iPhone, I have sworn completely off Apple products.


I was a systems engineer at an Enterprise environment with over a thousand users and about 50/50 Mac/PC split. You can guess which were less headaches from a hardware perspective. And by less headaches I mean virtually non-existent Hardware failures.

Turning them into Enterprise friendly machines from a software perspective was a different story.


You know, I'm so beat down by the state of apple laptops, I don't even care. I was all ready to spring for a new machine the day these dropped. Until I found out that they max the 13" model at 16 gigs of ram. I'm so pissed. I don't want the 15". That's what external monitors are for. I like the portable to be fucking portable. Jesus. I give up.


Who makes a 13" laptop with 32 GB RAM that is Thinkpad or the better models of Asus build quality? That much RAM in such a small form factor, regardless of thickness, also seems a hefty thermal challenge, but if someone out there is doing it, I'd like to take a look at them. I swung by the Lenovo and Dell sites, and only the Dell 14" will let you go up to 32 GB, but I've yet to see a 13" model.


Why do you need 32 GB of ram?


Not the op, but I need 32 GB for:-

- running virtual machines.

- running MemSQL which requires a minimum of 8GB to be available.

- running video editing software.

There are probably other reasons, but those are the ones off the top of my head.


Similar to VMs, I run a lot of docker containers, mostly with JVMs in them.

I bought an Intel Hades Canyon with 32GB to run Linux on after the new MBPs didn't turn up at WWDC. Feeling somewhat relieved right now.


This problem is easily solved by just having an external cooling solution like a USB fan tray that sits under the MacBook, and only when you are doing some heavy tasks.

In normal day to day use most people will not do enough processing to be affected by thermal issues.

Keeping the form factor of a MacBook is much more important than eliminating every thermal edge case. People take for granted how much we’ve improved on old big and heavy laptops (with worse performance even).


This problem is easily solved by just having an external cooling solution like a USB fan tray that sits under the MacBook

So, making it an inch or two thicker?


Louis Rossmann made a video[1] a while ago presenting how badly optimized is the cooling on new MacBooks. In short: Apple prefers their MacBook CPU and GPU to run hot and thermally throttle so that the user is happy because the machine is quiet, and then, more importantly, Apple is also happy, because when it overheats often it has a shorter lifespan so you’ll have to buy a new MacBook after 2-3 years.

https://www.youtube.com/watch?v=wgeh7ZJRhZU

edit: Bear in mind these videos are about new MacBooks so there's no way to predict their usual lifespan. And I'm not saying all of them die after 2-3 years. Apple is simply optimizing for the 90% of users here, people doing mostly web browsing and using Microsoft Office. This does not put a sustained load on the CPU/GPU so these users will be fine.


>and then, more importantly, Apple is also happy, because when it overheats often it has a shorter lifespan so you’ll have to buy a new MacBook after 2-3 years.

Conspiracy theory BS. MacBooks don't die after "2-3 years", and they retain much better resale value than Wintel laptops too.

That said, Apple could very well prioritise quiet over full-speed.


> more importantly, Apple is also happy, because when it overheats often it has a shorter lifespan so you’ll have to buy a new MacBook after 2-3 years.

And yet, MacBooks tend to be extremely reliable for many years.


I switched from a MacBook Air to a similarly spec’ed (4260U vs 5200U) Thinkpad a couple of months ago, and noticed a vast performance difference when playing games. When playing games the processor gets up to 85C, but doesn’t throttle. It’s either that or Linux is a lot better for gaming than macOS.


Running your CPU hotter doesn't significantly affect the lifespan of your device especially for longer sustained periods unless there's some glaring design defect as happened to Nvidia.

So that's actually pretty well optimized.


> Running your CPU hotter doesn't significantly affect the lifespan of your device

Maybe, but I've seen reliability formulas with fifth powers of temperature, namely for equivalent ageing of devices.

And 14nm fabrication processes don't exactly improve my expectations of durability ...


It helps if you know 14 nm fabrication isn't really 14 nm. The biggest culprits are electrolytic capacitors and the lipo batteries but the batteries are far away and the other components are cooled by the air headed into the fan so neither will actually reach anywhere near the temperature of the CPUs. Apple also doesn't skimp on components like this especially when it's a core and obvious part of optimizing for noise so I wouldn't expect it to have an appreciable impact on the lifespan of the device.


This madness is something Apple can only blame themselves for.

It's Steve Jobs' ploy to kill the laptop market. He didn't need the laptop market, he had the iPhone, he knew of the iPad when he demo'd the Macbook Air. Today the entire Mac revenue https://www.statista.com/statistics/382260/segments-share-re... is below 10%.

See, he have started the thin craze and everyone copied. But noone needs a thin laptop. There's no advantage to it whatsoever. People do need lightweight laptops and well before the Macbook Air we had shockingly lightweight ones, my Panasonic CF-Y5 was 1.5kg including a DVD ROM more than ten years ago. It had a protrusion on the lid to create a cushion which made a review to call "the exterior design of the machine's casing is reminiscent of a Sherman tank cross-bred with a 1970s sports saloon". It made it 45mm thick at the thickest point. Still, it was 1.5kg and undestructible.

So, instead of focusing on materials science and building lightweight laptops in whatever thickness a good cooling system would require everyone is falling over themselves to produce a thin laptop. Which, of course, can't be properly cooled. It's just physics. In this case, the Macbook "Pro" is 0.61" high where the Lenovo P52 (which is Pro for real) is 0.96”. This thinness craze killed off the convenient bottom dock ports from the business laptops creating these awkward side docks.

Fantastic trick, Mr Jobs, well played.


> But noone needs a thin laptop. There's no advantage to it whatsoever.

I own an Air, and it's been my go to machine in university. Yes, the weight does a lot to help me just pick it up and grab it, but the thinness does help a ton too. Thin means that I can throw it in a bag and still fit textbooks, notebooks, hoodies, and whatever else inside. I've had bags packed fully and needed every inch of space. The Air does that.

It's not a huge market, and I think only the Air models should focus on it, but it does exist. A bigger but equally light laptop wouldn't make nearly the difference that the Air does in how it fits ends up being the smallest thing in many of my bags.


That's a serious tin-foil hat you got there.



I really doubt even Jobs thought his RDF extended to the entire laptop market.

He just had a great idea and Apple executed on it. The early MBA's (2010) were absolutely fantastic, quiet and reasonably powerful (read completely usable).

As an anecdote, new hires at my company are given either a somewhat heavy ThinkPad or MPB (higher end). Many have complained about the weight of the ThinkPad. None complain when they choose the MBP.


This just shows Apple items are fashion items which you can't do a technical comparison with. The ThinkPad T580 is 4.29lbs where the Macbook Pro 15" is 4.02lbs. I am sure the four ounces made a huge difference. If you wanted to compare it to the 5.4lbs P52 then may I point out the GTX 1060 level GPU which is quite a bit faster than the Macbook Pro? Not to mention the 128GB RAM capacity or the multiple M.2 SSD support. They are entirely different machines for different purposes.


Panasonic CF Y5, interesting machine with some uncommon features https://www.trustedreviews.com/reviews/panasonic-toughbook-c...


I don't know why there aren't docks on the market. The Henge USB-C dock has been pre-order for two years, so I just made my own in an afternoon by cutting a slot into a 4x4, and drilling a couple holes for a usb hub cable and a monitor+power cable.


When I bought my Lenovo P50 I was unpleasantly surprised by its weight and thickness after having used MacBooks before. However I did get over it pretty quickly and swapping out batteries so I can keep working is awesome. I do need the power.


I develop on a macbook pro for 12 hours a day. Have a bajillion chrome tabs open, 2-3 ides open, sequel pro, photoshop, slack/skype, utorrent, spotify, and more....

My macbook pro 2016 i7/16G doesn't even skip a beat.

Unless you're doing 3d graphics or Machine Learning or something...I think for 99.9999999% of POWER USERS a throttled i9 with 32 gigs of RAM is plenty for the year 2018.

This throttle issue in my opinion is such a minor quibble it feels like demonization of Mac.

You can focus on the very very very few negatives of Macbooks but in my opinion at the end of the day it's an incredible machine.


I think the main quibble is that the laptop can't even hold it's base clock speed without throttling.


Under sustained load. That's the key part here.


Yeah but that's like buying a race car and being upset that it's throttled at 300 mph.

99.9999% of people will never go 300mph..unless you're using it in a very specialized way.


If you spending 400$ to upgrade from an i7 to an i9, you are expecting i9 performance. If the i9 throttles, you are not getting that performance.


But it's supposedly _slower_ than its predecessor in sustained situations like this. What's the point of having a "more powerful" 2018 machine over the 2017 if, when trying to use that power, you end up worse off?


For most users, sacrificing some sustained performance for the sake of higher burst performance is absolutely the right choice if such a tradeoff is possible. People who keep the CPU at 100% with a multi-threaded workload for minutes on end have a workload that requires a desktop form factor. For the other 97% of users, the CPU will spend far more time idle than fully-loaded.


No it's like buying a truck for your business and being upset when it can't tow its advertised load.


It's like buying a chinook to tow a skateboard then being upset that the chinook can't go it's full 200 mph.


Apple didn't advertise a minimum time guaranteed to sustain 2.9ghz with 100% load.

It's not much different than my iPhone shutting down after recording 4k60 video for 2 minutes in the hot summer sun at 110F degrees (in fact, that's actually more frustrating than this Macbook "issue" but there was no outrage for that).


Yeah but I don't think buying a race car and use it to race should be considered as "using it in a very specialized way".


I think the better car analogy here would be buying an expensive luxury sports car and being disappointed that it couldn't keep pace with a real race car on the track.

No laptop can be an equal to a real workstation unless you don't actually need a workstation to begin with.


Or buying something like a Corvette Z06, taking it to the track, and realizing they go into limp mode if you try to hot lap the thing if you're a decent driver... and this is a real thing, they do.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: