Hi,
I am thinking to get a razer blade stealth 13 with i7-1065G7 and maybe a dedicated nvidia GPU. To safe some bucks and batter life I am looking only into the FHD version. After reading some reviews the following statement struck me: the version without dedicated gpu can power the cpu with 25w, whereas the version with dedicated gpu limits the cpu to 15w, no matter of the gpu is active or not. See for example (under "Processor").
Is this information really correct and still current? Looks like an artificial restriction to me, maybe a BIOS updated fixed it?
Page 1 / 1
If the cpu tdp is set same with base model without dgpu it’ll run definitely hotter, since gpu is main heat from a laptop, but you can disable it if you want to reach similar cpu performance with 25 watt on base model. Dell 2 in 1 and Lenovo yoga without dgpu is already run hot by using 25 watt ctdp, take a look on their thermal review I think I saw Linus recently showed HWinfo64 sensor sheet that shows 90ish degrees which you won’t see on 1065g7 on stealth I believe, but mostly reviewers are hypnotized by the form factor firstly.
Joikansai
you can disable it if you want to reach similar cpu performance with 25 watt on base model.
Has this been tested? This would be a game-changer for me. It would mean you could still get excellent eGPU performance as long as you disabled the 1650.
figrin1
Has this been tested? This would be a game-changer for me. It would mean you could still get excellent eGPU performance as long as you disabled the 1650.
Yes I believe this other guy using base stealth from the 256gb storage.
https://www.3dmark.com/compare/fs/21421626/fs/20766402#
figrin1
Has this been tested? This would be a game-changer for me. It would mean you could still get excellent eGPU performance as long as you disabled the 1650.
After I upgraded the Synapse 3 to ver3.5 - Jan 14th, 2020, the bonus of 25w TDP while no 1650 usage was gone. Before this upgrade I could get around 740points on Cinebench R15 with dGPU, now it is ~400. If you don't play games too much, just don't buy the GTX model.
Wow, this is bad. Thanks for sharing this!
I am disappointed, this notebook could be so great.
I am disappointed, this notebook could be so great.
Homurachyan
After I upgraded the Synapse 3 to ver3.5 - Jan 14th, 2020, the bonus of 25w TDP while no 1650 usage was gone. Before this upgrade I could get around 740points on Cinebench R15 with dGPU, now it is ~400. If you don't play games too much, just don't buy the GTX model.
That's incredibly unfortunate. Definitely a solid reason for anyone with a Stealth not to update to 3.5.
Homurachyan
After I upgraded the Synapse 3 to ver3.5 - Jan 14th, 2020, the bonus of 25w TDP while no 1650 usage was gone. Before this upgrade I could get around 740points on Cinebench R15 with dGPU, now it is ~400. If you don't play games too much, just don't buy the GTX model.
Did you try to restart the system? Once even I removed the power limit the clock stucked to 1,5ghz and as I thought that time resulting lower cinebench like 500 ish though. And did you try gaming mode as well. 400cb is just too low for the cpu.
thomas001le
Wow, this is bad. Thanks for sharing this!
I am disappointed, this notebook could be so great.
If you really want to get 25 watt performance and don’t want to get base model there’s workaround with of course a risk and higher cpu temperature as trade off. I still can get 850cb on 1650 model on latest 3.5 Synapse.
Joikansai
Did you try to restart the system? Once even I removed the power limit the clock stucked to 1,5ghz and as I thought that time resulting lower cinebench like 500 ish though. And did you try gaming mode as well. 400cb is just too low for the cpu.
If you really want to get 25 watt performance and don’t want to get base model there’s workaround with of course a risk and higher cpu temperature as trade off. I still can get 850cb on 1650 model on latest 3.5 Synapse.
I just ordered the the 1650 model. Can you confirm that the workaround of disabling the dGPU in device manager still works and let the CPU operate at 25W instead of 15W? I was planning to keep the dgpu disabled unless I'm running games in order to get better cpu performance and battery life during normal use as described in this thread on Reddit. If that workaround doesn't work anymore, it's a deal breaker for me.
soloAntiqueBrassdepot968
I just ordered the the 1650 model. Can you confirm that the workaround of disabling the dGPU in device manager still works and let the CPU operate at 25W instead of 15W? I was planning to keep the dgpu disabled unless I'm running games in order to get better cpu performance and battery life during normal use as described in this thread on Reddit. If that workaround doesn't work anymore, it's a deal breaker for me.
I’ll look after that, first I’ve to put bios values to default. That Reddit thread is interesting though to me moding bios is faster, and let Optimus doing his Job.
How can I test whether CPU is running at 15 or 25? I'll see if this is true on my machine as well
Ddoogg
How can I test whether CPU is running at 15 or 25? I'll see if this is true on my machine as well
You can download Intel Power Widget and it will show you the power usage in real time. You can also test this by simply running a CPU only benchmark like Cinebench R20 while the 1650 is enabled in Device Manager and the laptop is connected to the eGPU and then disable the 1650 and run the test again. If the theory is correct, the score with the dGPU disabled should be higher than the first run.
soloAntiqueBrassdepot968
You can download Intel Power Widget and it will show you the power usage in real time. You can also test this by simply running a CPU only benchmark like Cinebench R20 while the 1650 is enabled in Device Manager and the laptop is connected to the eGPU and then disable the 1650 and run the test again. If the theory is correct, the score with the dGPU disabled should be higher than the first run.
Sorry for late answer, on default disabling dgpu 1650maxq still give 25 Watt on synapse 3.5. But score still under power limit removed moding due lower frequency all core peak 3,5 vs 2,9 average, though on default temperature is way better.
I've gotten better boost clocks from mine while running certain engineering software by disabling the GTX 1650 through device manager. Keep in mind, however, that I noticed that the trick will not survive sleep or a reboot, you will need to enable and disable it again to get the boost. Not the most convenient but it's a workaround.
ShredBird
I've gotten better boost clocks from mine while running certain engineering software by disabling the GTX 1650 through device manager. Keep in mind, however, that I noticed that the trick will not survive sleep or a reboot, you will need to enable and disable it again to get the boost. Not the most convenient but it's a workaround.
Other users have mentioned that this no longer works after updating to Synapse 3.5. Curious what version of Synapse you're running?
I've finally resorted to completely disabling the power limits on my Stealth via the BIOS as some users have recommended elsewhere. It leaves a bad taste in my mouth because it's clearly not how processors are meant to operate, but I was tired of never being able to predict or control performance. Now I just leave it on Balanced in Synapse and Throttlestop controls power usage via temperature limits.
figrin1
Other users have mentioned that this no longer works after updating to Synapse 3.5. Curious what version of Synapse you're running?
I've finally resorted to completely disabling the power limits on my Stealth via the BIOS as some users have recommended elsewhere. It leaves a bad taste in my mouth because it's clearly not how processors are meant to operate, but I was tired of never being able to predict or control performance. Now I just leave it on Balanced in Synapse and Throttlestop controls power usage via temperature limits.
Ah, I missed that part. Yeah, the trick definitely worked when I first got the machine but I haven't had to do a CPU heavy export in a while. I'll have to check what version of Synapse I'm running and if it still works with certain programs.
Sign up
Already have an account? Login
Log in with Razer ID to create new threads and earn badges.
LOG INEnter your E-mail address. We'll send you an e-mail with instructions to reset your password.