Razer Blade Pro 2017 Low Battery / Running Hot

Discussion in 'The Linux Corner' started by jccantele49, Jun 13, 2017.

Thread Status:
Not open for further replies.
  1. Hi all,

    I have a 2017 Razer Blade Pro running Elementary OS with the 4.11 Linux kernel upgrade. It is running silky smooth, and I'm in love.

    However, the battery life is sitting around 2 hours, and the laptop is running very hot. Can anyone help me troubleshoot this?

    Looks like we've got the overheating issues under control by installing TLP and thermald:

    sudo add-apt-repository ppa:linrunner/tlp
    sudo apt-get update
    sudo apt-get install tlp tlp-rdw
    sudo apt-get install thermald


    reboot after installing.

    Will report back on what effect that this has on the battery life.
     
    Last edited by a moderator: Jul 7, 2017
    sceleus likes this.
  2. sceleus

    sceleus New Member

    after above commands plus powertop --calibrate and a full recharge i'm still getting roughly 1-2 hours of battery life max.
     
  3. linuxrazer2_no_id

    linuxrazer2_no_id New Member

    I'm getting about 2-2.5 doing standard stuff on the Razer Blade Kaby Lake (not the pro (or stealth), I have the middle 13/14" version).

    I'm currently running the nvidia driver all the time.

    nvidia-smi reports the GPU is using about 26W when I do stuff with various windows
    nvidia-settings reports the GPU is scaling from perf level 0-4 depending on what's going on.
    perf 4 seems to use about 26 Watt, with GPU at 1400Mhz and Memory at 8000MHz
    perf 0 uses about 5W with GPU at 80-15Mhz and Memory Transfer at ~1000MHz

    With a 50Wh battery, 26W is basically the entire thing gone in 2 hours of typing, which is crazy. But the good news is, it means There's possibly 2x more battery life available if I tweak it right.

    Now, I don't need a strong GPU to type, but the Nvidia driver seems to think I do.

    Two experiments I will be running to improve battery life:

    1. I plan to experiment with switching to the Intel driver using the Nvidia tool, (which I think will make external monitors not possible?)

    2. Unlocking the option to force the Nvidia card to stay in a lower performance mode.
    ---

    Another bonus of controlling charge consumption.

    No external batteries seems to work to boost the Razer Blade or Razer Blade Pro.
    That's because the Blade needs 165W and the Pro needs even more.

    That's presumably to power the system on full draw + charge it at the same time.

    If I can get total consumption of system + charging under 100W then I can use the Omni20 external battery pack.
     
    jccantele49 likes this.
  4. I'm in exactly the same boat as @linuxrazer2 , i've installed powertop and running tlc, but my power consumption is sitting around 40W-50W doing nothing but simple browsing on chrome which gives me around a 1 hour 50 minute battery life, which seems pretty ridiculous. I'm getting about 5-6 hours on Windows with the same kind of activity. I've got TLP running with default settings and powertop installed, and i've made the little tweaks that powertop provides.

    I'm also going to try setting up some graphics card switching with Bumblee, as I only need the GPU for running terminal applications with CUDA and cudNN.

    Let me know how you go with your experiments.

    So i tried to get bumblebee working... it didn't happen, was too complicated and buggy. So I tried switching to Intel graphics instead of discrete graphics via NVIDIA Prime server settings app that comes with the drivers. I'm now running 20-25W and getting a lot more battery life, maybe 3-4 hours at most. Just have to restart the computer when i need to use discrete graphics.

    Note that after changing your settings the 'log in and log out' approach to switch graphics card (suggested by the NVIDIA Prime settings app ) doesn't work and the machine freezes on start up.You have to actually restart your computer to avoid the freeze.
     
    Last edited by a moderator: Jun 21, 2017
    jccantele49 likes this.
  5. Running the Blade (not Pro/Stealth) I get an improvement to 7-9W draw on Intel which equates to 8-10 hours battery life.

    If only there was a way to switch without reboot, or purely load the driver for deep learning but not display. I'll dig further.

    @sam_hains @jccantele49 @sceleus
    The GTX 1060 sucks between 5-50W depending on what it's doing, seems to hover up 35W for browsing/videos which is insane.

    Here's some steps that are meant to allow you to set performance levels, but I can't seem to get them to work.
    http://z-issue.com/wp/nvidia-linux-...bits-performance-levels-and-gpu-fan-settings/
    https://askubuntu.com/questions/4662/where-is-the-x-org-config-file-how-do-i-configure-x-there
    https://devtalk.nvidia.com/default/topic/820497/-solved-coolbits-without-xorg-conf-/

    Let me know if you get anywhere with them.

    For Blade (not Pro/Stealth)
    I get:
    7-9W on Intel (8-10 hours)
    15W minimum possible on Nvidia (4.5 hours)
    35W typical on Nvidia (2 hours)

    If we can unlock the ability to set performance levels, then we can get down to maybe 15W
    If we can run deep learning without loading the Nvidia driver for X, then this thing could run brilliantly all day
    (would probably still need a reboot if running external, since I think they require the Nvidia driver, althought that might be possible with Bumblebee which I haven't looked at yet).
     
    Last edited by a moderator: Jun 21, 2017
  6. @linuxrazer3 thanks again for your response! how are you getting 7-9W on intel? is that with your brightness all the way down and running no apps? that seems unattainable to me. I'm doing nothing but browsing the web on chrome i'm sitting at more like 18-25W on intel with a comfortable brightness.. but all I have done to optimize my battery life is toggle powertop turnables and set graphics card to intel.
     
  7. Alright, hit some good info for the folk looking to do CUDA and have tonnes of battery life :)

    Probably won't get a chance to hack on this for a few days (and still haven't started getting my deep learning running on this yet) but check this out. Not only will it get you a battery life saving, it should mean your neural nets get all the GPU and don't have to fight over it with X)
    ---
    it looks like it is possible to use Intel HD GPU to handle displays, while reserving the NVIDIA devices for computing on the same machine. However, after following the procedures on two 14.04 boxes, I could not get this to work - either the Intel driver is used, or nvidia driver, but not the same time.

    Is this approach feasible at all? does it work with only hybrid graphics systems? anyone has experience?
    ...

    An update, problem solved.

    All I need to do is to add cuda driver's path (in my case /usr/lib/nvidia-375) to the LD_LIBRARY_PATH.

    A side problem is the libGL.* in nvidia driver will take priority over mesa libGL. The fix is to remove or rename/relocate the libGL.so*/libGLX.so*/libGLdispatch.so* to a different folder, or not to install them if you install from the .run file (I prefer to install from apt-get).

    In summary, in order to make this to work, you need to

    1. make sure you have enabled onboard graphics in the BIOS settings (or set it as primary)
    2. install both xorg intel driver and nvidia/cuda drivers
    3. start nvidia-settings, and go to the PRIME settings page, set Intel (Power Saving Mode) as default
    4. modify your .bashrc and set LD_LIBRARY_PATH to at least contain /usr/local/cuda/lib64:/usr/lib/nvidia-XXX where XXX in my case is 375.
    4. logout to restart X or reboot
    5. run ldd $( which glxinfo ) to make sure your GL libraries point to mesa, or run glmark2 to confirm GL status
    6. (update) if the libGL printed from step 5 points to nvidia's driver folder, you need to remove/rename the libGL.so*/libGLX.so*/libGLdispatch.so* under nvidia driver folder so that your OS can pick up the mesa libGL library.
    7. run nvidia-smi to list your dedicated NVIDIA GPU, and run your CUDA program, you should not see any errors.

    https://devtalk.nvidia.com/default/...phics-is-this-possible-/post/5079872/#5079872

    @sam_hains I have the Blade (not Pro/Stealth) so maybe that's what the difference is? You have the Kaby Lake Razer Blade as well right?

    I have also installed the standard linux stuff that's meant to help. -- have you tried tlp and thermald ?

    are you running an external monitor by any chance? I'm not even sure if that's possible on Intel, but when I did on Nvidia it double the power draw.

    Finally managed to unlock this. I used nvidia-xconfig to dump the xorg files and then added coolbits 12 by hand. It's a pretty inelegant setup (laptop display runs at 640x480), I'm sure it's fixable but not sure if this route is worth further pursuing.

    Sadly it's not fully settable and the changes I make don't seem to take effect. All I was able to do is set the GPU -200MHz and GPU RAM -2000MHz but that didnt' seem to take effect.

    These links seem to be more promising and have a way to force the performance level down:
    https://hashcat.net/forum/thread-4569.html

    for Windows: (maybe something to learn from here)
    http://wiki.step-project.com/Guide:NVIDIA_Inspector
    http://forums.guru3d.com/showthread.php?t=397347

    However, I'm thinking my setup will be:
    Intel driver for X, if my above post about having Nvidia only works out.

    Only problem is multi monitors, have to see if that can be worked out.

    EDIT: looked into multimonitors. Bumblebee is not an option, as it requires the HDMI port not to be attached directly to the Nvidia card which it seems it is.

    However, there is the USB C port. I've ordered this, as Linux supports MST which should allow two 4K screens to run. https://www.amazon.com/StarTech-com-USB-C-HDMI-Splitter-Thunderbolt/dp/B06XPVGQKY
     
    Last edited by a moderator: Jun 21, 2017
    jccantele49 likes this.
  8. I have the same computer as you running on the Intel integrated graphics, and I am not running an external monitor. I am running TLP and thermald on their default settings. I'm browsing google chrome, watching youtube videos- really basic browsing. I'd say im jumping around at 15-30W. Is that reading coming from powertop? Curious why your getting such a lower power usage :/.

    I'm also very excited to try out your method of using the discrete card just for CUDA/CUDANN stuff, it seems we probably bought this computer for the same reasons! First I would love to get down my wattage, but thanks a lot for documenting that.
     
    jccantele49 likes this.
  9. seems like today im getting 7-10W with occasional spikes up to around 15-20W... not sure what has changed! but I'm happy with that as a base battery drain performance on Intel chipset for now. Looks like I can probably get around 5-6 hours doing basic web/music/coding which is pretty similar to what im getting on Windows I think. I am pretty happy manually switching between graphics cards, as I only really need the graphics card in Ubuntu for deep learning/machine learning stuff. I'm someone who is either REALLY using the GPU or not at all.

    @jccantele49 I might be a little thick when it comes to some of this hardware stuff but not entirely sure those links are so relevant to me, they look like they are for people who are trying to do GPU passthroughs for an external monitor? To me it seems the @linuxrazer3 route of running integrated graphics except when I need Cuda, CudaNN is the dream. Unless bumblebee would also let me do CUDA with similarly good battery life, without needing to restart my machine but that just kind of seems like magic to me lol. Bumblebee was also poorly documented and complicated to install. I failed to get it to work.. I'm gonna the @linuxrazer3 solution a shot tonight!
     
    Last edited: Jun 19, 2017
  10. sceleus

    sceleus New Member

    I was able to get a pretty good setup using the following options in the device section of xorg.conf:
    Option "Coolbits" "8"
    Option "RegistryDwords" "PowerMizerEnable=0x1; PerfLevelSrc=0x2222; PowerMizerDefault=0x3; PowerMizerDefaultAC=0x1"

    explanation:

    coolbits 8 enabled powermizer management on the powermizer screen of nvidia settings. You can over or underclock the gpu there. without this setting on the next setting wasn't working for me.

    PowerMizerEnable=0x1 enables powermizer control.
    PerfLevelSrc=0x2222 is for maximum performance unlocking on AC
    PowerMizerDefault=0x3 is for default on battery which is the lowest peformance (powermizer level 0)
    PowerMizerDefaultAC=0x1 is default on power which is the highest performance
     
    jccantele49 likes this.
  11. Hi @linuxrazer3,

    Wich distribution and kernel are you using?

    Thank you.
     
  12. @hiperjp I am using Ubuntu 17.04. Upgraded Kernel to 4.11, Nvidia Divers to 381 and Use the Intel Graphics Update tool to update to latest. I have not yet put CUDA on and I know that may be difficult/impossible under 17.04, so I may need to go to 16.04 and repeat all the work.

    @sam_hains nice on the 7-10W!!

    I did some more measurements today.

    ~16W
    on battery, doing light web work (google sheets and updating my accounts)
    with two external monitors running via Startech dual HDMI hub, on Intel only, (no Nvidia).

    ~7W
    sleeping with the Startech dual HDMI device connected

    ~10-15W
    light web + video, no Startech hub. (ie. inbuilt screen only).

    ~9-10W
    light web (reading this forum), without Startech hub.

    You can see them represented in order, on this graph
    [​IMG]

    @sceleus

    This is super helpful. Can you dynamically adjust the performance levels now, by 0,1,2,3,4?

    How did you generate the xorg.conf? Most methods I've tried for this have resulted in loads of glitches.
     
    Last edited by a moderator: Jul 7, 2017
  13. sceleus

    sceleus New Member

    @linuxrazer3 I'm still tweaking the powermizer configuration. For some reason the lightest amount of work on the computer on battery jumps me up to a 2 level. I have successfully locked it into a 4 on ac. I'm thinking the trick is somewhere wiithin the following powerMizerSettings:

    PerfLevelSrc=0x2222 <- from what i understand this is 2 digits each for battery/ac and determines whether the card should scale to meet demands or stay set at what you've defined. It doesn't, however, seem to be working for me for the battery mode.

    PowerMizerDefault=0x3 <- as confusing as this is 3 is the lowest setting and 0 is the highest and from what I have gathered this AxB is A= highest perf, B = lowest perf. So i've been trying to tweak this to things like PowerMizerDefault=2x3 (only allow the 2 lowest settings but scale between them) to no avail. Anyone with more experience with powermizer settings feel free to correct me.

    To generate the xorg.conf file I generated a file using nvidia-settings then copied it to /etc/X11/xorg.conf.d/20-nvidia.conf . Restarted my computer and presto, it does SOMETHING.

    Coolbits=8 btw allows you to modify the performance metrics in nvidia settings (not changing it from 0, 1 2 3 or 4) but actually SETTING the memory transfer rate/graphics clock values manually. GLHF YMMV
     
    linuxrazer3 likes this.
  14. wesley27

    wesley27 Active Member

    Apologies for the necro-thread, but I tried to make a new post similar to this and it was deleted because "razer insider is not a support forum, contact our support team". I don't quite understand this because it was almost identical to this thread, and I'm pretty sure Razer's support team can't and won't help me with running software they don't support on their machine with their hardware. Isn't that the purpose of this forum corner, so we can help each other?

    Anyways:

    @jccantele49 Did you have to install bumblebee or prime to handle the Nvidia GPU?

    I'm running Fedora 26, and when I boot up, my CPU temps range from 44 to 57 C (get 44 by running lm_sensors, 57 by running tlp stat -t). I have TLP and thermald installed, both running. Within 5-15 minutes of booting up, my blade gets pretty warm (70-90 C). Could anyone help me troubleshoot why it's running so hot? I feel like at idle/normal usage it shouldn't be higher than 45 C. The fans never ever increase in speed, they stay on at the same speed as when I boot.

    I thought it might be because I don't have bumblebee and therefore the Nvidia card is running even though it isn't being used. Is this reasoning right, should I install bumblebee? With this being said, the CPU side of the computer (left side) still gets much hotter much faster, so there's still a CPU issue?

    Finally, lm_sensors and TLP give me varying temperature results; sometimes lm_sensors is 10-15 degrees less than TLP stat -t, and sometimes they are the same. Does anyone else know of this happening? My blade doesn't feel super hot, but it's definitely pretty warm, is it possible that they're both wrong (it doesn't feel like 85 C when they both say that, and then 2m later it'll say 65)?

    Also idk if it matters, but all of this is while plugged in. And when I boot back into Windows, the fans kick in instantly pretty high for a few minutes to cool everything down.
     
    Last edited: Sep 5, 2017
  15. Wesley27, no need to apologize at all for the thread necromancy. Rezer deleted my thread once too, under the same wrong logic. I wrote back to the moderators that this was part of the Linux forum, and they realized their mistake and restored the thread. I advise that you write back to them and explain the situation. I also reached out to Min Liang Tan (Razer CEO) on Facebook and he was helpful in getting this thread restored.

    No, we did not use Bumblebee. The Razer Blade Pro does has a single desktop graphics GPU and does not use Bumblebee.
     
    wesley27 likes this.
  16. GreatAttractor

    GreatAttractor New Member

    Hi Wesley have you solved the overheating problem?

    I had a Razer Blade Pro 2016 (various distribution, always the latest nvidia driver) but it was unusable, because of overheating, during the spring/summer period, so I returned.

    Now I have a Razer Blade 14 2017, and I had the same problem, in the end I disabled Nouveau, turned off NVIDIA via ACPI (with bbswitch, but you can't do it because you do not have optimus).

    I played with all the NVIDIA driver options but in the end I grow tired and disabled the NVIDIA chip for good.

    I believe that your problem can be related (as always with notebooks) to ACPI. If you post your 'dmesg' after the boot (yes all of them) we can analyze them and see if something went wrong.
     
  17. wesley27

    wesley27 Active Member

    @Rameil Thanks a lot for the reply, I appreciate it. I actually was able to solve my heat issues. They only occurred when I had my laptop plugged into the charger or when I was using an external monitor. The cause was due to, regardless of nouveau being disabled or not, my GPU being powered on and drawing more power under those conditions even though it wasn't in use.

    I did a fresh install of Fedora and then disabled nouveau and installed bumblebee, and the issues were fixed. Bumblebee now automatically keeps my GPU off unless I need it, which is pretty much never.

    I did try bumblebee originally and it didn't work; I am assuming this is because I had installed NVIDIA proprietary drivers ahead of time. With the fresh install, I used bumblebees drivers.
     
  18. GreatAttractor

    GreatAttractor New Member

    The huge difference for me is when I disable NVIDIA card via bbswitch:

    cat /proc/acpi/bbswitch
    0000:01:00.0 OFF

    as you can see it is off, and X11 run with Intel drivers.
     
  19. KazWolfe_

    KazWolfe_ New Member

    Installing tlp/thermald on Ubuntu 16.04 had some, er, very negative effects on my system. I would regularly reach 90° C with not a sign of fans turning on. Uninstalling these and falling back to native power management seems to keep my system reasonably cool.

    I'm sure power management could be better, but this is okay (for now).

    This is running on a Razer Blade Pro (2017, 4k) with Ubuntu 16.04 on kernel 4.10.0-37-generic. I do not (yet) know how kernel 4.11 will behave, so this is limited to those still stuck on 4.10.
     
    Last edited: Oct 14, 2017
Thread Status:
Not open for further replies.
Sign In with Razer ID >


Don't have a Razer ID yet?
Get Razer ID >