Some Core benchmarks with the Blade 14 and Stealth.

Discussion in 'Systems' started by KillerFry, May 29, 2016.

Thread Status:
Not open for further replies.
  1. KillerFry

    KillerFry New Member

    Hello; I already posted these results on the Notebook Review forums, but I thought I would share them here.

    Here is my desktop, which is a somewhat beefy system, nothing is overclocked (it's currently summer here; I only overclock my CPU during winter):

    Intel Core i7-5930K
    Asus X99-Deluxe
    16GB Corsair Vengeance LPX DDR4 2666MHz (4x4GB sticks)
    256GB Samsung 950 Pro (boot + Blizzard games)
    1.2TB Intel 750 SSD (PCIe card version, holds most of my Steam library and work files)
    750GB Seagate Momentus XT
    Nvidia 980 Ti (reference cooler, which is the same one I use in the Core)
    Monitor is an Asus ROG Swift PG278Q.

    Here's the USB devices that I use in a regular basis:
    • Finalmouse 2016
    • Corsair K65 RGB
    • Mayflower Electronics Objective2 (headphone amp)
    • Tascam US-1800 (audio recording and game comms, heh)
    • Vantec 7 Port USB 3.0 Hub
    First, these are my results using 3DMark's normal Fire Strike and all of them using the Blade 14: Clicky clicky! There is some variance in the results for the Blade 14; I did them back-to-back and my theory is that the temps affected the results, as the first runs meant a cooler GPU. Also, my desktop was included.
    1. First result is my desktop.
    2. Second one is the Core with both the internal and external displays.
    3. Third one is the Core with just the external display.
    4. Fourth one is the Core with both displays and all my USB devices plugged in to the Core as well as physical network.
    5. Added a fifth column that has the results for the integrated 970m in the Blade.
    A friend let me use his Stealth for the weekend, and I added it in a new batch of benchmarks using 3DMark's Fire Strike Ultra: Clicky clicky! Something of note is that I ran three times the benchmark for each case, but I'm only adding to the comparison the third run. I did this so that the first two runs would heat the GPU up and simulate an more "real" gaming session scenario. It is still a small test sample tho; but hey, I don't have all the time in the world to bench, I also have to game for fun!

    Here's the column order for this:
    1. My desktop PC - 3,878.
    2. Blade 14 using the laptop's internal display - 3,460.
    3. Blade 14 using my external display - 3,692.
    4. Stealth using the external display - 3,487.
    Now for some real games and general fps. Right now I'm rotating my gaming time between Heroes of the Storm, Overwatch and Doom; so those are the only ones I tried for now. All of them were played on the external display and with the same highest quality settings that I use on my desktop (which is to say, everything maxed out! ;)):
    • Heroes of the Storm
      • Desktop - 80-100 fps, staying mostly in the 90's.
      • Blade 14 - 60-90 fps, staying mostly in the upper-70's.
      • Stealth - 50-70fps, staying mostly in the mid-60's.
    • Overwatch
      • Desktop - 80-90 fps, staying mostly in the upper-80's.
      • Blade 14 - 70-90 fps, staying mostly in the lower-80's
      • Stealth - 60-80 fps, staying mostly in the mid-70's.
    • Doom
      • Desktop - 130-170 fps, staying mostly in the 140's
      • Blade 14 - 90-120 fps, staying mostly in the 100's
      • Stealth - 30-50 fps, staying mostly in the upper-30's
    Notice the biggest difference was the Physics score, since we're going from a 5930K to the 6700HQ and the 6500U. Now, this is of particular interest because when going to actual games - such as Overwatch and Heroes of the Storm - I would believe that physics impact fps the most, particularly on the Stealth. Maybe if I were to lower the physics settings the fps could improve. I didn't try it, but I am confident that it is so.

    Now, something that got me thinking a lot, is that if you compare the results regular Fire Strike to Ultra, you will notice that the difference between the desktop and the Core vary wildly. My guess is that in Ultra there is more time taken by the GPU rendering the actual scene; therefore, bandwidth limitation is not an issue. Notice how the actual graphic test fps in Ultra just lose around a 7-8%, even in the Stealth! But when you move to lower graphical settings, such as regular Fire Strike, in which the GPU doesn't take as much time rendering, it is then that bandwidth might becomes the bottleneck; we're talking about 20% and 50% differences for the Blade and Stealth, respectively. This too, reinforces my believe that physics has something to do with fps loss in actual games.

    I find this particularly interesting because it means that the higher settings you run your game at, the smaller the performance gap might be - barring a CPU bottleneck, though. Also, it will be cool to find out the implications of Vulcan.

    Other things: it seems like the experience is a little more refined with the Stealth; it is more plug-and-play. I think the reason for this might be the 970m on the Blade 14. Actually, Nvidia Geforce Experience is very, very confused on the Blade; sometimes it thinks I have a 970m, sometimes a 980 Ti.

    Also, I noticed that my USB headphone amp would behave weirdly. When there was nothing playing, it... "disconnected". Not from Windows, there was not USB unplug sound. A soft-disconnect/sleep, if you will. As soon as I started playing something it would take a few seconds before actual sound came out. As soon as the music would stop, it would "disconnect". I wonder if, in order to save bandwidth, the Core soft-disconnects devices that are not being used.

    Noise! The GPU is usually inside a desktop case, so when fans ramp up one can only hear what escapes from the case. The Core is not only smaller but there are grills just in front of the GPU. It is much more audible; maybe a little bit annoying, if I'm honest. Most of the time I am wearing headphones, but when I take 'em off, it is pretty noticeable.

    Another small (pun intended) detail: the USB-C cable is short. I believe this was a conscious decision in order to maintain the 40 Gbps of the core. In my mind I would have the Core on the right of my desk - where my desktop currently is - and the Blade to my left - where I have open space in my desk. That ain't gonna happen, no sir. Maybe both Blade and Core will have to be to the left of me. So, keep that in mind.

    If the laptop goes to sleep, pressing a mouse or keyboard attached to the Core will not wake the Blade. You have to press a key on the laptop's keyboard.

    And there's some funky DPI scaling action going on. If the Type-C cable is disconnected while you are using both displays - the laptop's internal and an external display -, and have the external as the main Windows display, then after the disconnect it all goes to the laptop's internal display with the DPI scaling options of the external display. That is, I use my ROG Swift with a 100% scaling option (which is to say... none), and the Blade's display has a 150% scaling; when the Core is disconnected, the Blade's display default's to the 100% of the previous main display (the ROG Swift in my case). A user log-out or a reboot fixes this... heck, I just re-read this and don't know if I made sense.

    Oh, and in Doom, loading screens seem to freeze. If I press Ctrl+Alt+Del then the game loads fine, but I have to do it every time there is a loading screen. Weird.

    Well, there you have it folks! A somewhat light more in-depth-ish look at the Core. All in all, I am pretty satisfied and excited for this sort of technology. Around 13 years ago when I went to college out of town, I had to take my desktop and my laptop to school; obviously I used the laptop for school related activities (a solid eMachines M6807 with ATI Mobility 9600 capable of running Doom 3!) and my desktop for gaming. I dreamed that something like this would exist some day.

    And now it does!
  2. JETcoolCoolBlack681

    JETcoolCoolBlack681 Active Member

    Sorry, cant read that much: did you turn of laptop display completely?
    k0sek likes this.
  3. KillerFry

    KillerFry New Member

    I am not sure what exactly do you mean by "completely"? What I did was the good ol' Win+P and set it to "Second screen only".
    k0sek and marching_cow like this.
  4. JAK3407D1NG_xf_rzr

    JAK3407D1NG_xf_rzr New Member

    What resolution did you play with on the Razer Blade 14's internal screen? 1440p or 1800p? Btw thanks for the benchmarks above, they're great
    k0sek likes this.
  5. KillerFry

    KillerFry New Member

    The only times I used the internal display was for Fire Strike; regular FS uses 1080p, and Ultra uses 4K.

    All gaming was done using my external display; the ROG Swift is 1440p. My take is that using the Core with the internal display is a 5-10% extra hit, depending on the settings. If my observations regarding quality and bandwidth are correct, the performance hit on 1800p while using the internal display should be smaller than 1440p.

    I'll see if I can check it out later and add it to the original post.
  6. BarrothHS

    BarrothHS Active Member

    Wow! Thank you! Finally some numbers I can work with! I hope the 1070 and 1080 can make a better performance and aren't too much bottlenecked by the CPU, because those Doom fps on the Stealth are making me sad... T_T
    k0sek likes this.
  7. JETcoolCoolBlack681

    JETcoolCoolBlack681 Active Member

    That's what I was asking.
  8. Sidedoor

    Sidedoor Member

    Thanks for the benchmarks. Seems like we have all been waiting an age for them
  9. fakeian

    fakeian New Member

    I was seeing the exact same "soft locking" issue in DOOM with the Skull Canyon NUC + Core. Hope to see that issue fixed because otherwise the game was running great.
  10. ididntmemeto

    ididntmemeto Member

    Those poor Doom frame rates with a 980 ti are quite concerning.
  11. Road_Runner86

    Road_Runner86 Member

    Anybody forced physx to run from Gpu, it increases the frames from Doom significantly at 1080p with a gtx1080 atleast. I don't have a better display to test with yet.
    Gojjang likes this.
  12. colddevil324

    colddevil324 New Member

    do you think the core will take better advantage of amd chis because they were a huge help in creating this tech?
  13. One thing I'd like to see is how the Core performs when hooked up to a high end desktop system vs the blade and blade stealth. I'm guessing OP probably doesn't have an X99 board with Thunderbolt 3 or he would've tried it. There's only been one X99 board with TB3 the Gigabyte X99P-SLI, I have one with an i7-5820K in it and am really interested to see how my 980ti performs hooked up via the Core.
  14. Thank you KillerFry for the benchmarks. Hopefully we can see a few more reviews in the coming week. I am not very clear on all the specific numbers, but overall, it sounds like the core is functioning as expected?

    Any thoughts on what is making the RBS perform ..much more poorly on Doom than the other's? Is that related to the physics section --> "Notice the biggest difference was the Physics score, since we're going from a 5930K to the 6700HQ and the 6500U. "
  15. KillerFry

    KillerFry New Member

    Heh, yeah, the X99-Deluxe does not have TB3; I probably would've tried for giggles, but if you have that high end of a board and components, why would you gimp GPU performance? I mean, you'd have a PCIe slot right there.

    Unless... it could open the possibility of NUC like devices with high end CPU's using external graphics... yeah, I could see that. Someone make it happen!

    This is only what I think, and I could be completely wrong: So far I noticed that fps in Doom - regardless if it is on the desktop or the Core - take a dip when there's many light sources, specially when I shoot with the Heavy Assoult Rifle using the micro missiles mod (lots of explosions, lots of lighting sources), which leads me to believe there are a lot of lighthing and physics calculations being done in the CPU.

    Don't get me wrong, I think the 6500U is a great CPU, but I do understand that by design it is an ultra-low voltage and they had to limit its performance in order to meet that 15W TDP.

    Up until now I have not tried what Judge86 suggests, and it is also the reason why I think Vulkan/DX12 could help improve game performance in the Stealth; physics calculations could be offloaded to the GPU. There is an upcoming Vulkan patch for Doom; we'll see how that goes.
    nathanvollmer and Lost Dreamer like this.
  16. JETcoolCoolBlack681

    JETcoolCoolBlack681 Active Member

    It can simply be DRM so that 6500u is indeed a bottleneck. I would say it is more than likely. Say thanks to bethesda in this case.
  17. KillerFry

    KillerFry New Member

    Well, if you look at Doom's minimum recommended specs, you'll find an i5-2400. Then you could also see this:,88194

    The 6500U might have 4 threads, but only 2 actual physical cores and a lower base frequency. So whatever the reason, technically it is below the minimum recommended spec. There is no reason to get the pitchforks.

    Sent from my Nexus 6P using Tapatalk
  19. AmazingSpanoMan

    AmazingSpanoMan Active Member

    In a video review I heard that Doom is actually pretty CPU intensive.
  20. AKbrandonMC

    AKbrandonMC New Member

    "Oh, and in Doom, loading screens seem to freeze. If I press Ctrl+Alt+Del then the game loads fine, but I have to do it every time there is a loading screen. Weird."

    Do you use evolve? I heard about some issues with evolve and Doom.
Thread Status:
Not open for further replies.
Sign In with Razer ID >

Don't have a Razer ID yet?
Get Razer ID >