Blade Stealth + Core + GTX 1080 benchmarks, first impressions

Discussion in 'Systems' started by vitorious, May 27, 2016.

Thread Status:
Not open for further replies.
  1. Alright so to answer my original question "Does Thunderbolt 3 present some form of bottleneck based on benchmarks / numbers"



    Dave Lee backed up my claim between his Amp and Core setup (equal setups i7-6700hq / 16gb / 1080 FE) that the Amp always gets slightly better frame rates, around 4%-6% better in normal gaming scenarios.

    I'm happy with my Core, I'm okay with the 6%-8% slower performance in real world gaming (account that the Amp already is 2%-4% slower than a standard desktop setup). I'm glad the story is set straight that the Core does see a performance loss over TB3 compared to a direct PCIe external eGPU, it's just the nature of TB3 overhead / latency.
     
    Last edited: Jul 22, 2016
  2. BuckRogers_no_id

    BuckRogers_no_id New Member

    Yeah, I agree. I'm just so used to everyone telling me how bad of choices I make because I don't pick what I buy based on FPS per dollar charts. I didn't think $635 was bad once you consider how much it costs to build a nice ITX rig. I'm a Lian Li fan and getting the really nice stuff gets expensive quickly. Of course people slop together some cheapo case and parts and say "better bang for buck!" but not for me. For ~$650, Skull Canyon had maybe a $100 premium on itself, but there's nothing else like it on the market (like the Core). I'd pay $100 extra just to get all-Intel parts personally, or that form factor. For me, there really is no price premium on this thing because I don't view things from a gaming/bang for buck perspective alone.

    No problem with 4K. It doesn't GPU accelerate HEVC 10bit (it does accelerate everything else HEVC), and the VP9 acceleration is hybrid GPU/CPU but as with most new systems any slack is picked up easily by the CPU itself. I've never had 4K stutter.
     
    Last edited: Jul 22, 2016
  3. Deathalo

    Deathalo Active Member

    Interesting, does that mean it still supports 10bit HEVC? Have you played 10-bit HDR 4k content through the NUC (and does it have HDMI 2.0a)? That will be a big thing for me as I'll likely be getting one of the new Vizio P-series sets so I'd like to make sure it's HDR ready.
     
  4. Firebat246

    Firebat246 The One

    Unfortunately until the core is more mature, and has never drivers and firmware, we can't say the reason for performance loss is TB3 bottleneck. It may be the easiest assumption at the moment... But with research done in bandwidth comparison's it's still very likely the gap is being caused by something else.
     
  5. Deathalo

    Deathalo Active Member

    Exactly, I fully believe that performance can increase with a firmware update and/or better driver support, it's just the beginning and it performs marvelously already, exciting to see where it goes from here.
     
  6. kurohyo

    kurohyo Active Member

    actually you are right, i bought that usb hub that was supposedly compatible with the stealth, the manufacturers told me it would, and it wouldn't even charge the stealth with the normal charger
    so @Diblegs i advice you not to buy it
     
  7. So the original basis for the argument was "there is no performance loss". Now based on reviewer info we can all now agree that there is performance loss even compared to "other" eGPU solutions.

    Possibly it can be fixed with Thunderbolt firmware / Razer firmware / etc. but until then we do have performance loss. So at this time a Thunderbolt 3 eGPU solution (Core) does have a performance loss compared to a desktop setup 6%-8% and also to a competing PCIe based eGPU solution 4%-6% in real world gaming test on duplicate test systems.

    That accounts for the performance deltas others and I that *own* a Razer Core have noted.
     
    Last edited: Jul 22, 2016
    Destrok likes this.
  8. Deathalo

    Deathalo Active Member

    Right....
     
  9. [​IMG]

    Did you not look at the reply Dave Lee gave me on his Razer Core review video? Instead of giving a smart remark maybe you should look at the numbers presented by a reviewer that has both an Alienware Amp and Razor Core who tested both extensively on the same spec'd laptops (i7-6700hq, 16gb ddr4, 1080FE, etc.).

    Do you even own a Razer Core? Do you own any equipment to do any form of testing to question performance or you just trolling the thread with your vague opinions?
     
  10. BuckRogers_no_id

    BuckRogers_no_id New Member

    As noted, you are the 1 person who SHOULDN'T have a Core. My god, what a problem for Razer over nothing. The Core performs... like the Core performs. It will not perform like a Z170 ATX build or Dell's Amp. I don't think it was ever claimed by Razer that it was faster or slower.
    You wanted an untested, sight unseen product so bad and even forked over $400 or $500 for it. So it's tough to sympathize about any concerns you might have now. You should have copied, part for part, those top 3DMark rigs if that's what you wanted.
    The product is fine though. Literally nothing to see here folks.

    I'm not a big HTPC guy, but I don't believe it supports HDR. It's HDMI 2.0 not 2.0a. In my references I was talking about what it supported in GPU acceleration. I only know enough about HTPC stuff to be dangerous. My HTPC right now is an Amazon FireTV.

    If you're looking strictly for a HTPC and want a x86 computer rather than Shield/FireTV stuff, I'd probably skip my Skull Canyon and wait for the i5 or i7 'Baby Canyon'.
    http://www.anandtech.com/show/10492/intel-readies-new-nucs-based-on-kaby-lake-and-apollo-lake-socs

    It should be here at the end of the year. For me this is my desktop rig, so I wanted a quadcore. The Baby Canyons are dualcores. The next quadcore+ NUC is end of 2017 at the earliest, maybe later.
    Baby Canyon will have 10bit main HEVC hardware support including HDMI 2.0a I believe, and full VP9 GPU acceleration.

    According to that roadmap Baby Canyon should be here Q4 '16 or Q1 '17. I'd personally wait for that. Mine will work well and be better as a general purpose machine but I'd sacrifice those 2 cores for "all" the current codec acceleration in GPU (in nuke form) if it's a dedicated HTPC.
     
  11. Razee sure didn't! Wasn't it you claimed to everyone concerned that there was no bottleneck? With your imaginary Core you've tested? So you've finally come to the conclusion the Core performs like the Core performs and not the Amp or a Desktop? There is performance loss?

    I'm simply giving the real facts for those that have concerns of performance loss compared to A and B above. This doesn't mean it's not impressive technology, doesn't mean I'm dissatisfied with my purchase, doesn't mean I'm not impressed by anymeans.

    The consensus is a performance loss total of around 6-8% which falls in line with Anandtechs mention of 10%. It's roughly 4-6% slower than a competing PCIe direct eGPU with equal setup in real world 1440p gaming according to Dave Lee. That's completely acceptable for this slick piece of technology. For those concerned or curious at performance comparisons now know what to expect who are contemplating dropping $400/ $500 on this.

    You don't have to get so wound up over it.
     
  12. Firebat246

    Firebat246 The One

    The only thing is that it's not a bottleneck. The problem here is that you almost want to believe to rest easy that the core is bottlenecking compared to other eGPU's. We have done the research that there is not going to be a bottleneck with TB3. The performance loss is being caused by something else... that we are hoping will be get fixed so it DOES perform in line with the AMP etc. There is a line between a TB3 bottleneck vs tweaking that needs to be done to bring it up to par. Please try not to mix the two together.
     
  13. We'll I look forward to the tweaking that needs done then :), seriously. I'd be thrilled if they can tweak that performance back in, but for all intended purposes and those looking to throw down cash now understand the performance loss is real. That's all, no reason for everyone to get so defensive or upset.

    http://www.anandtech.com/show/10133/amd-xconnect-external-radeons

    I'm going based on Ryan Smith's analysis too. I hope Anandtech gets a deep dive at some point they always have excellent testing methods and great data.
     
    Last edited by a moderator: Jul 22, 2016
  14. BuckRogers_no_id

    BuckRogers_no_id New Member

    Pot calling the kettle black. No, there is no performance loss- I said from the start that you don't have another Thunderbolt3 eGPU case to test against. If you're looking for evidence of "performance lost" you need that. You can't compare with PCIE3 x16 or the Amp but the time for that was pre-purchase, not post-purchase. The Core performs like the Core. There's no contradiction in that statement.

    While there's no evidence Razer's TB3 case has anything wrong with it- my opinion is that most likely someone else will have a faster Thunderbolt3 eGPU case on the market. They're not going to be identical. Switching motherboards can affect performance by a few percent as well. The difference is: you didn't wait for a better one. You bought this one sight unseen and got it a year or two earlier. That's worth a lot. Thus nothing to see here. Expecting massive gains on the existing Core won't happen, so try to accept your decisions. It really wasn't a bad one, but lesson learned is that you aren't an ideal early adopter of new stuff. It's imperfect (always will be TBH). The rest of us are used to this scenario.

    Replacing my ATX midtower with a Skull Canyon NUC was an early adopter move. They will be cooler and quieter in V2 and V3. It's part of the deal, products iterate and some companies are better than others at doing it. Instead of waiting, I got one now. I'm glad I did.
     
    Last edited: Jul 22, 2016
    Eason85 likes this.
  15. Firebat246

    Firebat246 The One

    I still expect that eventually that the core will perform as good as the AMP. I will be disappointed at least some if it never reaches that level.
     
  16. BuckRogers_no_id

    BuckRogers_no_id New Member

    The Amp is a dead man walking and will never be repeated, it's being scrapped and Dell is making their new one on Thunderbolt3 rather than the Amp's proprietary setup. So in effect, buying the Core was stepping into the future in that respect.

    I hope for $400-$500 they improve it if it can be improved. So I agree with you. But to not set expectations that they aren't going to put those engineers on Core2 development, I'd assume it won't happen.
    What I disagree with is this guy demanding quite a bit when the easy answer is just buying the same hardware that was netting that top 3DMark score to begin with. It's silly.

    "Wahhhh!! Mommy I wanted that lollie!"
    "Ok, go get off your *** and get that that one instead"

    I posted a link to a guy who moved from an i7-5960X, which is an 8 core, 16 thread desktop machine using a 980Ti. He moved the 980Ti to a Razer Core, he loves it and he's not going back. Accepting that is tougher than accepting 1 eGPU case has different results than another eGPU case.
     
    Last edited: Jul 22, 2016
  17. Destrok

    Destrok Well-Known Member

    I am really confused on the distinction you're trying to make.... His argument is that compared to a normal desktop or the amp, the performance you get out of the same gpu and cpu is 4 to 6% less when using the razer core as opposed to the amp or a regular desktop with those specs. He has provided multiple sources showing this, why is this being fought against tooth and nail? The core is great, but you get 4 to 6% less performance out of your gtx whatever than if you threw it in a normal system is all he is saying and has benchmarks and multiple testers to back that up? He isn't even bashing it, just pointing it out. What is the conflict?
     
  18. BuckRogers_no_id

    BuckRogers_no_id New Member

    Because it's not an apples to apples comparison.
    The Core is based on Thunderbolt. The Amp is proprietary Dell. PCIE 3.0 is what's in a desktop and unfettered with any additional complexity.

    Comparing 3 different implementations on how to hook up a GPU to a computer and expecting the same results is just crazy. Or, better results I guess because he wanted the better performing option.

    There's nothing broken here to fix though that we're aware of. Maybe that's just the way Thunderbolt is going to be? We don't have crystal balls but at a minimum need more TB3 eGPU cases on the market to discuss further.

    Razer did it first and it's legit good. I'm not sure what the problem is because I don't see one that could even be discussed or addressed till someone beats the Core at it's own game (Thunderbolt3 eGPU performance). Today, nothing beats it. The Core is still the only game in town in the Thunderbolt realm.

    It's dishonest to say the comparison is between Amp/Core/desktop when the honest comparison and complaint would be between differing Thunderbolt implementations. But the Core arrived first and it's clearly definitely within range of where it should be already.

    Maybe I'm the crazy one here but look at the 4K results. This is using Firestrike to, to make everyone happy here.


    1080P drops because it's been long known and proven that limiting PCIE bus bandwidth can cap max FPS. Even though it is capped, the performance is already far more than you'd need. Games do not go from "playable" to "unplayable" once you move to a Razer Core at 1080P. They're just slight less overpowered at insane framerates. If that bothers people, well now you know if you didn't do your research already as I did. Doesn't bother me, I don't buy $500 Razer Cores and a matching $650 GTX 1080 to run 1080P.
    We saw this for decades with 640x480 testing, bus bandwidth having an impact. But no one cares.

    1440P the game is already within "doesn't matter" range. Anyone using this should be gaming at 1440P or higher anyway.

    4K the gap is nonexistent.

    This is a real comparison using 2 powerful machines, and going from pretty much the best desktop you could build (give or take) with the overclocked 5960X and then Skull Canyon with the Core.

    For me, this is definitive. Muddying the waters isn't necessary. There's no problem with this product. It's ridiculous. The next step for people that are still unhappy is to issue a complaint with Razer and ask for their money back. File a complaint with the Better Business Bureau, or hire a lawyer and sue the Razer Corporation for having a Thunderbolt product that isn't exactly like the Amp or a desktop.

    Which is what he really wanted. So that's what he should get.
    It's much easier to wait for benchmarks on things before you buy them.
     
    Last edited by a moderator: Aug 13, 2016
  19. Destrok

    Destrok Well-Known Member

    Dude...... No one said there was a problem, you're so hyper and defensive. I understand what you are saying, but no one here was ever at any point trying to bash the product. I understand that proprietary connectors, pcie, and thunderbolt are different, but that is not the point. Literally the point was that compared to a desktop there is a very small loss of performance. Literally all it was is determining how well the core performed, and it performs well, but it isn't perfect. No one here is complaining about their purchase, just noting that so far there is miniscule performance loss for those that are curious. Absolutely at no point did anyone complain about their purchase...... at all...... yet you are so determined to yell about how they should have done more research before they bought. All that is happening here is noting a slight difference. Calm down.
     
  20. BuckRogers_no_id

    BuckRogers_no_id New Member

    I might be hyper or defensive. I can live with that. But I am also right: the complaints are mountains out of molehills. There's no problem here and it's legit that people need to research before buying, not after then complaining. That's just a lesson learned here.

    There's nothing we can do for him or his Core. It is what it is. Just something to either decide live with, or sell and build that perfect Firestrike machine.
     
Thread Status:
Not open for further replies.
Sign In with Razer ID >


Don't have a Razer ID yet?
Get Razer ID >