Blade Stealth + Core + GTX 1080 benchmarks, first impressions

Discussion in 'Systems' started by vitorious, May 27, 2016.

Thread Status:
Not open for further replies.
  1. Why would a benchmark need to be optimized for the Core? It's simply a thunderbolt 3 eGPU enclosure based on Intel's spec http://www.anandtech.com/show/10133/amd-xconnect-external-radeons . Anandtech roughly estimated a 10% loss doing all the numbers discussing the technology since technically thunderbolt introduces latency (overhead).

    Others are saying that TB3 is not the bottleneck for the eGPU which is the whole reason I've been looking into it. I'm assuming based on these comparisons and what Anandtech stated on the technology that the 10%-12% is within the realm of reason. If that's the case I'm completely satisfied with the Core's performance as its right in line with what to expect from a TB3 eGPU setup using 4x lane PCIe 3.0.

    Now if others are firm that they are correct and no bottleneck exist in regard to latency, etc. then myself, nightmyth, and others are complaining that we aren't getting the performance we should be seeing (possibly an issue then firmware / software). The Alienware AMP is the best comparison since it's same spec 4x lane PCIe 3.0 slot, only it utilizes a raw PCIe connection instead of Thunderbolt 3.

    Why do we keep saying "real world data", most reviewers use 3dmark, Heaven, etc. for a real world "test" scenario. I promise those numbers translate to real world gaming performance differences, especially GPU heavy applications like VR (stereoscopic rendering, etc.). Anandtech even notes that higher latency and GPU heavy tasks will put more burden and possibly skew that 10% number. If that's the case our performance loss we see in the 10-12% is normal.
     
  2. Deathalo

    Deathalo Active Member

    Actually plenty of reviewers use real, current games as tests, and benchmark scores are just thrown in for people who like to see them. If you "promise" they translate into real world gaming performance differences, then put your money where your mouth is and actually show the FPS differences, until then I have to take every result you post with a grain of salt.
     
    BuckRogers likes this.
  3. I guarantee 110% if I track down one of the Amp guys on notebookreview to run GTAV bench on his same spec'd Alienware laptop + AMP + 1080 @4k and I do the same he'll have roughly a 10-15% higher average framerate then me. I promise.

    The key is there is a bottleneck here even against whatever I'm hearing, if there isn't a bottleneck we have a problem that needs addressed via firmware / updates from Razer. That's the whole point of digging into this.
     
  4. Deathalo

    Deathalo Active Member

    Ok, do it and get back to us with the results. If there is an issue that can be fixed with firmware then the more evidence you provide the more likely they are to address it anyway.
     
    BuckRogers likes this.
  5. BuckRogers_no_id

    BuckRogers_no_id New Member

    Then buy the Amp?

    But if you were this concerned, why not wait for the benchmarks from others and the reviews before purchasing?

    It's really difficult to sympathize with a ~10% or less performance gap from a non-comparable, full ATX system synthetic benchmark result when it was all purchased sight-unseen. A blind purchase 100%.

    I think about everyone else here agrees there's no problem. Your system nor video card matches the comparisons, and it's not an *TX system build so that comparison isn't really apples to apples either. Only thing to do is build a traditional rig or wait for competing products. You were the wrong guy to go in on this ASAP if you're this picky.

    I'm guessing the Core was purchased with your parent's money, because anyone else would've researched more and waited for benchmarks- such as the ones Linus Tech Tips is saying they're working on.
     
  6. Destrok

    Destrok Well-Known Member

    Play nice kids.
     
  7. KillerFry

    KillerFry New Member

    Nah, I'm just a single guy with a lot of disposable income to enjoy in my hobbies. When and if that changes, then my purchasing processes might change.

    But yes, if that is not the case for someone, researching is good. I knew more or less what I was getting with the Core - even before reviews came out, just with educated guesses - and I was ok with that when I clicked the "Buy" button.

    I understand your frustration when something doesn't pan out as expected. You could return it, or sell it, and look for an option that is more aligned with what you need/want [emoji6]

    Sent from my Nexus 6P using Tapatalk
     
  8. Deathalo

    Deathalo Active Member

    I don't think that comment was directed at you man...
     
  9. BuckRogers_no_id

    BuckRogers_no_id New Member

    Oops, yes that wasn't meant for you KillerFry. :stuck_out_tongue_winking_eye: Probably an unnecessary addition to my post TBH. Just getting frustrated reading this thread, Razer has done nothing wrong here that I can see. And I am not a Razer fan, I'd be the first hater on the block. They're impressing me with the Razer Blade and Core, taking a second look for sure.

    Before the Core's launch, I expected a small gap between the Core and desktop builds as well (KillerFry). I already knew about the 4GB/sec bandwidth vs full PCIE 3.0 x16 link, roughly speaking this is the breakdown.

    PCIE 3.0 x16 (Z170, so your i7-6700Ks) 15.75GB/sec
    PCIE 2.0 x16 8GB/sec
    Thunderbolt3 5GB/sec (yay!)
    PCIE 3.0 x4 4GB/sec (this is your PCIE bus link for NVME M.2 SSDs)
    PCIE 2.0 x8 4GB (I think historically the vast majority of SLI rigs ran on dual PCIE 2.0 x8 links)
    PCIE 1.0 x16 4GB/sec

    There's various system overheads in all implementations so I usually quote the Razer Core as having 4GB/sec. This is full-duplex (for those old enough here to remember dial-up modems in the late 80s/early 90s), meaning 4GB/sec each way simultaneously.
    That's why Razer can allow you to hook up the Core to your laptop, and still use your laptop's screen.

    Saturating 4GB/sec full duplex is a tough task with a GPU without doing it on purpose (like a synthetic benchmark, as opposed to games). There's a lot of texture compression going on, even color compression now, and other driver optimizations that have been made over the years.

    This was really put in place to allow NV/AMD to put on cheaper memory, cheaper memory buses (like 128bit vs 256bit memory buses), keep the prices the same and make more money. But that desire for efficiency ensures that Thunderbolt will never have a problem with bandwidth. I'm sure a 1080Ti will also work perfectly in a Razer Core next year too.

    GPU bus bandwidth hasn't been a problem since PCIE 1.0 x16 was released back in 2004.

    I made the mistake of being an early adopter of SLI back then, when I paid $850 (USD) for two 6800GTs in SLI. AGP was still dominant at the time and SLI was a joke (few recommend SLI today, as everyone eventually figured out it wasn't as good of an idea as a single highend GTX1080 or similar). Sometimes it pays off. In 1996 I bought a new 3dfx Voodoo 4MB card and it was worth every cent of $299. I think the Core is as big of a product launch as something like the 3dfx Voodoo.
    Looking back in 10 years, we'll see how the Core changed everything. Kudos to Razer.

    Not to derail the thread, but wanted to add the PCIE vs Thunderbolt bandwidth comparisons. My earlier links show how it's proven that for 12 years now, bandwidth for GPUs has been a solved problem. PCIE 3.0 x16 is dramatic overkill for GPUs at least. Part of the reason are people like me, who now use Intel Iris Pro 580 graphics (roughly a Geforce 750 in performance) in a Intel Skull Canyon NUC, game engines and developers can't go too crazy streaming textures with lots of us Intel Graphics users out there. Among all the other lower end and older cards in-use today.

    You gain some by using PCIE 3.0 x16, a few percent never going over 5% of performance, which is meaningless because that's at 1080P or lower resolution in situations where you're already at 150FPS+. As mentioned by many here, the gap is nonexistent at 2K Ultrawide or 4K.
     
    Last edited: Jul 21, 2016
  10. I see it's honestly useless to post on here anymore since it's not generating any useful discussion.

    - I'm married with children (32)
    - I've been building PC's since 1998 (and overclocking them)
    - I've been in the IT industry professionally for 15 years
    - "fun" money isn't a problem to buy toys, I mean the Core honestly is a toy (along with all these high dollar parts).
    - I bench and test components all the time, I understand how it all works.
    - I'm simply trying to understand why two products based on 4x lane PCIe 3.0 have such a vast performance delta between them (possibly PCIe vs TB3 connection / overhead).
    - BuckRogers you obviously skipped my direct comparison between that put two direct laptops in comparison that pulled a 22% difference in GPU score. The difference is real. (where you came up with ATX I do not know)
    Obviously we are making mountains out of mole hills and getting no real discussion on benchmarks and performance. I noted from Anandtech in the past the same latency / overhead issue that could cause a 10% loss, I'm seeing it now or more just curious to see if this is standard fare or not (won't know until more TB3 eGPU enclosures launch).
     
    Last edited: Jul 21, 2016
  11. BuckRogers_no_id

    BuckRogers_no_id New Member

    Just different implementations with a lot of moving parts and complexity going on with a new frontier. Check benchmarks on both and pick which one suits you better (Amp, Core or ITX build). I'm personally with the Core from everything I've read here and elsewhere online. If you offload the Core for cheap, hit me up. :D You already accepted that Razer deserves time to fine-tune, but even if it stays as it is, I'd say it's a worthy purchase.

    Remember there's no refund coming even if it doesn't exist in competitor's products. Unless it's low hanging fruit to fix (it's not), they aren't going to "fix" it either.

    Me too, a little bit longer timespan but I'm a former game developer. I've worked on some games you've probably played. I have a pretty good understanding of what's going on here with the Core. I'm a fulltime dev today still but no more games. :)

    Well ya what did you expect. We can't "fix" something that isn't broken. It works right, it's not going to exactly be a desktop machine. I really don't see anything here quantifiable that can be fixed.

    You could gather evidence as Deathlo suggested, but the end result is a lot of serious time investment and engineering by Razer to improve anything. They may update it if they can, or may wait till the next release. I'd bet they'll do everything they can, it's just a non-trivial product.

    The fact Linus unplugged the Core while the system was running and it didn't bluescreen alone impresses me. I'm glad I didn't have to work on the team that engineered the Core at Razer. Poor souls getting whipped to make something like this work up to everyone's extremely high standards.
     
    Last edited: Jul 21, 2016
  12. - BuckRogers you obviously skipped my direct comparison between that put two direct laptops in comparison that pulled a 22% difference in GPU score. The difference is real. (where you came up with ATX I do not know)
    Say what you want about "synthetic" benchmarks but a 22% GPU score difference is huge, it does equate to gaming performance difference no matter how you skin it.
     
  13. BuckRogers_no_id

    BuckRogers_no_id New Member

    Then buy the Alienware. Yeesh. I'd love to see a TWENTY TWO PERCENT difference in a real game, just 1 game please. Won't happen unless you're at 640x480.
     
  14. Deathalo

    Deathalo Active Member

    Look, you presented some theories, but when asked to provide more concrete evidence you just said to trust you and that you promised your benchmarks were the same as real world gaming results. That's just not going to cut it and you should understand that given your background. If you're going to come complain and bring up issues without actually comparing numbers from real world scenarios then we have nothing to discuss.
     
    BuckRogers likes this.
  15. BuckRogers_no_id

    BuckRogers_no_id New Member

    Watch this.

    He has a Skull Canyon NUC and Razer Core with 980Ti and benchmarks it, including 3DMark from 1080P to 4K. You don't see him complaining. Enjoy your system.
     
  16. Deathalo

    Deathalo Active Member

    That actually makes me want to get a NUC and hook it up to my TV, then I can use the core with my laptop and the NUC and use the NUC as a sort of media player with Plex as well...
     
  17. BuckRogers_no_id

    BuckRogers_no_id New Member

    I can confirm Skull Canyon is awesome. I have mine hooked up to 3 thin-bezel LCDs (Dell U2414Hs) in daisy-chained DisplayPort config. It's fast, I put 32GB DDR4-3000 in mine. I'm using an old SSD right now but will be putting in a 1TB Samsung 960 Pro NVME M.2 when it lands later this year.
    I have my NUC mounted to the back of my LCD too so it's a very clean setup.

    I ain't skeered of this Razer Core stuff- all this FUD doesn't bother me a bit. If I get the itch to pickup a GTX1080, I'll drop $500 on a Razer Core in a heartbeat. That thing is a revolution, can't believe people are b'ching about it. Razer needs to flipping stand on their HEADS to please people. Amazing really.

    One thing the Core enables is you could put the NUC and Core at your desk, either run a 50' HDMI to your TV or hookup a SteamLink at your TV for streaming. Then plug your laptop into the Core when you have it there. I'd probably recommend a SteamLink, $50 and simplifies things a bit with controls/remotes.

    The NUC is powerful enough that any streaming or basic gaming you want to do with a SteamLink or straight HDMI will work pretty well (unless you're on 4K). If you're on 4K, the NUC will definitely need the Core/GTX1080.

    NUCs (pronounced NUKE for those who don't know) and laptops are definitely the future, it's inevitable with the mainstream desktop market collapsing. These mini-PCs and laptops like Razer has are going to take over for sure. Even if people don't like it.
     
  18. Deathalo

    Deathalo Active Member

    I was actually thinking of getting an NVidia shield for the living room, but the Skull Canyon may be even more tempting. Hmmmm, decisions decisions...
     
  19. BuckRogers_no_id

    BuckRogers_no_id New Member

    I like the NV Shield but Skull Canyon is more versatile. I figured it was ideal for my next system, can plugin the Core, can not. Can repurpose it as a pure HTPC. Great as a shared home computer. Anyone would be happy to take it off your hands down the road, can't imagine a better point-of-sale machine, highend digital signage. No one wants some kids old ATX/ITX desktop machine that's crashing and BSODing.. but NUCs are easy sells. These things are the future, this one just isn't cheap. My full setup is $1350 not including monitors. Which actually isn't that bad but people will complain about anything.

    It's not the best bang for buck but I'm not a bang for buck guy. I buy what I want/need. If the Iris Pro 580 does what I need, great. If I need a GTX 1080, I'll buy that. I don't look at frames per dollar charts then buy the best option there. I think "price/performance" is mental illness. I don't get it.
     
    Last edited: Jul 21, 2016
  20. Deathalo

    Deathalo Active Member

    Aren't they only like 600 bucks? That seems pretty reasonable to me, and of course you can add on the core for more graphic power, but should still be able to play 4K video on it's own correct?
     
Thread Status:
Not open for further replies.
Sign In with Razer ID >


Don't have a Razer ID yet?
Get Razer ID >