Razer Core - $399 for Blade users/$499 standalone

Discussion in 'Systems' started by Min-Liang Tan, Mar 16, 2016.

Thread Status:
Not open for further replies.
  1. mmocarski

    mmocarski Active Member

    So, does anyone actually know how much bandwidth the card actually requires under full load? Until we know that everything anyone says is little more than a guess.

    Is it likely to be bottlenecked to some extent, yes, but if it only results in 5-10% decrease in performance, the net performance is still a huge leap over any of the Maxwell cards.
     
  2. venturePinkLacespot810

    venturePinkLacespot810 Active Member

    well that's... unfortunate
     
  3. campfrontBeaver226

    campfrontBeaver226 New Member

    Will this have it's own power button to power on and off your computer? One annoyance about hooking up my laptops to monitors and keyboards, etc.. is having to open and close my laptop lid every time I reboot.
     
  4. SirThomasKrown

    SirThomasKrown Active Member

    No/\ . There hasnt been anything mentioned, or shown about that. Id assume just a power switch for the Core itself.
     
  5. PastelGraywatchsharp937

    PastelGraywatchsharp937 Active Member

    No, but some (Dell) laptops have an option to turn on when docked to a TB3 port, or to turn on when receiving AC power.

     
  6. MarcusChai

    MarcusChai Active Member

    i think power on have to open lid every time and shut down dont need to open lid
     
    SirThomasKrown likes this.
  7. SirThomasKrown

    SirThomasKrown Active Member

    Really? I wasn't aware of that.

    @DouglasHK Maybe that will be included as an option in xconnect?
     
    Last edited by a moderator: May 13, 2016
  8. TrueLightbringer

    TrueLightbringer Active Member

    Correct me if I'm wrong, but isn't Xconnect just AMD's terms for their external graphics card technology? I mean, it's not like Nvidia doesn't have it, they just haven't given it a fancy name like AMD has. Why is everybody using 'Xconnect' as an interchangeable term for eGPU tech when it likely has nothing to do with Nvidia (and thus most of the cards that the people here are going to get?)
     
  9. JETcoolCoolBlack681

    JETcoolCoolBlack681 Active Member

    Your question is invalid. GPU is a separate processing unit, it does not need any bandwidth at all to run something at full speed. However CPU pushes textures (data) and shaders (subprograms) back and forth through that TB3. So it becomes a bottleneck as much as gamedevs care about their source code.
     
  10. mmocarski

    mmocarski Active Member

    The question is not invalid, it's well known that 16 lanes of PCI-e 3.0 is way more bandwidth than a CPU needs to push to data to the GPU to be processed. The question is if 16 lanes is way too much, how too few is 4 lanes over TB3.

    I'm well aware that the GPU will run always run at full speed, but that doesn't matter if it needs 6 lanes of PCI-e to be fully utilized and it only has 4. That's what we're trying to figure out here.
     
  11. JETcoolCoolBlack681

    JETcoolCoolBlack681 Active Member

    "Too much" or "too few" is not a question neither it is an answer. It can be an opinion, though.
    No, GPU has PU (processing units), it has a memory and an internal bus. It can be fully utilized with very minor usage of PCI.
     
  12. Derek712

    Derek712 Active Member

    If you want to test it out, can't you change your mobo settings on your desktop to x4 and see what happens?
     
  13. mmocarski

    mmocarski Active Member

    That's actually a good idea, and could provide some useful numbers. Too bad no one has a 1080 GTX yet to run that type of comparison.

    Not sure I'm following here, it can be an answer, if it is too few the card will be throttled, if it is too many, it won't be... And that's the whole question we were originally trying to address. It's no different than a highway, if it you put a heavy volume of cars on a two lane highway - there may be a traffic jam, if you put that same amount on a 4 lane highway there wouldn't be.
     
  14. Derek712

    Derek712 Active Member

    Maybe not a 1080 but someone with a 980 or a titan can try
     
  15. mling001

    mling001 Active Member

    Does anybody know if the new 1080 will be too much for the bandwith?
     
  16. Derek712

    Derek712 Active Member

    Everyone can speculate but there's no way anyone could possibly know for sure until both are released or razer says something.

    Heck, we don't even know for sure if the current gen cards will be too much.
     
    mling001 likes this.
  17. ididntmemeto

    ididntmemeto Member

    There has been a lot of speculation regarding the hardware/bandwidth/bottlenecking of the TB3 interface. From my observations, it actually won't be as bad as some might think. eGPUs aren't a new concept, they have been around on computers using early thunderbolt revisions, and now TB3 is 4x faster. The bottleneck I saw from benchmarks of TB1 were around 10-15fps, TB2 around 5-10FPS, so I can presume that the bottleneck of TB3 will be neglibile. The more important factors will be optimization of the game for the hardware (CPU/GPU architecture) and the hardware itself (mainly CPU, as some games are more CPU intensive than others).

    TLDR, we shouldn't be worrying about TB3.

    NOW PLEASE LET ME JUST GIVE YOU MY MONEY RAZER!
     
    Demthios likes this.
  18. Demthios

    Demthios New Member

    I'm getting ready to pull the trigger on a Stealth just debating over which screen to get but pretty positive I'm going to be getting the core as well. I'll probably pick up the 1080 for my desktop and move my 290X over to the Core.
     
  19. GoneDrinkin

    GoneDrinkin Active Member

    just wondering, why would you need a core for your stealth then? when you will be gaming on your desktop? I assume a stealth when you're out, but at home, wouldn't you use the desktop?
     
  20. Demthios

    Demthios New Member

    We only have one desktop in the house and just sold off an Asus Gaming Laptop due to size for everyday use. When Crowfall comes out my wife and I will want to play together. This way we will have two decent units to choice from. Ultimately I would like to hook it up to our TV in the living room and grab a Turrent to use when playing out there, as sometimes she likes to backseat watch while I play PC games.
     
Thread Status:
Not open for further replies.
Sign In with Razer ID >


Don't have a Razer ID yet?
Get Razer ID >