Transform your laptop with the Razer Core X

Discussion in 'Systems' started by Razer.WolfPack, May 22, 2018.

Thread Status:
Not open for further replies.
  1. czesio007p

    czesio007p New Member

    Oh my god ....
     
    vampirivan likes this.
  2. Dark-Rider

    Dark-Rider New Member

    Hello all,

    Sorry to bring the bad news, but the Razer Core V2 is still not the revolutionary product we all were awaiting for.

    I own a 2016 Razer Blade 14 with GTX1060 and I wanted to, eventually, get a Razer Code to increase performances. And in my case it is a total fail:
    - the USB-C acts as a bottleneck, impacting the performances of any external graphic card, even the most advanced ones (Blade 14 + GTX1060 = more fps than Blade 14 + Core + GTX1070 or in some case 1080ti)

    - The Razer Core acts as a USB-C hub that includes a GigaB. Network adapter and enables to connect to an external monitor (via the graphic card, or Display port) - so if you use already a usb-c hub that contains a display port to support 120 hrtz, the razer core becomes even useless and a noisy add-on to your desk

    - The USB-C cable of the Core V2 is 50cm long (to support 40Gb data transfer rate), so you will have to keep the Razer Core on your desk to use it (= fan noise), and you will need to face the most ugly and noisy part of it (the back) due to this short cable

    - From a connecting perspective, I will just say that Core X is the one you should never buy (it has no external USB port, no Ethernet port).
    Core V1 is a bit old and requires software to switch from embedded graphic card to external core.
    Core V2 does this more smoothly without the need to touch any software (hot plug)

    Basically the Razer Core (v2 NDLR) is (very) good ONLY when using a laptop with usb-c and a basic graphic card, (as for instance the Blade Stealth with Intel GPU) or any laptop without a proper 3D accelerating GPU, as for instance when using a laptop with an old NVIDIA
    (side note: I still struggle to imagine a laptop with Thunderbolt connector and a "old" gpu, but that is a different topic...)

    If your laptop has a decent 3D card (GTX1060 and above), then using the Razer Core is a "step down" in your setup: fps will drop, operating noise level will go up.

    If you want to improve your gaming perf, you will need to buy for a more recent Razer Blade (gen 7) or eventually switch to a more standard gaming desktop

    D8
     
    Last edited: Dec 3, 2018
  3. Joikansai

    Joikansai Well-Known Member VANGUARD

    Yes Razer Core basically made for boosting laptops without dgpu aka ultrabook. But keep in mind there’s some advantages having it with a gaming laptop. To me here’s the advantages of course
    - CPU Temperature improvement up to 20 celcius, it may longer your Blade life span, my Blade 2017 use almost half its usage with Razer Core, sold out in mint condition.
    -Triple monitor gaming, normal Blade have only one or two ports to connect to external monitor, you can use dongle though, but some may reduce or not maximize the performance.
    -4K gaming, with 1080ti or more powerful card you can have access to this which is impossible by most powerful Blade gpu atm 1070maxq.
    It may be expensive upgrade but It’s worth having core with a gaming laptop, you have still portability plus power and at home you can add some juice and extra cpu cooler on it which is awesome imo.
    Unfortunately atm my rtx card still on the way so my core still empty, I believe with that I can max more performance on 144Hz Predator 1440p ultrawide via surrounds, with 1070maxq only around 60fps high with cpu temperature on high 80. 669E0567-B45D-40C4-957C-C6F76D9088E2.jpeg
     
    PhrostByt likes this.
  4. Dark-Rider

    Dark-Rider New Member

    Hello Joikansai
    Thank you for your reply! I am very curious to see your feedback, using the Razer Core V2, once you will have your graphic card in it.

    Don't get me wrong, I am a Razer fan, despite having issue with 2 mouse cords (Orochi and Diamondback), and having my Razer synapse not recognizing my Orbweaver if I unplug and replug the usb cable of the gaming pad. Or even just the fact that my Diamondback is incapable to retain the chroma configuration more than 10 mns - it disconnects or reconnects and switches back to factory default dpi and spectrum cycling when ever it wants...)

    My point is, there is still room for improvement (and a lot) on Razer Products and the Razer Core is no exception.
    Imo, I believe it is important for the audience to understand that purchasing the Razer Core will not "solve just everything" by just plunging the Thunderbolt cable to your laptop.
    It will reduce laptop heat, usb-c hub and using multiple screens, yes, but that is the added value I really see using a Razer Core. For multiple screens, in high res. I would like to see with my own eyes :smile_:

    For my Blade 14 I would want a Core (v3?) that will support the following features
    - support wide screens (> 34") at 140hz refresh in 4K
    - a place for not 1 but 2 (or even 3) graphic cards as I seriously doubt you will be able to run 3 screens at 140 Hz 4K resolution on 1 single graphic card.
    - or alternatively, a Core that will be able to combine both processing capabilities of the internal graphic card and the external one.
    - a more practical design with a decent usb-c cable length to be able to put the core box on the side of my desk and still have 40gb of data transfer (50 cm cable that is one big issue for me, I believe you will discover this "limitation" soon enough)
    - a noise less enclosure (I mean really silent, like a modern desk PC - the core is very noisy compared to my actual setup - and for that price, I expect noise to be reduced to the max)

    Cheers
    D8
     
    Last edited: Dec 4, 2018
    PhrostByt and Joikansai like this.
  5. Joikansai

    Joikansai Well-Known Member VANGUARD

    Unfortunately egpu isn’t there yet, 4K 144Hz, even most powerful single gpu rtx2080ti desktop setting will struggle on some titles doing that on single monitor, Idk yet about new rumored Titan rtx though.
    Egpu was aimed for non dGPU laptops like ultrabook to have a gaming performance, like for example I can “abracadabra”;) my old stealth 2016 to be Blade 14 2017 in term performance with 1080ti core setting.
    Yes there’s a lot improvement needed on Razer Core especially v2 which loud PSU Fans noise, on core x user can change it easily since it use normal fans and PSU if they don’t like the sounds.
    There’s already longer full TB3 bandwidth (on egpu Graphics data it’s actually 22gb to 28gbs, intel limited it somehow, for other pheriperals bandwidth imo, you can check it with Cuda-Z) like this. There’s also another egpu enclosure that are silent but not with 500Watt or more, which means can’t drive high end gpu well.
    If you use it for cuda works both laptop dGPU and gpu on enclosure can work together but not on gaming as sli.
    TB3 still limited by 4 lanes maximum atm. You maybe need a real PCIe3 full desktop lanes (16) speed egpu for doing that, I saw some enthusiasts connecting directly to some parts inside laptop like express card slot with loosing its portability as trade off since it make the bottom lid opened and can’t be easily plugged off like normal USB ports, and of course loosing like wireless card that connected to that slot before. Some othe big brands make also their own gpu connection that almost same with desktop PCIe but you’ll need to boot your laptop anytime you want to use dGPU that to me is a deal breaker compared TB3 plug and play feature.
     
    PhrostByt and Dark-Rider like this.
  6. Dark-Rider

    Dark-Rider New Member

    Joikansai, we definitely understand each others! even on magic spell :wink_:

    Thx for the TB3 2m cable link :smile_: while 80 usd is very expensive for a cable, that is nothing compared to a Titan RTX graphic card, hitting today 2700.00 euros for 1 piece :cool_:.
    Adding 500.00 euros for a Razer Core, we are talking about 3300.00 euros for a eGPU (including the TB3 cable above) and still no screen(s) yet included.
    Alternatively it will cost you 1800.00 euros for the same setup with a 2080ti card.
    That is becoming very very pricey for gaming! I am not that rich! :rolleyes:
     
    PhrostByt likes this.
  7. Dark-Rider

    Dark-Rider New Member

    Small details that has it importance
    The Razer Core V1 and V2 have a 500w PSU and this is a bit short power to be able to install for instance, a GTX1080 ti that will consumme 600w - same goes with the 2080 ti

    On the bright side, the Razer Core X seem to be the only eGPU case to support these top of the line cards - as the Core is equipped with a 650w pretty much supporting any kind of greedy Graphic cards for now

    That implies for the ownerd of Razer Core V1 or v2 to upgrade the PSU (from a Razer Core X?) if they want to install a GTX1080ti and above

    D8
     
    PhrostByt likes this.
  8. l4v4l4mp3

    l4v4l4mp3 New Member

    Just want to agree @Dark-Rider after owning my new Razer Blade 15 Advanced with 1070 and now adding a Core X with a RTX 2080. The gaming performance is lower with my 2080 than with my 1070. Seems like 40GBit = 5 GB/sec through TB3 is really a bottleneck in comparison to 15,8GB/s for PCIE 3 x16 and for newer versions even more.

    So you will end up with less performance in your razer core if you have sth like a 1060 or above installed in your main system.

    makes the Core more or less worthless. Sad but true.

    Adding just two example screenshots for Horizon 4 benchmark

    . 1070.jpg 2080.jpg
     
    Dark-Rider likes this.
  9. Joikansai

    Joikansai Well-Known Member VANGUARD

    It’s not only Razer Core, you should blame intel for limiting TB3 bandwidth on egpu setting, it’s not even 30gbs it’s max 28 with full TB3 PCIE3X4 lanes. So don’t compared with desktop or direct PCIe3x16 lanes. Another things to add that on 1080p the bottleneck is even higher If you have time you can read this. You can check bandwidth with cuda Z, check on performance tab host to device and device to host bandwidth. Blame intel!;) The rest Bandwidth i think but not sure is allocated for other things like pheriperals.
    High FPS is also egpu big enemy and there’s also games that badly rendering things through TB3 like that Forza Horizon and PUBG in my experience, correct me if I’m wrong though. For those titles I would play with dgpu, it’s still play well on 1070maxq on 1440p high
    2C1833B5-F4DB-49CA-BBF0-B7AAE8D2D817.jpeg
    With RTX 2070 on 1440p, it’s sharper but lost a lot FPS. 1B10E861-E38E-49AF-BDF8-01857E6F52C0.jpeg

    Another egpu advantage on laptops with dgpu is cpu thermal improvement, since fan can put more attention to cpu because dGPU is off. For example this is cpu thermal comparison between dgpu and egpu on gaming mode (better performance) around 15 minutes doing Forza Horizon 4 Benchmarks, dGPU...Auch;)
    dGPU
    E9D8DD72-C9BA-4879-A11A-0C662CCDCBFF.jpeg
    Egpu 5F29D13C-525A-4600-A4EB-C990AFBC0992.jpeg

    So it depends on you, better heat for better Blade lifespan or better performance for hotter machine:big_grin_:
    I personally shared both, for open world I use egpu and esports and racing dGPU, this setting keeps my Blade from last 14 model rocks like a champ:cool_:
    Btw forgot mentioned that 1070 can’t run well ultrawide 5120x1440p, it’s only around 30fps or worse compared 2070 with gpu on high 40ish to max 60fps, with 2080 i think it’ll be better.
    7BA3BF0B-61D4-47E0-946D-CD86D943D0B5.jpeg 09DB03CC-50B3-48ED-A5C2-FCA63DA321F9.jpeg
    Cheers:smile_:
     
  10. l4v4l4mp3

    l4v4l4mp3 New Member

    I am not directly blaming razer here, so more a bit sad about the actual situation. So the only thing which I might complain, is about the transparency of those facts. even from intel or razer. So happy to sell the core to people who do not need it. And by “need” I just mean “want” - knowing that no one, really dying without [​IMG]it :D

    So yeah the gain of performance might be better with 4k and in other games.

    But on the one hand side I am using a C49HG90DMU so using more or less double FHD or half 4k [​IMG]~equal to 1440 and not swapping out this soon.

    And on the other side I had really bad experiences with BF V or SotTR, too. (Pro: It makes no much difference to (dis/en)able Raytracing.

    All in all I am just expecting a bit more from a setup, costing over 2k for blade, nearly 1k for a bigger SSD and more memory swap out (both not directly impacting the performance issue described here […] and over another k for the core + GPU.

    So I will send in core and GPU again and just go with the blade. While with the savings I will be able to just swap out the blade, once sth new is on the market. (maybe a new blade with a 2080 or at least 70 by mid of the year - So the blade has no need to stay alive for years) :D

    But yeah no things to hate razer or intel, but just want to clarify this again. Because if I had known all the facts upfront I would have stopped me from buying the core. But again I am happy that I got the blade 15 Advanced and not a stealth with the core ;)

    20181207214316copymofrk.jpg
     
    Last edited: Jan 4, 2019
    Joikansai likes this.
  11. Dark-Rider

    Dark-Rider New Member

    Hello Joikansai

    First of all, Happy new year 2019 to you, and all forum members

    Now back on the subject, I don't think it is a question to blame Intel or Razer. Both in my mind actually are responsible, as said @I4v4I4mp3, for the lack of transparency, in order to sell you something you don't need.

    Now the argument of "overheat" impacting the laptop life span is highly questionable - for instance, I don't overclock and I am using my laptop within the boundaries of "normal usage" with a little plus: my laptop stay stationary 11 month on 12 on my desk. Hense there is no reason for my laptop to "break" or fall.
    If heat implies a shorter laptop life span, then we, blade owners, have a serious problem - but I doubt that.

    (with regards to heat and GPU questions, I recommend to read this to understand that gpu used for mining don't see a drop in performances while they generated heat, 24/7 for along period of time)

    side note: I though the 2080 could be a game changer for this setup, but, thx l4v4l4mp3 for trying it, we now know that it is not the case (and I am not even surprised that much)

    Cheers,
    D8
     
    Last edited: Jan 4, 2019
  12. Joikansai

    Joikansai Well-Known Member VANGUARD

    Yes Happy new year :tada:to you too.
    If you can keep monitoring your laptop without egpu it’s fine having Blade, keep in mind it’s gaming laptop in MacBook size. The heat especially in this category laptops is the best enemy.
    For most poeple who aiming FPS it’s 1080ti or 2080 isn’t gaming changer if you have decent gpu on your laptop but not for some who want better resolution like 4K gaming that couldn’t be done well. I did that on my previous Blade 14 doing 4K with 1080ti on rise of Tomb Raider that was unplayable on 1060 dGPU. Though I play also a lot esports that made my gaming usage around 50 50 between dGPU and eGPU. And as result I could sell it on mint condition like day one, even I use it outside as well.
    Heat that generated inside may wear the other parts as well, so it comes also to lucky factor if you get great parts.
    This is similar with gpu for mining, it’ll use from my understanding using softminer recently 100% gpu usage, like using 100%battery on phone, phone will be fine but it will be dropping the battery performance.
    Anyway to me egpu isn’t also gaming changer but it’s for some users, you can take a look on egpu.io, a lot Mac users become enthusiasts about this, they can finally gaming on that and as ex MacBook user I can really understand that gaming on mac was really painful in term performance and heat. To me on Blade 15 with egpu it’s a gaming revolution, especially with rtx card, the reflection is just natural and could be done in a laptop, played last night stories (for online competitive i prefer over 70fps) on BFV it just works;) 40fps~ 1440p medium, I think with 2080 or higher it’ll be better. On stealth with weaker cpu it’s totally unplayable, laggy.
     
  13. l4v4l4mp3

    l4v4l4mp3 New Member

    Yep I agree with you, too. (at least partly) [​IMG]

    For Mac Users or people with just an Ultrabook, without a mid to high value dGPU, a core and all eGPUs makes sense and bringing performance. But with a 1070 its at least for gaming not that much, in response to the costs.

    If you are rendering, mining or sth like this that might change, because in that scenario I guess the bandwidth is not that bottleneck than for high fps gaming.

    So just adding a core with RTX for medium raytracing might be fun, but it is not worth (at least for me).

    So saving the money and waiting for RTX cards in a new blade or sth like this.
    Since I doubt that even TB4 - if it will available sooner or later will tweak the performance we need for high fps enough.

    Just for reference the answer from razer regarding my question of any limiting factor when using a high price GPU in the Razer Core.

    Greetings from Razer!

    We understand your concern with regard to the performance of your Razer core X. We apologize for the inconvenience that the issue may have caused you.

    With regard to your question. The Razer core X does not limit eGPU. It also varies on the card. You can actually check the link below.

    https://support.razer.com/gaming-laptops/razer-core-x/

    Please feel free to ask if you have any further questions, Have a good day!
     
  14. Dark-Rider

    Dark-Rider New Member

    Hello Joikansai, l4v4l4mp3

    Apologies, but none of us 3 uses a Mackbook, and comparing GPU usage and Battery usage doesn't make any sense to me.

    The only interest I see to use an external GPU box is to
    - use 1 single TB3 cable to have everything going through it (charging, video, mouse, gamepad, Kb etc... = valid when using a Core V1 and V2 but not Core X)
    - move the heat generation from the dGPU to an external GPU box (in connection with the next point)
    - use very large screen(s) directly connected to the back of the Graphic card (as it is the case for you l4v4l4mp3)

    I am myself looking for a solution to plug my Blade to an ultra wide screen, but it is annoying/concerning that using a eGPU that will cost a kidney (or an eye) will reduce overall gaming performances.
    I expect at least the same level of performances, ideally far better.

    Then it make sense to wait for a new generation of Blade 14 or 15 with higher GPU embedded
    or alternatively to use an Omen accelerator (far more quiet than the Core) with a GTX 1080ti and 700w PSU enabling hot swap until then.

    Cheers,
    D8
     
  15. l4v4l4mp3

    l4v4l4mp3 New Member

    Yeah only one cable would be the dream idea. :]

    But since I need an additional for power due to high consumption of the new blade 15A

    And in addition I just wanted to save throughput and not connecting mouse, NIC and so on to this one cable, I already agreed with several cables again.

    But as summary I guess we now more or less on the same page, all three of us.

    Just one question, since you bringing up the Omen, I was reading much of it earlier and found a lot of compatibility issues and so on.

    But besides that, is there any advantage for the Omen than using a Core?
    Since due to TB3 connection and all it will be the same bottleneck as with the Core.
    Also you need to tweak the PSU to get more than 500Watt.
     
  16. Dark-Rider

    Dark-Rider New Member

    Indeed, I heard as well the Omen has some compatibility issues with some GPUs. I understand it is seems to be connected with the Firmware version of the Omen MB and dimension available to fit the card in

    For me, the advantage using the Omen
    - it is silent compared to the Razer Core(s)
    - it contains a TB3 hub (like the core V1 and V2), all connected to the TB3 single cable as follow
    1. Gigabit ethernet 10/100/1000
    2. Thunderbolt 31
    3. USB 3.1 Type-C™
    4. 4 x USB 3.1 Type-A
    - you can install a SSD hard drive inside - you can even boot from it, or just use it as storage for your games and data and that is a big big plus compared to the Core
    - there is room inside the box for a big GPU (330x140x53 for the Omen vs 330x144x43! for the Core v1 and V2)
    - the Box is bigger than the Core (all versions) and it is good in case the GPU produces a lot of heat like with RTX cards

    PS: I like your ultra-wide 49" screen ! :)
     
    l4v4l4mp3 likes this.
  17. l4v4l4mp3

    l4v4l4mp3 New Member

    If I had not got a good deal price for The Core X and no need for additional storage due to NAS and the 2TB Evo970 in my blade I guess I had chosen the Omen than the Core.

    Since the benefits are really good advantages besides the main aspect of GPU power.

    PS: Thx - Yeah bought that screen since I saved some money from downgrading my blade from 4k to 144hz model and getting a good black Friday deal. So I re-invested the saved money :D
    But today Samsung announced the new model with 2x1440p -> but again here more pixels will cost more GPU power, will lower my frames again. So i think nothing to worry that much :D
    And Sadly my Spare One X is not able to give ultra-wide resolution, only 16:9 - So I can run both of my Xbox On X'es at the same screen, but makes totally no sense :D
     
    Dark-Rider likes this.
  18. Joikansai

    Joikansai Well-Known Member VANGUARD

    HP accelerator is for someone who has big room;) It take really space from design wise, nice features and price tag, but it has often compatibility issues with parts inside like i/o ports etc, you’ll need keep checking firmware update I believe, at first time it had even issue with their own laptops, HP Spectre when I’m not wrong. So going with same brand setting would be better in terms compatibility.
    Concerning card heat and Fans behavior it depends on GPU brand, Asus Turbo 2080ti made my core V2 screaming since max gpu Temperature is 80ish Zotac Mini 1080ti was also not quite but acceptable, but with evga 2080 gaming and 2070 Black, 2070 one even on max OC, gaming temperature or even mining temperature (softminer) on 100% gpu usage never leave unbelievably 60ish, on idle with Blade Stealth I was even schock since the PSU fans is really silent unlike before with other cards. I believe evga use now glass material for rtx series because they know this card will be hotter than pascal series and changed their metal aluminum design.
     
  19. l4v4l4mp3

    l4v4l4mp3 New Member

    As expected, but a bit earlier, so I've just sent back my stuff today.
    And Razer also annouced the all new blade 15 2019.
    So I will trade my laptop to a friend of mine and upgrade to the new top tier with the 2080 integrated...

    I think the 1k I've saved through returning the Core and 2080 desktop, will be more or less the amount I need to add, for the upgrade :D
     
    Last edited: Jan 7, 2019
    Dark-Rider likes this.
  20. Joikansai

    Joikansai Well-Known Member VANGUARD

    Another Update on RTX egpu (Core V2) usage, after almost a month using Rtx 2070 Black and happy user, last night it unfortunately it died. Thanks for well known Evga support, got ticket today and ready to ship for replacement.
    Here’s my experiences with Rtx cards:
    1. October to November, 3 weeks 2080ti, works fine but due build quality (for the price), high temperature and huge bottleneck compared to desktop setting I sold it.
    2. November, 2 weeks 2080 Fe, works fine, great build quality (the best on my experience), but due active fans it’s loud on my core v2, returned it.
    3. November to December xc 2080 gaming, works fine, quite gpu, but due someone asked it for higher purchasing price and the FPS gain not so huge compared to 2070 on my setting (1440p) i sold it.
    4. December till yesterday 2070 Black. Works fine, best performance for value on my setting, quite and really cool gpu (I use always OC mode), never go above 60ish degrees doing anything including long hours softminer (doing it rarely). But first card was DOA and second one, the replacement was not a month, maybe I’m just unlucky or there’s something wrong with Rtx cards on egpu setting.
    Conclusion: Get GPU that listed on Support page;) I can’t refund my 2070 though and must use it but I’ll give 3rd chances since it’s my favorite card, but if still unlucky I’ll sell it and get 1080ti (previous setting, over a year without issues) somehow even can’t enjoy new gen gpu features, it’ll last on my setting till next gen 7nm chips out. If still want to get Rtx cards get from somewhere with good support for faster solution and replacement.
    Cheers
     
    l4v4l4mp3 likes this.
Thread Status:
Not open for further replies.
Sign In with Razer ID >


Don't have a Razer ID yet?
Get Razer ID >