Transform your laptop with the Razer Core X | Razer Insider

Transform your laptop with the Razer Core X



Show first post
This topic has been closed for comments

166 Replies

Hello all,

Sorry to bring the bad news, but the Razer Core V2 is still not the revolutionary product we all were awaiting for.

I own a 2016 Razer Blade 14 with GTX1060 and I wanted to, eventually, get a Razer Code to increase performances. And in my case it is a total fail:
- the USB-C acts as a bottleneck, impacting the performances of any external graphic card, even the most advanced ones (Blade 14 + GTX1060 = more fps than Blade 14 + Core + GTX1070 or in some case 1080ti)

- The Razer Core acts as a USB-C hub that includes a GigaB. Network adapter and enables to connect to an external monitor (via the graphic card, or Display port) - so if you use already a usb-c hub that contains a display port to support 120 hrtz, the razer core becomes even useless and a noisy add-on to your desk

- The USB-C cable of the Core V2 is 50cm long (to support 40Gb data transfer rate), so you will have to keep the Razer Core on your desk to use it (= fan noise), and you will need to face the most ugly and noisy part of it (the back) due to this short cable

- From a connecting perspective, I will just say that Core X is the one you should never buy (it has no external USB port, no Ethernet port).
Core V1 is a bit old and requires software to switch from embedded graphic card to external core.
Core V2 does this more smoothly without the need to touch any software (hot plug)

Basically the Razer Core (v2 NDLR) is (very) good ONLY when using a laptop with usb-c and a basic graphic card, (as for instance the Blade Stealth with Intel GPU) or any laptop without a proper 3D accelerating GPU, as for instance when using a laptop with an old NVIDIA
(side note: I still struggle to imagine a laptop with Thunderbolt connector and a "old" gpu, but that is a different topic...)

If your laptop has a decent 3D card (GTX1060 and above), then using the Razer Core is a "step down" in your setup: fps will drop, operating noise level will go up.

If you want to improve your gaming perf, you will need to buy for a more recent Razer Blade (gen 7) or eventually switch to a more standard gaming desktop

D8
Userlevel 7
l4v4l4mp3
Just want to agree @Dark-Rider after owning my new Razer Blade 15 Advanced with 1070 and now adding a Core X with a RTX 2080. The gaming performance is lower with my 2080 than with my 1070. Seems like 40GBit = 5 GB/sec through TB3 is really a bottleneck in comparison to 15,8GB/s for PCIE 3 x16 and for newer versions even more.

So you will end up with less performance in your razer core if you have sth like a 1060 or above installed in your main system.

makes the Core more or less worthless. Sad but true.

Adding just two example screenshots for Horizon 4 benchmark

.

It’s not only Razer Core, you should blame intel for limiting TB3 bandwidth on egpu setting, it’s not even 30gbs it’s max 28 with full TB3 PCIE3X4 lanes. So don’t compared with desktop or direct PCIe3x16 lanes. Another things to add that on 1080p the bottleneck is even higher If you have time you can read this. You can check bandwidth with cuda Z, check on performance tab host to device and device to host bandwidth. Blame intel!;) The rest Bandwidth i think but not sure is allocated for other things like pheriperals.
High FPS is also egpu big enemy and there’s also games that badly rendering things through TB3 like that Forza Horizon and PUBG in my experience, correct me if I’m wrong though. For those titles I would play with dgpu, it’s still play well on 1070maxq on 1440p high

With RTX 2070 on 1440p, it’s sharper but lost a lot FPS.

Another egpu advantage on laptops with dgpu is cpu thermal improvement, since fan can put more attention to cpu because dGPU is off. For example this is cpu thermal comparison between dgpu and egpu on gaming mode (better performance) around 15 minutes doing Forza Horizon 4 Benchmarks, dGPU...Auch;)
dGPU

Egpu

So it depends on you, better heat for better Blade lifespan or better performance for hotter machine:big_grin_:
I personally shared both, for open world I use egpu and esports and racing dGPU, this setting keeps my Blade from last 14 model rocks like a champ:cool_:
Btw forgot mentioned that 1070 can’t run well ultrawide 5120x1440p, it’s only around 30fps or worse compared 2070 with gpu on high 40ish to max 60fps, with 2080 i think it’ll be better.

Cheers:smile_:
Hello Joikansai

First of all, Happy new year 2019 to you, and all forum members

Now back on the subject, I don't think it is a question to blame Intel or Razer. Both in my mind actually are responsible, as said @I4v4I4mp3, for the lack of transparency, in order to sell you something you don't need.

Now the argument of "overheat" impacting the laptop life span is highly questionable - for instance, I don't overclock and I am using my laptop within the boundaries of "normal usage" with a little plus: my laptop stay stationary 11 month on 12 on my desk. Hense there is no reason for my laptop to "break" or fall.
If heat implies a shorter laptop life span, then we, blade owners, have a serious problem - but I doubt that.

(with regards to heat and GPU questions, I recommend to read this to understand that gpu used for mining don't see a drop in performances while they generated heat, 24/7 for along period of time)

side note: I though the 2080 could be a game changer for this setup, but, thx l4v4l4mp3 for trying it, we now know that it is not the case (and I am not even surprised that much)

Cheers,
D8
Userlevel 7
Dark-Rider
Hello Joikansai

First of all, Happy new year 2019 to you, and all forum members

Now back on the subject, I don't think it is a question to blame Intel or Razer. Both in my mind actually are responsible, as said @I4v4I4mp3, for the lack of transparency, in order to sell you something you don't need.

Now the argument of "overheat" impacting the laptop life span is highly questionable - for instance, I don't overclock and I am using my laptop within the boundaries of "normal usage" with a little plus: my laptop stay stationary 11 month on 12 on my desk. Hense there is no reason for my laptop to "break" or fall.
If heat implies a shorter laptop life span, then we, blade owners, have a serious problem - but I doubt that.

(with regards to heat and GPU questions, I recommend to read this to understand that gpu used for mining don't see a drop in performances while they generated heat, 24/7 for along period of time)

side note: I though the 2080 could be a game changer for this setup, but, thx l4v4l4mp3 for trying it, we now know that it is not the case (and I am not even surprised that much)

Cheers,
D8

Yes Happy new year 🎉to you too.
If you can keep monitoring your laptop without egpu it’s fine having Blade, keep in mind it’s gaming laptop in MacBook size. The heat especially in this category laptops is the best enemy.
For most poeple who aiming FPS it’s 1080ti or 2080 isn’t gaming changer if you have decent gpu on your laptop but not for some who want better resolution like 4K gaming that couldn’t be done well. I did that on my previous Blade 14 doing 4K with 1080ti on rise of Tomb Raider that was unplayable on 1060 dGPU. Though I play also a lot esports that made my gaming usage around 50 50 between dGPU and eGPU. And as result I could sell it on mint condition like day one, even I use it outside as well.
Heat that generated inside may wear the other parts as well, so it comes also to lucky factor if you get great parts.
This is similar with gpu for mining, it’ll use from my understanding using softminer recently 100% gpu usage, like using 100%battery on phone, phone will be fine but it will be dropping the battery performance.
Anyway to me egpu isn’t also gaming changer but it’s for some users, you can take a look on egpu.io, a lot Mac users become enthusiasts about this, they can finally gaming on that and as ex MacBook user I can really understand that gaming on mac was really painful in term performance and heat. To me on Blade 15 with egpu it’s a gaming revolution, especially with rtx card, the reflection is just natural and could be done in a laptop, played last night stories (for online competitive i prefer over 70fps) on BFV it just works;) 40fps~ 1440p medium, I think with 2080 or higher it’ll be better. On stealth with weaker cpu it’s totally unplayable, laggy.
Yep I agree with you, too. (at least partly)

For Mac Users or people with just an Ultrabook, without a mid to high value dGPU, a core and all eGPUs makes sense and bringing performance. But with a 1070 its at least for gaming not that much, in response to the costs.

If you are rendering, mining or sth like this that might change, because in that scenario I guess the bandwidth is not that bottleneck than for high fps gaming.

So just adding a core with RTX for medium raytracing might be fun, but it is not worth (at least for me).

So saving the money and waiting for RTX cards in a new blade or sth like this.
Since I doubt that even TB4 - if it will available sooner or later will tweak the performance we need for high fps enough.

Just for reference the answer from razer regarding my question of any limiting factor when using a high price GPU in the Razer Core.


Greetings from Razer!

We understand your concern with regard to the performance of your Razer core X. We apologize for the inconvenience that the issue may have caused you.

With regard to your question. The Razer core X does not limit eGPU. It also varies on the card. You can actually check the link below.

https://support.razer.com/gaming-laptops/razer-core-x/

Please feel free to ask if you have any further questions, Have a good day!
Hello Joikansai, l4v4l4mp3

Apologies, but none of us 3 uses a Mackbook, and comparing GPU usage and Battery usage doesn't make any sense to me.

The only interest I see to use an external GPU box is to
- use 1 single TB3 cable to have everything going through it (charging, video, mouse, gamepad, Kb etc... = valid when using a Core V1 and V2 but not Core X)
- move the heat generation from the dGPU to an external GPU box (in connection with the next point)
- use very large screen(s) directly connected to the back of the Graphic card (as it is the case for you l4v4l4mp3)

I am myself looking for a solution to plug my Blade to an ultra wide screen, but it is annoying/concerning that using a eGPU that will cost a kidney (or an eye) will reduce overall gaming performances.
I expect at least the same level of performances, ideally far better.

Then it make sense to wait for a new generation of Blade 14 or 15 with higher GPU embedded
or alternatively to use an Omen accelerator (far more quiet than the Core) with a GTX 1080ti and 700w PSU enabling hot swap until then.

Cheers,
D8
Yeah only one cable would be the dream idea. :]

But since I need an additional for power due to high consumption of the new blade 15A

And in addition I just wanted to save throughput and not connecting mouse, NIC and so on to this one cable, I already agreed with several cables again.

But as summary I guess we now more or less on the same page, all three of us.

Just one question, since you bringing up the Omen, I was reading much of it earlier and found a lot of compatibility issues and so on.

But besides that, is there any advantage for the Omen than using a Core?
Since due to TB3 connection and all it will be the same bottleneck as with the Core.
Also you need to tweak the PSU to get more than 500Watt.
Userlevel 7
Dark-Rider
Indeed, I heard as well the Omen has some compatibility issues with some GPUs. I understand it is seems to be connected with the Firmware version of the Omen MB and dimension available to fit the card in

For me, the advantage using the Omen
- it is silent compared to the Razer Core(s)
- it contains a TB3 hub (like the core V1 and V2), all connected to the TB3 single cable as follow
1. Gigabit ethernet 10/100/1000
2. Thunderbolt 31
3. USB 3.1 Type-C™️
4. 4 x USB 3.1 Type-A
- you can install a SSD hard drive inside - you can even boot from it, or just use it as storage for your games and data and that is a big big plus compared to the Core
- there is room inside the box for a big GPU (330x140x53 for the Omen vs 330x144x43! for the Core v1 and V2)
- the Box is bigger than the Core (all versions) and it is good in case the GPU produces a lot of heat like with RTX cards

PS: I like your ultra-wide 49" screen ! :)

HP accelerator is for someone who has big room;) It take really space from design wise, nice features and price tag, but it has often compatibility issues with parts inside like i/o ports etc, you’ll need keep checking firmware update I believe, at first time it had even issue with their own laptops, HP Spectre when I’m not wrong. So going with same brand setting would be better in terms compatibility.
Concerning card heat and Fans behavior it depends on GPU brand, Asus Turbo 2080ti made my core V2 screaming since max gpu Temperature is 80ish Zotac Mini 1080ti was also not quite but acceptable, but with evga 2080 gaming and 2070 Black, 2070 one even on max OC, gaming temperature or even mining temperature (softminer) on 100% gpu usage never leave unbelievably 60ish, on idle with Blade Stealth I was even schock since the PSU fans is really silent unlike before with other cards. I believe evga use now glass material for rtx series because they know this card will be hotter than pascal series and changed their metal aluminum design.
Hello Joikansai
Impressive test ! didn't you need to upgrade you PSU to power all the cards?

Cheers
D8
Userlevel 7
Dark-Rider
Hello Joikansai
Impressive test ! didn't you need to upgrade you PSU to power all the cards?

Cheers
D8

No max power consumption on max load even 2080ti is 300 watt, and core v2 can supply up to 375 watt (See here), and on egpu setting enclosure psu is totally serve for GPU (despite if you add some peripherals on it) unlike desktop that should supply other parts, core v2 can’t supply only Vega 64 unlike core x that has bigger psu.
woop woop!!! will keep an eye on your next post!

D8
Userlevel 7
l4v4l4mp3
So now I can share the first results!

Starting with new benchmarks, at least from Forza Horizon 4, since I've shared the same here before.

One is with fhd, internal display and setting to high. So similar to the screenshots shared earlier.
And the other is on my external double-fhd screen and extreme settings.

So you can see now the CPU is the bottleneck :D

If you want me to test and share sth with you, just ask for it.

//adding the older screenshots as refference here, too. -> Just for easy view.

Wunderbar!!:smile_:
But unfortunately egpu is lame on high FPS and bottleneck would be less on higher resolution, beside FH4 isn’t great in my experience.
You may want to look at this, there’re all lame, MacBook is slightly faster because it has unique TB3 architecture that connects directly to cpu without going to pch controller that reduce the performance a bit unlike most laptops with dgpu. If you can’t see the pictures you can use google account to make easy account. Note the last benchmark was from egpu super tweaker, not TB3 egpu he use m2 slot for faster graphics connection and flash the card bios to unlock to bigger power limit cards and it’s really not recommended at all, you’ll loose the guarantee and may mess the device a lot.
Userlevel 1
Sounds like Iphone X haha
wow, luckly haven't buy v2
Does it work with the Synapse 3? Do it still has to run two drivers?
Shame it doesn't get the USB HUB and LAN connection. Would have been the best enclosure if you could have a eGPU, LAN, device charging/power and all your USB stuff connected with just a single cable. The X charges with 100W while the Corev2 onky has 65W, which might not be enough with the new RB15 considering it's coffe lake CPU.
Good price! Me likey!
Userlevel 5
no chroma btw.
Userlevel 7
It'll work with ANY laptop?! 😯
very nice! 😃
Danzle1991
Shame it doesn't get the USB HUB and LAN connection. Would have been the best enclosure if you could have a eGPU, LAN, device charging/power and all your USB stuff connected with just a single cable. The X charges with 100W while the Corev2 onky has 65W, which might not be enough with the new RB15 considering it's coffe lake CPU.


the new RB15 have a new power connection. so you will not use the X to charge it.

but still more connectivity would have been appreciated.
Userlevel 7
A bit bulky i might say, you really need a lot of space to carry it. But as long as it does it's job, i like it.
Userlevel 7
vampirivan
no chroma btw.

Plus Chroma plus price, Razer was working hard to get it under $300;)
FrozenFireVR
It'll work with ANY laptop?! 😯

Yes as long as it has TB3 port, but I think old TB like 2 and 1(with Apple TB3 Adapter) should also work as like other eGPU enclosure but with lower bandwidth, like 16gb for TB2 and 8gb for tb1 instead 32gb for TB3.
Danzle1991
Shame it doesn't get the USB HUB and LAN connection. Would have been the best enclosure if you could have a eGPU, LAN, device charging/power and all your USB stuff connected with just a single cable. The X charges with 100W while the Corev2 onky has 65W, which might not be enough with the new RB15 considering it's coffe lake CPU.

Plus it the price won’t be under $300, remember Razer Tax;) Gaming Laptops aren’t designed for usb C charge, even if it could it would be super slow like XPS 15, Blade 15 has its own charge connection so it won’t be matter pairing both host and device.
Not a fan.
I was waiting for a Razer Core knowing a new one was coming out. But no USB or Ethernet!? :(
Personally I dont see this as much of an upgrade from the V2. Sure this has extra space but that means nothing to me when I would otherwise have to purchase a separate laptop dock for the USB and Ethernet.

Did the V2 have issues with the ethernet/usb? The simplicity of having one cable for egpu, ethernet, and usb is a game changer. Why did they get rid of it in this model? I am just wondering if there is some reason I shouldn't buy the V2. I am wondering if this is a upgrade or just a cheaper model with less features?
good part about nothing else than the gpu connected is that you don t share your bandwith.
so less bottleneck.

is the bandwith between the core x and the laptop the same as the old core v1/v2 or is there a new connectivity allowing less bottleneck?

how much bandwith a 1080ti would need to not suffer any bottleneck at all if used in a core?