Hello; I already posted these results on the Notebook Review forums, but I thought I would share them here. Here is my desktop, which is a somewhat beefy system, nothing is overclocked (it's currently summer here; I only overclock my CPU during winter): Intel Core i7-5930K Asus X99-Deluxe 16GB Corsair Vengeance LPX DDR4 2666MHz (4x4GB sticks) 256GB Samsung 950 Pro (boot + Blizzard games) 1.2TB Intel 750 SSD (PCIe card version, holds most of my Steam library and work files) 750GB Seagate Momentus XT Nvidia 980 Ti (reference cooler, which is the same one I use in the Core) Monitor is an Asus ROG Swift PG278Q. Here's the USB devices that I use in a regular basis: Finalmouse 2016 Corsair K65 RGB Mayflower Electronics Objective2 (headphone amp) Tascam US-1800 (audio recording and game comms, heh) Vantec 7 Port USB 3.0 Hub First, these are my results using 3DMark's normal Fire Strike and all of them using the Blade 14: Clicky clicky! There is some variance in the results for the Blade 14; I did them back-to-back and my theory is that the temps affected the results, as the first runs meant a cooler GPU. Also, my desktop was included. First result is my desktop. Second one is the Core with both the internal and external displays. Third one is the Core with just the external display. Fourth one is the Core with both displays and all my USB devices plugged in to the Core as well as physical network. Added a fifth column that has the results for the integrated 970m in the Blade. A friend let me use his Stealth for the weekend, and I added it in a new batch of benchmarks using 3DMark's Fire Strike Ultra: Clicky clicky! Something of note is that I ran three times the benchmark for each case, but I'm only adding to the comparison the third run. I did this so that the first two runs would heat the GPU up and simulate an more "real" gaming session scenario. It is still a small test sample tho; but hey, I don't have all the time in the world to bench, I also have to game for fun! Here's the column order for this: My desktop PC - 3,878. Blade 14 using the laptop's internal display - 3,460. Blade 14 using my external display - 3,692. Stealth using the external display - 3,487. Now for some real games and general fps. Right now I'm rotating my gaming time between Heroes of the Storm, Overwatch and Doom; so those are the only ones I tried for now. All of them were played on the external display and with the same highest quality settings that I use on my desktop (which is to say, everything maxed out! ): Heroes of the Storm Desktop - 80-100 fps, staying mostly in the 90's. Blade 14 - 60-90 fps, staying mostly in the upper-70's. Stealth - 50-70fps, staying mostly in the mid-60's. Overwatch Desktop - 80-90 fps, staying mostly in the upper-80's. Blade 14 - 70-90 fps, staying mostly in the lower-80's Stealth - 60-80 fps, staying mostly in the mid-70's. Doom Desktop - 130-170 fps, staying mostly in the 140's Blade 14 - 90-120 fps, staying mostly in the 100's Stealth - 30-50 fps, staying mostly in the upper-30's Notice the biggest difference was the Physics score, since we're going from a 5930K to the 6700HQ and the 6500U. Now, this is of particular interest because when going to actual games - such as Overwatch and Heroes of the Storm - I would believe that physics impact fps the most, particularly on the Stealth. Maybe if I were to lower the physics settings the fps could improve. I didn't try it, but I am confident that it is so. Now, something that got me thinking a lot, is that if you compare the results regular Fire Strike to Ultra, you will notice that the difference between the desktop and the Core vary wildly. My guess is that in Ultra there is more time taken by the GPU rendering the actual scene; therefore, bandwidth limitation is not an issue. Notice how the actual graphic test fps in Ultra just lose around a 7-8%, even in the Stealth! But when you move to lower graphical settings, such as regular Fire Strike, in which the GPU doesn't take as much time rendering, it is then that bandwidth might becomes the bottleneck; we're talking about 20% and 50% differences for the Blade and Stealth, respectively. This too, reinforces my believe that physics has something to do with fps loss in actual games. I find this particularly interesting because it means that the higher settings you run your game at, the smaller the performance gap might be - barring a CPU bottleneck, though. Also, it will be cool to find out the implications of Vulcan. Other things: it seems like the experience is a little more refined with the Stealth; it is more plug-and-play. I think the reason for this might be the 970m on the Blade 14. Actually, Nvidia Geforce Experience is very, very confused on the Blade; sometimes it thinks I have a 970m, sometimes a 980 Ti. Also, I noticed that my USB headphone amp would behave weirdly. When there was nothing playing, it... "disconnected". Not from Windows, there was not USB unplug sound. A soft-disconnect/sleep, if you will. As soon as I started playing something it would take a few seconds before actual sound came out. As soon as the music would stop, it would "disconnect". I wonder if, in order to save bandwidth, the Core soft-disconnects devices that are not being used. Noise! The GPU is usually inside a desktop case, so when fans ramp up one can only hear what escapes from the case. The Core is not only smaller but there are grills just in front of the GPU. It is much more audible; maybe a little bit annoying, if I'm honest. Most of the time I am wearing headphones, but when I take 'em off, it is pretty noticeable. Another small (pun intended) detail: the USB-C cable is short. I believe this was a conscious decision in order to maintain the 40 Gbps of the core. In my mind I would have the Core on the right of my desk - where my desktop currently is - and the Blade to my left - where I have open space in my desk. That ain't gonna happen, no sir. Maybe both Blade and Core will have to be to the left of me. So, keep that in mind. If the laptop goes to sleep, pressing a mouse or keyboard attached to the Core will not wake the Blade. You have to press a key on the laptop's keyboard. And there's some funky DPI scaling action going on. If the Type-C cable is disconnected while you are using both displays - the laptop's internal and an external display -, and have the external as the main Windows display, then after the disconnect it all goes to the laptop's internal display with the DPI scaling options of the external display. That is, I use my ROG Swift with a 100% scaling option (which is to say... none), and the Blade's display has a 150% scaling; when the Core is disconnected, the Blade's display default's to the 100% of the previous main display (the ROG Swift in my case). A user log-out or a reboot fixes this... heck, I just re-read this and don't know if I made sense. Oh, and in Doom, loading screens seem to freeze. If I press Ctrl+Alt+Del then the game loads fine, but I have to do it every time there is a loading screen. Weird. Well, there you have it folks! A somewhat light more in-depth-ish look at the Core. All in all, I am pretty satisfied and excited for this sort of technology. Around 13 years ago when I went to college out of town, I had to take my desktop and my laptop to school; obviously I used the laptop for school related activities (a solid eMachines M6807 with ATI Mobility 9600 capable of running Doom 3!) and my desktop for gaming. I dreamed that something like this would exist some day. And now it does!