Simple answer about the GTX970, for the under-informed.

Discussion in 'Off Topic Chat' started by xByakko, Feb 5, 2015.

Thread Status:
Not open for further replies.
  1. BORUzer

    BORUzer New Member

    You need to read more up on the issue. https://forums.geforce.com/default/board/155/geforce-900-series/

    You won't notice on a 1080p set up, unless you're really pushing the card (mods mainly). At the moment you're probably not even using 3.5GB, but they have literally updated the specifications, because their release specifications were untrue.
     
  2. _ramsey_

    _ramsey_ Active Member

    If they have the ability to address 8Gb of ram on a mobile card at full speed then why not include that technology on the desktop line? It doesn't make sense.
     
  3. RyuMakkuro

    RyuMakkuro New Member

    Another one...
    Have you benchmarked the 8GB version of GTX970? I assume not since there isn't even release date for it, therefore moving on.

    Watch the video, understand why it works like this and then please stop spouting nonsense due to your misinformation.
     
  4. _ramsey_

    _ramsey_ Active Member

    See the question mark? I asked a question. You didn't answer the question, you merely accused me of spreading misinformation. I have a picture for you

    [​IMG]

    It's a very straightforward question. If the 980m can address 8GB of memory at full speed, then why wasn't that same setup used on the desktop 970? If you don't know the answer, that's cool, but attacking me makes you look bad.
     
  5. RyuMakkuro

    RyuMakkuro New Member

    Yes, I did. Like I already said, watch the video. It explains why. Your question itself automatically points at the fact that you are misinformed.

    In case you don't realise this, 980M along with 970M were released after their desktop equivalents. Not before.

    This whole thread was meant to make people understand the whole "problem", or rather the fact that what they call problem, becomes one only when you run things way out of their comfort zones and even then it can be easily corrected.
     
  6. _ramsey_

    _ramsey_ Active Member

    http://www.theonion.com/articles/friend-who-sent-link-to-8minute-youtube-video-must,32442/

    But seriously, I watched that whole video a second time. Nowhere in it does it mention the 9xxm graphics cards. Anywhere. At all.

    The issue is a disabled L2 module. Am I to believe that a 980m with 8GB of VRAM has 2x the L2 modules of a desktop 980 or that the 980m L2 modules are twice as wide as the desktop version?

    Actually, no, I don't believe either, because I see people benchmarking the 980m and hitting the 3.5GB wall.

    http://tech4gamers.com/nvidia-geforce-gtx-980m-also-memory-problem/

    Since the new blade ships with 3GB of VRAM, the point is moot to me. I think I'd be pretty upset if I paid an extra couple hundred bucks to get a MSI GS60 with 970m+6GB instead of a 970m+3GB.
     
  7. wow, this really escalated quickly, flame on
     
  8. IceStorm_III

    IceStorm_III Member

    It's not the L2 modules per se, it's what they connect to the crossbar. In the 970, the L2 is missing for a specific memory controller (MC), requiring it to go "sideways" through its sister MC. Two MCs via one L2 is a bottleneck.

    Combine this with the fact that memory addresses do not "fill up" chips in sequence, they stripe. Instead of filling up the first chip, then the second, then the third, data is striped across all available chips. That's a problem when your last two memory banks are sharing bandwidth across a single L2.

    The solution was to stripe the first 3.5GB across seven of the L2s. The last 0.5GB is solely allocated to the L2 that shares with its gimped MC. For 3.5GB of memory, the 970 performs at 7/8ths the speed of a 980. For the last 0.5GB of memory it performs at 1/8th the speed of a 980.

    http://techreport.com/review/27724/nvidia-the-geforce-gtx-970-works-exactly-as-intended

    Shouldn't be an issue. That should be just a doubling of the DRAM density. It should not impact striping performance the same way as disabling an L2 does. Seems that the 970M lops off a whole lot more wholesale than the 980M does (192bit memory vs 256bit memory).
     
    Last edited: Feb 6, 2015
    xByakko and _ramsey_ like this.
  9. BobbyMike

    BobbyMike Member

    I bought my GTX 970 last November and am still quite happy with it. No problems with Far Cry 4 maxed out and am looking forward to playing GTA V when it's released. I think it was (and still) a great card for $375 (especially since I got Far Cry 4 with it).
     
  10. RyuMakkuro

    RyuMakkuro New Member

    Because it doesn't have to. It's like putting 2+x=4, result is obvious.

    GTX970 came out BEFORE the mobile version, thus naturally they can't call back all the GTX970s back and "fix" them, especially when the problem occurs only when you forcibly want it to appear. You can expect the 8GB version to come out with a fix (thus it may be more expensive) that negates this issue, or you'll have simply 7 or 7.5GB out of 8GB running at full speed. Not a huge issue, since again, 99% of users will never use that amount of VRAM.

    Also, notice how some benchmarks of 980M have that issue, while some don't. AMD fanboys trying to fake results? Possible. nVidia fanboys faking results? Possible. Do I care about those? No.
    I suggest the same. Whichever version of GTX970 you'll get, you won't get disappointed while using it "the way it was meant to be" used. In games ;)
     
  11. I got my 2 GTX 980's Changed from GTX 970's from my supplier, they tried not to but were happy to do so it i really wanted to so i did.

    I do honestly feel that this may have been an over reaction however i just wanted to be reassured a bit. If DirectX 12 and Mantle start allowing users to Stack their multi GPU VRAM i would prefer to have full speed 8GB of VRAM. Obviously this is just speculation at this time.

    I would still recommend the GTX 970 to anyone looking and how is ok with the information provided
     
  12. Manahotep

    Manahotep Active Member

    970m is a turd compared to the full blown 970... also after half an hour the 970m will heat up and get even slower. There is no, nor has there ever been an equal comparison between "M" cards or mobile cards and their full desktop brothers. mobile models get their butts handed to them in EVERY benchmark. Mobile has ONLY one strength and that is its name"Mobile" . If you value mobility and the ability to take your computer places then a laptop is for you. But laptops have never and will never compete with their desktop equals.
    The 970m has more memory THATS it which is useless at any resolution.
    the 970 is better because
    Significantly better passmark score

    8,613
    vs 4,573
    Around 90% better passmark score
    Higher clock speed 1,050 MHz vs 924 MHz
    Around 15% higher clock speed
    Higher effective memory clock speed 7,012 MHz vs 5,012 MHz
    Around 40% higher effective memory clock speed
    Better floating-point performance 3,494 GFLOPS vs 2,365 GFLOPS
    Around 50% better floating-point performance
    Significantly higher turbo clock speed 1,178 MHz vs 993 MHz
    Around 20% higher turbo clock speed
    Higher pixel rate 58.8 GPixel/s vs 44.4 GPixel/s
    More than 30% higher pixel rate
    Significantly higher memory clock speed 1,753 MHz vs 1,253 MHz
    Around 40% higher memory clock speed
    Better passmark direct compute score 4,109 vs 2,693
    Around 55% better passmark direct compute score
    More render output processors 56 vs 48
    8 more render output processors
    More shading units 1,664 vs 1,280
    384 more shading units
    More texture mapping units104 vs 80
    24 more texture mapping units
    Wider memory bus 256 bit vs 192 bit
    Around 35% wider memory bus
     
  13. Manahotep

    Manahotep Active Member

    When people say the "max" stuff out.. they need to say the size and resolution and panel type and if multiple monitors the number of them. 3x27" 1080p Tn panel monitors are less taxing on a gpu that a single 30" IPS 2560x1600 panel.
    I am running a 34" ultrawide curved IPS panel @ 3440x1440 and can max out ANY game currently available with a single Gigabyte G1 970 the ridiculous 3.5 vs 4gb memory debacle doesnt change the amazing performance of the 970.
     
  14. Manahotep

    Manahotep Active Member

    Looked at the 980...but i limit my monitors to 60fps and in no game did the 980 provide any benefits over the 970 under 60fps... the 970 hit the 60fps cap as much as the 980 did at 3440x1440, 2560x1440 and 1900x1200. Granted at 4k the 980 wins hands down... but when i bought my card ALL the 4k monitors were crappy tn panels and none ultrawide. AND a single 980 is gonna get dogged by a AAA game in 4k so you would need 2 of them anyway. In 2 years when the 970 slows down... i will buy the 2nd tier card once again and i will have spent just slightly more on 2 cards than one 980 and the new gen 2nd tier as always will hand the previous gen champ its butt.
     
    Ryu Makkuro likes this.
Thread Status:
Not open for further replies.
Sign In with Razer ID >


Don't have a Razer ID yet?
Get Razer ID >