Results 1 to 16 of 16

Thread: 980 vs 1080

  1. #1
    Boycott shampoo! Demand the REAL poo! AOD Member AOD_DCC70's Avatar
    Rank
    Private First Class
    Division
    PlayerUnknown
    Status
    Active
    Join Date
    Dec 2013
    Location
    Florida (Go Gators!!!!!)
    Age
    20
    Posts
    109

    Default 980 vs 1080

    I currently have a gtx 980 can get exact model later. However would it be worth upgrading to a 1080, I play BF4, iracing, and some other non-demanding games. My 980 can run everything at 60+ frames ultra on up to a 2k resolution. So my question is would it be worth upgrading to the 1080 before BF1 drops, or should I wait for next gen.

    Thank you.

  2. #2
    Criminal Lawyer is a redundancy TekNicTerror's Avatar
    Rank
    Forum Member
    Division
    None
    Status
    Active
    Join Date
    Dec 2014
    Location
    SC, USA
    Posts
    234

    Default

    Well can the 980 run BF1?
    Of course then it becomes, can the 980 run BF1 in ultra settings? If that is how you want to run it..

    BTW, is it just a 980 or 980 Ti? If Ti, no matter what, I'd say wait for 1100's or even 1200's GTX models; a 980 Ti should be fine for a few more years even if you are rich.

    If just 980, compare it to a 1070 and see if worth upgrading to that, cuz I do not feel the price difference between 1070 and 1080 is worth the performance gain; which is why my new PC has a 1070 in it instead of a 1080.

  3. #3
    Boycott shampoo! Demand the REAL poo! AOD Member AOD_DCC70's Avatar
    Rank
    Private First Class
    Division
    PlayerUnknown
    Status
    Active
    Join Date
    Dec 2013
    Location
    Florida (Go Gators!!!!!)
    Age
    20
    Posts
    109

    Default

    It is just a normal 980, I mean I pull 130 fps in battlefield 4 with settings completely maxed at 1080p and 80 at 2k. Guess I will have to wait until bf1 drops. Only concern is then I won't be able to sell my 980 once these 1060s boom since they are on par with the 980.

  4. #4
    #Superhuman AOD Member AOD_Tymplar's Avatar
    Rank
    Specialist
    Division
    Destiny 2
    Status
    Active
    Join Date
    Dec 2015
    Location
    Atlanta, GA
    Age
    37
    Posts
    138

    Default

    Quote Originally Posted by AOD_TekNicTerror View Post
    Well can the 980 run BF1?
    Of course then it becomes, can the 980 run BF1 in ultra settings? If that is how you want to run it..

    BTW, is it just a 980 or 980 Ti? If Ti, no matter what, I'd say wait for 1100's or even 1200's GTX models; a 980 Ti should be fine for a few more years even if you are rich.

    If just 980, compare it to a 1070 and see if worth upgrading to that, cuz I do not feel the price difference between 1070 and 1080 is worth the performance gain; which is why my new PC has a 1070 in it instead of a 1080.
    I agree on the 980 vs. 980 Ti point. There's absolutely no comparison between the 980 and the 1080. The 1080 will eat the 980's lunch all day long.

    The only 10x0 card we're waiting on now is the GTX 1080 Ti. Well, the one that EVERYONE is waiting on, heh. I'm still just wondering what they're doing with their super sped-up release cycle (aside from just swatting away AMD like the fly that they are) and it makes me think that we could very well see the GTX 1080 Ti just in time for Christmas this year.

    The only way they could get away with that would be if the other change they're making is something like a "Tick / Tock" release cycle (like what Intel used to do ). If that were the case, then what we would most likely see would be the GTX 1100 series as the "Tock" release to the GTX 1000's "Tick" release, but still on Pascal. That would most likely be where they would finally incorporate HBM2 if manufacturing would sort out. Or, they do what Intel just changed to, which is a three-step release cycle (Process --> Architecture --> Optimization) and then we could hypothetically see something like the GTX 1000, GTX 1100 and GTX 1200 series, all based on Pascal.

    But then, I guess the only question remaining at that point is...will there be a GTX 1090? :)

    (Duh...yeah there will be)

    Pascal hasn't even began to hit what it's capable of so I say they should sit back and enjoy it and not have to try and rush to get Volta out the door. I mean, they are LITERALLY using partially disabled chips in all most of the cards.

    And to your other point about performance gain from a 980 Ti to a GTX 1080...trust me...it's there. I did it and I'm not regretting it one bit.
    i7-6850K Broadwell-E 6-Core 3.60 GHz | Asus ROG Strix X99 Gaming LGA2011-v3 | 128 GB G.Skill Ripjaws V DDR4-3200 | nVIDIA GTX 1080 Ti 11 GB Founders Edition | NZXT H440 ATX Mid-Tower (Matte Black & Red) | Corsair HX1000i Modular PSU
    Samsung 960 Pro 512 GB NVMe SSD | (x3) Samsung EVO 960 1 TB SSD
    Asus ROG Swift 34" PG348Q Ultrawide | (x2) Asus MG279Q 27"

  5. #5
    Boycott shampoo! Demand the REAL poo! AOD Member AOD_DCC70's Avatar
    Rank
    Private First Class
    Division
    PlayerUnknown
    Status
    Active
    Join Date
    Dec 2013
    Location
    Florida (Go Gators!!!!!)
    Age
    20
    Posts
    109

    Default

    So would you advise either waiting for a 1080ti or 11xx series?

  6. #6
    Boycott shampoo! Demand the REAL poo! Time's Avatar
    Rank
    Forum Member
    Division
    None
    Status
    Active
    Join Date
    Aug 2016
    Age
    30
    Posts
    140

    Default

    There's really no major reason to upgrade just yet. The 10xx series is a massive gain over the previous gen, with the drop in fabrication levels to 16nm nodes - but we're yet to see anything to really take advantage of it. However, one 1080 is still not enough for 4K/Ultra, or 1440p/144Hz/Ultra. SLi can be, pending the title is decently optimised.

    If you were on a 780ti or below, I'd say to definitely invest in the upgrade.. but the 980 -> 1080 just isn't worth it yet.

    I'd say if you're on a 980/980ti, wait for the 1180ti. It'll likely be the first "real" 4K/Ultra / 1440p/144Hz/Ultra single GPU solution that we see. We're also likely to see similar 9xx -> 10xx leaps due to the looming introduction of HBM2 replacing GDDR/X.. so instead of a 30%~ gain from 980 -> 1080, you'll be seeing closer to a 50/60% gain. And at that point, you'll be safe to sit back for a few years, as honestly, we're hitting a point of diminishing returns with graphical fidelity in games. Our hardware is getting more powerful, but consumer-facing software just doesn't have the use for it.

    Now, if you're running lots of complex renders, or machine learning algorithms though CUDA.. then yes, the upgrade would probably be worth your while :) Otherwise, I'd say to hold off for the next gen!

  7. #7
    Looks like I picked the wrong week to quit sniffing glue Z3ROFR0ST's Avatar
    Rank
    Forum Member
    Division
    None
    Status
    Active
    Join Date
    Dec 2015
    Age
    36
    Posts
    48

    Default

    Quote Originally Posted by AOD_Time View Post
    There's really no major reason to upgrade just yet. The 10xx series is a massive gain over the previous gen, with the drop in fabrication levels to 16nm nodes - but we're yet to see anything to really take advantage of it. However, one 1080 is still not enough for 4K/Ultra, or 1440p/144Hz/Ultra. SLi can be, pending the title is decently optimised.

    If you were on a 780ti or below, I'd say to definitely invest in the upgrade.. but the 980 -> 1080 just isn't worth it yet.

    I'd say if you're on a 980/980ti, wait for the 1180ti. It'll likely be the first "real" 4K/Ultra / 1440p/144Hz/Ultra single GPU solution that we see. We're also likely to see similar 9xx -> 10xx leaps due to the looming introduction of HBM2 replacing GDDR/X.. so instead of a 30%~ gain from 980 -> 1080, you'll be seeing closer to a 50/60% gain. And at that point, you'll be safe to sit back for a few years, as honestly, we're hitting a point of diminishing returns with graphical fidelity in games. Our hardware is getting more powerful, but consumer-facing software just doesn't have the use for it.

    Now, if you're running lots of complex renders, or machine learning algorithms though CUDA.. then yes, the upgrade would probably be worth your while :) Otherwise, I'd say to hold off for the next gen!
    A GTX 1080 could easily do 1440p and break 60fps no problem. Depending on the game of course. But for games like BF4, BF1, Overwatch, a single GTX 1080 can do 1440p no problem.

  8. #8
    Don't piss me off! I'm running out of places to hide the bodies AOD Member AOD_ZED's Avatar
    Rank
    Private First Class
    Division
    AOD Racing
    Status
    Active
    Join Date
    Jul 2016
    Location
    Toronto, Canada
    Posts
    1,996

    Default

    Quote Originally Posted by AOD_Z3ROFR0ST View Post
    A GTX 1080 could easily do 1440p and break 60fps no problem. Depending on the game of course. But for games like BF4, BF1, Overwatch, a single GTX 1080 can do 1440p no problem.
    I just did Project cars all Ultra and MSAA ON at 4k and was around 60fps with GTX 1070
    My YouTube Channel: https://www.youtube.com/channel/UC_0...dKywFi90AiVjAQ
    CPU: i7 7700k @ 4.8 Ghz 1.25v
    GPU: ASUS ROG STRIX GTX 1070 @ 2050 Mhz / 8400 Mhz
    Motherboard: ASUS Z170 Maximus VIII Hero
    RAM: CORSAIR Vengeance LPX 16GB (2 x 8GB) DDR4 3000MHz

  9. #9
    Boycott shampoo! Demand the REAL poo! Time's Avatar
    Rank
    Forum Member
    Division
    None
    Status
    Active
    Join Date
    Aug 2016
    Age
    30
    Posts
    140

    Default

    Quote Originally Posted by AOD_Z3ROFR0ST View Post
    A GTX 1080 could easily do 1440p and break 60fps no problem. Depending on the game of course. But for games like BF4, BF1, Overwatch, a single GTX 1080 can do 1440p no problem.
    I'm talking about 1440p/144Hz. 60fps isn't worth much when you've got a monitor that can display 144fps. I'm running Overwatch with a 3-way SLi of 980's, and I'm only hitting around 110fps on Ultra/1440p. Similarly, for running on my 4K display, I need to sacrifice some visual settings to hit 60 with that setup.

    I was considering switching them out for tri-SLi 1080's, but it's just not worth the investment right now. Without HBM2, higher resolutions are always bottlenecked. I was going to invest in the Titan XP, but again, the decision to run with GDDR5X rather than HBM2 stopped me from making the investment. The new generation of memory (with the 4-8mbit busses) is just too huge a leapfrog to miss out on.. and I'm trying to not get in the habit of purchasing 3 brand new GPU's every generation. It's an expensive hobby.

  10. #10
    Looks like I picked the wrong week to quit sniffing glue Z3ROFR0ST's Avatar
    Rank
    Forum Member
    Division
    None
    Status
    Active
    Join Date
    Dec 2015
    Age
    36
    Posts
    48

    Default

    Quote Originally Posted by AOD_Time View Post
    I'm talking about 1440p/144Hz. 60fps isn't worth much when you've got a monitor that can display 144fps. I'm running Overwatch with a 3-way SLi of 980's, and I'm only hitting around 110fps on Ultra/1440p. Similarly, for running on my 4K display, I need to sacrifice some visual settings to hit 60 with that setup.

    I was considering switching them out for tri-SLi 1080's, but it's just not worth the investment right now. Without HBM2, higher resolutions are always bottlenecked. I was going to invest in the Titan XP, but again, the decision to run with GDDR5X rather than HBM2 stopped me from making the investment. The new generation of memory (with the 4-8mbit busses) is just too huge a leapfrog to miss out on.. and I'm trying to not get in the habit of purchasing 3 brand new GPU's every generation. It's an expensive hobby.
    Currently, you can only do 2-was SLi with the 1080's without having to tinker with the software. Also what you are talking about is 144hz, not FPS. That just gives it a smoother look by filling in frames. The human eye can't really see past 60FPS.
    Last edited by Z3ROFR0ST; 08-14-2016 at 09:12 AM.

  11. #11
    Boycott shampoo! Demand the REAL poo! Time's Avatar
    Rank
    Forum Member
    Division
    None
    Status
    Active
    Join Date
    Aug 2016
    Age
    30
    Posts
    140

    Default

    Quote Originally Posted by AOD_Z3ROFR0ST View Post
    Currently, you can only do 2-was SLi with the 1080's without having to tinker with the software. Also what you are talking about is 144hz, not FPS. That just gives it a smoother look by filling in frames. The human eye can't really see past 60FPS.
    Actually, a small portion of my dissertation were dedicated to disproving the human eye fps myth. From peer-reviewed data (including my own), the consensus is that the human eye can on average see up to a frequency of 83.68Hz before diminishing returns kick in. I've uploaded an extract from my dissertation here, for you to peruse; http://imgur.com/a/iWWL3

    144Hz also doesn't automatically fill in frames in any meaningful way - if you're running at 20fps on a 144Hz monitor, you're still only seeing 20 frames. It won't flicker between the blank periods; it'll just maintain the frame, as all monitors do. GSync/FreeSync use an adaptable refresh rate to provide a smoother image/"hide" bad frame jumps, but no technology "fills in frames" with anything other than the source image. At 144Hz, given you're not using a cheap "3D-Ready" monitor which fluffs the numbers, you can display up to 144fps. Well, pending your GPU's are capable of doing so.

    And there's a gigantic difference between 144hz/fps and 60hz/fps. Blind testing has been done for years on it. Even mouse movements on your desktop are noticeably smoother. The difference between 60 and the "point of diminishing returns" are also what set the refresh frequency on HMD's such as the Vive, and the Oculus Rift. 60 is jarring inside a HMD. 85-95 is much more comfortable.

  12. #12
    Don't piss me off! I'm running out of places to hide the bodies AOD Member AOD_ZED's Avatar
    Rank
    Private First Class
    Division
    AOD Racing
    Status
    Active
    Join Date
    Jul 2016
    Location
    Toronto, Canada
    Posts
    1,996

    Default

    guys guys guys. No arguing how Hz and FPS works. Here is a perfect answer by Linus how it all works.

    My YouTube Channel: https://www.youtube.com/channel/UC_0...dKywFi90AiVjAQ
    CPU: i7 7700k @ 4.8 Ghz 1.25v
    GPU: ASUS ROG STRIX GTX 1070 @ 2050 Mhz / 8400 Mhz
    Motherboard: ASUS Z170 Maximus VIII Hero
    RAM: CORSAIR Vengeance LPX 16GB (2 x 8GB) DDR4 3000MHz

  13. #13
    Boycott shampoo! Demand the REAL poo! Time's Avatar
    Rank
    Forum Member
    Division
    None
    Status
    Active
    Join Date
    Aug 2016
    Age
    30
    Posts
    140

    Default

    ..but he's saying what I just said :(

  14. #14
    #Superhuman AOD Member AOD_Tymplar's Avatar
    Rank
    Specialist
    Division
    Destiny 2
    Status
    Active
    Join Date
    Dec 2015
    Location
    Atlanta, GA
    Age
    37
    Posts
    138

    Default

    Quote Originally Posted by AOD_Time View Post
    ..but he's saying what I just said :(
    This :)

    He did another video where he had someone else try and see if BF4 was running at specific refresh rates, and that person couldn't tell much.

    Most people won't ever be able to tell the difference, but, the eye CAN see more than 60 FPS / 60Hz. It all depends on how "trained" it is.

    This whole scenario is why I try and convince people that want to upgrade their GPU time and time again, but never go with a higher-end monitor, it's a HUGE disservice. It's like buying a $200,000 car to drive around the city with no intent of ever getting it out on the open road and experiencing what it's fully capable of (in other words, a complete waste of money).

    Yeah - I can stick with a 1080p / 60Hz monitor and upgrade my GPU year after year, and while the gameplay itself may be capable of more and more FPS, the overall experience will never change.

    When I made the jump to 144Hz in January, I felt it was the most significant change in overall gameplay experience since the introduction of 3D rendering back in the late 90's. Right now, nothing beats gaming at 144Hz, period. That's why I'm more of a fan of that vs. 4K at the moment, because we haven't yet gotten to the same type of fluid gameplay at 4K. Right now, the absolute best you can do at 4K gaming is 100Hz w/ G-Sync and there's really only one monitor that can do that right now (Asus PG27AQ). Asus has a new monitor coming soon (I hope later this year) that does 4K at 144Hz w/ G-Sync (and it'll be 27"). And before anyone references the Dell UP3017Q, that's in no way meant to be used for gaming, it still only supports HDR at 60Hz, and it's $5,000.

    At the end of the day, if a monitor has a max refresh rate of 60Hz, your in-game FPS may get higher and higher, but your overall experience with, say, a GTX 1080 won't really be any different than if you were getting over 60 FPS on your GTX 770 or the like.

    The biggest "LOL" I've seen, though, is someone that had bought two of the Maxwell-version Titan X cards and did a Youtube video trying to benchmark and showcase the setup using 1080p, 60Hz monitors.
    i7-6850K Broadwell-E 6-Core 3.60 GHz | Asus ROG Strix X99 Gaming LGA2011-v3 | 128 GB G.Skill Ripjaws V DDR4-3200 | nVIDIA GTX 1080 Ti 11 GB Founders Edition | NZXT H440 ATX Mid-Tower (Matte Black & Red) | Corsair HX1000i Modular PSU
    Samsung 960 Pro 512 GB NVMe SSD | (x3) Samsung EVO 960 1 TB SSD
    Asus ROG Swift 34" PG348Q Ultrawide | (x2) Asus MG279Q 27"

  15. #15
    Boycott shampoo! Demand the REAL poo! Time's Avatar
    Rank
    Forum Member
    Division
    None
    Status
    Active
    Join Date
    Aug 2016
    Age
    30
    Posts
    140

    Default

    Quote Originally Posted by AOD_Tymplar View Post
    When I made the jump to 144Hz in January, I felt it was the most significant change in overall gameplay experience since the introduction of 3D rendering back in the late 90's. Right now, nothing beats gaming at 144Hz, period. That's why I'm more of a fan of that vs. 4K at the moment, because we haven't yet gotten to the same type of fluid gameplay at 4K.
    Fully agree. I traded out my 4K/60 monitor for a 1440p/144Hz setup. The difference is just overwhelming. It almost felt like cheating the first time I played FPS on it - you just see, and react to things far quicker. Thinking of it, I really should've benchmarked my K/D in BF4 before and after the switch to 144Hz - it was one of the more interesting benchmarks I could think of. I went up around 20% on that area instantly.

    Conversely to that, while I was working over in Israel I hopped on to play some Overwatch. Decent enough setup, could hold 1080p/60 fine, but that was as far as the monitor I was using could go.

    I sucked. I sucked hard. I did four matches, and got my ass plain handed to me each and every time. Unless all the Israeli heroes covertly knew Krav Maga, the only point of difference was the monitor I was running on. Everything felt sluggish, and laggy.

    4K/144Hz sounds like a dream, but I think we're easily a good 3 generations off single-gpu setups which would be able to render at that pace. For now though, I love my 1440p/144Hz/G-Sync setup. Long may it last.

  16. #16
    Don't piss me off! I'm running out of places to hide the bodies AOD Member AOD_ZED's Avatar
    Rank
    Private First Class
    Division
    AOD Racing
    Status
    Active
    Join Date
    Jul 2016
    Location
    Toronto, Canada
    Posts
    1,996

    Default

    Quote Originally Posted by AOD_Tymplar View Post
    This :)

    He did another video where he had someone else try and see if BF4 was running at specific refresh rates, and that person couldn't tell much.

    Most people won't ever be able to tell the difference, but, the eye CAN see more than 60 FPS / 60Hz. It all depends on how "trained" it is.

    This whole scenario is why I try and convince people that want to upgrade their GPU time and time again, but never go with a higher-end monitor, it's a HUGE disservice. It's like buying a $200,000 car to drive around the city with no intent of ever getting it out on the open road and experiencing what it's fully capable of (in other words, a complete waste of money).

    Yeah - I can stick with a 1080p / 60Hz monitor and upgrade my GPU year after year, and while the gameplay itself may be capable of more and more FPS, the overall experience will never change.

    When I made the jump to 144Hz in January, I felt it was the most significant change in overall gameplay experience since the introduction of 3D rendering back in the late 90's. Right now, nothing beats gaming at 144Hz, period. That's why I'm more of a fan of that vs. 4K at the moment, because we haven't yet gotten to the same type of fluid gameplay at 4K. Right now, the absolute best you can do at 4K gaming is 100Hz w/ G-Sync and there's really only one monitor that can do that right now (Asus PG27AQ). Asus has a new monitor coming soon (I hope later this year) that does 4K at 144Hz w/ G-Sync (and it'll be 27"). And before anyone references the Dell UP3017Q, that's in no way meant to be used for gaming, it still only supports HDR at 60Hz, and it's $5,000.

    At the end of the day, if a monitor has a max refresh rate of 60Hz, your in-game FPS may get higher and higher, but your overall experience with, say, a GTX 1080 won't really be any different than if you were getting over 60 FPS on your GTX 770 or the like.

    The biggest "LOL" I've seen, though, is someone that had bought two of the Maxwell-version Titan X cards and did a Youtube video trying to benchmark and showcase the setup using 1080p, 60Hz monitors.
    I totally agree. I also rather have 144hz monitor. As soon as I got mine 4 month ago I noticed the huge difference right away and I got totally different gaming experience. I rather stick with my 1080p 144hz monitor and have 100+ fps rather than go with 4k 60hz and have 40 - 60 fps. Moreover, 4k is still too expensive and is still not there for a single GPU setup. 1440p starts to make some sense but it is a small jump and barely makes any difference between 1080p and 1440p.
    My YouTube Channel: https://www.youtube.com/channel/UC_0...dKywFi90AiVjAQ
    CPU: i7 7700k @ 4.8 Ghz 1.25v
    GPU: ASUS ROG STRIX GTX 1070 @ 2050 Mhz / 8400 Mhz
    Motherboard: ASUS Z170 Maximus VIII Hero
    RAM: CORSAIR Vengeance LPX 16GB (2 x 8GB) DDR4 3000MHz


 

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
vBulletin Skin By: ForumThemes.com
Top