Page 1 of 2 12 LastLast
Results 1 to 20 of 27
  1. #1
    AOD 4 Life! AOD Member AOD_Creelo's Avatar
    Rank
    Specialist
    Division
    PlanetSide
    Status
    Active
    Join Date
    Jan 2015
    Location
    West Virginia
    Posts
    473

    Default AMD’s $200 Radeon RX 480 video card

    I'm really watching this one. June 29 is the release date.
    http://www.pcworld.com/article/30774...he-masses.html

  2. #2
    Save the whales. Collect the whole set KaosC57's Avatar
    Rank
    Forum Member
    Division
    None
    Status
    Active
    Join Date
    Dec 2015
    Location
    Texas
    Age
    25
    Posts
    494

    Default

    Albiet AMD is putting out a very aggressive price point for that GPU. One thing I will caution the Planetside 2 and Warframe players is that Even if PhysX is disabled in your settings, the game still uses underlying PhysX calculations. So, for Planetside 2 players and Warframe Players, your Framerates will go down a good margin if you switch to an AMD GPU, even if it's a raw computational power increase, AMD doesn't support PhysX calculations.

    So, if you are thinking of buying the RX 480, make sure you are not buying/playing any games that use PhysX, otherwise you might see performance DROPS, instead of increases.

  3. #3
    Banned from Forums Zeus121's Avatar
    Rank
    Forum Member
    Division
    None
    Status
    Active
    Join Date
    Feb 2016
    Age
    31
    Posts
    24

    Default

    Quote Originally Posted by AOD_KaosC57 View Post
    Albiet AMD is putting out a very aggressive price point for that GPU. One thing I will caution the Planetside 2 and Warframe players is that Even if PhysX is disabled in your settings, the game still uses underlying PhysX calculations. So, for Planetside 2 players and Warframe Players, your Framerates will go down a good margin if you switch to an AMD GPU, even if it's a raw computational power increase, AMD doesn't support PhysX calculations.

    So, if you are thinking of buying the RX 480, make sure you are not buying/playing any games that use PhysX, otherwise you might see performance DROPS, instead of increases.
    Not sure if this is still the case with PS2, but if you go into useroptions.ini, go under [rendering] GpuPhysics=0, (0 = off) (1 = PhysX on)

    Should disable it

  4. #4
    Save the whales. Collect the whole set KaosC57's Avatar
    Rank
    Forum Member
    Division
    None
    Status
    Active
    Join Date
    Dec 2015
    Location
    Texas
    Age
    25
    Posts
    494

    Default

    Quote Originally Posted by Zeus121 View Post
    Not sure if this is still the case with PS2, but if you go into useroptions.ini, go under [rendering] GpuPhysics=0, (0 = off) (1 = PhysX on)

    Should disable it
    Nope. PhysX is still used in the Physics calculations even if you turn it off, I don't even think this is an option anymore anyways.

  5. #5
    Rock and Stone AOD Member AOD_lanius424's Avatar
    Rank
    Specialist
    Division
    Helldivers
    Status
    Active
    Join Date
    Jun 2014
    Location
    Durham, NC
    Age
    27
    Posts
    1,567

    Default

    From what I remember the devs actually completely disabled PhysX due to how terrible performance was on both client and server. So I'm not sure if you're correct but I could be wrong.


    Sent from the Matrix using telepathic communication

  6. #6
    Save the whales. Collect the whole set KaosC57's Avatar
    Rank
    Forum Member
    Division
    None
    Status
    Active
    Join Date
    Dec 2015
    Location
    Texas
    Age
    25
    Posts
    494

    Default

    Quote Originally Posted by AOD_lanius424 View Post
    From what I remember the devs actually completely disabled PhysX due to how terrible performance was on both client and server. So I'm not sure if you're correct but I could be wrong.


    Sent from the Matrix using telepathic communication
    They didn't actually remove it/disable it. The game still runs basic physics calculations through the engine. The game would require a complete rewrite to make PhysX completely gone.

  7. #7
    If I'm not back in 5....wait longer! R6L84F3J's Avatar
    Rank
    Forum Member
    Division
    None
    Status
    Active
    Join Date
    Apr 2016
    Location
    Alberta, Canada
    Posts
    35

    Default

    Just buy 2 and crossfire them.

  8. #8
    Let the rivers flow red with the blood of our enemies PS_Anubis's Avatar
    Rank
    Forum Member
    Division
    None
    Status
    Active
    Join Date
    Oct 2014
    Location
    Parry sound, Ontario, Canada
    Age
    39
    Posts
    107

    Default

    ya I cant wait for this sucker, Atm using R9 290 and on firestrike ultra i get a score ~2000
    I was gonna upgrade to 390 series which would get me up to ~3200 in score and the Fury would net me a score of ~4000

    But this sucker reporting in at 14,461 score with an I7-4770 (same as me), even is i got 1/2 that score id be almost 4x the preformace it would seem for a mere 200 bucks....

  9. #9
    If I'm not back in 5....wait longer! R6L84F3J's Avatar
    Rank
    Forum Member
    Division
    None
    Status
    Active
    Join Date
    Apr 2016
    Location
    Alberta, Canada
    Posts
    35

    Default

    I have plans to be buying 2 Rx 480/8gb cards when they are released to the public. Going to be epic!

  10. #10
    "Oh great, here comes Captain Dipshit in a LAV" - Pyle986 Grady666's Avatar
    Rank
    Forum Member
    Division
    None
    Status
    Active
    Join Date
    Apr 2015
    Location
    US
    Age
    28
    Posts
    1,455

    Default

    Im sorry, After The 6/7K Series cards and the Athlon II/Phenom II everything AMD went south. The only thing promising about the Polaris-series GPU's is HBM. HBM Is absolutely fucking awesome, HBM2 JEDEC 8Gb(4GB) Per Die @ 256GB/s of Bandwidth. We can expect to see GPU's/APU's w/ 16-32GB of VRAM on a 1024-Bit wide bus in the next few years.

  11. #11
    Looks like I picked the wrong week to quit sniffing glue Ghztr_11's Avatar
    Rank
    Forum Member
    Division
    None
    Status
    Active
    Join Date
    May 2016
    Location
    Norway
    Age
    34
    Posts
    39

    Default

    Read some hype about that GPU, looks interesting.

    Would love to buy the RX480. But with my AMD FX8320, I just cant see the point of upgrading GPU when CPU will bottleneck -..-

  12. #12
    "Oh great, here comes Captain Dipshit in a LAV" - Pyle986 Grady666's Avatar
    Rank
    Forum Member
    Division
    None
    Status
    Active
    Join Date
    Apr 2015
    Location
    US
    Age
    28
    Posts
    1,455

    Default

    Quote Originally Posted by AOD_Ghztr_11 View Post
    Read some hype about that GPU, looks interesting.

    Would love to buy the RX480. But with my AMD FX8320, I just cant see the point of upgrading GPU when CPU will bottleneck -..-
    Well, I would think with your experience(Alas, even if its for 1-2 games) you would never buy a AMD GPU again. Their performance is annoyingly inconsistent, seriously.

  13. #13
    Looks like I picked the wrong week to quit sniffing glue Ghztr_11's Avatar
    Rank
    Forum Member
    Division
    None
    Status
    Active
    Join Date
    May 2016
    Location
    Norway
    Age
    34
    Posts
    39

    Default

    Quote Originally Posted by AOD_Grady666 View Post
    Well, I would think with your experience(Alas, even if its for 1-2 games) you would never buy a AMD GPU again. Their performance is annoyingly inconsistent, seriously.
    Spot on :P They might talk the RX480 sky high. But seriously, at the end of the day. What nvidia and intel provides is stable and relyable.
    AMD like to keep up, in theory. But thats it...

  14. #14
    Very funny Scotty, now beam down my clothes AOD Member AOD_BritishBob's Avatar
    Rank
    Lance Corporal
    Division
    Helldivers
    Status
    Active
    Join Date
    Oct 2014
    Posts
    698

    Default

    That's a bit harsh... AMD GPUs have been consistent in games, Launch titles have been a bit hit or miss due to poor optimisation but that's an issue between the devs and AMD rather than the cards themselves.

    The CPUs for games aren't great, games use strong single/dual cores. Maybe quad if you're lucky. So AMD weak but many doesn't work for gaming. Great for server style applications tho.


  15. #15
    If I'm not back in 5....wait longer! CallMEKoKo's Avatar
    Rank
    Forum Member
    Division
    None
    Status
    Active
    Join Date
    Feb 2016
    Location
    California
    Posts
    64

    Default

    Yeah I think I will pick one of these up when they come along.

  16. #16
    All the years to come I want you to remember the one man who beat you. Cropels's Avatar
    Rank
    Forum Member
    Division
    None
    Status
    Active
    Join Date
    Apr 2014
    Location
    México City
    Age
    28
    Posts
    65

    Default

    I'm waiting for GTX 1060 6GB let's see how it goes

  17. #17
    Criminal Lawyer is a redundancy TekNicTerror's Avatar
    Rank
    Forum Member
    Division
    None
    Status
    Active
    Join Date
    Dec 2014
    Location
    SC, USA
    Posts
    0

    Default

    Spending the extra cash on a 1070 would be better than a 1060.

  18. #18
    #Superhuman Tymplar's Avatar
    Rank
    Forum Member
    Division
    None
    Status
    Active
    Join Date
    Dec 2015
    Location
    Atlanta, GA
    Age
    43
    Posts
    194

    Default

    Quote Originally Posted by AOD_TekNicTerror View Post
    Spending the extra cash on a 1070 would be better than a 1060.
    THIS.

    Don't fall into the same marketing spin that AMDs trying to pull in that more RAM = better performing card. The like to selectively leave out things like memory bus speed and raw bandwidth.

    (Note that GDDR5X is just about 2x the available bandwidth as GDDR5 with very minimal, additional power requirement)

    GTX 1080 = 8 GB GDDR5X w/ 256-bit bus for 320 GB/sec raw bandwidth
    GTX 1070 = 8 GB GDDR5 w/ 256-bit bus for GB/sec raw bandwidth
    GTX 1060 = 6 GB GDDR5 w/ 192-bit bus for GB/sec raw bandwidth
    GTX 1060 = 3 GB GDDR5 w/ 192-bit bus for GB/sec raw bandwidth

    The two GTX 1060 cards are supposed to launch on July 7th at around $250 (3 GB model) and $299 (6 GB model), I believe. Perhaps since they're only using the more widely-available GDDR5, stock will be immediately available?

    RIP Polaris only a week after launch. Thanks for playing, AMD. Perhaps when you stop trying to compare two cards to one (and only squeak out 4 additional FPS...don't get me started on that "lower utilization / distributed workload bullshit") and then compare your "flagship" card to the absolute lowest-end Maxwell-based card to try and show a superior performance per watt metric, but all you do is show that the lowest-end Maxwell-based card is capable of keeping up with your new "flagship" card....no one gives a damn about PPW at that point. But hey...at least you found a formidably single card to compare to (with lower power consumption being the only thing you win on).

    P.S. FinFET technology is supposed to lead to improved energy efficiency at a higher performance point, yet you struggled against a non-FinFET chip. That was designed to be slower.

    But hey, multi-GPU configurations with two cards, configured BY DESIGN and OPTIMIZED that way to distribute workload across them in an effort to be all like, "Just imagine if we were both running at 100% load!!!". Yeah....you'd f'n lock up my rig and then I'd fry and egg on your shroud :)

    Fun times, indeed :)

  19. #19
    #Superhuman Tymplar's Avatar
    Rank
    Forum Member
    Division
    None
    Status
    Active
    Join Date
    Dec 2015
    Location
    Atlanta, GA
    Age
    43
    Posts
    194

    Default

    Could find the actual memory bandwidth metrics before being able to update my post, and didn't want to do math :)

    I was just thinking, though, that either way, MAYBE we're getting close to see those early 2000 "GPU Wars" right back in front of us. AMD launches the RX 480, a week later, nVIDIA launches the GTX 1060.

    Granted, I'm willing to bet they previously had the 1060s ready to go as part of the original launch, but knew this would happen :)

  20. #20
    Very funny Scotty, now beam down my clothes AOD Member AOD_BritishBob's Avatar
    Rank
    Lance Corporal
    Division
    Helldivers
    Status
    Active
    Join Date
    Oct 2014
    Posts
    698

    Default

    Quote Originally Posted by AOD_Tymplar View Post
    Could find the actual memory bandwidth metrics before being able to update my post, and didn't want to do math :)

    I was just thinking, though, that either way, MAYBE we're getting close to see those early 2000 "GPU Wars" right back in front of us. AMD launches the RX 480, a week later, nVIDIA launches the GTX 1060.

    Granted, I'm willing to bet they previously had the 1060s ready to go as part of the original launch, but knew this would happen :)

    I don't know what to say... The RX 480 8GB has 320 gb/s bandwidth on the memory...
    http://arstechnica.co.uk/gadgets/201...olaris-review/

    As far as amount of memory, yes more memory typically means better performance due to the increased frame buffer size allowing for a larger number of pre-rendered information as well as allowing larger amounts of data per frame. The cost per performance is fine, within 3-5 fps of a 980, better than the 970 while being cheaper than all the other cards of similar performance. It fits in perfectly to where it was marketed.

    It all depends how gimped the 1060 is.
    Last edited by AOD_BritishBob; 06-30-2016 at 02:56 AM.



 
Page 1 of 2 12 LastLast

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
vBulletin Skin By: ForumThemes.com
Top