I've been following the news a bit about high end cards as well as new APU chips from AMD and intel. ? i started thinking about it when i was checking for a replacement card (not needed yet, but just exploring what is out there) and when i read a comment "you might need a nuclear reactor to run latest RTX". a joke for sure, but still a lot of electricity needed just to have nicely moving models and textures in games. i use a low end GT 730 which is also good enough for some light gaming and older games.
when i check all these comparisons, reviews and news and then when i look at the prices the one thing that doesn't really make much sense to me is power usage of these dedicated chips. Especially if you look at the lower end. the APUs might be slower by only 30-40% yet the use a fraction of dedicated card power. what puzzles me is why the dedicated cards are not more energy efficient? why do they need so much power compared to APU. I would expect them to be in 15W maybe even 20W. not 30W or 47W and more.
in the past this made more sense, since the chips on APU were not nearly close to performance of dedicated cards. but it doesn't make that much sense anymore. i mean the Ryzen U chips use less than 30 watts in total. only slightly better GPU card than the one included on CPU needs nearly twice the power. AMD is particularly wasteful if you look at the TDP of cards compared to nvidia. the intel is close to entry GPU chips as well with a lot less power needed.
and look at the real performance in games - there is a difference sure, but not as much as the power difference. if you add to that that even with dedicated GPU you still need enough RAM it makes less and less sense.
anyone else noticed this? or am i reading the power usage wrong?
Bookmarks