Home > Graphics Card > What Makes The GPU Hotter?

What Makes The GPU Hotter?

Contents

permalinkembedsave[–]StargateGuy 0 points1 point2 points 2 years ago(0 children)I feel like the majority of the OP's post has been addressed, but I wanted to comment on this specific argument from his friend. I've never seen anything like this except when playing overwatch. seriously? Advanced Search Overclock.net›Forums›Graphics Cards›AMD/ATI›Running dual monitors makes GPU hotter. my review here

The reason your temps are higher on the CPU is more to do with the fact that it's thermally inefficient. Temperatures have no bearing on heat output. It sounds as if it goes from rad to gpu to cpu and back to rad. I currently have automatic fan speeds enabled, and it has my speed at 35% idle. Firstly the water cooling the cpu and the gpu isn't at the same temperature unless it cools both the cpu and the gpu at the same time. http://www.techspot.com/community/topics/what-makes-the-gpu-hotter.81212/

My Graphics Card Is Getting Too Hot

The whole thing started with me bringing up how I didn't like running games at unlocked framerates because my monitor being 60hz could only run at 60fps and games like Counterstrike:Global Now if we have more energy going into the GPU it would only make sense to me that more waste heat must be produced than from a CPU. If any of you Frankly i can live without out AA but HDR really adds to the game (Oblivion). Every other AAA release I get a max of 50-52c on ultra settings.

  • RockdpmJul 3, 2012, 3:14 AM Well noise wise yes...
  • But since a couple of weeks, I have noticed that my GPU idles at 54 to 55 degrees.
  • Sep 4, 2014 at 3:08 AM #2 XL-R8R Joined: Nov 12, 2012 Messages: 344 (0.22/day) Thanks Received: 144 Location: Technical Tittery....
  • Rule #5: No tech support questions and/or "Will my computer run this game?" questions Those go in /r/pcgamingtechsupport or /r/techsupport depending on how gaming specific the question is.
  • Well, its good to know that my card isn't dying lol..
  • permalinkembedsavegive gold[–][deleted] 0 points1 point2 points 2 years ago(0 children)Unless you're fucking with firmware it's hard to kill a card.
  • funkyzoomMember Since: December 10, 2005Posts: 1523funkyzoomFollowForum Posts: 1523Followed by: 0Reviews: 0 Stacks: 0Forum Karma: 0#10 Posted by funkyzoom (1523 posts) - 3 years, 10 months agoOf course, it's time to get

It's not "lagging", it's running as fast as it can. *sigh* Reply ↓ Leave a Reply Cancel reply Your email address will not be published. On the NVIDIA side of things, based on the ROPs in their cards, I recommend at least a GTX 970 for resolutions above 1080P, and a GTX 980 TI for resolutions Is it bad to run the fan at 75% at all times, or should I turn manual off when im done gaming? Hardware Monitor This can include operations such as anti-aliasing.

Overprovisioning REMOVED The biggest feature removed is... The earlier generation 40nm GPUs can reach temps of 88 - 95 degrees C, though they are made to withstand temps of up to 95 - 100. Sep 4, 2014 at 8:14 PM #5 doyll Joined: Sep 5, 2013 Messages: 86 (0.07/day) Thanks Received: 30 H55 really isn't designed to handle 200+ watts so it will run hot. What way is your cooling loop configured.

Hopefully this clears up many misconceptions. Gpu Z Many people will call games poorly optimized or badly made if it causes their graphics card to run abnormally hot. In addition to internal bottlenecks within the graphics card, some games may contain CPU bottlenecks that prevent the graphics card from running at its full potential, or ones that are simply Jul 5, 2007 Antec Sonata 2 Vent system makes machine HOTTER!

Msiafterburner

Ask a question and give support. I recently added another fan to my computer and my 8600GT reaches nearly 75C (Instead of nearly 90) under load so i wanted to know which special effect makes the card My Graphics Card Is Getting Too Hot All rights reserved.REDDIT and the ALIEN Logo are registered trademarks of reddit inc.Advertise - gamesπRendered by PID 16698 on app-512 at 2017-01-20 16:01:41.736571+00:00 running dc009a3 country code: DE. Graphics Card Overheating Facebook Google+ Twitter YouTube Subscribe to TechSpot RSS Get our weekly newsletter Search TechSpot Trending Hardware The Web Culture Mobile Gaming Apple Microsoft Google Reviews Graphics Laptops Smartphones CPUs Storage Cases

The game is fine. The electrical resistance of copper will also increase as the temperature goes up. Now i'm geniuenly curious if it's Nvidia with their drivers or the dev team not taking this problem seriously at all when we complained in open beta too. Like with my 40" TV it plays around 50s idle. Sage Mode System (13 items) CPUMotherboardGraphicsRAMAMD Athlon II 635 OC @ 3625MHzGigabyte 890GPA-UD3HXFX HD5670 512MBG.SKILL F3-12800CL9D-4GBNQ @ 7-7-7-20 Msi Afterburner

It pushes your graphics card to use the vast majority of its processing units at once, which creates a "worst case scenario" in regards to power consumption and heat generated. That's because it's not. It details specs and has nice pictures of the case. I bet it's google.

So strictly speaking, yes, higher temperatures will have an impact on the lifetime of the card. Amd Drivers If the temperature range for normal use is a significant impact on the expected lifetime of consumer devices - I don't think so. You can complain the game is 'using too much GPU resources' if you want.

I've heard this is because AMD was trying to position their price/performance differently, I just don't see why they would purposefully handicap their cards like that.

Rig: i7 2600K @ 4.2GHz, Larkooler Watercooling System, MSI Z68a-gd80-G3, 8GB G.Skill Sniper 1600MHz CL9, Gigabyte GTX 670 Windforce 3x 2GB OC, Samsung 840 250GB, 1TB WD Caviar Blue, Auzentech X-FI not really djscribblesJul 3, 2012, 3:18 AM I would just let it run on auto all the time. At least, for me it seems to just be broken. Login now.

I read on some post here about something which i dont recall about the GPU's transistors and how they go into use when you use AA or HDR. permalinkembedsavegive goldaboutblogaboutsource codeadvertisejobshelpsite rulesFAQwikireddiquettetransparencycontact usapps & toolsReddit for iPhoneReddit for Androidmobile websitebuttons<3reddit goldredditgiftsUse of this site constitutes acceptance of our User Agreement and Privacy Policy (updated). © 2017 reddit inc. Discussion in 'Overclocking & Cooling' started by hat1324, Sep 4, 2014. For a GPU 80 degC isn't really a big deal, when it get's up there your fan should kick on higher (become audible) to cool it back down.My guess is you

Byreze-daniel Jul 5, 2007 I read in some post here about something which i dont quite remember but it said that when the card uses its transistors it gets hotter and Still, running cooler means using less power and producing less heat and noise, which is a good thing in itself. If it were true you'd expect an 8350 that loads at 60C to use less power than a 3770K at 75C. 3x Dell U2412M |Silverstone FT03 | Maximus V Gene | 3770K Register a new account Sign in Already have an account?

The game is fine. Now as far as ventilation goes, I haven't moved anything in years.. JigglyWiggly_Member Since: January 27, 2009Posts: 24497JigglyWiggly_FollowForum Posts: 24497Followed by: 0Reviews: 3 Stacks: 0Forum Karma: 0#11 Posted by JigglyWiggly_ (24497 posts) - 3 years, 10 months ago86f ambient?wuthot weather is disgusting mitu123Member permalinkembedsavegive gold[–]cemgescemges -1 points0 points1 point 2 years ago(0 children)Gaming software is built around maxing capabilities of gpus, as such, gpus are also built to operate at their max capability within reason.

this is what happens when you get consoles involved.