3 Replies Latest reply on Oct 22, 2012 9:14 AM by Jason

    Is it possible for a CPU to become sluggish over time as a result of gradual damage?

    Sesshy

      My friend and I are having a debate about whether CPU performance can actually diminish as a result of damage it accumulates over time. In my experience, CPU damage at first results in errors, then eventually in total failure, neither of which have anything to do with sluggishness. He claims that a CPU can somehow "decide" not to use damaged transistors and thereby continue to function with diminished capacity. He claims, in particular, that this problem applies to smartphone hardware more than it does to other computer hardware because of the form factor and associated complications with heat.

       

      Can anybody provide us with an authoritative answer to settle our debate?

        • 1. Re: Is it possible for a CPU to become sluggish over time as a result of gradual damage?
          Jason

          I don't know the answer to your question off the top of my head, but are you asking in the context of overclocking?

          • 2. Re: Is it possible for a CPU to become sluggish over time as a result of gradual damage?
            Sesshy

            No, it's in the broader context of the heat that a CPU might normally incur which gradually shortens the lifespan of the component. My friend's position is that with smartphones in particular, heat takes a more dramatic toll than it ever could on a desktop or a laptop. I know that this isn't true because I beat the hell out of my E6600 running it at 3.4GHz for several years, and the result is that if I don't overvolt it enough, it becomes really unstable, rather than that it runs more "slowly" than it normally would. I've also been a repair tech for a number of years, and I know from experience that when hardware components go bad over time, they either start to produce serious errors or they don't work at all. My friend disputes that this is the case and asserts that hardware like CPUs can simply become slower as a result of what he seems to think is a gradual loss of overall transistor count, and he went on to point out how sensitive transistors are and how easily they're damaged by things like UV radiation. He argues that because the transistors aren't "wired in a series" that the CPU can continue to function just fine even if it loses a significant portion of its count. Personally, I think he has absolutely no idea what he's talking about and is making wild extrapolations on the basis of what little knowledge he does have of electronics and circuitry, but we need somebody like an engineer who can authoritatively settle the matter one way or the other.

            • 3. Re: Is it possible for a CPU to become sluggish over time as a result of gradual damage?
              Jason

              After I read your post, I did a little searching. As it turns out, after a certain point in time over-voltage is required for stability - as you say. Still can't say that I know for sure, but have a look at this article.

              AnandTech - Intel's 45nm Dual-Core E8500: The Best Just Got Better