Power Consumption

The nature of reporting processor power consumption has become, in part, a dystopian nightmare. Historically the peak power consumption of a processor, as purchased, is given by its Thermal Design Power (TDP, or PL1). For many markets, such as embedded processors, that value of TDP still signifies the peak power consumption. For the processors we test at AnandTech, either desktop, notebook, or enterprise, this is not always the case.

Modern high performance processors implement a feature called Turbo. This allows, usually for a limited time, a processor to go beyond its rated frequency. Exactly how far the processor goes depends on a few factors, such as the Turbo Power Limit (PL2), whether the peak frequency is hard coded, the thermals, and the power delivery. Turbo can sometimes be very aggressive, allowing power values 2.5x above the rated TDP.

AMD and Intel have different definitions for TDP, but are broadly speaking applied the same. The difference comes to turbo modes, turbo limits, turbo budgets, and how the processors manage that power balance. These topics are 10000-12000 word articles in their own right, and we’ve got a few articles worth reading on the topic.

In simple terms, processor manufacturers only ever guarantee two values which are tied together - when all cores are running at base frequency, the processor should be running at or below the TDP rating. All turbo modes and power modes above that are not covered by warranty. Intel kind of screwed this up with the Tiger Lake launch in September 2020, by refusing to define a TDP rating for its new processors, instead going for a range. Obfuscation like this is a frustrating endeavor for press and end-users alike.

However, for our tests in this review, we measure the power consumption of the processor in a variety of different scenarios. These include full peak AVX workflows, a loaded rendered test, and others as appropriate. These tests are done as comparative models. We also note the peak power recorded in any of our tests.

First up is our loaded rendered test, designed to peak out at max power.

In this test the 3995WX with only 64 threads actually uses slightly less power, given that one thread per core doesn’t keep everything active. Despite this, the 64C/64T benchmark result is ~16000 points, compared to ~12600 points when all 128 threads are enabled. Also in this chart we see that the 3955WX with only sixteen cores hovers around the 212W mark.

The second test is from y-Cruncher, which is our AVX2/AVX512 workload. This also has some memory requirements, which can lead to periodic cycling with systems that have lower memory bandwidth per core options.

Both of the 3995WX configurations perform similarly, while the 3975WX has more variability as it requests data from memory causing the cores to idle slightly. The 3955WX peaks around 250W this time.

For peak power, we report the highest value observed from any of our benchmark tests.

(0-0) Peak Power

As with most AMD processors, there is a total package power tracking value, and for Threadripper Pro that is the same as the TDP at 280 W. I have included the AVX2 values here for the Intel processors, however at AVX512 these will turbo to 296 W (i9-11900K) and 291 W (W-3175X).

AMD TR Pro Review: 3995WX, 3975WX, 3955WX CPU Tests: Rendering
POST A COMMENT

98 Comments

View All Comments

  • Thanny - Thursday, July 15, 2021 - link

    Your Blender results for the 3960X are off by a lot. I rendered the same scene with mine in 173 seconds. That's with PBO enabled, so it'll be a bit faster than stock, but not 20% faster.

    My guess is that you didn't warm Blender up properly first. When starting a render for the first time, it has to do some setup work, which is timed with the rest of the render, but only needs to be done once.

    I'd expect a stock 3960X to be in the neighborhood of 180 seconds.
    Reply
  • 29a - Thursday, July 15, 2021 - link

    "Firstly, because we need an AI benchmark, and a bad one is still better than not having one at all."

    I 100% disagree with this statement. Bad data is worse than no data at all.
    Reply
  • arashi - Saturday, July 17, 2021 - link

    But but but what about the few (<10) clicks they'd lose for not having lousy CPU based AI benchmarks! Reply
  • willis936 - Thursday, July 15, 2021 - link

    Availability of entry level ECC CPUs (AMD pro and Intel Xeon E-2200/W) is really low. It's unfortunate. People don't have the cash for $10k systems right now but the need for ECC has only gone up. I hope for more editorials calling for mainstream ECC. Reply
  • Threska - Thursday, July 15, 2021 - link

    Linus is mainstream enough.

    https://arstechnica.com/gadgets/2021/01/linus-torv...
    Reply
  • Mikewind Dale - Thursday, July 15, 2021 - link

    At least mainstream desktop Ryzens tend to support ECC, even if not officially validated.

    What frustrates me is that laptop Ryzens don't support ECC at all - not even the Ryzen Pros.

    Every Ryzen Pro laptop I've seen lacks ECC support, and some of them even have non-ECC memory soldered to the motherboard.

    If you want an ECC laptop, it appears you have literally no choice at all but a Xeon laptop for $5,000.
    Reply
  • mode_13h - Friday, July 16, 2021 - link

    > laptop Ryzens don't support ECC at all - not even the Ryzen Pros.

    It probably depends on the laptop. If its motherboard doesn't have the extra traces for the ECC bits, then of course it won't.
    Reply
  • Mikewind Dale - Saturday, July 17, 2021 - link

    It depends on the laptop, yes. But I haven't found a single Ryzen Pro laptop from a single company that supports ECC.

    AMD's website ("Where to Buy AMD Ryzen™ PRO Powered Laptops") lists HP ProBook, HP EliteBook, and Lenovo Thinkpad. But none of them support ECC.
    Reply
  • mode_13h - Saturday, July 17, 2021 - link

    > I haven't found a single Ryzen Pro laptop from a single company that supports ECC.

    Thanks for the datapoint. Maybe someone will buck the trend, but it's also possible they judged the laptop users who really care about ECC would also prefer a dGPU and therefore won't be using APUs.
    Reply
  • mode_13h - Friday, July 16, 2021 - link

    > I hope for more editorials calling for mainstream ECC.

    You'll probably just get inferior in-band ECC.
    Reply

Log in

Don't have an account? Sign up now