Sandy Bridge Celerons

Intel released Sandy Bridge-based Celeron CPUs in early September, and these started appearing in retail channels by the middle of that month; we provided a brief overview of these parts. The Celeron that stands out is the G530, a dual-core CPU clocked at 2.4GHz with 2MB L3 cache and on-die Intel HD Graphics. This processor lacks Hyper-Threading and Quick Sync support, and it has a TDP of 65W (though it will generally use far less power than that). While Intel's suggested pricing is a meager $42, retail prices have stayed steady since its release at $55-60. It is the least powerful Intel dual-core CPU, with only the single core G440 available for less money.

If you've been building and using computers for years, you know there is a stigma attached to the Celeron name. For a long time, Celerons were crippled to the point of near-unusability for even the most basic tasks. That has changed as our basic benchmarks indicate that the G530 is not at all an almost-garbage CPU. The Celeron stigma is dead.

Athlon II X2s

AMD's Athlon II X2 Regor-based 45nm dual-cores have been a mainstay of budget computing since their introduction in 2009. The Athlon II X2 250, clocked at 3.0GHz with 2MB L2 cache, is essentially as capable today as it was two years ago for basic usage. For example, 1080p videos on YouTube are no more difficult to decode and Microsoft Office 2010 isn't much more CPU-hungry than Office 2007 was. Given that most computers I assemble are budget systems, I've now used the Athlon II X2 250 for more builds than any other CPU. Is that about to change?

Llano APUs

AMD's most recent APUs (accelerated processing units) have also expanded into the budget processor range. These Fusion APUs combine both the CPU and Radeon "cores" on a single die. Anand reviewed the most capable APU back in June, and compared the A6 model to Intel's Sandy Bridge Pentium in late August. The more recently released 32nm A4-3300 chip (overviewed by Anand in September) is a dual-core part clocked at 2.5GHz with 1MB total L2 cache and featuring AMD's Radeon HD 6410 graphics—160 GPU cores clocked at 443MHz. Its nominal TDP is 65W. Priced around $70, the A4-3300 is only about $10 more than the Celeron G530 and Athlon II X2 250. It promises better graphics performance—but how does the least expensive A-series APU compare to inexpensive discrete video cards, and do you sacrifice processor performance for better graphics?

Battle of the Budget Processors: Benchmarks

While we didn't put the Celeron G530 and A4-3300 through our extensive Bench suite, here are a few benchmarks that show how they stack up against the venerable Athlon II X2 250. All benchmarks were performed using an Antec Neo Eco 400W power supply, a Western Digital Blue 500GB WD5000AAKX hard drive, and a 2x2GB kit of DDR3-1333 with a clean installation of Windows 7 Enterprise 64-bit, with only the manufacturer-supplied drivers installed.

Conversion of a PowerPoint Presentation to a PDF

For this benchmark, I converted a 100 slide, 25MB PowerPoint file to a PDF using Microsoft Office 2010's integrated "Save as PDF" option. As you can see, the Athlon II CPU performs this task slightly faster than the Celeron, though in reality you're only going to actually notice a difference if you're converting extremely large PowerPoint files. The Fusion APU is substantially slower—this is a difference you will notice in real-world usage scenarios.

7-Zip performance

These values were obtained using 7-Zip's built-in benchmark function with a 32MB dictionary. AMD's Athlon II CPU has a more noticeable advantage here over the Celeron—you will notice a difference if compressing/decompressing either many files or large files. The A4-3300 again performs palpably worse--no surprise given its lower 2.5GHz clock compared to the Athlon's 3.0GHz.

FastStone image resizing

For this test, I resized 50 4200p pictures down to 1080p resolution using FastStone's batch image conversion function. Again, the two CPUs perform similarly, though this time Intel takes the lead. The AMD APU once again lags significantly behind the two CPUs.

x264 HD encode test

Graysky's x264 HD test (v. 3.03) uses x264 to encode a 4Mbps 720p MPEG-2 source. The focus here is on quality rather than speed, thus the benchmark uses a 2-pass encode and reports the average frame rate in each pass. The difference between the Athlon II and Celeron CPUs is essentially nil; both offer better performance than the AMD APU.

Power consumption

Like the above benchmarks, all components were held equal for power consumption testing sans the CPU and motherboard. For the Athlon II platform, I used the ASRock 880GM-LE motherboard, for the Intel platform I used the ASRock H61M-VS motherboard, and the APU was tested on an ASRock A55M-HVS. This is where the efficiency of the newer architectures truly outshines the older Athlon II design. Measurements were taken using a P3 International P4400 Kill A Watt monitor and reflect the entire system, not just the CPU.

Intel's Celeron still leads for low power use, but Llano is at least within striking distance. The older Athlon II X2 uses around 50% more power than Llano for these two tests--or around 17 to 30W more power. Taking the lower number and going with a system that's only powered on eight hours per day, we end up with a difference of around 50kWh per year--or $4 to $15 depending on how much you pay for electricity. If you're in a market where power costs more, obviously there's a lot to be said for going with the more efficient architectures.

Gaming benchmarks

Next we test how the AMD A4-3300 APU's graphics prowess stacks up against a budget GPU. The AMD Athlon II and Intel Celeron CPUs were paired with an AMD Radeon HD 5670 512MB DDR5 discrete GPU as neither of their integrated graphics are capable of producing a tolerable gaming experience. The A4-3300 was not paired with a discrete GPU.

Left 4 Dead 2

For the Left 4 Dead 2 benchmark, we used a 1024x768 resolution with all settings at maximum (but without antialiasing). The AMD APU delivers almost 40 frames per second by itself, so no discrete graphics card is required. Subjectively, gameplay was smooth and fluid on the APU. However, bumping up the resolution to even 720p could be an issue, even with less demanding games.

DiRT 3

For the DiRT 3 benchmark, we used DirectX 11 at 1024x768 resolution, but this time graphics options were set to the low preset. Even then, the AMD APU struggled to breach the 30 frames per second threshold, and DiRT 3 clearly did not run as smoothly as Left 4 Dead 2. That said, it remained playable, and if you're tolerant of lower resolutions, it performs fine in windowed mode.

Keep in mind that we're using the bottom-rung Llano APU for these tests, and it's a pretty major cut from the A6 models--half the shader cores, but with a slightly higher clock, and only a dual-core CPU. Where the A6 and A8 can legitimately replace budget discrete GPUs, the same cannot be said for the A4 APUs. The lowest priced A6-3500 will set you back around $100, but it drops the CPU clock to 2.1GHz and only adds a third core. Meanwhile the quad-core A6-3650 will run $120 ($110 with the current promo code), but it sports a 2.6GHz clock with the HD 6530D graphics (and a higher 100W TDP). At that point, you might also be tempted to go for the A8-3850, with the full HD 6550D graphics and a 2.9GHz clock, which brings the total for the APU to $135. All of these APUs will work in the same base setup as our Llano build, but obviously the price goes up quite a bit. If you'd like added processing and graphics power, though, the quad-core parts make sense.


As you can see, the Athlon II and Celeron CPUs are very evenly matched across a range of basic productivity tests, while the Fusion APU typically lags behind, at least for office productivity and encoding tasks. That said, the A4-3300 is capable of delivering an acceptable gameplay experience for casual gamers without necessitating a discrete GPU. Additionally, Intel's newer Sandy Bridge architecture and AMD's newer Llano architecture result in dramatically lower total system power consumption at both idle and load compared to the aging AMD Regor architecture.

So which CPU should you buy for your budget build? In terms of upgradeability, socket AM3 is still viable. In the short term, Phenom II quad-cores are already inexpensive, starting at just over $100—so they will be even cheaper in another year or two. Of course, Bulldozer CPUs are compatible with many AM3+ motherboards and could be a wise upgrade in a few years as well. Intel's LGA 1155 socket is also very upgrade-friendly—the Celeron G530 is, after all, the least powerful Sandy Bridge CPU (aside from the sole single-core SKU). The Core i3-2100 will likely sell for less than $100 in another year or so (at least on the secondhand market), and more powerful Core i5 and i7s could keep today's Intel budget build alive and well for maybe as much as five more years. Like the Celeron G530, AMD's socket FM1 has nowhere to go but up from the A4-3300 APU. That said, LGA 1155 currently offers far more powerful CPUs than the high-end A8-3850.

I think in this case, given how evenly the CPUs perform (aside from power consumption), and that both offer lots of upgrade potential, the decision will come down to overall platform cost and features. The A4-3300 APU does offer an acceptable general, basic computing experience—its real strength is its ability to play less resource-intensive games without the extra cost of a discrete GPU. We cover a few budget AMD and Intel platform motherboards on the next page.
Introduction Motherboards and Features
Comments Locked


View All Comments

  • buildingblock - Tuesday, November 8, 2011 - link

    We now have the curious situation where AMD is selling both the A6 3650 APU and the X4 631 Athlon II socket FM, which is the same unit with the graphics unit disabled. Because of the design constraints of Llano, and I suspect because the die-shrink to 32nm didn't really work out that well, the CPU part of the current Llano range is puny compared to the socket 1155 processors, even the low-end budget Gxxx range. At my local hardware dealer, the X4 631 is priced more than the Intel G-series equivalent, but that seems to be the theme of AMDs current APU/CPU offerings - uncompetitive performance and uncompetitive pricing.
  • Iketh - Tuesday, November 8, 2011 - link

    You have the 500mhz difference and also the A4 has half the L2 cache of the X2. 1MB of L2 cache with no L3 cache is anemic.

    Ignore slayernine, he's a babbling idiot.
  • Wierdo - Wednesday, November 9, 2011 - link

    Ah, if the cache structure is different that I could see one possible potential reason for variation in same-core performance, thanks I didn't spot that.
  • slayernine - Tuesday, November 8, 2011 - link

    May I suggest an interesting alternative build that costs a bit more but is still within reach of most budgets. This system build is very tiny, good for those with limited space or in want of a portable machine:

    AMD A8-3850 2.9GHz $139
    ASRock A75M-ITX $94
    G.SKILL Ripjaws Series 8GB $34
    XFX HD-667X-ZHF3 6670 $83 (not including $25 MIR)
    SILVERSTONE SG05BB-450 (incl 450w PS) $129
    Crucial M4 CT064M4SSD2 64GB $119

    This system is tiny and takes advantage of AMD's Dual Graphics between the onboard GPU and the 6670. I normally shop but I bought this system from because they actually had AMD Mini ITX Boards. Please note these are Canadian prices as well. I would suggest a Momentus XT 500GB drive for this system if it was not for the insane prices right now. In this build I'm actually not purchasing a new HD I'm reusing a 60GB OCZ that I just got back from RMA. The RMA business being a big reason why I don't recommend OCZ, Intel and other brands are so much more reliable.
  • A5 - Tuesday, November 8, 2011 - link

    Your system costs double of those in this article (one you take out the Win7 license). Also the A8 is a waste if you're going to use a dGPU anyway.
  • slayernine - Tuesday, November 8, 2011 - link

    1. Canadian prices are higher than american ones. eg. $60 mobo turns into $90 mobo. This is not a currency value issue, more so that once things cross the border they magically cost more.

    2. The A8 processor is not a waste if you know about dual graphics. You technically get a 6690D2 which offers performance similar to the 6770 without paying more for in money and power usage.

    Educate yourself on dual graphics (sorry for the non anandtech link):

    3. I think $400 is not enough to spend on a system even if it is a budget computer. Also I did forget about the OS as I had previously purchased one.
  • silverblue - Tuesday, November 8, 2011 - link

    Asymmetric Crossfire is (or was... any change?) hit-and-miss. In some cases, it can actually harm performance to the point that the iGPU isn't much slower. However, in some cases, it does work very well. WoW works better, but Metro 2033 drops performance, if we consider your second link.

    The following AT link provides more data on aCF's performance (admittedly, things may have changed since then):
  • slayernine - Tuesday, November 8, 2011 - link

    Thanks for the link, for some reason I couldn't find that article in my quick google search. Check out this article which actually reviews the 6690D2 configuration that I've been talking about (I hate their graphs love the anand ones) Also rage3d doesn't compare enough games unfortunately but the ones it does use show 6690D2 > 6670:

    The other option I was also considering for this build was to go with Intel plus a 6770 which you can also find single slot cards for:

    However you will notice much higher power requirements on the 6770 as well as it needs a 4pin power connector on the end of the card. Something which caused me a lot of hassle when taking my 4850 out of my previous mini pc build.
  • Paul Tarnowski - Tuesday, November 8, 2011 - link

    3. That is your choice. This is about building a budget system. When a client asks me to supply an office computer, putting in Hybrid Crossfire is not going to make them magically want to spend double. Likewise for home use for the grandparents or so the little kids have something to write their homework on (they tend to play on iPads if they have them).

    Budget means that you have a low amount allotted to the project. Otherwise you miss the entire point of the article.
  • slayernine - Tuesday, November 8, 2011 - link

    I'm looking at this from the perspective of a budget gamer. I realize that the average Joe who just surfs the web doesn't give a crap about crossfire or gaming performance.

    What I'm saying is that without breaking the bank you can get significantly improved performance with AMD's new dual graphics (hybrid crossfire, Asymmetric Crossfire, whatever else people want to call it) Also note that some games see this benefit more than others so it depends what you play.

Log in

Don't have an account? Sign up now