"The 5500 is really the first chip to escape from the personal computing bias of the original x86 chips. It has a memory controller built onto the chip instead of off-loaded to a separate dedicated chip, reducing latencies encountered as a VM's operating system manages the memory that its application is using."
That's all well and good ... except AMD chips have done this since 2006
, so Intel is "cutting edge" by staying 3 years behind the curve. Granted the article doesn't explicitly say "AMD has yet to do this" but it also goes to great lengths not to say "the first Intel chip to not suck by including 20 year old technology."
Makes me wonder just how much ad money comes into Information Week from Intel, considering anyone who knows anything about chip architecture would tell you this has been a huge advantage for AMD the last several years as the CPU was so much faster than everything else in the box you needed to do everything you could to feed it information faster. And when you're talking computers, one speaker at a time 1980s bus technologies are not usually mentioned in the same sentence as "fast."
Of course Information Week targets the people writing the checks, who know nothing about hardware, and will tell their peons "we should buy Intel servers because they made this great technological leap that no one else has yet. Kind of like how at work we moved from Legato backup to CommVault backup (as far as I can tell the only worse backup product than Legato) when suddenly the Legato ads in Information Week stopped and the CommVault adds started up. It's always a sad day when marketing outweighs technology, but if it's good enough for Microsoft why shouldn't everyone else do it too, right? I hope the publish my response to the editors.