Jump to content
  • entries
    940
  • comments
    5,894
  • views
    863,990

Debunking Hype


Josh

2,475 views

 Share

I am usually very excited to read about new graphical techniques, and even new hardware approaches, even if they are presently impractical for real use. I was very interested in Intel's Larabee project, even though I didn't expect to see usable results for years.

 

However, sometimes articles get published which are nothing but snake oil to raise stock prices. The uninformed reader doesn't know the difference, and these articles are usually written in such a way that they sound authoritative and knowledgeable. It's unfair to consumers, it's unfair to stockholders, and it hurts the industry, because customers become unable to differentiate between legitimate claims and marketing nonsense. This one is so over-the-top, I have to say something.

 

In an attempt to stay relevant in real-time graphics, Intel, the company that single-handedly destroyed the PC gaming market with their integrated graphics chips, is touting anti-aliasing on the CPU.

 

There's a nice explanation with diagrams that make this sound like an exciting new technique Intel engineers came up with. The algorithm looks for edges and attempts to smooth them out:

blogentry-1-0-02906300-1312214678_thumb.jpg

 

It's so advanced, that I wrote this exact same algorithm back in 2007, just for fun. Below are my images from it.

 

Original:

blogentry-1-0-80126800-1312214103_thumb.png

 

Processed:

blogentry-1-0-75870700-1312214108_thumb.png

 

The reason this is dishonest is because you would never do this in real-time on the CPU. It may be possible, but you can always perform antialiasing on the GPU an order of magnitude faster, whether the device is a PC, a netbook, or a cell phone. I don't think Sebastian Anthony has any clue what he is writing about, nor should he be expected to, since he isn't a graphics programmer.

 

Furthermore, swapping images between the GPU and the CPU requires the CPU to wait for the GPU to "catch up" to the current instructions. You can see they completely gloss over this important aspect of the graphics pipeline:

This means that a pipeline can be created where the graphics hardware churns out standard frames, and the CPU handles post-processing. Post-processed frames are handed back to the GPU for any finishing touches (like overlaying the UI), and then they’re sent to the display.

Normally, graphics are a one-way street from the CPU, to the GPU, to the monitor. The CPU throws instructions at the GPU and says "get this done ASAP". The GPU renders as fast as it can, but there is a few milliseconds delay between when the CPU says to do something, and when the GPU actually does it. Sending data back to the CPU forces the CPU to wait and sync with what the GPU is doing, causing a delay significant enough that you NEVER do this in a real-time renderer. This is why occlusion queries have a short delay when used to hide occluded objects; the CPU doesn't get the results of the query until a few frames later. If I made the CPU wait to get the results before proceeding, the savings you would gain by hiding occluded geometry would be completely negligible compared to the enormous slowdown you would experience!

 

What Intel is suggesting would be like if you went to the post office to mail a letter, and you weren't allowed to leave the building until the person you were sending it to received your letter and wrote back. They're making these claims with full knowledge of how ridiculous they are, and counting on the public's ignorance to let it slide by unchallenged.

 

So no, Sebastian, this is not going to "take a little wind out of AMD’s heterogeneous computing sails". Please check with me first next time before you reprint Intel's claims on anything related to graphics. If any Intel executives would like to discuss this with me over lunch (your treat) so that I can explain to you how to turn the graphics division of your company around, I live near your main headquarters.

 Share

5 Comments


Recommended Comments

Whoop! Well said Josh!

I find that most things you write or speak about is quite understandable, which is a big reason I like your products. It is excelent indeed that you give us your viewpoint on the technological corners of the industry, and for this I applaud you!

Link to comment

Intel would be a much better company, if they wouldn't always make so ridiculosly easy to detect crimes:

1) Making the Intel C++ compiler deliberately produce slower code for AMD CPUs with some useless NOPs

2) Claiming that a single multicore CPU can ever achieve the speed of 128 or more GPU cores

3) Killing Project Offset

4) Killing the Motorola based MacIntosh, Amiga and Atari by supporting the PC

Link to comment

Don't blame intel for the mac. Apple chose to switch to intel's new processors. No one held a gun to the figurative Apple's head.

 

 

But hey, I've stuck with AMD since 2001. A processor that works and sold on its own merit rather than by corporate bullying, and now outright lies...

Link to comment

It's still Intel's fault that Mac chose Intel, because if they would have let Motorola to finish their 68060 series, they wouldn't have needed to switch. At that time Motorola had been quiet on new CPUs for some years and people thought it was dead, and it became dead because people thought so, and acted in panic.

 

The Motorola was so much better than the Intel, because it could address the whole 32-bit memory range with a single command, while on Intel you needed those bloody segment and offset pointers, which are just ridiculous and completely useless, and it was only 16-bit even then.

 

That's the same panic reaction what happens when stocks go down, or when the media says there is a global depression, then everyone acts accordingly and actually causes it.

 

I think my next CPU will be AMD too, as the last one was already a tough decision, and I ended up with Intel, but afterwards I thought AMD would actually have been a better choice, because it has more cores, and the programs which can utilize them, run then much faster than on an Intel.

Link to comment

I have been AMD for at least 10 years as well and have no intention of switching now or in the future. The quad core I have had for over a year has given me plenty of power at a fraction of the cost.

 

IMO, It was Steve Job's sole decision to dump Motorola and it was a marketing ploy, pure & simple. Intel spent millions on it's own marketing strategy. It became necessary to have a computer with "Intel inside" for Joe Consumer. The company that brought you the Pentium, The Core 2 Duo, and other fancy marketing gimmicks. Motorola had the latest 68..something, something that went in a PowerMAC.

 

For Apple to align itself with that behemoth meant a solid boost to its short-term market share. Their short sightedness has pretty much cost them the desktop market (Rosetta comes to mind). Not that they care, mind you because that diversion bought them enough time to turn apple into THE gadget company. Apple Computer, Inc. becomes Apple, Inc. and their Desktop OS will eventually fade into oblivion. It's sad too, because I feel OS X is vastly superior in many ways. It has the stability & security Windows lacks and the cohesiveness & unified branding the *nix community lacks.

 

It seems as though Intel is hell bent on destroying the PC market as thoroughly as they did with MACs. Let them just try to pry my AMD PC from my cold dead fingers :D

Link to comment
Guest
Add a comment...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...