Shawn Hargreaves Blog
Those of you who have tried to render point sprites on Xbox will have noticed they don't work quite the same as on Windows.
We are working on updating the XNA documentation to cover the various Xbox HLSL extensions, but in the meantime, here's what you need to know to make point sprites work.
First a reminder of the things that are common to both platforms. To draw point sprites, you need to set the PointSpriteEnable renderstate to true. In your vertex shader you must write the sprite center position to the POSITION0 output semantic, and the sprite size to the PSIZE0 semantic.
On Windows, your pixel shader will typically take a float2 input parameter using the TEXCOORD0 semantic. But on Xbox, you should use the SPRITETEXCOORD semantic, and the parameter type is float4. The coordinate values you want will be in the Z and W fields, and may be negative. Crazy, huh?
Here is a typical point sprite pixel shader that will work the same on both Windows and Xbox:
float4 TexCoord : SPRITETEXCOORD;
float2 TexCoord : TEXCOORD0;
float4 Color : COLOR0;
float4 PixelShader(PS_INPUT input) : COLOR0
texCoord = abs(input.TexCoord.zw);
texCoord = input.TexCoord.xy;
return tex2D(Sampler, texCoord) * input.Color;
One final gotcha may occur if you are using dynamic vertex buffers to render point sprite particles. On Windows that is typically done using the Discard vertex buffer locking semantic, which is not supported on Xbox. The best way to render dynamic geometry on Xbox is using the GraphicsDevice.DrawUserPrimitives method (which works on Windows as well).
DrawUserPrimitives is typically considered a performance no-no on Windows. On the Xbox, I'd assume it is still slower than DrawPrimatives, correct?
I'm curious about this... are the User methods OK because of the 360's shared memory architecture?
On Xbox, user primitives are not only the fastest way to do this, they're the only way! Without a discard locking semantic, you just can't get any kind of performance from dynamic vertex buffers.
On Windows, user primitives are only slow because of crappy drivers. They're generally a lot faster now than they used to be a few years ago, anyway: fast enough that I wouldn't have any qualms about using them in a real game.
Does anybody know whether a GeForce 6600 GT card supports pointsprites or not? I cannot run XNA games that use them, and I have found no info confirming one thing or the other.
I can confirm point sprites work on GeForce 6600 GT.
Interesting, thanks for the clarifcation.
"I can confirm point sprites work on GeForce 6600 GT."
Me too, now!
I've found an old MDX 1 example that uses Point Sprites and everything runs just perfect.
Thanks Catalin ...
I don't suppose anyone has an example implementation of the above shader?
I want to implement point sprites in my game but I'm not sure how to do it.