XNA Game Studio on Windows Phone

XNA Game Studio on Windows Phone

  • Comments 25

I set out to write an article about the unique features of XNA Game Studio on Windows Phone 7 Series, but while trying to populate my outline, I realized, wow, there actually aren’t many things different about Game Studio 4.0 on the phone compared to Windows and Xbox 360! That’s bad news for this article, but I think good news for our platform as a whole. With Game Studio 4.0, we made a concerted effort to increase portability and consistency across our target platforms.

In some areas it was obvious how to achieve consistency. For instance, Windows Phone 7 Series includes a Zune media client, so we could port our existing Zune media APIs. Input is similarly straightforward. If you only want to access a single touch point, you can use our existing Mouse API. If you need multitouch (the phone supports four simultaneous touches), you can use an API similar to what we previously shipped for Zune HD. This touch API works on Windows 7 PCs with multitouch displays, too.

Other areas, especially graphics, were more challenging to design. Had we just ported the Game Studio 3.1 graphics API, we would have been left with a confusing mess of non-overlapping caps that would make it hard to port code between Windows, Xbox, and the phone. But we didn’t want to force a lowest common denominator approach: it would make no sense to limit Xbox 360 games to only those features which are also available on mobile hardware!

Our solution was to take a long, hard look at our graphics API, tweaking, polishing, and refactoring to increase consistency wherever possible, and relying on a coarse bucketing of features into Reach and HiDef profiles for things that just weren’t possible on the phone. I am tremendously proud of the results. This tuning is valuable even for developers who are not targeting the phone, as it fixes common causes of error on all platforms, and helps make your game compatible across different Windows graphics cards.

The phone supports full hardware accelerated 3D, but we are not exposing programmable shaders in this release. Charlie Kindel summed up the reason for that in a great article about focus and priorities:

“We will do a few things and do them very, very well; we are better off not having a capability than doing it poorly. There are always future versions.”

Instead of programmable shaders, we augmented the existing BasicEffect with four new configurable effects: SkinnedEffect, EnvironmentMapEffect, DualTextureEffect, and AlphaTestEffect. These are designed to run efficiently on the mobile GPU hardware, and I think do a good job of providing enough flexibility for developers to create awesome looking games, while also meeting our goals of being able to ship a robust and well tested product on schedule.

The phone features an image scaler which allows games to render to any size backbuffer they like, and have it automatically stretched to fill the display, with black bars along the edges if the backbuffer and display have different aspect ratios (an idea that will be familiar to Xbox developers). This scaling is handled by dedicated hardware, so does not consume any GPU resources, and it uses a high quality image filter that gives much better results than bilinear filtering like you would get if you did this yourself on the GPU. The scaler is important for two reasons:

  • At launch, all phones will have a 480x800 (WVGA) display resolution, but we will add 320x480 (HVGA) in a future update. Of course you can detect the native resolution and program your game to adapt to this if you want, but the scaler allows games to pick just one resolution, always render at that fixed size, and still run correctly on phones with different native screen sizes. For bonus points, we automatically scale touch input to match your chosen resolution. 

  • 480x800 is a lot of pixels! This is a great resolution for displaying text, browsing the web, etc, but it can be a challenge for intensive 3D games to render so much data at a good framerate. To boost performance, some games may prefer to render at a lower resolution, then scale up to fill the display.

We also implemented an automatic rotation feature, so (unlike Zune) you don’t have to write special code to handle portrait, landscape left, and landscape right modes. Just tell us which way up you want to be, and we’ll adjust your graphics rendering and touch input accordingly. This is implemented via special magic in the graphics driver, so there is no performance penalty from choosing a rotated orientation.

When I think about Game Studio on the phone, a recurring theme is that we did a lot of furious padding beneath the water, in order to ship an API that glides smoothly over the surface of the lake. Quoting Charlie Kindel again:

“We will build on the shoulders of giants; where possible integrate instead of create.”

(that’s XNA he is talking about there! He called XNA a giant!!! tee hee…  :-)

In addition to the XNA Framework itself, we integrated existing pieces of awesomeness such as the Direct3D runtime from Windows. But there were also many places where we had to build large and complex things from scratch. For instance, in cooperation with our hardware partners we created an entirely new graphics driver stack, optimized from the ground up for mobile GPU hardware. The strange thing for me is that, while I plan on writing many articles about the API improvements and new features, I probably won’t be talking much about the hard implementation problems I have been working on this last year. Our goal was not only to solve these problems, but to make them go away so thoroughly our customers need never know they existed in the first place.

Did we succeed? Ultimately, you will be the judge of that.

I had a great experience yesterday, working on the demo for my GDC talk. I coded most of it last Thursday, using our Windows framework, with mouse input to switch between the five built-in effects. When I moved this code over to the phone, it “just worked” the first time I tried it on an actual device. Yesterday I found myself with a couple more spare hours, but I had the wrong build on my phone at the time, so I went back to the Windows version of the demo, blinging it up with rendertarget transition effects. Once again, this updated version ran exactly the same on the phone as it did on Windows, which made me feel pretty good!

I can’t wait to show my demo at GDC this week, and later on to show you all the things we have been building so you can try them out yourself.

  • Are the new shaders (SkinnedEffect, EnvironmentMapEffect, DualTextureEffect, and AlphaTestEffect) designed in such a way that they can be easily combined with the BasicEffect and/or each other?

    ps. Is such an idea even possible with the current framework?

  • > Are the new shaders (SkinnedEffect, EnvironmentMapEffect, DualTextureEffect, and AlphaTestEffect) designed in such a way that they can be easily combined with the BasicEffect and/or each other?

    It depends what you mean by "combined". You can only use one effect at a time when rendering graphics, but these effects are designed to combine well when used together in the same scene, both from a coding perspective (we provide things like a standard interface for setting the projection matrices regardless of the effect type), and also visually (for instance all 5 use the same fog computations, and the ones that support lighting all use the same lighting model).

  • WVGA 800x480 has a 5:3 ratio whereas HVGA 480x320 has a 3:2 ratio. How could automatic scaling be achieved? Games and applications will be stretched when downnscaled...

  • > WVGA 800x480 has a 5:3 ratio whereas HVGA 480x320 has a 3:2 ratio. How could automatic scaling be achieved? Games and applications will be stretched when downnscaled...

    Games will be letterboxed (with black bars along the edge) if their backbuffer and display have different aspect ratios.

  • All you have to do to get to market and be PAR with the NATIVE development on Android, mobile OS X, and webOS, is to support an industry standards API, OpenGL ES 2.0 first, then XNA.

    Not only is it already implemented by ATI/Qualcomm, but you would be left with only the work to integrate it into the new OS's display stack while giving everybody custom shaders from day Negative One (because we can write/port the graphics engine now, and OS bindings later).

    Instead, you guys picked the mobile DirectX route which leaves you behind on schedule (giving us another shoddy product like mobile DirectX), immature and unproven feature set (because you had to upgrade your non-forward looking API), and running on performance sapping .NET.

  • "Just tell us which way up you want to be, and we’ll adjust your graphics rendering and touch input accordingly."

    How do you go about setting the orientation? I have been digging around and must have skipped over it.

  • > How do you go about setting the orientation? I have been digging around and must have skipped over it.

    The auto-orientation functionality is not supported in the MIX CTP release.

  • Where are the SkinnedEffect, EnvironmentMapEffect, DualTextureEffect, and AlphaTestEffect documented?  I'm having a hard time finding out information about these.

    Another approach if a custom shader is the main sticking point would be to use SilverLight since it appears that custom shaders are supported for Windows Phone 7 in WPF.  

  • I'm finding the documentation in the MSDN now for the new effects.  Maybe I was doing something wrong before.  

    While I would love to be able to use my own custom shaders, I don't think this is day one killer.  Probably very few phone apps actually use custom shaders.  

    However, that being said, they need to be in there someday soon I hope.   I would also like to see SilverLight on the XBox!

  • The MSDN XNA documentation really needs some work!

Page 2 of 2 (25 items) 12
Leave a Comment
  • Please add 4 and 1 and type the answer here:
  • Post