?

Log in

No account? Create an account
To my joy and delight, Spectre runs on my girlfriend's new laptop.… - Spectre Gruedorf Challenge Blog [entries|archive|friends|userinfo]
Spectre Gruedorf Challenge Blog

[ userinfo | livejournal userinfo ]
[ archive | journal archive ]

[Feb. 23rd, 2008|01:35 am]
Spectre Gruedorf Challenge Blog

spectreblog

[mad_and_crazy]
To my joy and delight, Spectre runs on my girlfriend's new laptop. Accordingly, I've been working on Spectre this weekend instead of the stuff I should have been working on (math; Dredmor.) The good news: Spectre runs under unextended OpenGL - including my mad half-baked 3D terrain engine. You just get no special effects, but that's good for the casual gaming sector.

Question: how much eye candy is my 'minimum spec' machine entitled to? Discuss.

Anyhow, the good news is that I've spent the weekend hacking on unexciting, graphically unintensive things for Spectre. In particular, I've been playing with the hot-editing code I mentioned to a few of you. The idea is like Cryengine's Sandbox: at any time you can switch between SLED and the game to inspect gameplay data and/or edit stuff. Essentially this means that the level loader is pointed at the internal representations of data seen in SLED as opposed to the compiled binary versions that you get from files. This sucks to a certain degree, as:

a) the compiled binary version has 'additional stuff' done to it (tri-stripping, batching, texture atlasing, assorted other crap) that the XML version doesn't have (because it needs to be editable), and
b) it's not 100% WYSIWYG.

However, expecting to actually hot-edit compiled data is unrealistic at best. Anyhow, you can create a level with bricks, and then edit the location of bricks in SLED now: whee!

For those who still want a preview build, the plan is as follows: complete the support for sectors/portals, get enough of the hot-editing done that it's cool, and throw together a little interior level with a camera path of some sort. That seems like it would be cool enough.

Other stuff done recently: Ruby scripting is now callback driven. That was a mountain of suckage that I am saving for later chronicling at a future date.

incidentally: nothings, I second Casey's request for demo and slides from your MT talk. I'm interested to see how it compares to my work. Something I've been playing with lately is the tools side of things: how do you produce unique texturing data for an entire, indoor, level? Discuss.
LinkReply

Comments:
[User Picture]From: nothings
2008-02-23 09:34 pm (UTC)
I'm in the middle of uploading a video of another take of the talk (which will take a few hours to finish). The demo and source will follow.
(Reply) (Thread)
[User Picture]From: nothings
2008-02-24 12:03 am (UTC)
(Reply) (Parent) (Thread)
[User Picture]From: mad_and_crazy
2008-02-24 01:47 am (UTC)
Interesting - actually, your implementation is very different from mine (although possibly that is just coding styles.) One thing I did notice: you're using ddx/ddy, and these apparently just aren't supported on some ATI cards - R300 through R420, although I'm not even sure if they're supported on my current video card. Maybe I'm missing some sort of Exciting GLSL flag. So on those cards you do need to fake it by using a miplevel-packed texture, which is okay because that works fine.
(Reply) (Parent) (Thread)
[User Picture]From: nothings
2008-02-24 01:59 am (UTC)
It only worked on one of the three pieces of hardware I have, and specifically not on the one I developed it on. I only added the gradient stuff specifically for the talk, so people could be looking forward to future hardware more explicitly. (I don't know what the 360 and PS3 support, in fact, which was presumably a large percentage of my audience.)

The GF 7300 supports GLSL, but not ddx()/ddy() and gradients. Those are all part of an extension. I do an appropriate #ifdef in my GLSL code and use the gradients only if they're available.

Unfortunately, that extension adds a HUGE set of features to GLSL, so there's probably hardware that supports partials, but not the rest, so there's NO WAY to access those functions in GLSL on those cards!

For writing ARB_fragment_program, there's an NVIDIA extension that provides derivatives and a lot less other functionality. It might even be available on the 7300, I'm not sure, I didn't bother trying to use it. (It's basically shameful that there is hardware out there using finite differencing for dependent texture partials, and they don't let you hand them explicit derivatives instead, or at least an exact mipmap level, only a bias. It's shameful. That lack of orthogonality is just insane. If you could at least specify the mipmap level explicitly, there are other things you could do.

GLSL extension
Nvidia fragment shader extension

ATI doesn't seem to have any corresponding extension.



(Reply) (Parent) (Thread)
[User Picture]From: mad_and_crazy
2008-02-24 02:03 am (UTC)
Yeah - I guess better handling of the mipmapping would really be the nice IHV feature to have (isn't that what Ignacio asked?) The implementation I have w/o mipmapping is elegant, but the mipmapping turns a lot of things into a trainwreck...
(Reply) (Parent) (Thread)
[User Picture]From: nothings
2008-02-24 02:11 am (UTC)
(Reply) (Parent) (Thread)