16 Apr, 2008, David Haley wrote in the 41st comment:
Votes: 0
Also, pushing games so close to the hardware defeats the purpose of having the OS abstraction layer in the first place. Part of the reason why things evolved from the DOS way of doing things was that everybody had to write their own way of talking to the hardware, which is a huge waste of time. Sure, using something like DirectX means that you might incur a slight overhead versus talking directly to the hardware, but you're saving immeasurable amounts of developer time. Developer time is arguably worth quite a bit more than the relatively small overhead you're incurring.
16 Apr, 2008, Guest wrote in the 42nd comment:
Votes: 0
Sadly, in the DOS days, I recall games being far more stable, required only minor tweaks from time to time with boot disks, and generally didn't require 50 patches before they worked as advertised. This whole abstraction layer thing has made people lazy and careless with their code. Alot like me with mine :P
16 Apr, 2008, quixadhal wrote in the 43rd comment:
Votes: 0
How much developer time is spent lookup up the 5 billion API calls you have to wade through to get anything done under Windows and DirectX? How many different sets of code do you STILL have to write to ensure that your game runs under everything from Windows98 and DX7 to Vista64 and DX10?

The vast majority of development time in most games today is spent using custom engines and toolkits anyways. Most studios don't write their own engines, and most have to learn a new engine from one product to the next as well.

In any case, I'm not advocating running something as simple as DOS. I would instead suggest an embedded version of Windows (or OSX, or Linux, or whatever the developer is happy with) which runs in single-user mode with cooperative multitasking. That is, a program has to give up CPU cycles when it knows it can afford to do so, rather than having them wrenched away by the clock.

Pre-emptive multi-tasking was designed for mainframes where lots of users complained because Joe got his 6 hour batch job in ahead of everyone else's 5 minute jobs. It was written for a hostile environment which should NOT exist for a single user desktop machine.

IM and voice chat would be integrated into the game… in fact, it is headed that way now. As for trainers and cheats, I'd be happy if they went down in flames anyways.
16 Apr, 2008, Kjwah wrote in the 44th comment:
Votes: 0
Personally, if I had to start booting into my own game instance, I'd stick to games that used the "old" way of doing things. I do too many things to have to deal with just a game. Now if we're talking plug in the harddrive and play while still running linux/windows/whatever fine but then what kind of a performance hit do you take for that? It may work for some games but not all.
16 Apr, 2008, David Haley wrote in the 45th comment:
Votes: 0
If you think looking up one set of API calls is bad, you should try looking up API calls for all the hardware you want to target. The problem was a lot worse than it is today. One of the things that allowed so many people to produce hardware, and for it to "just work" for the most part, was the abstraction layer on top of the hardware. The fact that people spend their time on the engines, and not talking to the hardware, is proof enough IMO.

As for DOS being more stable, I have memories both ways. Some crashes would cause the entire system to die so miserably that the power needed to be pulled. Games were simpler back then too in terms of the number of resources used, so that probably helped. But on the flipside, the constrained environment caused people to be more careful, I think. Well, anyhow…

Quixadhal said:
It was written for a hostile environment which should NOT exist for a single user desktop machine.

While I agree that a game should be cooperatively multi-threaded, I disagree that a single-user machine should be. I run all kinds of processes concurrently that I want to run concurrently, not one at a time. I don't want my browser to have to be well-behaved enough to let my email client run from time to time while my IM program does its thing in the background and my compiler is happily compiling away. While the environment is not "hostile" like a multi-user environment, it is hostile in the sense that not all programs are necessarily completely well-behaved.

Preemptive multi-threading makes sense not really for multiple people, but for multiple processes – the former is a subset of the latter. A game is just one process and indeed it would (usually) make sense for it to have complete control over what runs when.
16 Apr, 2008, Guest wrote in the 46th comment:
Votes: 0
I actually kind of like quixadhal's idea of dedicated plugin drives. When I'm gaming I don't want the rest of the system spending time trying to interrupt that. I want to play, and I want the system to devote every last shred of power it has to that task. Windows currently does not allow for this. It instead insists on giving away my precious clock cycles to processes I often didn't want running anyway, along with giving them away to my AV software and other things that have to run to protect the PC.

Multiplayer games would obviously complicate this ideal of mine since that would require the internet link to be active and all the protection that goes with it. Which means the game would have to sacrifice precious clock cycles to other things to keep the whole mess afloat. Of course since nearly everyone who plays MMORPGs makes MUDders look like angels from heaven, I don't play multiplayer games :)

And despite the demonization of DOS and talking directly to the hardware, games were more stable back then. Hardware drivers were less prone to crashing. You didn't have to worry about whether or not the video card supported DX10 on the OS you were running. If it did, the game could talk to it ( or whatever, you get the idea ). Instead of dealing with direct hardware calls most companies had driver APIs that did just as well, if not better, than the mess we have now with games needing dozens of patches before they work. QA used to mean something back then. Now it just means Alpha is over and the public beta… er… release is ready.
16 Apr, 2008, David Haley wrote in the 47th comment:
Votes: 0
Samson said:
You didn't have to worry about whether or not the video card supported DX10 on the OS you were running. If it did, the game could talk to it ( or whatever, you get the idea )

Sure, you didn't have to worry about the card supporting DX10. You just had to worry about the game supporting your card. :wink:

Samson said:
Instead of dealing with direct hardware calls most companies had driver APIs that did just as well, if not better,

Perhaps the abstraction layer and the proliferation of card development are linked, no?

Samson said:
than the mess we have now with games needing dozens of patches before they work.

Game logic patches and issues unrelated to the hardware are not a factor of DOS vs. XP, or DirectX etc. For whatever reason, people patch games more today than they used to – perhaps because they can whereas before it was not practical in the slightest to distribute patches.

But the point is that these QA issues are not always related to the hardware side. So it's not fair to say that the DOS environment is somehow responsible for making the games more stable.

Talking directly to the hardware is (usually) a Bad Idea ™. That is why OSes exist in the first place. We've known for a long time that you don't want to talk to the hardware unless you really, really, really have to. That is the point of higher-level environments, higher-level languages (even C vs. assembly…), operating systems, libraries, etc. This fact really isn't being disputed by engineers these days.
16 Apr, 2008, quixadhal wrote in the 48th comment:
Votes: 0
DavidHaley said:
Quixadhal said:
It was written for a hostile environment which should NOT exist for a single user desktop machine.

While I agree that a game should be cooperatively multi-threaded, I disagree that a single-user machine should be. I run all kinds of processes concurrently that I want to run concurrently, not one at a time. I don't want my browser to have to be well-behaved enough to let my email client run from time to time while my IM program does its thing in the background and my compiler is happily compiling away. While the environment is not "hostile" like a multi-user environment, it is hostile in the sense that not all programs are necessarily completely well-behaved.

Preemptive multi-threading makes sense not really for multiple people, but for multiple processes – the former is a subset of the latter. A game is just one process and indeed it would (usually) make sense for it to have complete control over what runs when.


That's a common misconception. Co-operative multi-tasking does NOT mean only one application runs at a time (well, technically that happens for ALL multi-tasking unless you have multiple CPU's). It means the task scheduler does not yank control away from a process just because the wall clock ticked.

If you write your code properly, YOU know when the best time to give up control is. Assuming you actually DO that, any number of processes can run just as efficiently in co-operative multi-tasking as they can in pre-emptive multi-tasking. The problems come when lazy programmers don't yield control when they can, or (in the case of multi-user mainframes), they intentionally try to keep resources locked so their work finishes faster.

As for not wanting to reboot… well… if a game is properly immersive, you won't really want to be doing anything else while playing it. If you *have* to be doing something else, perhaps you should be doing that instead of playing the game, eh? :wink:
16 Apr, 2008, David Haley wrote in the 49th comment:
Votes: 0
Cooperative multi-tasking means that applications control when they release control. That means that if the application was not explicitly written to do that, only one thing will run at a time. Given that the vast majority of software does not do that – indeed, most threading libraries do not even give you a means of doing it! – it does not make sense to assume that such a schema would work now.

Just because you know when the best time for you to give up control is doesn't mean that is good for other people. I don't want my compiler to decide that the best time to give up control is in five minutes when I want to be checking my email and have IMs come in at the same time.

Sure, if everybody did it right, it would be fine. But that's a big if. If you're going to argue the point that multiple people requires preemptive multithreading for things to run correctly, then the same argument applies almost exactly to one person running multiple processes that need to happen more or less simultaneously. As I said, multiple people each with their process is really a subset of multiple processes.
17 Apr, 2008, quixadhal wrote in the 50th comment:
Votes: 0
Current threading libraries don't give you the option because they're written with the assumption that the kernel they will be running under will handle it. You can use nanosleep(), which will yield the CPU for at least the argument nanoseconds, and achieve almost the same effect.

Trying to argue that such an OS can't work because current software, written for a pre-emptive multi-tasking OS, won't play nice under such a scheme is just silly.

In any case, a co-operative multi-tasking kernel would work just fine for my original suggested purpose. You wouldn't be running "rogue" programs on your machine, because you'd boot the game drive and get the GameOS (whatever that might be). It might be a full WindowsXP installation with the kernel running in single-user mode so you still get your network stack, your video drivers, etc… just that the game executable would be your GUI and wouldn't let you launch anything outside its framework.
17 Apr, 2008, Justice wrote in the 51st comment:
Votes: 0
In most languages I've worked with, you simply call yield to signal that this is a good time for other processes to handle their work. As opposed to sleep which requires an argument. Most of these languages don't let you use nanos though.
17 Apr, 2008, David Haley wrote in the 52nd comment:
Votes: 0
There is still the problem of trusting the programs that you run to actually use the capabilities that exist. That is a large assumption. If you're assuming that for a multi-process desktop environment, I maintain that you have also solved the multi-user problem, so it's not really correct to be opposing them as such different situations.
17 Apr, 2008, Justice wrote in the 53rd comment:
Votes: 0
Taking into account that Quixadhal's entire idea is essentially a bootstrap for a game… I think by now we've established that there are 2 process models being discussed. And I don't see how you've added anything to "pre-emptive is automatic and has additional overhead". Okay, obviously when you're using a manual system, you have to trust the operator understands when to push in the clutch to shift gears before blowing out their transmission.

Lets consider for example that we have a computer that runs a very small specialized OS. We'll say it runs a MS based OS, that supports DirectX, we'll call this "imaginary" device… an XBOX. Now, lets assume that this box "boots" itself and then loads something from a disc…

Lets consider for example, that you put a bootable CD into your computer. Guess what? You can supply your own OS. It'd be a simple matter to reserve a section of hard drive space for configuration of this OS. I've seen versions of linux that will run straight off the install CD, so I don't think it'd be terribly difficult to bootstrap a game in this way. Of course, this still doesn't avoid the issues of hardware drivers, but if you're willing to not support the bleeding edge… many OS's come with effective drivers.

I think I've summarized everything said up to this point… anyone got something fresh, or will we be chasing our tails a little longer?
17 Apr, 2008, David Haley wrote in the 54th comment:
Votes: 0
I'm still not convinced that the overhead from preemptive scheduling of a game's threads is worth all of the trouble of bootstrapping the OS etc. Not having extraneous processes (anti-virus, etc.) might make it worth it, but really, I'm not sure that removing preemptive scheduling gains enough to make it worth all the time it would take to set this up for PCs. It's what consoles are for, as you pointed out.
17 Apr, 2008, quixadhal wrote in the 55th comment:
Votes: 0
DavidHaley said:
I'm still not convinced that the overhead from preemptive scheduling of a game's threads is worth all of the trouble of bootstrapping the OS etc. Not having extraneous processes (anti-virus, etc.) might make it worth it, but really, I'm not sure that removing preemptive scheduling gains enough to make it worth all the time it would take to set this up for PCs. It's what consoles are for, as you pointed out.


It doesn't take much time at all. In fact, I'd be willing to bet it takes MORE time to properly build and maintain an installshield setup that roots around your system looking for pre-requisites and figures out what to download and install, and where to put it.

I don't know what kind of console you have, but mine doesn't have a flat panel monitor with a decent resolution, nor does it have a keyboard and mouse set comfortably on a desk. It also doesn't have a 5.1 surround system attatched to it. It DOES have a network adapter and a hard drive. As such, it's very nice for playing Gauntlet Legends, and Destroy All Humans was pretty neat too. I'd be pretty unhappy trying to play EVE-Online, Civilization IV, or even Warcraft on it though.

The only problem with consoles is that if I add the monitor/keyboard/mouse and stick it on a desk… it's an awfully lot like my computer except I can't boot to "desktop" mode to get work done. My computer is also an awfully lot like a console except I can't run the games in single-user mode and can't disable all the junk that doesn't really need to be running while I'm playing the game.

The other part of my point was that if you can get the public to accept the idea of plugging in a bootable drive and booting from it to play a game, the developers are then free to use whatever they WANT to use. Some will use Windows and just embed it so they have all the same tools they have now. Others might use Linux. A few might even write their own micro-OS if they really wanted to do so. The end-user wouldn't have to know or care, so long as it ran and updated itself.
17 Apr, 2008, David Haley wrote in the 56th comment:
Votes: 0
I don't have a console at all. I don't really want one, actually; I'm somewhat opposed out of principle to them, I believe for similar reasons as you listed. The only thing that would push me to buy one would be a game not available on the PC.

As for time it would take to develop these things, I'm not sure what exactly you were referring to, but the idea of constructing an entire operating system to be based around cooperative multi-threading, with all applications needing to yield instead of be interrupted, sure seems like a lot of work to me…

Another issue with this bootable disk setup is that you would get slower access speeds to the data than you would for a local drive. You would need to copy things to the internal hard drive, but now you need to be able to talk to it in some common way.

Frankly, all of this is sounding like a lot of trouble for rather little gain in terms of speed. But I guess I've said that before and am starting to be repetitive so I will probably step out of the conversation at this point. :smile:
40.0/56