09 Jun, 2007, Conner wrote in the 41st comment:
Votes: 0
So, without resorting to switiching to Mac, who do you feel is worth switching to?

I know I've certainly had pretty bad experiences with HP/Compaq and e-Machine lately. What do you think of Toshiba?
09 Jun, 2007, Dorian wrote in the 42nd comment:
Votes: 0
bunabiros said:
If the gaming market ever moves away from DirectX we'll see a huge abandonment of Windows OS, but until then it's doubtful we'll see anything of the sort.


Gaming? I would think it would be the business world by far that would cause a major shift in the most popular OS in the world. They hinge on the necessity for backwards compatibility and easy revision of old software. I distinctly remember a recent talk I attended with Federated Insurance, Inc. They recently did a six year redesign of their banking software. Their old software was in production use for 30 years! They didn't even change for the reason that it was 30 years old. They did it because the language they used (name escapes me now, not a popular one) had no one updating the language and compiler anymore.

No, it is likely the business world being the driving force behind Microsoft. For all the crap that we give it, it does do a pretty damn good job of being able for businesses to run their legacy apps way beyond intended use (whether that is good or bad, I'll let you be the judge). As long as Microsoft can do that, businesses will be locked into Windows.

Gaming is a large industry, but hardly a driving force. I would think most people would turn to consoles than another operating system if Microsoft were to drop Direct X.
09 Jun, 2007, Brinson wrote in the 43rd comment:
Votes: 0
Comparing IBMs to Dells are like comparing Fine French wine to a beer. ;)

IBMs are, of course, better than pretty much every other company in my (limited) experience.

I got my dell e1405 for half the price of an IBM at the time and am pretty happy. Never had a problem with it. I don't game, but I do some high processor things like video editting and the occasional compiling.

I wanted to buy a thinkpad…but wasn't willing to spend that much more.

I've always thought that the way a computer is treated makes a bigger difference in its usability than anything else. For years I ran an EMACHINES (piece of shit, actually, the one with the faulty power supply, had to replace it) celeron 400mhz with like 256 megs of ram. I would go over to my friend's house…and be annoyed at the slowness of their pc…then I'd check their specs and it'd be like a pentium 4…, then open their start bar to be spammed by pages of useless apps…
09 Jun, 2007, Guest wrote in the 44th comment:
Votes: 0
Dorian,

Gaming has more of an influence on the evolution of hardware and processing power than anything else. For the same reasons you cite for software, business is not in the habit of picking up new PCs every 18 months the way avid gamers are. If it weren't for Microsoft constantly releasing operating systems that cater to gamers and therefore require more juice to run, we'd probably all still be using underpowered CPUs because business would prefer the platform remain stable.

However, the reason business AND gaming both stick to Windows is because that's what was literally forced down everyone's throats in the 1990s. Microsoft managed to attain monopoly status and wielded that power to craft some very unfair OEM contracts in which they forced them to pay for Windows licenses even if they didn't install it to the PCs they sold. So rather than waste money, they all pre-installed Windows and would only change that if the user put up enough of a fuss ahead of time. Windows dependence was not because it was what users wanted, or what the industry wanted, or what business wanted. It's what Bill Gates wanted and he abused his monopoly status to make it happen. By the time the DoJ took action to remedy this, it was far too late to undo the damage. So we're stuck with it now. The only way it's going to change is if there's a mass exodus to some other platform other than the PC, and right now that's the Mac. Vista just might be the trigger for this since Microsoft has become too big and bulky to change, and their forced dominance has actually left them without the ability to do so anyway. Strangled by their own greed.

Brinson,

You should have gone with the Thinkpad. Much more reliable and stable platform. The only problems we've had from those in the field is when the users abuse them. The Dells would fail at a far greater rate under the same abuse, but just as often would fail without any real reason other than substandard parts. Our corporate machines aren't doing anything especially intense. Other than running SAP perhaps. But we had plenty of problems when it was just the AS/400 clients and the standard Office apps. The machines are not being physically or environmentally abused.
09 Jun, 2007, bunabiros wrote in the 45th comment:
Votes: 0
Samson is right, and none of Dorian's arguments mention anything about how much microsoft would in fact suffer if they lost the game development industry. Sure, they're still the driving force in the bussiness world, but all that is changing, while gaming is not. Online office collaboration suites, cms software, courseware, things like google apps, plus the portability of OpenOffice, thin linux clients, and the fact that things like file compatibility are not 10% of the problem they used to be… it all spells change in the bussiness arena. To maintain it's supremacy, Microsoft is having to try harder and harder. The disaster that Vista's been has not helped much.

The gaming industry, on the other hand, is still where it was 10 years ago. The majority of games (almost all the big hits) are released for the PC and lazily ported to some other platform if it's worth the investment. Ever talked to a group of game programmers? The LOVE microsoft more than any IT Consultant I've ever met. They only know how to talk in terms of visual studio, directx, etc… the tools of their trade!
09 Jun, 2007, Kayle wrote in the 46th comment:
Votes: 0
On Dells:

Dell is a proprietary developer, and 90% of the time, just opening your case and upgrading even the smallest thing will void your warranty, not to mention, they show you these really nice case models, and then tell you there's one thing in it, but when you actually get it, you find out there's about 1/4 of what they told you in it. For example, The description of my latest PC from Dell said that It had a AMD Turion 64 Dual Core, with 2G of RAM, and 1G of Video RAM on a GeForce 7800. What it actually has, is a P4, with 512Mb of RAM, and some cheesy ass half integrated Video Card that doesn't even HAVE Video RAM.

So What did I do about this? Nothing. I'm just waiting it out, utilizing this POS as best I can until I can afford to hire a lawyer to file suit against them for false advertisement. And then I'll get me a set of 4 Top of the line towers, two to play Web server with.. and two to turn into gaming machines. >.> meanwhile this POS will become strictly work related BS. >.>

On what really controls OS development and where OS's head:

It's Gaming, More people buy new computers sooner for gaming than business, a business computer can run on a Pentium 3, a Multimedia Computer (one used for visual effects in movies or the like) can still function rather well on a P4, but Gaming… gaming required cutting edge. Every new game that comes out is designed to push the limits of every aspect of the PC, it has to have better graphics then the last game to come out, it has to run faster, smoother, and more efficiently. The bigger and badder these games get, the bigger and badder computers need to get. And when the computer gets bigger and badder, the OS needs to get bigger and badder. How many businesses look at a computer and say, "No way. There's not a chance in hell that P4 can run Microsoft Word fast enough for our people to work." They won't, because Word isn't processor intensive. World of Warcraft is processor intensive. The business world can still run on computers that are "out-dated" by gaming standards, hell, a lot of companies still do.

Gaming drives the need for PC's to evolve, and when PC's evolve, OS's evolve, because the PC's get bigger, and the OS needs to be adapted to handle the new level of power that PC's have attained. And as for Gaming switching away from DirectX, I don't see that happening anytime soon.
10 Jun, 2007, Brinson wrote in the 47th comment:
Votes: 0
Did you contact Dell about it?

I imagine if it were that different from what you ordered, they'd replace it…
10 Jun, 2007, Guest wrote in the 48th comment:
Votes: 0
I've heard of Dell doing that to lots of people. They usually hope you won't notice it's not what you got. They're less likely to try that with a corporate customer because corporations tend to have very good legal teams at the ready. But for them to give you something so completely out of spec? You should at least call them and let you know. Someone out there got your system and is rather happy with what they received :P

As far as legal action, if you make no effort at all to resolve your situation before getting to court you could be in for a surprise. If the judge is feeling particularly unhappy that day, you might find your case dismissed or your damage award severely lacking if you don't at least try to resolve it on your own first. If you call Dell, explain it, and still get fucked, then your case has considerably more weight to it. Especially if you can document it.
10 Jun, 2007, Brinson wrote in the 49th comment:
Votes: 0
Its possible the judge would not even hear the case if he knew you hadn't contacted the company…

If you don't realize the absence of some serious ram or a couple ghz in power then you might deserve to be screwed :-p
11 Jun, 2007, Dorian wrote in the 50th comment:
Votes: 0
Samson said:
Gaming has more of an influence on the evolution of hardware and processing power than anything else.


Last time I checked, businesses do far more than just word processing, which seems to be the only thing you assume that computers are good for in the business world. For example, crash simulations in the auto industry put measly games to shame. Modeling of proteins during drug discovery is one of the most difficult practical problems known to mankind. Online shopping has become a huge phenomenon, and a huge headache. Business needs and interests are far more diverse than gaming is or ever will be. Do you really think that Windows and Intel are concerned with a small (large in their own right, but small in the overall context) gaming industry or large corporations like Amazon and Ebay which distribute massive amounts of products needing reliable, fast components on both sides of the connection?

Samson said:
For the same reasons you cite for software, business is not in the habit of picking up new PCs every 18 months the way avid gamers are.


And? This shows nothing. A business will buy a new PC when the problem scope becomes more difficult. Businesses may certainly have a lot of areas where this problem scope doesn't change much over the course of five years. On the other hand, some areas change rapidly to stay with the cutting-edge.

Samson said:
If it weren't for Microsoft constantly releasing operating systems that cater to gamers and therefore require more juice to run, we'd probably all still be using underpowered CPUs because business would prefer the platform remain stable.


Again, this implies that businesses do nothing more than word processing and that their task never changes. Making such a generalization hardly does justice to the wide scope of what is done in the business world.

bunabiros said:
… none of Dorian's arguments mention anything about how much microsoft would in fact suffer if they lost the game development industry.


Because I was trying to point out that business is a far larger part of their market, like you seem to have missed? Not saying they wouldn't suffer, but it would be laughable to suggest they then would be more important than the businesses.

Anyway, I don't even think that Microsoft would suffer too much. Like I said, many would turn to other systems, including X-box. Even if they did that, they would still likely have a computer, which still would likely be Windows. I know plenty of gamers who could not easily make the Linux jump. Some might make the Mac jump, but not all. Even if Microsoft dropped gaming from their platform, I would be surprised to see a hit in their revenues because of the necessity of having a computer today. Unless Microsoft makes money directly from Direct X being used, which I don't think is the case, the real losers would just be the gaming industry.

bunabiros said:
Sure, they're still the driving force in the bussiness world, but all that is changing, while gaming is not. Online office collaboration suites, cms software, courseware, things like google apps, plus the portability of OpenOffice, thin linux clients, and the fact that things like file compatibility are not 10% of the problem they used to be… it all spells change in the bussiness arena. To maintain it's supremacy, Microsoft is having to try harder and harder. The disaster that Vista's been has not helped much.


Wow. You're so right. Let me go write my boss a memo of impending change with my Microsoft Word while I listen to tunes through Windows Media Player and surf the web with Internet Explorer!

Sarcasm aside, change is inevitable. Regardless, don't you think that the need for Microsoft to try harder in the business world means that is where they are more focused in their efforts? Are you going to tell me that gaming is a larger market share than business is?

Kayle said:
It's Gaming, More people buy new computers sooner for gaming than business, a business computer can run on a Pentium 3, a Multimedia Computer (one used for visual effects in movies or the like) can still function rather well on a P4, but Gaming… gaming required cutting edge.


Right. It's not like companies like Google have ever needed a cutting-edge infrastructure.

Kayle said:
Every new game that comes out is designed to push the limits of every aspect of the PC, it has to have better graphics then the last game to come out, it has to run faster, smoother, and more efficiently. The bigger and badder these games get, the bigger and badder computers need to get.


Please. At best, games attempt to target high-mid of what computers are going to be like. Do you really think that games are developed in a vacuum? No! Games are developed according to how the industry is at the time and where it is going. It merely follows the flow. Why in the world would such a fraction of the entire would of computers with not even the most amount of money dictate how the industry is going to develop?

Kayle said:
How many businesses look at a computer and say, "No way. There's not a chance in hell that P4 can run Microsoft Word fast enough for our people to work." They won't, because Word isn't processor intensive. World of Warcraft is processor intensive. The business world can still run on computers that are "out-dated" by gaming standards, hell, a lot of companies still do.


Maybe because not all aspects of business is word processing? Sorry, World of Warcraft is hardly intensive compared to running perfect alignment queries with DNA sequences. Word of Warcraft can run on most regular PCs. You need one hell of a supercomputer to do the alignment, and even then it isn't necessarily perfect.
11 Jun, 2007, Scandum wrote in the 51st comment:
Votes: 0
Running Firefox takes between 50 and 100 MB of RAM with the occasional slow moment when my CPU gets stressed out.

Now it's well possible that Firefox is lousy software, but even small software like a modern MUD client with a scroll back buffer and other fashionable features will end up using around 4 MB, which would have been impossible to run for most people 12 years ago. For many Firefox would have been an impossible application to run 6 years ago when many still had 64MB of RAM.

I'd guess the maturing of software and operating systems is one of the biggest drives behind hardware development. Most computers sold in stores don't have the graphical cards required to play the latest batch of games anyways, which implies that most people don't play them.

Hardware requirements will level out eventually though, like the power of the average car engine hasn't changed in the past 20 years. I estimate that 4 Gigs of memory with a quad core 4 ghz cpu, and a terrabyte disk will do the trick for most people. After that the reliability of hardware and speed of hard disks will become increasingly important.

If hardware development stagnates companies like Microsoft will find themselves out of business, that might be an interesting development.
11 Jun, 2007, Brinson wrote in the 52nd comment:
Votes: 0
Holy Shit…50 to 100 megs to run firefox?
11 Jun, 2007, bunabiros wrote in the 53rd comment:
Votes: 0
Dorian said:
Sarcasm aside, change is inevitable. Regardless, don't you think that the need for Microsoft to try harder in the business world means that is where they are more focused in their efforts? Are you going to tell me that gaming is a larger market share than business is?

No. I'm telling you that Microsoft is slaughtering the competition in the gaming market, and slowly losing in the business market, and that Vista is only exaggerating both of these trends.

Also with regard to dna modeling and advanced stuff like that; I've never read of anyone using windows for this. But I've frequently read/heard about companies and universities that use NIX and even Apple for this stuff… just my own experience.
11 Jun, 2007, Caius wrote in the 54th comment:
Votes: 0
Scandum said:
Hardware requirements will level out eventually though, like the power of the average car engine hasn't changed in the past 20 years.

That may not be a perfectly valid allegory though. Speed limits and a general improvement of the quality of roads in most countries makes an overly powerful car engine pointless. Computer hardware doesn't have these limitations.
11 Jun, 2007, Scandum wrote in the 55th comment:
Votes: 0
Caius said:
That may not be a perfectly valid allegory though. Speed limits and a general improvement of the quality of roads in most countries makes an overly powerful car engine pointless. Computer hardware doesn't have these limitations.

There's at least one limitation commonly known as the energy bill.

Another limit is the human eye and brain, which will be quite pleased with video rendering capabilities that will soon be reached.

A good example of this limit is audio hardware, which has already come to a standstill. I'm sure there's still room for improvement, but I know few people who would buy a 500$ audio card that is 2.3% better than a 50$ audio card.
11 Jun, 2007, Guest wrote in the 56th comment:
Votes: 0
@Dorian

Wow. Ok. So apparently your world view is a bit limited because the last I checked, most of the industries you cite as examples are not using Windows computers to do the simulation work you describe. There's a reason RISC processors and other ultra-high end hardware exists. They tend to have customized operating systems for the bulk of what they do. That doesn't mean the clients looking in on the data aren't using Windows, but you certainly aren't doing the actual work in it. These are not what people think of when the words "business computing" come up. Business computing is usually taken to mean the average office slave who uses Word to write documents. Excel to build spreadsheets, and Powerpoint to throw it all together for their next big presentation for the big boss. None of this requires an especially powerful computer because these types of apps don't typically do anything when the user is idle.

Games on the other hand generally have a lot of background jobs to process while the user is idle. And often times the user will find that idling gets them killed, so they tend to avoid this unless they have no choice. Therefore it is in fact safe to say that WoW is more processor intensive than having Word, Excel, and Powerpoint all open at the same time. Since games are generally very visually intense the quality of the CPU, RAM, and graphics hardware is important for a good play experience. So this tends to drive the demand for higher and higher powered cards. Do you honestly think nVidia and ATI are marketing their entire lineup to automotive crash simulators or DNA folding? their entire business depends on a market you've dismissed as insignificant. AMD and Intel both put things into their CPUs to optimize gaming performance. Why would they do this if the market size wasn't large enough to matter?

@Scandum and Brinson:

I bet if you took all of the components involved in running IE you'd find that the resource usage is much MUCH higher than 50MB. You can't rely solely on what task manager tells you because it'll only report on what the iexplore.exe process uses in IE. And I've seen even just that get pretty high. It all depends on what you do with the browser. Firefox isn't able to capitalize on how IE uses the system so its entire process is visible. Then again, I use Firefox because it doesn't rely on the IE components to run :)

I've also seen mud clients consume vast amounts of resources. When I still used zMud, it would regularly consume as much as 100MB. I never figured out why and Zugg insisted I was lying when I tried to report it. Even went so far as to accuse me of doctoring the screenshots of task manager. As if I care enough to waste that kind of time on a bug report. Several other mud clients aren't much better.

I think far too many application and game developers are caught in the "well everyone has 2GB" mentality. There's really no reason for it. We'd all be much happier if people paid attention to resource optimization and stopped writing bloatware.
13 Jun, 2007, mann_jess wrote in the 57th comment:
Votes: 0
Quote
Hardware requirements will level out eventually though

That's been said before… Many times. The issue that you have to take into account is that, as computer "ability" increases, the things a computer do will increase. Simply put, this means that new things will be created and utilized that we can't prepare for now. There was a time that no one could have envisioned Photoshop, and the same is true now for other applications that will one day be commonplace.

Anyway, I'm rather surprised that no one has mentioned Linux as a solution to the Vista thing. It's been brought up, and seemingly dismissed a few times. I'm curious, why does everyone feel that way?

From my personal experience, I just recently moved my laptop over to Ubuntu, and initially I figured I'd do a dual boot of XP while I was at it. Looking at it solely from a "computer illiterate standpoint", (as much as I can, anyway), the Ubuntu installation was infinitely easier than XP, and frankly, after installation, I didn't once *have to* use the command line for anything. (I did anyway, but that's beside the point). On the other hand, XP was riddled with configuration options, and then had problems on top of it all.

Personally, I think Linux is really coming up fast. Excepting some really odd/outdated hardware compatibilities, I know the most recent Ubuntu and Fedora run out-of-the-box with GUIs for everything. Honestly, I think the next couple releases of a few choice Linux distributions are going to really pull (at least some of) the market over… especially with Vista being so disappointing.

Best of Luck,
-Jess
13 Jun, 2007, Scandum wrote in the 58th comment:
Votes: 0
mann_jess said:
Quote
Hardware requirements will level out eventually though

That's been said before… Many times. The issue that you have to take into account is that, as computer "ability" increases, the things a computer do will increase. Simply put, this means that new things will be created and utilized that we can't prepare for now. There was a time that no one could have envisioned Photoshop, and the same is true now for other applications that will one day be commonplace.

Not necessarily. There's a significant limitation to the complexity of the software that can be developed. OOP hasn't changed that one bit.
13 Jun, 2007, Guest wrote in the 59th comment:
Votes: 0
mann_jess said:
Anyway, I'm rather surprised that no one has mentioned Linux as a solution to the Vista thing. It's been brought up, and seemingly dismissed a few times. I'm curious, why does everyone feel that way?


When recommending new systems to people I tend to avoid prodding them in the direction of something that still requires elevated tech knowledge to use. As good as Ubuntu and Fedora have become lately, they're still linux, and many things are still approached from the geek standpoint. Like program installs. I don't like doing the whole "configure/make/make install" crap. How am I supposed to convince a new user to like it? :P

People want to use the computer. Not spend half their time working on the computer. I had hoped by now linux in general would have woken up to this fact but I don't think it ever will. Kernel got upgraded during a system update? Oops. Need new drivers. Drivers require you to compile from source? Ugh. I think this is probably the major reason you won't see linux being widely adopted on the dektop for some time to come. So in the meantime that leaves people with Windows PCs or Macs.
13 Jun, 2007, mann_jess wrote in the 60th comment:
Votes: 0
Samson:
Actually, I haven't had to do any of that on my Ubuntu install, which is particularly why I think it's making progress. There have only been a few times I've had to install from source, and that was because I was doing something "techy" anyway. Have you used Ubuntu recently? There's a GUI frontend to "aptitude" called "Synaptic Package Manager", which handles everything. It searches every repository for available packages, and when requested, downloads a package, checks and installs all dependencies, and does any configuration. Plus, then, it also puts it in the right place, and does all of that without user involvement. That's actually *easier* from a non-techy standpoint than windows installs.

That said, I did mention Fedora, but I haven't fooled around with it all that recently. Last I checked, it broke pretty consistently due to the vast number of unstable/unconfigured updates. In fact, the first time I installed it, the "new release" was so new that the repositories weren't even enabled, which prevented me from obtaining *any* software, which prevented me (due to another issue) from using the OS at all. I've heard that's improved a bit though, which is why I'm sort of casually watching it.

Honestly though, based on my experience with Ubuntu 7, I've actually been just on the verge of recommending it to a few non-techy people, including my family. I mean, the fact that there is a GUI for everything, alone, I think makes it worth a shot as opposed to Vista.

*shrug*

Quote
Not necessarily. There's a significant limitation to the complexity of the software that can be developed. OOP hasn't changed that one bit.

It's not really a matter of OOP, and yes, *logically speaking*, there is a limit to the complexity of anything. However, that limit isn't a realistic one. Photoshop can *always* run faster, and process bigger images. A program can *always* use more memory to make it just a little bit better. Alot of innovation has come from the gaming industry over the years, and if you look just at that, we aren't anywhere *near* having a virtual world which is indistinguishable from reality. However, the gaming industry is pursuing that, and I guarantee you, hardware will improve to accommodate it.

Best of Luck,
-Jess
40.0/66