but i hit another snag, my dad saw the electric bill from last month and he is pissed at the amount of power we had used. so i don't think i'll be able to run two computers at the same time right now.
Tell your Dad to turn off a friggin light bulb. A typical old computer, with no monitor and just a single hard drive, will use a whopping 100W or so of power continuously. That's one bright incandescent bulb, or two not-so-bright lamps. Your Dad is probably using more electricity sitting in front of the TV than a server is likely to use.
Yes, they do add up… but there are plenty of ways to cut your power consumption in other areas too. Do those first. :)
the cheapest apartment in my area is 900 a month with only heat included. water, power, internet, and cable are extra.
figuring that my income is only 950 a month after taxes, and my truck goes though $85 a month in gas (with going down hill in neutral and getting a discount for working at the gas station i fuel up at) i don't think i can afford to move out just yet.
and before you say get a better paying job, with this economy, i'm lucky to have the one i got.
The point he was making is that an old computer typically uses about as much power as a 100 watt light bulb. An added point I think he was getting at, and if not I will state now is that amount of power (and with that energy use and cost) depends on the power used (and time on) and not just if something is on (a nightlight is not the same thing as a water heater).
A watt is a unit of power, what you buy is energy. A watt is the rate of use of energy, so time is only a factor in energy, not power. Normally you buy in KWHs that is 1000 watts used for 1 hour (or 1 watt for 1000 hours, or 100 watts for 10 hours etc). In short, the old computer will use about as much energy as the 100 watt light bulb running the same amount of time.
A typical old computer, with no monitor and just a single hard drive, will use a whopping 100W or so of power continuously.
is that an hour?
No, that's continuous draw. Your power company charges by the kilowatt hour. To determine what the effect on your power bill will be, the formula is watts * hours per day * days per month / 1000 = kilowatt hours for one month, and then kilowatt hours for the month * price per kilowatt hour = total addition to power bill.
To use quix's example, if your server drew 100 watts (which is a high estimate for a barebones server; mine draws ~60ish):
100 watts * 24 hours per day * 30 days per month / 1000 = 72 kilowatt hours per month 72 kilowatt hours per month * $0.12 per kilowatt hour (national average, yours may vary) = $8.64 total addition to the power bill
That's to run the server 24/7/365, as well. If you're not actually hosting the game yet, and just want to use the box for development, then you can shut it off when you're done with it every day and cut this amount drastically.
19 Sep, 2010, quixadhal wrote in the 26th comment:
Votes: 0
Yep, and my point is that the amount of energy a server typically uses is far less than most people waste by leaving extra lights on, having the door open while they bring groceries in, or chat with someone who's leaving, while the AC or heat is on, firing up the oven just to cook one item (like a frozen pizza).
19 Sep, 2010, Ssolvarain wrote in the 27th comment:
Votes: 0
But trying to explain this to a grumpy old man is nearly impossible.
I used to wake up with all my stuff just unplugged.
Tell your Dad to turn off a friggin light bulb. A typical old computer, with no monitor and just a single hard drive, will use a whopping 100W or so of power continuously. That's one bright incandescent bulb, or two not-so-bright lamps. Your Dad is probably using more electricity sitting in front of the TV than a server is likely to use.
Yes, they do add up… but there are plenty of ways to cut your power consumption in other areas too. Do those first. :)