Sign in to answer a question

1 Answer from this member:

are a measure of the rate of electrical energy being used (consumed or released). The amount of energy depends on the voltage and the resulting current. In fact, to calculate watts, you simply multiply the voltage times the current. For example, if you had a 120-volt force and a 5-amp current, you would have a 600-watt

rate

of energy consumption.

When you have 1,000-watts, that's equal to

1 kilowatt (kW)

. Five thousand watts equals 5 kilowatts (kW).

We pay for the electricity based on the amount we consume. The

amount

we consume is measured by taking the

rate

at which we consume it and keeping track of how long we consume it.

We pay for electricity in units called

kilowatt-hours (kWh)

. If you use electricity at the rate of 1,000 W for one hour, you will have consumed 1 kilowatt-hour (kWh) of energy. This may cost $0.10 from your electrical utility (this varies dramatically depending on where you are).

Remember how your parents used to tell you to turn off the lights? Let's figure out what it costs to leave a 60-watt light bulb on for a week. That 60-watts equals (60/1000) .06 kW, so our rate of consumption is .06 kW. There are 168 hours in a week. We've consumed (.06 x 168) 10.08 kWh by leaving one 60-watt light bulb on for one week.

We said that a kWh may cost $0.10. Therefore it costs about $0.10 x 10.08 kWh or $1.00 to leave a 60-watt bulb on for a week. If you left the light bulb on for full year it might cost about $52. Were your parents overreacting? On the other hand, if you left all your lights in the house on, that might be 3,000- watts (let's assume 50 light bulbs rated at 60-watts). This is 50 times as much electricity and now the cost for the year is $50 x 50 equals $2,500! Maybe we should turn those lights off.

0 votes

Didn't find what you are looking for?
Ask a question