Electricity used at any moment is measured in watts. It only makes sense that a 100-watt bulb uses 100 watts of electricity. A desktop computer uses about 65 watts and a central air conditioner uses about 3,500 watts. Since all of the watts add up quickly, the term kilowatt is used to represent 1000 watts. To understand how much energy you’re using you also have to consider how long you run your appliances. When you use 1000 watts for an hour, that’s a kilowatt-hour. The key is to reduce the number of kilowatt hours you use each month in order to save money and our natural resources… and a teacher in New York has a clever idea.
How Much is a Kilowatt?
Electricity used at any moment is measured in watts.
Electricity is usually measured in kilowatt-hours (kWh). One kWh represents the amount of energy needed by a 1000-Watt device such as a clothes-iron or a microwave oven to operate for one hour. Leaving a 100-watt light bulb on for 10 hours consumes 1 kilowatt (kWh) of energy. According to the Organization of American States’ Office of Sustainable Development, the average American household uses about 10,000 kWh yearly. Electric utilities typically charge their customers by the kilowatt hour, and the rate tends to fluctuate over time, and it also varies dramatically by region. In the United States, for example, the average residential cost of a kilowatt hour is between 8 and 10 cents. Check your latest utility bill to find out what you’re paying per kilowatt hour.
Ken Luna, a science teacher in New York, had a great idea. What if every student in the United States replaced one standard 60 watt bulb with a Compact Fluorescent bulb? Ken believes that this simple, inexpensive action will help fight global warming by reducing our carbon emissions from electric power plants, save Americans AT LEAST 2.3 BILLION dollars in electricity costs, and help put America on the path to environmental sustainability. Learn more about Mr. Luna’s Bright Idea.