In many homes, our televisions are on for many hours of the day. While not as power-hungry as appliances like air conditioners or washing machines, it’s important to know how much electricity your TV uses when you’re looking at your whole home’s energy usage.
Key takeaways about TV wattage
On average, TVs use 50 to 200 watts of electricity – this number is highly dependent on the model you have.
100 watts is a safe average electricity consumption to assume for modern TVs from top manufacturers.
To find how much electricity your TV uses, multiply yearly electricity use by average electricity rate in your area.
Using a TV 21 hours a week will use about 54.75 kilowatt-hours of electricity per year
It costs an average of $1.30 to run a TV for a month, and $15.54 to run for a year
The best way to save on electricity is to install solar panels. Start comparing your options on the EnergySage Marketplace today.
In this article
How much electricity does a TV use?
Generally, TVs use between 50 to 200 watts (W) of electricity, depending on the model. Most TVs use less than one amp, and connect to a 120 volt outlet. Older TVs that use technology such as Plasma and CRT are much less efficient compared to newer LED and LCD TVs.
How much you run your TV has the biggest impact on how much electricity it uses over time, and households have all sorts of television watching schedules. Interestingly, the average time U.S. adults spend watching TV has been falling for several years, and nowadays, it’s around three hours per day. Assuming an average TV wattage of 100 W:
If you watch TV 1.5 hours per day, that’s 1.05 kilowatt-hours (kWh) of electricity per week, 4.55 kWh per month, and 54.6 kWh per year.
Running that same TV 3 hours per day comes to 2.1 kWh per week, 9.1 kWh per month, and 109.2 kWh per year.
If you run a 100 W TV for 4.5 hours per day, that’s 3.15 kWh of electricity per week, 13.65 kWh per month, and 163.8 kWh per year.
Different wattage TVs use different amounts of electricity over the course of a year. Assuming you run your TV an average amount (3 hours per day, every day), here’s how much electricity you’ll use over the course of a year:
How many watts do different TVs use in a year? TV wattageHours per year runYearly kWh of electricity 50 W1,09554.75 kWh 75 W1,09582.13 kWh 100 W1,095109.5 kWh 125 W1,095136.9 kWh 150 W1,095164.3 kWh 175 W1,095191.6 kWh 200 W1,095219 kWh
We’ll mostly be referring to the electricity used by TVs in terms of kWh in this article. The reason is simple: your electric bill is measured in kWh, and you get charged based on the kWh of electricity you use per month!
How many volts and amps does a TV use?
The wattage of an appliance is determined from its voltage and amperage. You can use the yellow EnergyGuide label on your TV to determine the volts and amps it uses.
Using the above example label, here’s how you can calculate volts and amps:
Translate energy consumption to watt-hours (Wh) by multiplying the label’s kWh by 1,000. This gives you 180,000 Wh.
Divide 180,000 Wh by the number of days in a year you would use your TV (likely 365) – which gives you 493 Wh per day. TVs are run for an average of 3 hours per day, so that’s 164 W of hourly wattage.
TVs usually use 120 volt outlets. Divide the 164 W by 120 volts to get the amperage for your appliance: 164 W / 120 V = 1.37 amps.
Watts, amps, voltage, and more: what do they mean?
There are a lot of terms you can use to describe how electricity flows and is used by appliances. We’ve already mentioned most of them – here are a few definitions to keep things straight:
Volts (V): volts (short for voltage) are measures of electrical pressure differences. Put simply, voltage is the speed of electricity passing through a circuit.
Amps (A): amps (short for amperes) are a measure of electrical current. Put simply, amps are the amount of electrons (which make up electricity) flowing through a circuit.
Watts (W) and kilowatts (kW): multiplying volts x amps gets you watts (or wattage). Put simply, watts are the rate of electricity consumption. A kilowatt is just 1,000 watts.
Kilowatt-hours (kWh): lastly, kilowatt-hours are how your electric bill measures your energy usage. Simply put, kilowatt-hours are electricity consumption over time.
You can think of all of these terms like water flowing through a pipe. Voltage is the water pressure, amps are the amount of water flowing past any point, and wattage is the overall rate of water flow through the pipe.
How much does it cost to power a TV?
When you get your monthly electric bill, you only get to see the total amount you’re charged, not how much each appliance contributes to your final bill. Based on an average wattage of 100 W for TVs (amounting to 109.5 kWh/year if you use your TV like an average household would) and using state average electricity rates, here’s how the cost to run a TV pans out over the course of a month and a year:
Monthly and yearly costs to run a TV by state StateAverage electricity rateCost per monthCost per year California22.00 ¢ / kWh$2.01$24.09 New York20.59 ¢ / kWh$1.88$22.55 Texas12.56 ¢ / kWh$1.15$13.75 Massachusetts22.59 ¢ / kWh$2.06$24.74 Florida12.21 ¢ / kWh$1.11$13.37 Virginia12.58 ¢ / kWh$1.15$13.78 New Jersey16.20 ¢ / kWh$1.48$17.74 Maryland14.48 ¢ / kWh$1.32$15.86 Washington10.38 ¢ / kWh$0.95$11.37 US Average14.19 ¢ / kWh$1.30$15.54
Note: average electricity rates are based on October 2021 data from the U.S. Energy Information Administration (EIA).
Looking to offset your electric bills (and the energy these appliances use) with solar? When you sign up (for free!) on the EnergySage Marketplace, you can compare solar quotes from high-quality, local solar installers. Make sure to keep in mind your current and future electricity usage, and talk about how that could change with your installer for the most accurate quotes.
How to calculate how much energy your TV uses?
Remember that yellow Energy Saver sticker we mentioned above? If you want to know how much electricity your TV uses (or at least is supposed to use), take the estimated yearly electricity use in kWh – this is probably your best bet for an accurate number. Simply multiply this number by the average electricity rate in your area to get an estimate of how much you spend to power your TV each year. For an estimated monthly cost, divide the estimated yearly cost by 12.
See what electricity costs near you
The more expensive your electricity is, the more you’ll pay to power your television and other appliances. Curious how much electricity costs near you? Click on your state to learn more:
Arkansas Arizona California Colorado Connecticut Washington D.C. Florida Georgia Iowa Idaho Illinois Indiana
Louisiana Massachusetts Maryland Maine Michigan Minnesota Missouri North Carolina New Hampshire New Jersey New Mexico Nevada
New York Ohio Oregon Pennsylvania Rhode Island South Carolina Texas Utah Virginia Washington Wisconsin
Frequently asked questions about powering a TV
What’s the best time to run a TV?
If you’re on a time-of-use (TOU) rate plan, you are charged different amounts for electricity throughout the day. In general, it’s cheaper to use appliances during “off-peak” hours, which are usually overnight.
What size battery do you need to back up a TV?
All popular home batteries are capable of powering a TV: most lithium-ion batteries like the Tesla Powerwall or Generac PWRcell have a power rating of 4 to 5 kW or higher, and 10+ kWh of usable capacity. TVs use about 100 W (0.1 kW) of power at any one time, meaning a battery will be plenty suitable for backing up and powering your TV, even for long periods of time.
How many solar panels does it take to run a TV?
Average TVs use between 50 and 200 W of electricity to stay powered. On average, solar panels are rated at around 350 W, meaning you’ll be able to power a TV easily with just one solar panel.
What are ENERGY STAR appliances?
ENERGY STAR is a U.S. government-backed system that certifies how energy efficient appliances are. If an appliance is better than the average appliance in its category by a certain amount, it is labeled as “ENERGY STAR certified”. ENERGY STAR appliances cost less money to run, given that they are more efficient with the electricity they use.
How much money can solar panels save you?
Solar savings vary widely, and your unique savings depends on factors like electricity usage, your location, electric rates and plans, and more. In general, most homeowners can expect to save somewhere between $10,000 and $30,000 over the lifetime of a solar panel system. On average, it takes between 7 and 8 years for most homeowners who shop for solar on EnergySage to get their solar panels to pay for themselves.
Going solar is one of the most effective ways to reduce or eliminate your electric bill, and you should make sure you are getting several quotes from reputable installers before you decide to move forward. Visit the EnergySage Marketplace to get solar quotes from installers in your area and begin comparing options.
Comments