A certain amplifier requires 200 watts when it is being used. How much would is cost to run for 55 minutes, at a cost of .11 per kWh? Round to the nearest cent.

Respuesta :

Answer:

2 cent

Step-by-step explanation:

Given:

A certain amplifier requires 200 watts when it is being used.

Cost of 1 kilowatt per hour = 0.11

Question asked:

How much would it cost to run for 55 minutes ?

Solution:

First of all we will convert 200 watts into kilowatt then we will find cost of 200 watt per hour by using unitary method.

As we know:

1 kilowatt = 1000 watts

1000 watts = 1 kilowatt

1 watts = [tex]\frac{1}{1000}[/tex]

200 watts = [tex]\frac{1}{1000}\times200=\frac{1}{5} \ kilowatt[/tex]

Cost of 1 kilowatt per hour = 0.11   (given)

Cost of [tex]\frac{1}{5}[/tex] kilowatt per hour = [tex]0.11\times\frac{1}{5} =0.022[/tex]

Cost of [tex]\frac{1}{5}[/tex] kilowatt in 60 minutes = 0.022

Cost of [tex]\frac{1}{5}[/tex] kilowatt in 1 minute = [tex]\frac{0.022}{60}[/tex]

Cost of [tex]\frac{1}{5}[/tex] kilowatt in 55 minutes = [tex]\frac{0.022}{60}\times55=\frac{1.21}{60} =0.0201[/tex]

Now, to convert it into cent, we will multiply this by 100, hence we get:

[tex]0.0201\times100=2.01[/tex]

Therefore, it would be cost 2 cent to be used 200 watts from amplifier.

The cost of running the 200 watts amplifier for 55 minutes at a cost of $ 0.11 per KWh is 2 cent

How to determine the energy

  • Power (P) = 200 watts = 200 / 1000 = 0.2 KW
  • Time (t) = 55 mins = 55 / 60 = 0.917 h
  • Energy (E) =?

E = Pt

E = 0.2 × 0.917

E = 0.1834 KWh

How to determine the cost

  • Cost per KWh = $ 0.11
  • Energy (E) = 0.1834 KWh
  • Cost =?

Cost = energy × Cost per KWh

Cost = 0.1834 × 0.11

Cost = $ 0.02

Multiply by 100 to express in cent

Cost = 0.02 × 100

Cost = 2 cent

Lean more about buying electrical energy:

https://brainly.com/question/16963941