I wonder if my system is good or bad. My server needs 0.1kWh.

      • d_k_bo@feddit.org
        link
        fedilink
        English
        arrow-up
        10
        ·
        edit-2
        20 hours ago

        It’s the other way around. 0.1 kWh means 0.1 kW times 1 h. So if your device draws 0.1 kW (100 W) of power for an hour, it consumes 0.1 kWh of energy. If your device factory draws 360 000 W for a second, it consumes the same amount of 0.1 kWh of energy.

        • GravitySpoiled@lemmy.mlOP
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          3
          ·
          19 hours ago

          Thank you for explaining it.

          My computer uses 1kwh per hour.

          It does not yet make sense to me. It just feels wrong. I understand that you may normalize 4W in 15 minutes to 16Wh because it would use 16W per hour if it would run that long.

          Why can’t you simply assume that I mean 1kWh per hour when I say 1kWh? And not 1kWh per 15 minutes.

          • 486@lemmy.world
            link
            fedilink
            English
            arrow-up
            9
            ·
            edit-2
            18 hours ago

            kWh is a unit of power consumed. It doesn’t say anything about time and you can’t assume any time period. That wouldn’t make any sense. If you want to say how much power a device consumes, just state how many watts (W) it draws.

      • elmicha@feddit.org
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        19 hours ago

        0.1kWh per hour can be written as 0.1kWh/h, which is the same as 0.1kW.