Top Posters
Since Sunday
5
a
5
k
5
c
5
B
5
l
5
C
4
s
4
a
4
t
4
i
4
r
4
New Topic  
o2484680@rtrt o2484680@rtrt
wrote...
Posts: 19
Rep: 0 0
10 years ago
I intend to heat some quantity of water (say 1 liter) by applying electrical energy to a resistance. How much power is needed. From the power, I will deduce what electrical resistance is neccessary for using say a 120 V electrical network outlet.
Read 258 times
4 Replies
Replies
wrote...
10 years ago
That depends upon the efficiency of heat transfer from the electrical device, and upon the time it takes to heat the water.

A low power device can heat the water to the same temperature as a high power device, except that it takes longer.

You need to better define the constraints of your experiment:
Starting temperature of the water.
Desired end temperature.
How fast you want to change temperature.
Material characteristics of your heating unit.

Done.
Answer accepted by topic starter
smileyladysmileylady
wrote...
10 years ago
Sign in or Sign up in seconds to unlock everything for free
1

Related Topics

wrote...
10 years ago
Electrically speaking you would not want to draw much more than 1000watts from your 120V Outlet so a minimum resistance should be V^2 / P = 14 Ohms or so.   10 Ohms would give you 1440 Watts at about 12 Amps.
wrote...
10 years ago
The definition of a BTU (British Thermal Unit) is the amount of energy required to raise the temperature of 1 pound of water 1 degree F.  There are 3,412 BTU's in 1 kilowatt/hour.

As a previous poster said, you need to define starting and ending temps and some other variables, perform some conversions based on the above and you should have your answer.
New Topic      
Explore
Post your homework questions and get free online help from our incredible volunteers
  1289 People Browsing
Related Images
  
 205
  
 901
  
 384
Your Opinion
Do you believe in global warming?
Votes: 370