Top Posters
Since Sunday
w
3
w
3
e
3
3
r
3
g
2
2
b
2
M
2
V
2
f
2
c
2
New Topic  
o2484680@rtrt o2484680@rtrt
wrote...
Posts: 19
Rep: 0 0
11 years ago
I intend to heat some quantity of water (say 1 liter) by applying electrical energy to a resistance. How much power is needed. From the power, I will deduce what electrical resistance is neccessary for using say a 120 V electrical network outlet.
Read 279 times
4 Replies
Replies
wrote...
11 years ago
That depends upon the efficiency of heat transfer from the electrical device, and upon the time it takes to heat the water.

A low power device can heat the water to the same temperature as a high power device, except that it takes longer.

You need to better define the constraints of your experiment:
Starting temperature of the water.
Desired end temperature.
How fast you want to change temperature.
Material characteristics of your heating unit.

Done.
Answer accepted by topic starter
smileyladysmileylady
wrote...
11 years ago
Sign in or Sign up in seconds to unlock everything for free
1

Related Topics

wrote...
11 years ago
Electrically speaking you would not want to draw much more than 1000watts from your 120V Outlet so a minimum resistance should be V^2 / P = 14 Ohms or so.   10 Ohms would give you 1440 Watts at about 12 Amps.
wrote...
11 years ago
The definition of a BTU (British Thermal Unit) is the amount of energy required to raise the temperature of 1 pound of water 1 degree F.  There are 3,412 BTU's in 1 kilowatt/hour.

As a previous poster said, you need to define starting and ending temps and some other variables, perform some conversions based on the above and you should have your answer.
New Topic      
Explore
Post your homework questions and get free online help from our incredible volunteers
  3016 People Browsing
Related Images
  
 367
  
 189
  
 291
Your Opinion
Where do you get your textbooks?
Votes: 447

Previous poll results: How often do you eat-out per week?