Vaughan Spa

Ask Me Anything...I Have An Answer

Yoga Face

New member
Jun 30, 2009
6,328
19
0
here is something I do not understand


when a transformer increases voltage it lowers amperage

that way there is no free energy

as power is Volt * amps (watts)


but amperage is not preset as it depends on ohms (the resistance to the current or amperage) as amperage = volts / ohms

so what amperage are they talking about here when they say when a transformer increases voltage it lowers amperage?

for example, if there is no usage of electricity after the transformer transforms then there is no ohms (the resistance is infinite meaning no current or amperage ) so how can a transformer decrease that which does not exist ?


to readdress my question from another angle



if a line has 120 v and 20 ohms resistance then amperage is 120/20 or 5 amps

if a transformer is introduced that raises voltage to 240 then 240/20 is 12 amps

so increasing the volts increases the amperage it does not decrease it!

the only answer I can think of is that when increasing voltage you increase the resistance to the current ( even though it is the same thing still resisting IE same lightbulb)
 

Ceiling Cat

Well-known member
Feb 25, 2009
28,526
1,306
113
Compute to the last digit the value of π.
 

Yoga Face

New member
Jun 30, 2009
6,328
19
0
Compute to the last digit the value of π.
that is not a question

here is an example of a question


"Is it alright to pet on the first date if both parties are mature and respect each other ? "


got it ?
 

Ceiling Cat

Well-known member
Feb 25, 2009
28,526
1,306
113
that is not a question

here is an example of a question


"Is it alright to pet on the first date if both parties are mature and respect each other ? "




got it ?
So, OK Alex! What is the last digit of π?
 

Yoga Face

New member
Jun 30, 2009
6,328
19
0
Last edited:

buttercup

Active member
Feb 28, 2005
2,569
4
38
here is something I do not understand


when a transformer increases voltage it lowers amperage

that way there is no free energy

as power is Volt * amps (watts)


but amperage is not preset as it depends on ohms (the resistance to the current or amperage) as amperage = volts / ohms

so what amperage are they talking about here when they say when a transformer increases voltage it lowers amperage?

for example, if there is no usage of electricity after the transformer transforms then there is no ohms (the resistance is infinite meaning no current or amperage ) so how can a transformer decrease that which does not exist ?


to readdress my question from another angle



if a line has 120 v and 20 ohms resistance then amperage is 120/20 or 5 amps

if a transformer is introduced that raises voltage to 240 then 240/20 is 12 amps

so increasing the volts increases the amperage it does not decrease it!

the only answer I can think of is that when increasing voltage you increase the resistance to the current ( even though it is the same thing still resisting IE same lightbulb)



You are correct that watts = volts x amps. Also, ohms = volts / amps.

Suppose you apply 120 volts to a 60-watt light bulb -- the current flowing through the light bulb would be 0.5 amps. It follows that the 60-watt bulb has a resistance of 240 ohms (120 / 0.5 = 240). Similarly, a 40-watt bulb has a resistance of 360 ohms and draws a current of 0.33 amps at 120 volts.

What happens if you subject the 60-watt bulb to 240 volts rather than 120 volts? Its resistance stays the same, at 240 ohms. So the 60-watt bulb now draws one amp of current at 240 volts.

The bulb being rated only for 120 volts, the 240 volts will probably cause it to quickly fail. But while it lasts, the 60-watt bulb will shine at twice the brightness.

The 60-watt bulb is actually incorrectly described as such. It is, strictly speaking, a 240-ohm bulb. It is only a 60-watt bulb if/when it operates at 120 volts.

So now, suppose you transform the electricity supply to the 240-ohm bulb up to 240 volts, as you suggest. Now, the 240-ohm bulb is operating at 240 volts, so it draws a current of one amp, and (before it blows) it burns electricity at the rate of 120 watts -- you would pay for the 120 watts it draws from the power station, not 60 watts.

Equally, if you transformed the power supply to the bulb down to 60 volts, the 240-ohm bulb now draws only 0.25 amps and burns electricity only at the rate of 30 watts.

You suggest that, in the case of a bulb having a fixed resistance, if you double the voltage applied to the bulb, you will double the current flowing through the bulb, That is a correct statement. You will also double the rate (watts) at which the bulb burns electricity.


But now, take the case of a power station that generates 500 megawatts. As we all know, the electricity from the power station is transformed to high voltage for long distance distribution across-country over power lines. If they were to put the 500MW on the grid at 1000 volts, only say 250MW would be available by the time the power got to the houses and factories; the rest would be lost as heat in the wires. But if they put the 500MW on the grid at 250,000 volts, now 490 of the 500MW will make it to the consumers.

If you send the 500MW out at 1000 volts, the current in the wires would be 500,000 amps. If you send the 500MW out at 250,000 volts, the current in the wires is an easy 2000 amps.

So, they transform the 500MW from the power station to high voltage / low current. You can transform 500MW to e.g 2000 amps at 250,000 volts, or to 1000 amps at 500,000 volts, or to any other volts x amps that computes to 500,000,000. The point is, when you are talking about transforming power, the power remains constant (apart from some loss) -- when you are talking about doubling the voltage you apply to a load that has a fixed resistance, you double the amps and double the watts.

Watt could be simpler?
 

LuvDaKimChi

New member
Apr 18, 2008
54
0
0
buttercup said:
There's a question that's always puzzled me. The mystery is, how do horses eat food from nosebags?

I can see that, if the food is at just the right level in the bag, Harry the horse can eat and breathe at the same time. But, if you look at the geometry of a horse's head, its nostrils are at about the same level as its mouth. So, if the food in the nosebag is at the level that the horse can reach it with his lips and tongue, then he can't breathe. Equally, if Harry can breathe, he can't eat.

So, what does Harry actually do? Does he blow out through his nose at the same time as he breathes in through his mouth? Possible, but how does he know to do that?

Or, is it that when the chap starts tying the nosebag on, Harry takes a great gulp of air, and then very quickly munches his way down through the pile of oats in the nosebag, until he can take another breath? If not that, then what?

Of course, feeding horses from nosebags has become less prominent recently, what with the invention of the internal combustion engine and the advent of motor transport. But still, if you have the answer for me, I will be forever in your debt.
Very good question....after studying how horses eat from a nosebag, and watching the fat girls eat ere at work...I can safely say that the horses have "elongated" lips that stretch down to grab the food.
 

LuvDaKimChi

New member
Apr 18, 2008
54
0
0
Yoga Face said:
here is something I do not understand

when a transformer increases voltage it lowers amperage

that way there is no free energy

as power is Volt * amps (watts)

but amperage is not preset as it depends on ohms (the resistance to the current or amperage) as amperage = volts / ohms

so what amperage are they talking about here when they say when a transformer increases voltage it lowers amperage?

for example, if there is no usage of electricity after the transformer transforms then there is no ohms (the resistance is infinite meaning no current or amperage ) so how can a transformer decrease that which does not exist ?

to readdress my question from another angle

if a line has 120 v and 20 ohms resistance then amperage is 120/20 or 5 amps

if a transformer is introduced that raises voltage to 240 then 240/20 is 12 amps

so increasing the volts increases the amperage it does not decrease it!

the only answer I can think of is that when increasing voltage you increase the resistance to the current ( even though it is the same thing still resisting IE same lightbulb)
My answer is .... You've already answered it..."increasing voltage you increase the resistance to the current ( even though it is the same thing still resisting IE same lightbulb"
 

red

you must be fk'n kid'g me
Nov 13, 2001
17,572
8
38
Very good question....after studying how horses eat from a nosebag, and watching the fat girls eat ere at work...I can safely say that the horses have "elongated" lips that stretch down to grab the food.
thats true- but I have also seen horses puch the feedbag against their stall to get food and some horses tip it up so the grain falls into their mouth
 

Yoga Face

New member
Jun 30, 2009
6,328
19
0
You are correct that watts = volts x amps. Also, ohms = volts / amps.

Suppose you apply 120 volts to a 60-watt light bulb -- the current flowing through the light bulb would be 0.5 amps. It follows that the 60-watt bulb has a resistance of 240 ohms (120 / 0.5 = 240). Similarly, a 40-watt bulb has a resistance of 360 ohms and draws a current of 0.33 amps at 120 volts.

What happens if you subject the 60-watt bulb to 240 volts rather than 120 volts? Its resistance stays the same, at 240 ohms. So the 60-watt bulb now draws one amp of current at 240 volts.

The bulb being rated only for 120 volts, the 240 volts will probably cause it to quickly fail. But while it lasts, the 60-watt bulb will shine at twice the brightness.

The 60-watt bulb is actually incorrectly described as such. It is, strictly speaking, a 240-ohm bulb. It is only a 60-watt bulb if/when it operates at 120 volts.

So now, suppose you transform the electricity supply to the 240-ohm bulb up to 240 volts, as you suggest. Now, the 240-ohm bulb is operating at 240 volts, so it draws a current of one amp, and (before it blows) it burns electricity at the rate of 120 watts -- you would pay for the 120 watts it draws from the power station, not 60 watts.

Equally, if you transformed the power supply to the bulb down to 60 volts, the 240-ohm bulb now draws only 0.25 amps and burns electricity only at the rate of 30 watts.

You suggest that, in the case of a bulb having a fixed resistance, if you double the voltage applied to the bulb, you will double the current flowing through the bulb, That is a correct statement. You will also double the rate (watts) at which the bulb burns electricity.


But now, take the case of a power station that generates 500 megawatts. As we all know, the electricity from the power station is transformed to high voltage for long distance distribution across-country over power lines. If they were to put the 500MW on the grid at 1000 volts, only say 250MW would be available by the time the power got to the houses and factories; the rest would be lost as heat in the wires. But if they put the 500MW on the grid at 250,000 volts, now 490 of the 500MW will make it to the consumers.

If you send the 500MW out at 1000 volts, the current in the wires would be 500,000 amps. If you send the 500MW out at 250,000 volts, the current in the wires is an easy 2000 amps.

So, they transform the 500MW from the power station to high voltage / low current. You can transform 500MW to e.g 2000 amps at 250,000 volts, or to 1000 amps at 500,000 volts, or to any other volts x amps that computes to 500,000,000. The point is, when you are talking about transforming power, the power remains constant (apart from some loss) -- when you are talking about doubling the voltage you apply to a load that has a fixed resistance, you double the amps and double the watts.

Watt could be simpler?
it seems to me a step up transformer uses less amperage to transmit power so there is less loss through heat then they step it down for the consumer

this is why we use AC because DC does not step up, I think

as to watt happens to current when voltage is stepped up ...


let me try to figure this out...


First, some terminology - the side of the transformer which is connected to the supply side is called the primary, and the side connected to the resistor is called the secondary.


What I have done is compare the secondary current, before I connected the transformer, to the secondary current after I connected the transformer.

What I should be doing is comparing the primary and secondary current after I connected the transformer.

So if you measure the primary current, you'll find that it equals 24A, or twice of what it is in on the secondary side, so it does step down the current.

However, if you leave the secondary circuit open there is no power consumed, therefore no current on either side of the transformer, so that relationship will no longer be valid

the current on the primary side will always be double the current on the secondary side if the transformer has doubled the voltage

does this sound correct ?


or do you find my answer shocking
 

Yoga Face

New member
Jun 30, 2009
6,328
19
0
My answer is .... You've already answered it..."increasing voltage you increase the resistance to the current ( even though it is the same thing still resisting IE same lightbulb"
I think that is the wrong answer


 

IM469

Well-known member
Jul 5, 2012
11,139
2,469
113
if a line has 120 v and 20 ohms resistance then amperage is 120/20 or 6 amps

if a transformer is introduced that raises voltage to 240 then 240/20 is 12 amps

so increasing the volts increases the amperage it does not decrease it!
If I may ....

A transformer transforms power into a different voltage but the power going in always equals the power going out: V*I = the same number (watts) on both sides of the transformer. If you plug in a transformer that has 240 V into a 20 ohm resistor - the current will be 12 amps and the power will be 240 x 12 = 2,880 Watts. On the 120 volt side of the transformer with the same power demand the current is 2,880/120 or 24 Amps (not 6 amps).
 
Ashley Madison
Toronto Escorts