Originally Posted by joshoisasleep
Originally Posted by the_experience03
For what it's worth there are plenty of uninteruptable power supplies for computers that use ten 12 volt batteries to achieve the 120 volt mark, but they still use inverters to get the necessary AC frequencies for computers.
OK, so now you're just converting from DC to AC and not having to go from 12V to 110V anymore. Does this make it more energy efficient than constantly upping the voltage? I think I read on here that the figure was you got something like 1 110V watt for every 12V watt you put in after running it all through the inverter and whatnot?? I guess the key is more or less to use as few 110V items as possible.
On another note, what's the deal on running laptops off 12V? Their batteries run around there anyway don't they? It seems to me that converting to 110V, and then back down to what they use with the AC adapter is a major waste.
I'm going to try and take this in pieces...
Ok, as for the large UPS systems...I do believe that running 120 vdc and then inverting to 120 vac would be more efficient. Power inverters are basically just step up transformers followed an inductor-capacitor "oscillator" circuits. The step-up transformer is where a lot of the heat is generated and every watt being used to make heat instead of 120 vac is a watt wasted. But unfortunately twelve 12 volt batteries would border a bit on the obnoxious side in terms of weight, size, and cost and I don't know of any commercially available inverter meant for direct conversion of dc to ac at the same voltage.
Oh...and 1 watt at 12 volts DC is the same as 1 watt at 120 volts AC. A watt is a unit of work and is universal across the board. Watts=amps*volts. I think the trick you're looking for is that 1 amp at 120 vac is nearly equal to 10 amps on the DC side in our systems once you factor in inverter losses, input voltages, etc.
So...build around watts. Once you figure out how many watts you will need to supply you can start to figure out the DC current side of things. Just divide the total of the watts by 12.
Realistically I don't think you lose a tremendous amount by going to AC. Obviously there ARE in fact losses, but DC systems are not without fault either. By the time you run a cable 30 feet to the back of the bus you are talking about some SERIOUS line drop issues unless you spend some serious money on grossly oversized wire. AC doesn't suffer from these line drop issues to NEARLY the same extent making it cheaper and easier to wire in.
Notebook computers run all over the board in terms of battery voltage. 3.6 nominal volts per cell is the generally accepted value for lithium ion cells. So...battery voltages will be multiples of this value. I've seen a few 10.8 vdc computer batteries, but most are 14.4 or higher. You will see some variation on this number because various manufacturers use different nominal voltages (3.65, 3.7, etc), but that gets you in the ballpark.
Now for many laptops there are car cords available that just plug in to the cigarette lighter. Unfortunately there is a problem with this. The average car charges at 14.4 volts or less and even the most ambitious GM charging system rarely gets above 15 volts. The cords contain no step-up transformer and since charging a battery requires a voltage higher than it is at you will rarely see a car charger fully charge a laptop battery and the charging it does do is slow. However, it will run the computer perfectly well.
The cords for laptops typically have a big transformer inline. This rectifies the AC voltage and steps it down. Typical output on these is 16vdc on up. Clearly they do a better job of charging thanks to the higher voltage, but you are absolutely correct that this is an inefficient way of doing things.
As a side note...the typical 18650 cell for laptops should only be charged at a .8C rate. Most chargers go much higher than this because people want fast recharge times. The results are very hot batteries (which effects everything else in a cramped laptop environment) and poor lifespan. Also, leaving your computer plugged in all the time with the battery in it is bad juju. Lithium ion batteries don't really suffer from much of any memory type effects and are very efficient at recharging, but they still only have so many charge cycles in them. All batteries self discharge over time and laptop batteries are no exception. The problem comes from them self discharging while plugged in. When voltage drops enough the charger will kick on and bring them back up. The short cycles are pretty easy on a battery, but over time they add up and take lifespan away. The best bet is to pop the battery out if the computer is going to be plugged in for a period of time. Completely discharging the battery once in a while is also a good idea if for no other reason than to reset the smart charger so you get more accurate time remaining figures.